Dissertation
Dissertation
VALIDATION OF THE
SELF-ASSESSMENT OF NURSING INFORMATICS COMPETENCY SCALE (SANICS)
BEFORE AND AFTER ONLINE INFORMATICS TRAINING
DOCTOR OF PHILOSOPHY
IN
NURSING
May 2015
By
Judi A. Godsey
Dissertation Committee:
2015
i
Acknowledgements
I wish to express sincere appreciation to my advisor and chair, Dr. Debra Mark, who has
been a model mentor and a highly competent guide on my journey to a PhD. Many thanks to the
members of my dissertation committee, Drs. Codier, Ho, and Wong, and a special thanks to Dr.
Yoon, who created and graciously shared the SANICS instrument as the topic for my
dissertation, along with much expertise and guidance on psychometrics. A special thanks to my
friend and statistical coach, Greg Grady, who patiently guided and instructed me on all things
psychometric.
This dissertation is dedicated to my loving and supportive parents, Bob and Erna Godsey,
who always believed in and nurtured my potential. To my sisters, Debbie, Pam and Tonja, who
are my cheerleaders, and lifelong best friends. To my extended family members, including my
forever friends Debbie and Phyllis, who have remained steadfastly positioned as a source of
support, prior to and during the years leading up to this degree. My deepest appreciation to Dr.
Debra Breneman, who long ago coaxed, urged, and ultimately persuaded a relatively new
Associate Degree RN to leap outside of her comfort zone and relentlessly seek higher ways of
knowing. And to my wonderful children, Amy and Allen (and my brand new granddaughter,
Allyn), who inspire me to be more than I am. And finally, I extend my warmest gratitude to my
dedicated husband and life partner, Buddy, who consistently sees exciting opportunities instead
of insurmountable barriers.
I would not be here without the people acknowledged on this page, and I am deeply and
ii
Abstract
essential tool for improving health outcomes (IOM, 2010). However, nurses frequently report
lack of competency to perform the most basic computer functions, outside of those required
within their work environment (Hwang, 2011). Without educational or training interventions,
nurses are limited in their ability to effectively use information technology in practice (Greiner,
2003). This study explored the psychometric performance of the Self-Assessment of Nursing
population of entry-level nursing students. Data collected before and after an online informatics
training intervention (SOLO-IT) confirmed the factor structure and internal consistency
reliability of the SANICS. Statistically significant increases (p < 0.001) were reported by
Significant differences (p < 0.001) in each sub-scale mean score before and after completion of
SOLO-IT confirmed the construct validity of the SANICS. Results of this study support the
the nursing workforce could depend upon the availability of on-demand training resources and
valid instruments which support nurses as competent users of informatics in an era of ubiquitous
health information technology. Findings from this study provide preliminary evidence that
among entry level nursing students. Future studies are recommended using paired samples of
nurses and nursing students from diverse populations, as well as studies which correlate
iii
Table of Contents
Background ......................................................................................................................................2
American Association of Colleges of Nursing ...........................................................................2
Technology and informatics Guiding Educational Reform ........................................................4
The National League of Nursing.................................................................................................6
Nursing Informatics and the Affordable and Accountable Care Act ..........................................6
Institute of Medicine ...................................................................................................................7
Significance......................................................................................................................................8
Purpose...........................................................................................................................................10
Research Questions ........................................................................................................................11
Methods..........................................................................................................................................12
Summary ........................................................................................................................................12
iv
The Self-Assessment of Nursing Informatics Competency Scale ............................................29
Choi ......................................................................................................................................31
Choi and De Martinas ..........................................................................................................31
Choi and Bakken ..................................................................................................................32
Informatics Training and Assessment ............................................................................................42
European Computer Driver’s License (ECDL) Certification ...................................................43
TIGER Virtual Demonstration Center .....................................................................................44
Nursing Resources: A Self-Paced Tutorial ...............................................................................45
IC3 Training and Certification Program ...................................................................................46
The Passport Project for Nursing Success ................................................................................46
Computer Competency Tutorial ...............................................................................................48
Information Literacy Tutorial ...................................................................................................48
Pre-test for Attitudes Toward Computers in Healthcare...........................................................49
Summary .......................................................................................................................................53
Introduction ....................................................................................................................................55
Research Questions ..................................................................................................................55
Research Design ............................................................................................................................56
Conceptual Definitions ..................................................................................................................56
Informatics ................................................................................................................................56
Informatics Competency ...........................................................................................................57
Informatics Knowledge ........................................................................................................57
Computer Skills ...................................................................................................................58
Operational Definitions ..................................................................................................................61
Entry Level Nursing Student ...................................................................................................61
Blackboard ................................................................................................................................61
Informatics Curricular Instruction ...........................................................................................61
Informatics Training ................................................................................................................61
Competency on the SANICS ....................................................................................................62
Instrumentation ..............................................................................................................................62
Method to Test the SANICS Instrument ...................................................................................64
Sample..................................................................................................................................65
SANICS and SOLO-IT ..................................................................................................................65
Instructional Design of SOLO-IT .............................................................................................66
Self-Directed Learning ........................................................................................................66
SOLO’s Web Based Modules ...................................................................................................67
Plan for Data Collection and Analysis ..........................................................................................73
Process to Determine Feasibility ..............................................................................................73
v
Data Analysis Plan ...................................................................................................................74
Statistical Power ..................................................................................................................75
Plan for Statistical Analysis .................................................................................................76
Ethical Considerations ............................................................................................................ 76
University of Hawaii IRB Approval ....................................................................................76
Xavier University IRB Approval .........................................................................................78
Limitations of the Design..........................................................................................................79
Summary ........................................................................................................................................80
Research Design………………………………………………………………………………….81
Setting and Sample ...................................................................................................................81
Profile of Respondents ..............................................................................................................83
Presentation of Data for Research Question One ..........................................................................84
Principal Component Analysis .................................................................................................84
Parallel Analysis .......................................................................................................................86
Psychometric Analysis: Comparisons with Original SANICS Study.......................................86
Presentation of Data for Research Question Two ..........................................................................89
Value of SOLO-IT .........................................................................................................................91
Summary ........................................................................................................................................93
Discussion ......................................................................................................................................94
Sample Size and Return Rates ..................................................................................................94
Study Participants .....................................................................................................................94
Research Question One ..................................................................................................................94
Research Question Two .................................................................................................................95
Value of SOLO-IT .........................................................................................................................96
Limitations .....................................................................................................................................97
Implications for Future Research ...................................................................................................99
Summary ......................................................................................................................................100
References ...................................................................................................................................114
List of Tables
vi
Table 4: Factor Structure Matrix Pre-and Post-SOLO-IT ..........................................................85
Table 5: Factor Structure Compared to Original SANICS Study ...............................................87
Table 6: Average Scores for Each SANICS Sub-Scale Pre/Post SOLO-IT ...............................90
Table 7: SANICS Scores Post SOLO-IT Compared to Original SANICS Study ......................91
Table 8: Value of SOLO-IT ........................................................................................................92
List of Figures:
List of Appendices
vii
Chapter I
Introduction
Numbering more than three million, licensed Registered Nurses (RNs) make up the
largest group of healthcare professionals in the United States (Health Resources Services
dependent upon technology to deliver innovative therapies at the bedside, and to interpret
massive amounts of quality-related outcomes data in the executive boardroom. The information
age is producing data at rates that exceed the ability to use it. The Economist estimates medical
centers are home to almost one billion terabytes of data, “equivalent to almost two trillion file
and information science to manage and communicate data, information, and knowledge in
nursing practice” (American Nurses Association [ANA], 2001, p.17). Nurses capable of
the future of nursing and the promotion of health that spans “time and place” and “wellness and
health maintenance activities” (Health Information and Management System Society [HIMSS],
2012, p. 3). For more than a decade, nursing researchers have recognized that evidence based
practice (EBP) would be dependent upon nursing’s ability to demonstrate expert knowledge in
patients’ individual and collective decision making processes (McCormack, Kitson, Harvey,
Rycraft-Malone, Titchen & Seers, 2002). Such expert knowledge in practice is based on a
foundational ability to use information technology effectively and manage electronic sources of
1
Background
As the largest group of clinical health information technology users, nurses must
possess a minimum set of technical competencies needed for the electronic medical record
(EMR), including basic computer skills and information literacy (Ball, Douglas & Hinton-
Walker, 2011). Basic computer literacy (competency) is defined as “the use of personal
computers, including the use of software tools such as word processing, spreadsheets, databases,
presentation graphics, and e-mail” (Hebda & Czar 2013, p. 8). Information literacy is the ability
to access, process, and use information and is an essential component of nursing informatics
competency (IOM, 2011). The IOM has identified information literacy as the recommended
Significant contributions concerning the role of nursing informatics have been made
during the past decade. Staggers and colleagues defined and described informatics
competencies, skills, knowledge, and abilities based on educational preparation and expertise:
beginning nurse, experienced nurse, informatics nurse specialist, and informatics innovator
(Staggers, Gassert, & Curran, 2001, 2002). Their seminal research has informed the
development of informatics competency measurements for the nursing profession and has
influenced many of the standards commonly used by leading professional nursing organizations.
These organizations include the American Association of Colleges of Nursing (AACN, 2008),
the TIGER Initiative (2010) and the National League of Nursing (2008).
provides guidelines for nursing practice (2008). In 2008, the AACN included the mandate that
2
“computer and information literacy are crucial to the future of nursing” and provide a bridge
Care Technology” as one of the nine “Essentials” for a professional nursing education (p. 17).
healthcare settings.
3. Apply safeguards and decision making support tools embedded in patient care
technologies and information systems to support a safe practice environment for both
6. Evaluate data from all relevant sources, including technology, to inform the delivery of
care.
7. Recognize the role of information technology in improving patient care outcomes and
3
9. Apply patient-care technologies as appropriate to address the needs of a diverse patient
population.
10. Advocate for the use of new patient care technologies for safe, quality care.
11. Recognize that redesign ofworkflow and care processes should precede implementation
12. Participate in evaluation of information systems in practice settings through policy and pr
standards is an important step in the process of transforming nursing education in a manner that
prepares graduates to practice competently. Such skills are critical in a technology driven
independent, grassroots nursing initiative whose primary focus is educational reform in nursing
informatics (Ball, et. al, 2011). The TIGER Initiative began in 2006 in response to the growing
need for an informatics competent nursing workforce. The TIGER Initiative catalyzed the
formation of the Alliance for Nursing Informatics (ANI), a coalition of 20 nursing informatics
professional societies, and major nursing organizations such as the ANA, Association of Nurse
Executives (AONE), the American Association of Colleges of Nursing (AACN) (TIGER, 2008).
The ANI also includes an additional 170 diverse organizations which seek to make informatics
“nursing’s stethoscope by the 21st century” (Ball, et al, 2011, p. 19; TIGER, 2008, 2010). The
TIGER Initiative focuses on “the need to engage nurses in all settings in the national effort to
4
prepare the healthcare workforce toward effective adoption of electronic health records” (p. 10).
Include Health Information Technology (HIT) in every strategic plan, mission, and
vision statement; use of HIT is embraced by executives, deans, all point of care
clinicians and students with goal of high quality care and safety.
Establish multidisciplinary teams that embrace a shared vision and operate cohesively to
Develop mutual respect between/among clinicians who may bring different skills and
knowledge (p.14).
(TICC). This group of informatics and nursing experts was charged with the establishment of a
minimum set of informatics competencies (Ball, et al, 2011). The TICC also outlined
basic computer and information literacy skills for nurses and nursing students. Recommended
information literacy and computer skills are described below (Gugerty & Delaney, 2009):
3. Evaluate information and its sources critically and incorporates selected information
specific purpose
5
5. Evaluate outcomes of the use of information (p.5).
3. Word Processing
4. Spreadsheets
5. Using Databases
6. Presentation
The National League of Nursing (NLN) has called for the development of innovative
educational programs which improve informatics competencies and enable competent use of
information technologies (2008). For more than a decade, the nursing literature has reported that
nursing schools lack standardization as to the type and complexity of computer skills required for
nursing students (Sinclair & Gardner, 1999) The ability for nursing education to integrate
computer competency and information literacy “will require nursing curriculum reform and an
essential tool for improving health outcomes (IOM, 2010). As a result of the Affordable Care
Act (ACA) of 2010, healthcare delivery is transitioning to electronic health records for
meaningful use and exchange of information (Department of Health and Human Resources
[HHS], 2010). The ACA includes broad application of informatics competencies to support the
6
meaningful use of information supporting quality patient outcomes. However, a chasm between
knowledge and practice prevails due to incomplete adoption of healthcare innovations (IOM,
2001; Glanz, Rimer &Viswanath, 2008). Nurses deficient in basic informatics competencies
have limited ability to use and apply communication and information technology in their practice
(IOM, 2003, p. 85). Nurses are the largest group of clinical users of health information
technologies and must master a minimum set of competencies needed for the electronic medical
record (EMR), including basic computer skills and information literacy (Ball, et al, 2011).
Employing agencies may understandably assume that requisite computer skills are being taught
during a student’s undergraduate college education. However, nurses frequently lack the
systems (Thede & Sewell, 2004; Wilbright, et. al, 2006). Inexperience with technology is one of
the most common factors impeding the adoption of Electronic Medical Records (EMR) (Mcginn,
et al. 2012). Effective use of electronic health systems depend on competence with the complete
system, including advanced tools for charting and data management (Saba & McCormick, 2011).
Nursing: Leading Change, Advancing Health, the Institute of Medicine (IOM) called for nurses
improving health outcomes (2010). The Institute of Medicine defines information literacy as the
ability to access, process, and use information (2011). A key component of informatics
competency is information literacy, which is the ability to find, retrieve, and analyze information
7
(American Library Association, 2012). Information literacy is considered an essential
competency and is the recommended standard for all nurses and nursing students (IOM, 2011).
Significance
Informatics competency is readily accepted as basic technology skills that nurses must
have to perform their duties (McNeil, Elfrink, Pierce, Beyea, Bickford, & Averill 2005; Staggers,
Gassert & Curran, 2001; Yoon, Yen, & Bakken, 2009). Effective and efficient use of electronic
health systems rely on competence with the complete system, including the ability to use
advanced tools designed for charting and data management (Saba & McCormick, 1996). Expert
testimony provided to the Office of the National Coordinator (ONC) revealed significant
challenges for nurses when locating pertinent data “in the sea of available electronic information
in the EMR” (Staggers, 2011, p. 542). Inexperience with technology has been identified as one
of the most common factors impeding the adoption of the EMR (Mcginn, et al. 2012).
For the past 30 years, nurses have been encouraged to acquire and build upon
informatics competencies to perform their role optimally (Ball & Hannah,1984; Graves &
Chocoran, 1989; Herbert, 1999; Saba & McCormick, 1996; Scholes & Barber, 1980; Staggers,
Thompson, Happ & Bartz, 1998; Turley, 1996). Still, nurses consistently self-report a lack of
competency to perform even the most basic computer functions, outside of those required within
their work environment (Ball, Douglas, Hinton-Walker, 2011; Hwang & Park, 2011; Thede &
Sewell, 2009).
A gap exists in the literature regarding informatics competency training resources for
nurses and the availability of valid, reliable tools to measure their effectiveness. These findings
are surprising given persistent recommendations from professional nursing organizations calling
for a deliberate, systematic approach to improving the basic computer skills and information
8
literacy of the nursing profession (ANA, 2001; IOM, 2011; TIGER, 2008). Despite this call, the
competencies among nurses is still lacking (Hunter, McGonigle, Dee, Hebda, 2013; Virgona,
2013).
The TIGER Initiative focuses on “helping the nursing profession to adopt informatics
tools, principles, theories, and practices that make healthcare safer and more effective, efficient,
patient-centered, and equitable for all stakeholders” (p. 3). The TIGER Initiative’s Executive
Summary (2010) identified the concern of “informatics illiteracy in nursing education” and
included the imperative for nurses to become competent in the use of technology tools to support
decision making in clinical practice (p. 2). In a recent study (2013), students enrolled in the first
semester of a Doctor of Nursing Practice (DNP) Program were not competent in any of three
categories of computer skills, informatics knowledge, and informatics skills. Despite ubiquitous
use of technology in daily life, the incorporation of basic computer skills and informatics into
nursing curricula is still considered an “essential” need (Virgona, 2013, p. 61). A common
misperception is that younger students enter traditional BSN programs with requisite computer
and information literacy skills. However, recent research suggests that RN to BSN and
Accelerated BSN students may be more competent in informatics than younger Pre-Licensure
students (Choi, 2012). Even though use of technology has been expanded in K-12 education,
“findings indicate that (nursing) students’ computer competencies may be lower than
for the nursing profession (Ball, et al., 2011). The TIGER Initiative calls for competency based
training resources which can help prepare the nursing workforce to improve healthcare delivery
9
using informatics (Ball, et al, 2011). However, informatics competency training resources for
nurses remain insufficient and frequently do not provide validity and reliability data for the
Evidence exists that “without a basic education in informatics, nurses and other
healthcare professionals are limited in their ability to make effective use of communication and
information technology in their practice” (IOM, 2003, p. 85). Educational interventions aimed
directly at informatics competencies of nursing students have proven effective (Choi, 2012;
Flood, Gasiewicz & Delpier, 2010; Tarrant, Dodgson, & Law, 2008, Yoon, et al., 2009).
However, nursing education programs have been slow to support the development of informatics
competencies or to provide the training and resources necessary for nurses to enter the
professional world with adequate technology or information literacy skills (Westra & Delaney,
2008).
Purpose
The TIGER Collaborative issued a Call to Action identifying the need for educational
interventions which foster information technology innovation and adoption by nurses (2008). In
2011, the TIGER Initiative called for competency based training resources which improve basic
informatics competencies (computer skill and information literacy) and prepare the nursing
workforce to improve healthcare delivery using informatics (Ball, et al). While numerous
instruments have been utilized to measure the informatics competencies of nurses, relatively few
research reports provide psychometric data to support the reliability and validity of the
instruments (Lin, 2011). Issues related to computer competency and information literacy have
long been described by researchers, yet few training resources exist and “little attention has been
10
This purpose of this study is to evaluate the psychometric performance of the Self-
Assessment of Nursing Informatics Competency Scale (SANICS) before and after completion of
informatics training. Principal component analysis will be used to examine the factor structure
of the SANICS. Internal consistency reliability will be used to examine the psychometric
properties of the SANICS among a sample of BSN entry-level nursing students. Responsive of
the SANICS over time will be examined following completion of Successful Online Learning
and Orientation Informatics Training (SOLO-IT) modules. Construct validity of the SANICS
will be assessed using a known group approach to compare differences in means for each
SANICS subscale (construct) before and after completion of Successful Online Learning and
Research Questions
The literature supports an ongoing lack of informatics competencies among the nursing
profession, as well as a lack of valid tools in which to measure them. Additionally, the literature
1) What are the psychometric properties of the SANICS among a population of BSN
entry-level students?
2) Is the SANICS responsive over time when used to measure informatics competencies
11
Methods
Essential informatics concepts will provide the foundational underpinnings for this
organizations and national health collaboratives were used to inform the study’s design. A
thorough review of current and historical literature supported the lack of requisite computer
competency and information literacy skills essential for an informatics competent nursing
workforce. The literature also described a persistent lack of reliable and valid tools to measure
those competencies.
for nurses, as well as rigorous assessment tools and strategies to validate their effectiveness
(Ball, et al., 2011). However, a comprehensive literature search identified a lack of web based
training resources and accompanying instruments used to measure the outcomes of informatics
training. Due to the small number of empirical studies available, the search for training
resources was expanded to include both nursing and educational tools which measure
competencies before and/or after informatics training. Review of curriculum based informatics
instruction and/or health record simulation training sites was excluded from this search.
Summary
Researchers have developed measurement scales which define and quantify the
informatics competency (IC) levels needed for nurses and nursing students (Staggers, Gassert, &
Curran, 2002; Yoon, et al., 2009). Competency levels outlined by these instruments are
comprehensive and span from the “beginning nurse” to “informatics innovator” (Staggers, et al.,
2002). While nursing informatics competencies have been described for more than a decade, a
comprehensive review of the literature revealed a surprising lack of research studies describing
12
training interventions to address the problem of insufficient informatics competencies among
tools to measure the effect of informatics competency training among nurses or nursing students
13
Chapter II
Theoretical Framework
objectivist epistemology. In post-positivism, objectivity is the “regulatory ideal” but may not be
achievable due to the research process which allows claims to be refined or abandoned altogether
(Guba & Lincoln, 1994, p. 205). The post-positivist approach does not seek to prove a
hypothesis (as with positivism), but rather to indicate a failure to reject the hypothesis (Phillips &
Burbules, 2000). Human knowledge can be challenged and assumptions withdrawn, when
warranted, for further investigation (2000). Quantitative research is defined as “a means for
testing objective theories by examining the relationship among variables” (Creswell, 2009, p. 4).
bias, controlling for alternative explanations, and being able to generalize and replicate findings”
Scientific inquiry is dependent upon the adequacy of its measures (Foster and Cone,
accurately measure the informatics competencies necessary for nursing practice in a technology-
driven healthcare environment. This model is built upon the Principal Components of Scale
Theory (Guttman, 1941), which involves a vector of numerical weights or scores with
the sum of the weights for the item response” in such a manner that maximizes the “correlation
ratio” (p. 327). The resulting scoring weights support item reliability by factoring a matrix of
14
inter-item correlations. Principal components are considered correlated with the scores when the
for self-directed learners who work in fast paced healthcare environments. The need for an
of nursing practice (ANA), and recommendations of authoritative bodies (IOM, NLN). The
model shows the relationship between easily accessed tools for self-directed learning and the
15
Psychometrics
This study will measure the items contained within the Self-Assessment of Nursing
Informatics Competencies Scale (SANICS) and will analyze their psychometric properties.
Psychometric evaluation is concerned with an instrument’s reliability and validity (Carmines &
Zeller, 1979). An instrument is a device used by researchers to measure variables (Neale &
Liebert, 1986). Commonly, research instruments include questionnaires, rating scales, and
repeated on a different population, remains consistent (Cortina, 1993). The degree to which
This study will evaluate the psychometric properties of the SANICS instrument using: 1)
Principal component analysis with oblique (promax) rotation will be used to determine
the factor structure of the SANICS instrument. Principal component analysis is the preferred
method for factor extraction since, unlike factor analysis, it allows for all sources of variability
(unique, shared, or error) to be analyzed (Mertler & Vannatta, 2010). The goal in principal
component analysis is to “extract the maximum variance from the data set, resulting in a few
determine which factors should be retained. In parallel analysis, “eigenvalues from research
data prior to rotation are compared with those from a random matrix of identical dimensionality
to the research data set. Component PCA eigenvalues which are greater than their respective
component parallel analysis eigenvalues from the random data would be retained” (Franklin,
Gibson, Robertson, Pohlmann & Fralish, 1995, p. 100). An eigenvalue is “the amount of total
16
variance explained by each factor, with the total amount of variability in the analysis equal to the
number of original variables in the analysis (i.e. each variable contributes one unit of variability
to the total amount)” (Mertler & Vannatta, 2010, p. 234). Parallel analysis is “considered more
replicable than using eigenvalues or Scree plot to determine the cut-off for retention” (Yoon, et
al, 2009, pg. 548). Factor loadings and promax rotation with Kaiser normalization will be used
to examine and confirm correlations among factors (Tabachnick & Fidell, 2007). Factorial
measures what it was intended to measure (Guilford, 1946). Rotation is “a process by which a
factor solution is made more interpretable without altering the underlying mathematical
structure” (Mertler & Vannatta, 2010, p. 238). Promax rotation is an oblique rotation of factors
Cronbach’s alpha to assess the total scale and each subscale. Cronbach’s alpha (Cronbach, 1951)
is one of the most popular reliability statistics and is used to determine “the internal consistency
or average correlation of items in a survey instrument to gauge its reliability” (Reynaldo &
Santos, 1999, p.2). An acceptable reliability coefficient of 0.7 has been identified as reliable,
though lower thresholds have been used (Nunnally & Bernstein, 1994). A scale which fails to
show variables with high correlation would be considered to have poor reliability (1994).
Construct Validity
(Shuttleworth, 2014). Construct validity is frequently measured using intervention studies, where
groups with low scores on the construct being measured are taught the construct, then re-
17
measured. The presence of statistically significant differences between pre and post-test scores
Construct validity of the SANICS will be assessed using a known group approach to
compare differences in means (averages) for each SANICS subscale (construct) before and after
modules. In a known group approach, “the instrument is administered to two groups known to
be high and low on the measured concept (Burns & Grove, 2010). This study will also determine
al., 2009). If the SANICS is sensitive to individual differences in means (averages) before and
after completion of SOLO-IT, then the mean performance between each group should be
Responsiveness
time between baseline and post-test (Middel & Van Sonderen, 2002). Responsiveness, or effect
size (ES), of the SANICS will be determined using a standardized response mean (SRM) which
is commonly used to measure responsiveness (2002). The SRM will be calculated as the
difference between scores before and after completing SOLO-IT, divided by the standard
deviation of the difference. Effects sizes of < 0.20 are commonly accepted as trivial
responsiveness; ES ≥ 0.20 < 0.50 small; ES ≥ 0.50 < 0.80 moderate; and ES ≥ 0.80 large (Cohen,
1977). Responsiveness of the scale over time will be assessed using independent sample t-tests.
18
Studies Describing Informatics Competencies
competencies largely in use today (2001, 2002). This seminal work has served as the foundation
for studies which have followed, resulting in expanded competency descriptions for specific
populations. Despite the significant contributions by Staggers and colleagues which have
still fail to define this concept (Hunter, McGonigle, & Hebda, 2013).
In a recent study (2011) of 350 nurses, more than two-thirds (69.2%) reported below
average informatics competency and over half (58.9%) rated computer skills as below average
(Hwang & Park). The ability to search databases and use nursing-specific software, both critical
information technology skills, were also reported as lacking by survey participants (2011). This
study positively correlated the presence of basic computer skills and formal informatics
education with informatics competency. Hwang and Park concluded accessibility to informatics
education, as well as training opportunities to improve basic computer skills, are needed in
nursing practice to improve the competencies necessary for managing and using healthcare
information, and improving patient safety (2011). Furthermore, the researchers recommended
Another study exploring common practices employed by RNs when accessing data to
support evidence based practice (EBP) indicated a preference for information obtained from
colleagues rather than a review of scholarly literature (Dee & Stanley, 2005). The reason cited
for this preference was unfamiliarity with navigational skills necessary to access electronic
databases (2005). Opportunities to infuse technology into nursing education include access of
real-time linkages to expansive and widely available global information networks. Nursing
19
programs frequently focus on the use of technology as a means to support educational
programming rather than an essential tool to prepare students for technology enhanced nursing
In further research conducted by McNeil and colleagues, college deans and directors
from 266 baccalaureate and higher nursing programs ranked their faculty’s ability to teach and
use nursing informatics as “novice” or “advanced beginner” (McNeil, et al., 2005). The study
revealed that United States (U.S.) nursing programs inconsistently taught information literacy
skills, standardized nursing language, and technology supported evidence based practice. Their
research examined informatics knowledge and skills taught in baccalaureate and master's level
nursing education programs in the U.S.. Findings from this study showed that only about half of
U.S. baccalaureate nursing programs (n=135) taught information literacy skills or required basic
word processing and e-mail skills for entering students. Only 25% (n=67) expected students to
enter their nursing program with information literacy skills and only 9% (n=24) expected
students to have the ability to use presentation software. The informatics content taught most
patient record” (46%), “ethical use of information systems” (45%), “informatics nurse
competencies” (40%), and “informatics definitions” (39%). Only 37% of respondents reported
teaching informatics content which supported evidence-based practice. Graduate programs were
found to have even less informatics related content areas than undergraduate programs. The
researchers also examined the perceptions of faculty members’ informatics competency and their
use of informatics tools. Almost half of respondents ranked nursing faculty abilities as “novice”
or “advanced beginners” with using nursing informatics applications. This study determined that
a critical need exists to “include informatics concepts, informatics skills, and the use of
20
informatics tools in professional nursing practice within nursing curricula across the US” and to
prepare faculty who are qualified to teach these skills (p. 1029). These findings have
implications for nursing education since the ability to access and translate informatics knowledge
is essential for evidence based nursing practice (IOM, 2011). Much work is needed to prepare
current and future generations of nurses to function as competent users of the information
technologies that support evidence based practice (Desjardins, Cook, Jenkins, Bakken, 2005;
Frequently, a substantial amount of time passes between an initial nursing degree and a
return to higher education. This delay between nursing programs of study exacerbates the
problem of informatics deficiencies for nurses since technological changes in the educational
Many high functioning students (top 10% of incoming college freshman) capable of
performing internet based literature searches lack the ability to 1) access scholarly sources, 2)
think critically about the information attained, and 3) lack awareness of legal and ethical issues
relating to information technology (Gross & Latham, 2009). Consensus exists among deans and
directors of nursing programs across the nation that nursing students should graduate with basic
information literacy skills, though such skills are frequently not taught in undergraduate
programs (McDowell & Ma, 2007). Additionally, nurse educators frequently possess only basic
skills in the area of applied informatics (Skiba, Connors, Jeffries, 2008). Informatics training,
therefore, shifts to the employing institution and to nursing administrators who are left to manage
the lack of informatics training and the inadequate technological preparation of nurses (Westra &
Delaney, 2008). Because graduating nurses are prepared insufficiently with technology skills,
21
formalized informatics training which should have been provided during the post-secondary
Information literacy (the ability to access, process, and use information) is the
recommended standard for nursing and nursing students (IOM, 2011). Nurse information
literacy skills must be highly efficacious to manage vast amounts of new information (Majid,
Foo, Luyt, Zhang, Theng, Chang, and Mokhtar, 2011). However, despite groundbreaking
outside of their own work environment (Thede & Sewell, 2009). A survey of nurse executives
nursing graduates (Nurse Executive Center Nursing School Curriculum Survey [NECNSC],
2008). Only 10.4% of respondents felt that nursing graduates are being sufficiently prepared for
practice in the area of technical skills. This finding was followed with agreement by almost 90%
Since the arrival of computer systems to the healthcare setting, nurses have been urged to
obtain competency with using technology in all practice settings (Ball, Douglas, Hinton Walker,
2011). Experts and leading professional nursing organization have provided perspectives
regarding the need for all nurses to use and apply information technology in their practice (ANA,
Nursing experts and professional organizations have called for a nursing workforce
the nursing curriculum and development of informatics competencies among nurses is still
lacking (Hunter, McGonigle, Dee, Hebda, 2013; Virgona, 2013). While numerous informatics
22
competency instruments have been utilized, relatively few research reports provide psychometric
data to support the reliability and validity of the instruments (Lin, 2011). Issues related to
computer competency and information literacy have long been described by researchers, yet
“little attention has been paid to the validity of scales used” (Lin, 2011, p. 306).
competencies using a variety of instruments over the past decade. An outline of what is
(including their psychometric properties, when available). This information is also summarized
chronologically in Table 2, with studies using the SANICS instrument highlighted separately
The 43 informatics novice nurse competency items developed as part of the seminal work
by Staggers, Gassert, and Curran (2001) served as the foundation for the Graduating Nurses’
pilot study, informatics competencies were ranked by 42 graduating seniors. The majority of the
program’s graduating students rated themselves as having moderate ability on novice nurse
communication and desktop software, and lowest competencies in the use of documentation
information technology into all courses and establishing minimum performance standards for
each course. The project “lacked the rigor of formal research” since reliability data on how it
23
Computer Literacy Scale for Newly Enrolled Nursing College Students
Lin (2011) described the reliability and validity of computer competency and computer
literacy scales used in nursing. Of the five survey tools included in the background evaluation,
only three included any basic reliability/validity data, and this information was limited (Bataineh
& Baniabdelrahman, 2006; Cole & Kelsey, 2004; Elder & Koehn, 2009; Lupo & Erlich, 2001;
McNeil, Elfrink, Beyea, Pierce, & Bickford, 2006). The researcher described the computer
literacy of Taiwanese nursing students using the Computer Literacy Scale for Newly Enrolled
Nursing College Students (2011). This assessment scale was developed based on the Ministry of
Education (MOE) guidelines. The resource was designed for technical schools and included
computer hardware, software, networks, computer problem solving and IT society (Hinkin,
1998). The content validity index (CVI) was tested using eleven experts who rated content
relevance (2011). Principal component analysis with varimax rotation was used to estimate total
variance, and Eigenvalues greater than one guided the total number of factors used. Exploratory
factor analysis confirmed the content validity and internal consistency of the instrument. The
researcher concluded the instrument had good content validity, reliability, convergent validity,
and discriminant validity, and was an “excellent computer literacy assessment for newly enrolled
competency (Elder & Koehn, 2009). Students were offered a voluntary tutoring session on
computer skills as an incentive to participate in this study. The researchers compared computer
competency self-ratings of 79 first and second semester nursing students and 8 RN-BSN
24
completion students. Ratings were then compared with the actual performance of those skills on
of reliability, resulting in in an alpha of 0.65. Content validity, questions, and skills were derived
from concepts taught in a basic computer course. Correlation coefficients were computed for
survey and actual assessment scores which showed a low but significant correlation (r=0.282,
p<0.05). Results suggested that students self-rated computer skills higher than actual ability to
The Electronic National Nursing Informatics (eNNI) Project measured the electronic
documentation competencies of nurses and nurse educators before and after an educational
(Rajalahti & Saranto, 2012). A total of 158 Finnish nurse educators and novice-to-experienced
RNs took part in this study. An e-questionnaire was developed and used based on the
foundational work of Staggers, et al. (2002) and Saronto (1997) which itemized requisite
computer competency and information literacy skills. The psychometric properties of the e-
questionnaire instrument were not evaluated during this project. The researcher concluded that
informatics competencies of the nurse participants and educators did not improve as a result of
Ornes and Gassert (2007) used the beginning nurse competency items from Staggers’
(2001) earlier work to develop an informatics competency tool for beginning nurses. This tool
determining how informatics content was represented in the curriculum. Researchers determined
25
students were not routinely exposed to computerized systems in the class room and may not be
concluded that students received insufficient exposure to informatics and were at risk for being
exposure to nursing informatics within the BSN curriculum. No reliability/validity data was
provided for the instrument used in this study (Ornes & Gassert, 2007).
Desjardins, Cook, Jenkins & Bakken (2005) evaluated the effect of an informatics
version of the tool developed by Staggers, et al (2001, 2002). The researchers used a repeated-
informatics competencies before and after a curriculum on Informatics for Evidence Based
Practice (IEBP). Students were not competent in information literacy at the beginning of the
BS/MS Program, despite having a bachelor’s degree in another field. Significant increases were
reported from admission to graduation in most or all of the cohorts studied. Findings suggested
incorporating informatics into the curriculum was successful. However, all data was self-
reported, and information concerning the validity or reliability of the survey tool was not
Elder and Koehn (2009) observed undergraduate nursing students may perceive computer
skills to be higher than ability to perform them, suggesting that actual competencies may be even
lower than reported. The researchers compared the self-rated computer competencies among
incoming nursing students’ (n=87) with their ability to perform those skills on assessment. The
26
Computer Competencies Survey tool was developed by the researchers and included Likert-type
scale items asking students to rate themselves from expert to no experience. The survey was
graded assessment also created by the researchers. Correlation coefficients showed a low, but
significant correlation for the survey and assessment (r=0.282; p<0.05). No psychometric data
describing the reliability and validity of either tool was provided. Findings from this study
suggested that students frequently “did not have an adequate grasp of basic computer
knowledge”, as demonstrated by high self-ratings and low performance scores (Elder & Koehn,
2009).
McDowell and Ma (2007) surveyed 411 students on admission and 429 upon graduation
program entrance and exit. Significant increases in experience were noted with word processing,
e-mail, and Internet experience, but little improvement was noted in information literacy, or
advanced skills, such as spreadsheets and use of data bases. Results suggested that “nursing
education programs currently may not be providing beginning nurses with the tools needed to
effectively and efficiently work in the technology-rich healthcare arena” (2007, pg. 30). No
self-perceptions of nursing informatics competency (Hunter, McGonigle & Hebda, 2013). The
231 items on the instrument were selected from three other instruments already in existence
following three rounds of reviews by content experts. The basic-computer-skills items (108)
27
came from European Computer Driver License computer literacy course ( 2012). The
information literacy items (47) were adapted from the American Library Association's
management (76) came from the Health Level Seven electronic health-record-system functional
model (2004). Content Validity Index (CVI) was calculated on each subset of NI competencies
Management=1.0; Basic Computer Skills=1.0. The instrument was piloted with 184 participants
ranging in age from 26-70 years (161 of the respondents were RNs). The majority of
participants ranked themselves as expert on most of items on the survey, with lesser confidence
on information literacy related items. Limited reliability/validity data was provided, as this
report focused on instrument development. The researchers stated an analysis of each item will
be forthcoming and reported in a future article (Hunter, McGonigle & Hebda, 2013).
Hwang & Park (2011) conducted a cross-sectional study among nurses in Seoul, Korea.
computer skills, attitudes toward computers and population. Informatics competency positively
correlated with basic computer skills and formal informatics education. The majority of nurses
(69.2%) rated informatics competencies below average and more than half (58.9%) rated
computer skills as below average. Content validity was examined by three informatics experts.
Principal component factor analysis with varimax rotation revealed factor loadings ranging from
.41 to .81. Based on principal component analysis, factors were considered to be in the
informatics content which includes basic computer skills (Hwang & Park, 2011).
28
Technology Skills Assessment Survey
to assess the self-perceptions of technology skills among graduate Registered Nursing students
(n=19) enrolled in their first semester of course work (Virgona, 2012). The survey tool included
two qualitative and ten quantitative items measuring nurses’ perceptions of current technology
skills and the barriers to using and learning new technologies. Students generally rated
themselves as novices with technologies (Word, Excel, HTML, Javascript, and online bill
payment) other than social media and smart phones. Technology skills were not seen as critical
when entering nursing, but critical for promotion. The major theme emerging from the
qualitative data was the lack of in-house technology training resources, and the perception that
certain technology skills were not valued by organizations (Virgona, 2012). Reliability and
Choi and Zucker (2013) compared the informatics competencies of Doctor of Nursing
Practice (DNP) students enrolled in the post-BS track (n=68) with students enrolled in the post-
MS (n=64) using the Competency Assessment Tool. This 86 item instrument assessed 18 areas
informatics skills. The instrument was based on Staggers and colleagues competency statements
for the beginning and experienced nurse (2001, 2002) with additional items added on
information literacy (Bakken, et al, 2004 and Curran, 2003). The internal consistency
reliabilities of the instrument were high, with a Cronbach's alpha for all items of .98. The
Cronbach’s alpha for the three competency categories was: computer skills .97, informatics
knowledge .95, and informatics skills .93. Overall, students in the post-BS and post-MS tracks
29
were not competent in any of the three categories of informatics competency at the beginning of
their first semester in the DNP Program. However, statistically significant improvement was
curriculum to improve the informatics knowledge and competency skills of DNP students,
particularly computer skills for decision support (Choi & Zucker, 2013).
The SANICS
instrument chosen for this study. The internal consistency reliabilities of the instrument are high.
The SANICS was developed to assess the informatics competencies of nursing students and
practicing nurses using a valid, reliable means of measurement. This instrument is primarily
based on the informatics competencies developed by Staggers and colleagues, for beginning and
novice level informatics users (Staggers, Gassert, & Curran, 2002). Additional items were added
with descriptors ranging from 1 (not competent) to 5 (expert) (2009). The instrument was
initially tested on a sample of 336 students completing the baccalaureate portion of a combined
components analysis with oblique promax rotation extracted a five-factor structure which
explained 63.7% of the variance. Those factors were as follows: Basic Computer Knowledge
and Skills (Cronbach’s alpha = 0.94), Applied Computer Skills: Clinical Informatics (Cronbach’s
alpha = 0.89), Clinical Informatics Role (Cronbach’s alpha =0.91, Clinical Informatics Attitudes
(Cronbach’s alpha = 0.94), and Wireless Device Skills (Cronbach’s alpha = 0.90).
30
Responsiveness of the SANICS was supported by significantly improved scores following
The studies summarized below describe how the SANICS was used to measure
Choi
The SANICS was the chosen instrument in a study to compare informatics competencies
and Accelerated BSN (Choi, 2012). Students from each group were found to differ significantly
in overall informatics competency (F(2, 92)=4.31, p=.02). The RN to BSN students were
(p=.02). The SANICS showed high internal consistency reliabilities. Cronbach’s alpha for the
total scale was .95. Subscale alphas ranged from .93 for “Basic computer knowledge and skills”
to .89 for “Clinical informatics role” and “Wireless Device Skills”. The researcher recommends
future research to identify factors affecting informatics competency, such as age, gender, basic
computer skills, level of nursing experience, or formal informatics education (Choi, 2012).
examined using the SANICS (Choi & De Martinas, 2013). The internal consistency reliabilities
of the instrument were high. The Cronbach’s alpha for each subscale was calculated for each of
the five factors, as follows: Basic Computer Knowledge and Skills (0.94); Applied Computer
Skills (0.90) Clinical Informatics Role (0.89); Clinical Informatics Attitudes (0.90); and Wireless
Device Skills (0.87). Overall, students were found to be competent in informatics: Graduate
31
students reported a higher informatics competency mean (3.23, SD = 0.70) than undergraduate
students (3.01, SD = 0.72) (t = 2.35, p = 0.02). Study findings indicate that students in both
programs were most confident in their basic computer skills and informatics attitudes. However,
students from both programs perceived themselves to be less competent in the areas of applied
The psychometric properties of the SANICS were examined in nursing students attending
an undergraduate (n=131) and graduate (n=171) program. The five-factor structure of the
instrument was valid, and accounted for 69.38% of the variance. The Cronbach’s alpha was 0.96
for the total scale with subscales ranging from 0.94 for basic computer knowledge and skills to
0.84 for data/information management skills. The SANICS showed good responsiveness
students with diverse demographic and educational backgrounds was demonstrated following
completion of an informatics course. Construct validity using a known group approach was
supported by significantly higher mean scores for graduate compared to undergraduate students.
Researchers concluded the SANICS is a psychometrically sound for nursing students with
32
Table 1
33
Students were not
Computer routinely
Competencies in the exposed to
Curriculum Tool computerized
Matrix developed to systems and may
Ornes & Not evaluate not be prepared to
Based on Categories Gassert Provid- informatics No validity/ Not use IT.
of Informatics (2007) ed competencie reliability data Applicable No syllabi
Competency for the s within addressed
Beginning Nurse syllabi of 18 informatics
(Staggers, et al. courses. knowledge
(2001, 2002). competencies
Graduating Nurses'
Self-Evaluation of Majority of
Computer “Lacked the rigor graduating
Competencies 42 No
Fetter (2009) 43 graduating of formal research Intervention students self-rated
Survey since reliability data as having
seniors
were not collected” moderate ability
Based on Staggers, et on novice nurse
al. (2001) novice No validity/ standardized
nurse competencies reliability data competencies
34
Content validity,
questions, and skills
tested were derived Results suggested
from textual that students rated
Computer concepts used in a their skills higher
Competencies basic computer Pre-test to than their actual
61 first and
Survey
second course. self-assess performance of
and
semester current computer skills.
Computer Correlation computer
Elder & 40 nursing
Competency coefficients were competency
Koehn (2009) students and
Assessment Tool computed for followed by Limited reliability/
RN-BSN
completion survey and computer validity studies
Both tools developed
students assessment scores assessment of
by the researchers
and showed a low ability to
Elder & Koehn (2009)
but significant perform skills
correlation (r =
0.282, p < 0.05).
35
More than two-
thirds felt they had
Cronbach alpha insufficient
coefficient = .79; informatics
Informatics Principal competency and
Competency component factor more than half
Questionnaire analysis with rated computer
Hwang & 292 hospital varimax rotation skills below
Park (2011) 40 nurses revealed factor No average.
Designed by the loadings ranging intervention
researchers from .41 to .81. Based on principal
(Hwang & Park, Content Validity component
2011) was examined by analysis, factors
three informatics were considered to
experts. be in the
developmental
stage.
Education
provided on
The eNNI Project 136 nurse electronic The NI skills of
educators, nursing the participants
e-questionnaire based novice and documenta- and educators did
on Saranto (1997) and Rajalahti, experienced No validity/ tion and not improve
Staggers, Gassert & Saranto 158 RNs, other reliability data information during the project
Curran's (2002) (2012) nurse communica-
informatics professionals tion No reliability/
competencies research in Finland technology validity studies
(ICT)
36
Students perceived
technology skills
as not critical
19 graduate when entering
Technology Skills Registered
No validity/ No nursing but critical
Assessment Survey Virgona 12 Nursing
(2012) students reliability data intervention for promotion.
Developed by the enrolled in Lack of training
researcher first semester resources were
Virgona (2012) of course identified as a
work barrier
37
TIGER Initiative
Competencies
38
Self-Assessment of Nursing Informatics Competency Scale (SANICS)
Principal component
analysis with oblique
Self- Assessment of
promax rotation
Nursing Informatics
extracted five
Competencies Scale
factors: clinical Informatics Preliminary
(SANICS)
informatics role curriculum evidence exists for
Yoon, Yen & 93 Combined
(α = .91), basic which the reliability and
Yoon, Yen & Bakken Bakken BS/MS
computer knowledge emphasized validity of the
(2009) (2009) nursing
and skills (α =.94), informatics SANICS
students in
applied computer tools to
Primary source was 2006-07
skills: clinical support Sample
Staggers, et al., (2001, (N=336)
informatics (α =.89), patient safety dependent-
2002) beginning and
nursing informatics mindfulness young with a high
experienced nurse
attitudes (α =.94), modeling and level of basic
informatics
and wireless device monitoring computer
competencies
skills knowledge and
(α =.90). skills
RN to BSN
students were
131 nursing Cronbach’s alpha for significantly more
students the total scale was competent
from three .95. Subscale alphas (mean=3.21) than
SANICS tracks: ranged from .93 for Traditional Pre-
Choi (2012) 30 Traditional “Basic computer No Licensure students
Yoon, Yen & Bakken Pre- knowledge and intervention (mean=2.82)
(2009) Licensure, skills” to .89 for (p=.02).
RN to BSN, “Clinical informatics
and role” and “Wireless The SANICS had
Accelerated device skills” high internal
BSN consistency
reliabilities
39
Construct
validity using a
Cronbach’s alpha of known group
0.96 for the total approach
scale with was supported by
subscales ranging significantly
from 0.94 for basic higher mean
Choi & 131 computer scores
14-week
Bakken 30 undergrads knowledge/skills to for graduate
online
(2013) 171 post- 0.84 for students compared
informatics
SANICS graduates data/information to undergraduate
course
management skills. students.
Yoon, Yen & Bakken
(2009) The SRM was large The SANICS is
(0.99) indicating the psychometrically
SANICS was sound for nursing
responsive students with
diverse
demographic and
educational
backgrounds
40
Students were
Cronbach’s alpha for competent in
each subscale is informatics.
below: Graduate students
Total (= 0.96) reported a higher
Basic computer informatics
knowledge and skills competency mean
SANICS Choi & De ( 0.94); Applied (3.23, SD = 0.70)
Martinas 30 289 computer skills No than
Yoon, Yen & Bakken (2013) undergrad (0.90) Clinical intervention undergraduate
(2009) and graduate informatics role students (3.01, SD
students (0.89); Clinical = 0.72) (t = 2.35, p
informatics attitude = 0.02).
(0.90); Wireless
device skills (0.87) The internal
consistency
reliabilities of the
instrument were
high
41
Informatics Training and Assessment
a Call to Action requesting education and training interventions which foster information
technology innovation and adoption by nurses (2008). Essential skills for clinicians were
identified as basic computer competencies and information literacy (2008). While the need for
improved informatics competencies has been well established for more than 25 years (Ball,
1984; Graves, 1989; Herbert, 1999; Saba & McCormick, 1996; Scholes & Barber, 1980;
Staggers, Thompson, Happ & Bartz, 1998; Turley, 1996), there remains a lack of training
resources for nurses (Ball, et. al, 2011). Informatics competency is also essential for students
enrolled in web based and web enhanced coursework commonly used to deliver nursing
education (Virgona, 2013). Education and training interventions described in the literature are
typically limited to informatics content modification for a single course (Bakken, Sheets, Cook,
Curtis, Soupios, Curran, 2003), or curricular revisions which measure informatics competency
from program entry to program exit (McDowell & Ma, 2007). Research has consistently
supported the effectiveness of courses which address the lack of informatics competencies of
nursing students (Choi & Bakken, 2013; Desjardins, Cook, Jenkins & Bakken, 2005; Shorten,
Wallace & Crookes, 2001; Tarrant, et al, 2008; and Wallace, Shorten, Crookes, McGurk, &
Brewer, 1999). Unfortunately, only half of nursing programs teach information literacy skills
Nurses work in complex, fast paced environments where they must quickly process data
and form clinical judgments which ultimately guide patient care (Kossman, Bonney & Kim,
training resources available “at any time and from any site” (Ball, et al., 2011, p. 445).
42
Educational strategies limited to the occasional informatics course, or initiative which require
matriculation through an entire nursing program are insufficient and inadequate to meet the
A comprehensive review of the literature was conducted for training interventions which
literacy) following a training intervention. This review identified a lack of web based training
resources and accompanying instruments to measure outcomes resulting from the delivery of
informatics training. Due to the paucity of empirical studies, the search for informatics training
resources for nurses with valid/reliable instruments was expanded to include 1) works published
since 1995, and 2) published resources/tools used in general education to measure computer
competency or information literacy before and/or after informatics training. These results
yielded only eight such informatics training resources in the scholarly literature or via public
access. Of these resources, only two included measurement tools. Review of curriculum-based
informatics instruction and/or health record simulation training sites was excluded from this
search. Training resources identified by this review are described in the following discussion,
The Task Force of the Council of European Professional Informatics Societies (1997-
2014) developed the European Computer Driver’s License (ECDL) Certification Program as a
way to promote standardization of information technology skills within the IT industry and for
the general population. The subscription based ECDL online modules include computer
essentials, online essentials, word processing and spreadsheets. Additional training options (fee
43
based) include presentations, using databases, web editing, image editing, project planning, IT
security, online collaboration, and heath information storage. The TIGER Informatics
Competencies Team offered their endorsement of the ECDL and recommends training for nurses
using the Basic Computer Competencies Modules. The ECDL is the world’s largest end-user
computer skills certification program, has been used by more than 7 million users, and has “a
very well developed and mature training program, work book, and testing process” (TIGER,
2008). The ECDL’s Basic Computer Competency training is comprised of the seven modules
outlined below:
Module 4 – Spreadsheets
Module 5 – Database
Module 6 – Presentation
While ECDL training is endorsed by the TIGER Collaborative, its modules are
generalized for use in a wide range of industries and do not address essential computer skills
required in healthcare, or the unique challenges of the healthcare setting (eg. HIPAA, meaningful
use, interoperability, etc.). Validity and reliability measurements for the tools used for ECDL
assessments are not available online, and were refused upon request of this researcher.
As mentioned above, the TIGER Collaborative has recommended the ECDL program as
a computer competency training option for nurses (2008). However, a new subscription based,
44
Virtual Demonstration Center (2013) sponsored by the TIGER Collaborative, was recently
launched as a virtual conference, simulation and training site for nurses. The Demonstration
Center contains web based informatics resources, including links to clinical simulations, as well
as observable and interactive demonstrations. Links are embedded within the Center’s web
pages allowing users to navigate to other sites which support the development of informatics
competencies. Access to the Center also includes webinars and educational sessions covering
electronic health records, usability, clinical decision support, meaningful use, health information
exchange, and interoperability. The vision of the TIGER Virtual Demonstration Center is to
delivery systems of the next three to ten years” (2013). An online instrument, (The TIGER
Online Self-Assessment Tool) has been designed and piloted which measures self-perceptions of
informatics competencies. This instrument has demonstrated moderate content validity (Hunter,
Refresher” is available via open source through New York University’s (NYU’s) Library
(Jacobs, 2014). The site includes four self-paced online modules covering the following topics:
Evidence Based Nursing- Locating the best evidence, developing the research question,
expanding the search for evidence, specialized databases, and clinical guidelines.
Web Resources – Searching the web, search engines, web directories, consumer health
45
Tools – Saving search histories
Learners move through the site using navigational arrows at the bottom of each page.
The site is not interactive and skill demonstrations, or the use of assessments or evaluation
The IC³ Training and Certification Program is available online through Pearson Vue
Business (2000). The site targets incoming college students who may need computer
remediation or employers who may wish to use its assessment capabilities as a screening tool for
fundamentals, key applications (word processing, spreadsheet and presentation software), and
living online (internet and networking). A convenience feature found on this site is the “IC³ Fast
Track” which allows literacy skills to be quickly assessed. This resource is only available
commercially, with an option to purchase bulk user licenses. Certification is achieved through
administered monthly in 148 countries and in 27 languages. Benefits of IC³ certification are
described as a means to “validate digital literacy skills”, although no published data regarding
the site’s effectiveness, or the psychometric properties of instrument could be located. Pearson
Vue denied requests by this researcher to provide data describing the performance of the training
The purpose of the Passport Project for Nursing Success was to “assess the skills,
knowledge, and informatics comfort level of students, while providing computer training and
teaching for beginning nursing students in an undergraduate nursing program” (Edwards &
46
O’Connor, 2011). A survey was created by the researchers since “no tools with measured
reliability and validity were available upon a review of the literature” (p. 5). The survey was
and learning needs. Following this initial assessment, students completed seven self-paced
online learning modules housed in the Blackboard LMS. On-line tutorials with questionnaires
and demonstration assignments were incorporated into the modules to facilitate learning needs of
Module 4: Netiquette
Evaluation of learning was based on the ability to perform various functions, including
complete an interactive evaluation, construct and download a document, and using the quiz
function within Blackboard. Five qualitative program evaluation questions yielded themes of
appreciation for the training, and improved understanding of informatics and Blackboard. No
reliability/validity data was available for any of the quantitative assessments used by the
researchers. While the Passport Project was described in the literature by the authors, the actual
47
Computer Competencies Tutorial
computer skills remediation tool for all incoming freshmen. Completion of this training was
required during the years 1995-2007, but became optional during the years 2008-2011.
Descriptions of this online computer training resource are provided here for completeness only,
as the Tutorial was removed as an active training site in 2013. No assessment instrument was
used to measure the effect of training. The Computer Competency Tutorial consisted of five
5.) Multimedia
web based training site for nursing students (2008). This tutorial consists of three sections
covering animated and/or video recorded content intended to improve information literacy. The
Tutorial’s content includes: formulate a research question, navigate CINAHL and Medline, find
full-text articles, and employ RefWorks to store reference sources and produce bibliographies in
proper APA format. The site is not interactive, requires no assignments, and uses no instrument
to measure outcomes.
48
Pre-test for Attitudes Toward Computers in Healthcare (PATCH)
The Pre-test for Attitudes Toward Computers in Healthcare (PATCH) is an open source,
online tool for self-assessment of general nursing informatics competencies (Kaminski, 2011).
Modules on the site provide information, links, and self-assessment checklists for a variety of
computer based tasks. Attitudes regarding various computer skills are measured via an online
survey, covering the following competency items: Word processing, keyboarding, spreadsheets,
presentations, databases, desktop publishing, internet, e-mail, expert and decision support
hospital information systems, peripherals, PDAs, nursing data and information, current computer
configurations, protection for client data & information, and evidence based practice. A study of
Turkish nurses (n=200) was conducted to assess the validity and reliability of the PATCH Scale
(Kaya & Turkinaz, 2008). Test-retest reliability was 0.20-0.77 and 0.85 for the total scale. Item
total correlation was 0.06-0.68 and Cronbach's Alpha was 0.92 (2008).
49
Table 2
50
4: Netiquette only.
5: Managing
Documents
6: Research &
APA
7: On-line Testing
Modules Include:
Word Processing,
Keyboarding,
Spreadsheets,
Presentations,
Databases,
An online
Desktop
tool for self-
Publishing, Test-retest
Pretest for assessment in
Internet, Email, reliability
Attitudes general
Decision Support was 0.20-
Toward nursing
Systems, 0.77, for the
Computers Informatics http://nursi
Registered Multimedia, Web total scale
in competencies ng-
Kaminski Nurses Development, was 0.85.
Healthcare . Various informatics
(2011) N=200 Telecommunicatio Item total
(PATCH) competency .com/niasse
Turkish ns, Nursing correlation
Self- taxonomies ss/plan.htm
nurses Information was 0.06-
Assessment have been l
Systems, Hospital 0.68 and
Tool reviewed and
Information Cronbach's
integrated in
Systems, Alpha was
the process.
Peripherals, 0.92.
PDAs, Data and
Information,
Computer
Literacy, Life
Long Learning,
Computer-Human
Interface and EBP
This tutorial
Nursing encompasses a
Kaplan Online
Resources: beginner's http://nyu.li
(2014). instructional Nursing
Self-Paced research guide, No bguides.co
New York tool Students
Tutorial web resources, instrument m/nursingt
University consisting of
and tools, and utorial
4 modules
Refresher evidence based
practice only.
Certiport Pearson One hour Incoming IC³ Fast Track can One-time
IC³ Vue online college be used to gauge fee based
Internet and Business competency students or digital literacy test
51
Computing (2000) test screening skills. https://ww
Certifica- tool for new No w.certiport.
tion job Certification published com/portal/
candidates. comprises data common/d
individual ocumentlib
175,000 examinations for No data on rary/IC3_G
exams computing psychometri S4_Progra
administere fundamentals, c properties m_Overvie
d monthly in key applications of w.pdf
148 (word processing, instrument
countries spreadsheet and
and 27 presentation
languages software), and
living online
(internet and
networking )
Tutorial:
Used to formulate
a research
question, navigate http://www
CINAHL and .libraries.ru
Rutgers Medline, find full- tgers.edu/r
Information No
University 3 online Nursing text articles, and ul/rr_gatew
Literacy instrument
Library sections Students employ RefWorks ay/research
Tutorial
(2008) to store reference _guides/nur
sources and sing/tutoria
produce l/
bibliographies in
proper APA
format
Five Modules:
http://acade
1.) Introduction to
mic.stedwa
Incoming Computers
St. Inactive rds.edu/co
freshmen 2.) World Wide
Computer Edward's 5 self-paced Test Link mpetency/
requirement Web
Competenc University online module2/L
until 2009. 3.) Introduction to
y Tutorial (1995- modules No esson1/ww
Currently Word Processing
2011) instrument wandintern
optional 4.) Introduction to
tgetconnect
Spreadsheets
ed.htm
5.) Multimedia
The TIGER
Online Self-
Assessment
52
The TIGER TIGER Nurses and Web based Tool- http://www
Virtual and Web based nursing informatics designed .thetigerinit
Demonstrat Hunter, conference students resources, and piloted; iative.org/v
ion Center McGonigl and including links to measures irtuallearni
e& simulation simulations, as self- ng.aspx
Hebda, center well as observable perceptions
(2013) and interactive of
demonstrations informatics
competency;
moderate
content
validity
Summary
For more than 30 years, authoritative nursing bodies have urged the nursing profession to
strategically address informatics competency for all nurses. Despite persistent recommendations
from the ANA, AACN, NLN, TIGER and IOM regarding the critical need for an informatics
competent nursing workforce, nurses still frequently lack essential informatics competencies,
including basic computer skills. Evidence based practice depends upon ability to use and apply
information technology skills, such as searching data bases and using nursing-specific software.
Studies of nurse executives and academicians describe a nursing profession which remains
insufficiently prepared with basic informatics skills upon entry to practice. Research supports
the effectiveness of courses which address the lack of informatics competencies of nursing
students, with recommendations that priority be given to groups possessing low computer
proficiency. Unfortunately, nursing programs have been slow to incorporate information literacy
skills into the curricula. Informatics training programs which could support nurses in practice
are sparse in the literature. Curriculum based instruction alone is no longer sufficient for nurses
attempting to keep pace in a fast paced technology-rich healthcare environment. Finally, this
review identified few or insufficiently validated instruments which could be used to measure
53
empirical outcomes resulting from delivery of nursing informatics training. The sparse number
of empirically supported informatics training resources and evaluative tools is inconsistent with
recommendations from official nursing and health organizations calling for a deliberate,
systematic approach to improve the basic computer skills and information literacy of the nursing
profession.
The SANICS has become a nationally recognized instrument with numerous empirical
studies attesting to its internal consistency reliability, responsiveness, and factorial validity
(Choi, 2012; Choi & Bakken, 2013; Choi & De Martinas, 2013). Studies of the SANICS to date
instruction. However, no studies to date have examined the effectiveness of the SANICS
instrument when used in a population of undergraduate nursing students before and after a self-
competencies throughout the nursing workforce could depend upon the availability and usability
of on-demand training resources focused on the needs of the self-directed learner. Further
testing is necessary to examine the psychometric properties of the SANICS in a sample of BSN
entry-level nursing students. Principal component analysis will be used to examine the factor
structure of the SANICS. Internal consistency reliability will be used to determine if the
54
Chapter III
Research Methodology
Introduction
the literature regarding the availability of these resources, and a lack of valid, reliable tools to
measure informatics competencies. This study will use psychometric analyses as a means to
(SANICS). The psychometric performance of the SANICS will be evaluated before and after
completion of Successful Online Learning and Orientation: Informatics Training (SOLO-IT) for
nursing students. Principal component analysis will be used to examine the factor structure of
the SANICS. Internal consistency reliability will be used to determine if the psychometric
properties of the SANICS remain consistent in a sample of BSN entry-level nursing students and
if the SANICS is responsive over time when used to measure informatics competencies. This
study will also evaluate the mean rating differences of the SANICS instrument before and after
completion of SOLO-IT. Construct validity of the SANICS will be assessed using a known
group approach to compare differences in means for each SANICS subscale (construct) before
Research Questions
1) What are the psychometric properties of the SANICS among a population of BSN
entry-level students?
2) Is the SANICS responsive over time when used to measure informatics competencies
55
Hypothesis: There will be a significant difference in SANICS scores between
Research Design
based remediation tool designed and created by the researcher to enhance the informatics
competencies (computer skills and information literacy) of nursing students (Godsey, 2011).
Online modules were designed to be self-guided and self-paced to allow nursing students to
learn, practice and demonstrate informatics competencies using common technological tools
embedded into the Blackbord ™ learning management system (Godsey, 2011). This study will
explore the psychometric performance of the SANICS instrument using archived survey data
collected before and after completion of SOLO-IT. A one-group pre/post-test design using the
30-item SANICS was utilized in a sample of undergraduate nursing students. The survey design
competency items contained within the SANICS instrument were administered via Survey
Monkey to student volunteers prior to and at the completion of SOLO-IT. Permission to use the
SANICS (2009; Appendix A) was requested and granted by Dr. Yoon and colleagues.
Conceptual Definitions
Informatics
The ANA defines nursing informatics as “a specialty that integrates nursing science,
computer science, and information science to manage and communicate data, information, and
56
knowledge in nursing practice” (2008, p. 177). The ANA’s (2001) position statement also
support patients, nurses and other providers in their decision-making in all roles and
Informatics Competency
Staggers and colleagues made a significant contribution in the area of nursing informatics
when they defined the competencies, skills, knowledge, and abilities necessary for nurses based
on educational preparation and expertise: beginning nurse, experienced nurse, informatics nurse
specialist, and informatics innovator (Staggers, Gassert, & Curran, 2002). Since the present
Informatics Knowledge. Informatics knowledge needed for nurses at the beginner level
has been categorized into data, impact, privacy/security, and systems information and is
1. Data
• Recognizes the use and/or importance of nursing data for improving practice
2. Impact
• Recognizes that a computer program has limitations due to its design and capacity
of the computer
57
• Recognizes that it takes time, persistent effort, and skill for computers to become
an effective tool
• Recognizes that the computer is only a tool to provide better nursing care and that
3. Privacy/security
management
4. Systems
• Identifies the basic components of the current computer system (e.g., features of a
Computer Skills. Staggers and colleagues outlined beginner level computer skills in the
education, monitoring, basic desktop software and systems skills (2002). A beginner level
1. Administration
58
• Uses administrative applications for practice management (e.g., searches for
• Uses applications for structured data entry (e.g., patient acuity or classification
applications)
• Uses the Internet to locate, download items of interest (e.g., patient, nursing
resources)
• Data access
• Accesses, enters, and retrieves data used locally for patient care (e.g., uses HIS,
• Documentation
• Education
59
• Monitoring
• Systems
• Is able to navigate Windows (e.g., manipulate files using file manager, determine
• Identifies the appropriate technology to capture the required patient data (e.g.,
• Demonstrates basic technology skills (e.g., turn computer off & on, load paper,
(pp. 1-2).
60
Operational Definitions
For the purpose of this study, an entry level nursing student is defined as any student
enrolled in the first year (first or second semester) of a traditional Baccalaureate of Science in
Nursing (BSN) program. Students enrolled in the Registered Nurse to Masters of Science in
Nursing (RN to MSN) Program were excluded from data collection procedures.
called Blackboard. A LMS uses web-based technologies to plan, organize and deliver on-
demand course content and also assess student performance and learner outcomes (Blackboard,
2004). Blackboard is one of the most popular LMSs nationally (Bradford, Porciello, Balkon, &
Backus, 2007; Ferrin, 2013), and was the system of choice for the mid-western university which
approved the conduct of this research and the completion of SOLO-IT by their BSN student
population.
school’s curriculum. This type of instruction typically occurs as part of a didactic course and
lasts for the duration of an academic semester or quarter. Curricular instruction may also refer to
informatics content delivered during the time of the student’s matriculation through the nursing
instruction as part of a brief, intensive, episodic program designed to increase the informatics
61
Competency on the SANICS. The SANICS instrument uses a Likert Scale ranging
Instrumentation
Nursing (Yoon, et al, 2009; Appendix A). Development of this tool evolved primarily from a
Delphi Study conducted by Staggers and colleagues which incorporates competencies for the
beginner and experienced nurse (Staggers, Gassert & Curran, 2001). Additional items were
added to the SANICS by the researchers in areas relating to standardized nursing terminologies,
evidence based practice, and wireless communications (2009). The reliability and validity of the
SANICS has been established using a combined sample of 337 BSN/MSN students (Yoon, et.al,
2009). Cronbach’s α measurements confirmed the internal consistency of related items within
the scale. None of the inter-item correlations was less than α = .89. Reliability coefficients of .70
or higher are usually considered acceptable by most researchers (Bruin, 2006). Independent t-
tests confirmed the responsiveness of the SANICS over time and also measured the five
categorical factors of the scale: 1) Clinical Role, 2) Basic Computer Knowledge, 3) Applied
Computer Skills, 4) Clinical Informatics Attitudes, and 5) Wireless Device Skills. The sub-
scales and individual items contained within the SANICS instrument are described below:
evaluation of systems
62
• Promote the integrity of and access to information to include but not limited to
• Act as advocate of leaders for incorporating innovations and informatics concepts into
• Navigate Windows
63
• Use applications to develop testing materials
• Recognize that one does not have to be a computer programmer to make effective use of
As described previously, the TIGER Collaborative has issued a call to action urging
which foster information technology innovation and adoption by nurses (2008). In response to
this call, TIGER urges the development of competency based training strategies utilizing pre-
and post- test data as measure of effectiveness in meeting competencies (Ball, et al, 2008).
based remediation tool designed and created by the researcher to enhance and measure the
informatics competencies (computer skills and information literacy) of nursing students (Godsey,
2011). Online modules were designed to be self-guided and self-paced to allow nursing students
64
to learn, practice and demonstrate informatics competencies using common technological tools
Sample. The sample for this study will be taken from archived data collected from a
population of 271 entry level nursing students during the Spring 2014 semester. All students
enrolled in the first year of the Baccalaureate of Science in Nursing (BSN) Program were
enrolled into SOLO-IT as a required assignment for most sections of an entry-level nursing
course. Students were instructed to complete SOLO-IT’s six modules during the first two weeks
of the semester in order to prepare for web enhanced instruction. The location for this study was
a medium sized, private university located in Cincinnati, Ohio with an enrollment of 6700
students, including 637 nursing students. Entry level nursing students were selected as the
population for this study since they may be at risk for lacking the informatics competencies
necessary to support and enhance their nursing education. Additionally, this population of
students will have had minimal exposure to nursing informatics or the Blackboard learning
management system (LMS) which houses course content and provides the educational
Each competency measured by the five sub-scales of the SANICS has been incorporated
(either specifically or broadly) into the six educational modules of SOLO-IT. The SANICS was
(Yoon, et al., 2009; Appendix A). Analytical surveys “go beyond simple description; their
intention is to illuminate a specific problem through focused data analysis, typically by looking
at the effect of one set of variables upon another set” (Kelley, et al, 2003). An advantage of
survey research is that it allows for the production of empirical data resulting from “real-world”
65
situations (Kelley, et al., 2003, p. 261). Survey research can also produce a large amount of data
within a relatively short amount of time. Data resulting from representative samples can be
generalizable to other populations (2003). Certain disadvantages can also be present with survey
research. Response rates may be difficult to control, and data may lack detail or sufficient depth
(2003).
The SANICS survey was administered before and after completion of SOLO-IT. The
post survey also included Likert scale items, developed by the PI, measuring the experiences of
learners taking the course, and evaluations of overall course quality. Completion of SOLO-IT
also included successful submission of informatics assignments within the Blackboard ™ LMS.
Each module included quizzes and/or hands-on activities requiring actual demonstrations of
were required to complete SOLO-IT with a minimum score of 92%, with no limit in the number
The structure of SOLO-IT provides an online framework where nursing students can
improve competence with commonly used technological tools (Godsey, 2011). Content was
designed to be consistent with the “beginner level” ICs identified by Staggers and colleagues
(2001).
Self-directed learning. The principles of self- directed learning (SDL) were the guiding
responsibility of the student to contribute to his or her own learning (Chang, 2006; Fisher et al.,
2001). Self-Directed Learning occurs “proactively, independently, and patiently” (Chang, 2006,
p. 269). Students engaged in SDL are charged with a responsibility to learn, schedule time for
66
learning, and plan for integral learning as a means to meet an objective (Chang, 2006). The
concept of individuals who are capable of understanding their own learning needs, goals, and
requirements for learning has been widely studied since its introduction (Knowles, 1975). Self-
directed learning means “learning something proactively, independently, and patiently; being
responsible to learn; learning which is a challenge; a self- training ability and a high curiosity”
(Chang, 2006, p. 269 ). Self-directed learning involves a process by which “learners take
responsibility for planning, carrying out, and evaluating their own learning experiences”
(DeMaris, 2012, p. 42). Self -directed learning requires learners assume an active role as
investors in their own learning (Campbell, Campbell & Dickenson, 1996; Gureckis & Markant,
2012). Self- directed learning encourages students to assume some of the responsibility for their
own learning and views the instructor as facilitator (Hunt, Sproat, & Kitzmiller, 2004). The flow
of information intake is controlled by the learner in a manner that processes information into a
2012). In the online environment, SDLs can progress through content at their own pace,
increasing the likelihood for retention of information which might otherwise be lost (Hunt,
Sproat, & Kitzmiller, 2004). Self-directed learning is the most recommended educational model
Each of SOLO-IT’s six modules are embedded within the Blackboard Learning
System™, the most popular distance learning platform used by institutions of higher learning
(Ferriman, 2013). One of the key features of SOLO Informatics Training is the availability of
supplemental tutorials which are easy to identify and select for the struggling new learner.
67
Supplemental links lead students to additional demonstrations, screen shots, narrative
descriptions and competency based activities that reinforce content. Hands-on computer
activities occur liberally throughout the course and are structured so that competency
demonstrations are practiced repeatedly, in the absence of submission or time limits. A non-
punitive approach to evaluation is also a feature of SOLO-IT. All quizzes, demonstrations and
written assessments permit unlimited submissions and unlimited time for completion (see
Appendix B for SOLO-IT Course Overview and Grading Criteria; Godsey, 2011). This ‘safe
grading’ environment promotes repeated practice of computer and information literacy skills, in
easily accessible, digitally based resources where professional students can learn and develop as
competent informatics users. Students remain enrolled in SOLO-IT during the duration of their
MSN Program progression, and may access or repeat training modules, as often as desired.
and the Blackboard platform. The objectives for this module include:
2. Describe the purpose of the Successful Online Learning Orientation (SOLO-IT) course
3. Perform a browser and software check to enable the necessary functions for successful
course completion
• Module II- Navigating the Computer and Web: Introduces nursing students to the parts and
functions of the computer and presents practice exercises for browsing the web. The
68
2. Explore the use of the internet as a tool to inform nursing practice
nurses, including spreadsheets, documents, and presentation software. The objectives for this
module include:
1. Apply the document construction principles of Microsoft Word, Excel, and Power Point
• Module IV- Information Literacy: Presents library and web based research tools that support
information literacy and Evidence Based Practice. The objectives for this module include:
3. Locate scholarly articles and journals that support nursing research and Evidence Based
Practice
4. Demonstrate the process for locating and evaluating scholarly articles and websites
69
2. Demonstrate the principles of scholarly writing using American Psychological
• Module VI-Issues for the Professional e-Nurse: Introduces future e-nurses to professional
issues encountered in the clinical setting. The objectives for this module include:
4. Describe the roles and responsibilities of nursing in the area of health IT and Personal
5. Explain the difference between the electronic health record (EHR) and the electronic
(Godsey, 2011)
information literacy. The following list outlines the competencies which students must
70
• Understand the term software
• Name files/folders
• Copy files
• Created a PowerPoint containing a background and at least one graphic (or whatever the
assignment was).
71
• Understand and use a search engine
• Send an email
• Forward an email
• Delete an email
72
Plan for Data Collection and Analyses
This study required access to a significant population of nursing students and use of the
university’s proprietary LMS, Blackboard. As such, a process was initiated during the 2012-13
academic school year to determine the feasibility of conducting dissertation research exploring
1. Permission was granted by the Director of the School of Nursing and the Nursing Faculty
Organization (NFO) to conduct future dissertation research using a population of entry level
2. Permission was requested and obtained from the university’s Distance Learning Department
3. Curricular policies were developed outlining the process for students to follow to
unanimously approved by the Nursing Faculty Organization. The policy included suggested
4. Nursing faculty granted permission for the PI of this study to be enrolled (as a Teaching
5. Completion of SOLO-IT will be required as a course assignment and will be worth 10% of
6. A Completion Certificate was awarded for all students who successfully completed SOLO-
IT with a grade of a 92% or higher. SOLO-IT has a maximum of 100 points possible and is
73
worth 10% of the total course grade for select courses. The faculty determined that skills
obtained (or strengthened) through SOLO-IT are critically important and students would be
more likely to take the course seriously if assignment credit was associated with completion
of training.
Following more than one year of planning, it was concluded that dissertation research
nursing students and the use of proprietary LMS software was feasible at this university.
Unanimous approval by faculty of the School of Nursing, permission from the Director of the
School of Nursing and the Educational Technology Department, and the preliminary approval
of the sponsoring university’s IRB have been successfully obtained to conduct dissertation
research. Approval from the University of Hawaii’s IRB has been obtained to extract and
analyze archived survey data as a means of analyzing the psychometric properties and
This research study will use principal factor analysis to examine the structure and internal
consistency reliability of the SANICS to examine the psychometric properties of the SANICS
among a sample of BSN entry-level nursing students, and to determine if the SANICS is
responsive over time when used to measure informatics competencies before and after
implementation of SOLO-IT modules. Construct validity of the SANICS will be assessed using
a known group approach to compare differences in means for each SANICS subscale (construct)
before and after completion of Successful Online Learning and Orientation Informatics Training
(SOLO-IT) modules. Means, standard deviations, and sample sizes will be computed for items
within each SANICS sub-scale category to test the hypothesis that differences in scores from
74
pre- to post- survey (using archived data) will be significant following completion of SOLO-IT
training. Significant results, small p-values, and differences from pre-survey to post-survey in
amounts greater than zero may indicate that SOLO-IT training had a positive effect on students’
self-perceptions of IC. Significant results with a negative difference will indicate that an adverse
unusually high overall course average of 94%. Because all students were required to successful
complete SOLO-IT, the resulting final scores were high and did not provide the data spread
necessary to correlate demonstrated competency scores within SOLO-IT with self -reported
competencies on the SANICS. Plans are underway to re-design the assessment portion of the
course so that correlations between perceived versus demonstrated competencies can be made.
Statistical Power. This study will use a sample size of almost 500 nursing students (256
pre and 242 post SOLO-IT). Recommendations for sample size in factor analysis vary widely in
the literature. Traditional recommendations have included 10 respondents per item (Sapnas &
Zeller, 2002). However, “hypothetical and real research examples illustrate the usefulness of
sub-sample analysis in determining that a sample size of at least 50 and not more than 100
subjects is adequate to represent and evaluate the psychometric properties of measures of social
A power analysis using the G power program (Faul & Erdfelder, 2007) was performed to
determine the sample size required for a t-test comparison of means at alpha= 0.05. The result of
this analysis indicated a total sample size of 202 participants (101 for each group) would be
75
necessary to obtain a medium effect size (d=0.5) and 95% power (Faul & Erdfelder, 2007;
Appendix C).
Plan for Statistical Analysis. Survey Monkey™ was the software program used to
administer anonymous surveys using a 1-5 Likert Scale. Course evaluation procedures were also
conducted and included user satisfaction items ranked on a 1-5 Likert Scale, and open-ended
A SPSS 21 Grad Pack will be used for all psychometric analyses. Principal component
analysis will be used to “extract the maximum variance from the data set, resulting in a few
using a simulator with the same number of variables and observations. Factor loadings and
promax rotation with Kaiser normalization will be used to examine and confirm correlations
among factors (Tabachnick & Fidell, 2007). The performance of the SANICS and the five
categorical factors of the scale will be assessed using Cronbach’s alpha (Tabachnick & Fidell,
2007). A standardized response mean will be used to evaluate the responsiveness of the
SANICS over time (Neale & Liebert, 1986). Differences between pre- and post- survey scores
will be analyzed using two sample t-tests at p = 0.05. The factor structure and internal
consistency reliability of the SANICS will be compared with the study conducted by the authors
of the SANICS instrument (Yoon, Yen & Bakken, 2009) to determine if the instrument’s
psychometric properties remained consistent when used in a sample of BSN entry level students
Ethical Considerations
University of Hawaii Institutional Review Board (IRB). Approval has been granted
by the University of Hawaii’s IRB to conduct a psychometric analysis of archived SANICS data
76
(Appendix D). The request was made for exempt review since the study will involve no risk to
former study participants who completed SOLO-IT and participated in pre/post SANICS data
collection procedures. All data were collected in accordance with study procedures, as approved
by the sponsoring agency (Xavier University) at the time the SOLO-IT course was implemented.
All SANICS pre- and post-survey data were collected anonymously on an off-site third party
secure server. The software (Survey Monkey) has a decoding feature which provides the
following security functions: Firewall restricts access; intrusion detection systems and other
systems that detect and prevent interference or access from outside intruders; QualysGuard
network security audits performed weekly; McAfee SECURE scans performed daily; all data is
stored on servers located in the United States; backups occur hourly internally, and daily to a
centralized backup system for offsite storage; backups are encrypted (Survey Monkey, 2013).
Since the time of data collection, all study results have been stored securely within Survey
page where they were introduced to the modules and provided a link to the pre-course survey.
Students were informed: 1) participation in the pre/post survey portion of the course was
match survey responses with identifiers in Blackboard, and 4) survey data would be collected
and stored anonymously on Survey Monkey (secure, off-site, encrypted, survey software).
Students were also advised that any survey response (or lack of response) would have no impact
on assignment or course grade. Students were also informed that aggregate evaluative data
would facilitate course evaluation strategies and support quality improvement measures.
77
Xavier University IRB. To ensure compliance with university level policies and
procedures and to determine the level of cooperation by the university’s Institutional Review
Board (IRB), a SOLO-IT research proposal was submitted requesting advanced permission to
conduct future PhD student research using a population of nursing students attending the
sponsoring university. The preliminary request for IRB approval was made to determine the
feasibility of performing future research at the sponsoring institution in partial fulfillment of the
dissertation requirements leading to a PhD in Nursing from the University of Hawaii for the
study’s Principle Investigator (PI). Study approval was granted by the sponsoring university’s
IRB and included an open ended date for collection and analyses of study data (Appendix E).
Informed consent was obtained from each student participant. Students were advised of
study procedures and the voluntary nature of survey completion, as evidenced by: 1) the option
to click the link to gain voluntary access to the survey, 2) the option to voluntarily complete pre-
and post- surveys, and 3) the anonymous nature of the survey via an off-site secure server.
Nursing faculty of select entry level courses volunteered to include SOLO-IT completion
as one of the required assignments leading to partial fulfillment of one or more course objectives
(see Appendix C for Course Overview and Grading Criteria; Godsey, 2011). Completion of the
SOLO-IT modules was required for select entry level courses. However, completion of all
SOLO-IT related pre- and post- competency surveys was strictly voluntary.
During the first week of the semester, students enrolled into the SOLO-IT course
received an introductory e-mail message from the PI and SOLO-IT instructional designer
describing the training modules and outlining access to SOLO-IT training. To measure
perceptions of IC, students were asked to complete voluntary surveys prior to and immediately
following completion of SOLO-IT. Student volunteers were informed that data resulting from
78
completion of surveys would inform this research study and would support quality initiatives
within the nursing program. Instructions accompanying pre- and post-surveys restated and
emphasized the voluntary and anonymous nature of study participation. Students were also
advised that survey responses would be analyzed as aggregate data and no attempt would be
made to match enrollment data resulting from SOLO-IT training with anonymous survey data.
Additionally, students were informed that, while completion of SOLO-IT training is a course
would have no impact on assignment or course grades. Finally, students were advised that
survey data would be collected via an off-site, secure server with encryption features. Refusal to
participate in the data collection procedures associated with SOLO-IT survey completion would
not be tracked in any way, further providing assurance that penalties could not be incurred for
considered superior to a post-test only design because it allows for the effect of the intervention
design does not control for potentially confounding variables, such as history, maturation or
regression artifact (Johnson & Christensen, 2004). This study was taken from one sample of
undergraduate nursing students from a single private university in the mid-west, which could
not accurately represent ability to demonstrate competency. Finally, the possibility for a conflict
79
of interest exists given the intervention for this study (SOLO-IT) was also designed and
implemented by the researcher. Full disclosure of this association will be made in all forms of
Summary
The components of this study included a thorough literature review describing computer
competency, information literacy skills, and the current state of informatics competency among
significant gap in the literature regarding the availability of valid and reliable informatics
training resources and valid, reliable tools to measure informatics competencies. The SOLO-IT
of nursing students. A one-group pre/post-test design using the SANICS was utilized in a
administered via Survey Monkey to student volunteers prior to and at the completion of SOLO-
IT.
This study will explore the psychometric performance of the SANICS. Archived pre/post
SOLO-IT data will be used to assess the factor structure and internal consistency reliability of
the SANICS, and to examine the psychometric properties of the SANICS among a sample of
BSN entry-level nursing students, and to determine if the SANICS is responsive over time when
used to measure informatics competencies following completion of SOLO-IT. This study will
also evaluate the mean rating differences of SANICS scores before and after completion of
SOLO-IT. Construct validity of the SANICS will be assessed using a known group approach to
compare differences in means for each SANICS subscale (construct) before and after completion
80
Chapter IV
Results
This chapter provides descriptions of the study sample and presents the factor structure,
internal consistency reliability and responsiveness of the SANICS instrument. Findings reported
What are the psychometric properties of the SANICS among a population of BSN entry-level
students?
Is the SANICS responsive over time when used to measure informatics competencies
Research Design
This study explored the psychometric performance of the SANICS instrument using
archived survey data collected before and after completion of SOLO-IT. A pre/post-test design
using the 30-item SANICS was utilized in a sample of undergraduate nursing students. Surveys
comprised of competency items contained within the SANICS instrument were administered via
During the period of January-March, 2014, two hundred seventy one (271) entry level
BSN students from a medium sized mid-western university were enrolled into SOLO-IT. A total
of 229 students (85%) successfully completed all SOLO-IT assignments, requiring an average of
two attempts per assignment in order to achieve the minimum passing score of 92% (overall
course average was 97%). Forty two students started, but did not complete all assignments in
SOLO-IT (although 13 of these students still chose to complete the post-training SANICS
survey). Most, but not all, course instructors required SOLO-IT completion. Courses which
81
lacked the formal requirement of SOLO-IT completion likely negatively influenced some of the
Of the 271 students originally enrolled in SOLO-IT, 256 completed the pre-course
SANICS survey data (response rate of 94.4%) and 242 completed the post-training SANICS
survey (response rate of 89.2%), for a total sample size of 498 BSN students (see Figure 2).
Both the pre- and post- training surveys were administered using Survey Monkey software. All
data were entered and analyzed using Microsoft Excel and SPSS version 18.0 statistical
software.
Figure 2
82
Profile of Respondents
The sample for this study was 89.5% female and 10.5% male. Most of the student
volunteers were 20-29 (58.4%), followed by 30-39 (25.3%). Students in the combined 40-64 age
group made up 16.4% of the study population (12.1% in the 40-49 age group and 4.3% in the 50-
Almost all of the BSN students in this study used computers for more than two years
(97.7%). Six students (2.3%) reported using computers for less than six months. The majority
of students reported using computers several times a day or daily (94.2%), followed by several
times a week (5.1%). Two students (0.8%) reported using computers only several times a
Table 3
BSN Demographics
Computer Experience
Just started using in the past 6 months 6 2.3%
In the past 2 years 0 0%
More than 2 years 251 97.7%
Total 257 100.0%
83
Frequency of Computer Use
Several times a day 195 76.2%
Once a day 46 18.0%
Several times a week 13 5.1%
Several times a month 1 0.4%
Never 1 0.4%
Total 256 100.0%
The first research question addressed by this study is “what are the psychometric
properties of the SANICS among a population of BSN entry-level students”? In response to this
question, the following discussion outlines the factor structure, internal consistency reliability,
Principal component analysis was performed to determine the factor structure of the 30-
item SANICS (Table 4). Promax rotation with Kaiser Normalization was used to examine
correlations among five factors: Basic Computer Knowledge And Skills; Clinical Informatics
Role; Applied Computer Skills; Clinical Informatics Attitudes; And Wireless Device Skills.
Almost all (27 of 30) factor loadings increased over pre-SOLO-IT. Five factors accounted for
71.6% of the variance in the 30 item scale (pre-SOLO-IT) and the percentage of variation
84
Table 4
85
Parallel Analysis
Parallel analysis was conducted to determine which factors to retain. This simulation
method compared observed eigenvalues with eigenvalues from a random sample consisting of
the same number of variables and observations. All scores were over 0.50 on the primary
loading of items after rotation, which was the cut off for retention of items. Observed and
simulated eigenvalues intersected at the five factor level, further validating the five factors of the
Figure 3
Parallel Analysis
14
13
12
11
10
9
8
7
6
5
4
3
2
1
0
0 5 10 15 20 25 30
(Shuttleworth, 2014). The presence of statistically significant differences between pre and post-
SOLO-IT suggest the construct validity of the SANICS is good. Effect size of the thirty items
86
Internal consistency reliability was determined using Cronbach’s α for each of the five
sub-scales: Applied Computer Skills, Clinical Informatics Role, Wireless Device Skills, Basic
SOLO-IT and 2) those reported by Yoon and colleagues (2009) (see Table 5). Cronbach’s α was
original SANICS study (.89-.94) for each of the five sub-scales of the instrument. These alphas
would be considered in the excellent range (Nunnally & Bernstein, 1994). Almost half (14 of
30) of the factor loadings were .90 or greater following SOLO-IT completion, compared to 23%
(7 of 30) pre-SOLO-IT.
Table 5
87
Basic Computer Knowledge and Skills α = .94, α = .94, α = .97,
(15 items) n = 321, n = 247, n = 238,
M (SD) = M (SD) = M (SD) =
3.86 (.71) 3.66 (1.15) 4.10 (.82)
Use telecommunication devices .73 .60 .77
Use the Internet to locate, download items of
interest .70 .60 .86
Use database management program to develop .68 .63 .74
a simple database
Use database applications to enter and retrieve .81 .71 .85
information
Conduct on-line literature searches .74 .69 .86
Use presentation graphics to create slides, .74 .81 .85
displays
Use multimedia presentations .74 .84 .86
Use word processing .72 .63 .86
Use networks to navigate systems .75 .78 .86
Use operating systems .74 .86 .84
Use existing external peripheral devices .79 .80 .78
Use computer technology safely .80 .53 .89
Navigate Windows .77 .60 .87
Identify the basic components of the computer .77 .58 .83
system
Perform basic trouble-shooting in applications .81 .64 .82
α = .89, α = .93, α = .95,
Applied Computer Skills: Clinical n = 330, n = 255, n = 240,
Informatics (4 items) M (SD) = M (SD) = M (SD) =
2.45 (1.03) 2.25 (1.36) 3.24 (1.07)
Use applications for diagnostic coding .71 .90 .89
Use applications to develop testing materials .69 .92 .92
Access shared data sets .75 .88 .93
Extract data from clinical data sets .77 .90 .92
α = .94, α = .95, α = .97,
Clinical Informatics Attitudes n = 332, n = 255, n = 242,
(4 items) M(SD) = M(SD) = M(SD) =
3.74 (.97) 3.82 (1.14) 4.15 (.85)
Recognize that health computing will become .82 .92 .93
more common
Recognize human functions that cannot be .83 .92 .94
performed by computer
Recognize that one does not have to be a .83 .92 .96
computer programmer to make effective use
of the computer in nursing
Recognize the value of clinician involvement .78 .93 .94
in the design, selection, implementation, and
evaluation of applications, systems in health
care
α = .90, α = .95, α = .96,
Wireless Device Skills (2 items) n = 328, n = 255, n = 242,
M(SD) = M(SD) = M(SD) =
2.75 (1.16) 3.44 (1.25) 4.07 (.84)
Use wireless device to download safety and .77 .82 .90
quality care resources
Use wireless device to enter data .76 .84 .90
88
Presentation of Data for Research Question Two
The following discussion answers the next question posed by this research: Is the
SANICS responsive over time when used to measure informatics competencies following
competencies for each of 30 SANICS items using a five-point Likert Scale, from one (strongly
disagree) to five (strongly agree). Two sample t-tests were used to determine means, standard
deviation, and sample sizes for each SANICS sub-scale category (Clinical Informatics Role,
Basic Computer Knowledge and Skills, Applied Composite Skills, Clinical Informatics
Attitudes, and Wireless Device Skills). It was not possible to pair sample data due to the
unplanned loss of an item on the post-survey (during a Blackboard upgrade) which requested the
test the hypothesis that there would be a significant difference in SANICS scores between pre-
All mean differences were significantly higher on each of the five SANICS sub-scale
categories following completion of SOLO-IT (p < 0.001). Differences from pre-score to post-
score are listed below from greatest to least amount of difference and are further delineated on
Table 6:
89
Interestingly, differences in mean SANICS scores pre- to post-SOLO-IT were greatest in two
Computer Skills” and “Clinical Informatics Role”). An explanation for this finding could relate
to the advanced wording of some of the competencies listed under this item, and will be further
Table 6
Post-SOLO-IT
Applied Computer Skills 1.87 1.10 3.89 0.87 P < .001 is higher
Post-SOLO-IT
Clinical Informatics Role 2.65 1.27 3.52 0.97 P < .001 is higher
Post-SOLO-IT
Wireless Device Skills 3.44 1.25 4.07 0.84 P < .001 is higher
Mean SANICS scores following SOLO-IT completion were then compared with scores
from the original SANICS study conducted by Yoon, et al. (2009) in a population of BS/MS
students taking an informatics course. The population for this study consisted of all students
participating in a curriculum which emphasized informatics tools for patient safety, modeling,
and monitoring. The curriculum included didactic lectures on informatics for patient safety and
web-based reporting of hazards and near misses. Significantly higher mean SANICS scores
90
were reported post-SOLO-IT on all five sub-scales compared to the mean SANICS scores
reported on the original SANICS study (p < 0.001) (see Table 7).
Table 7
Post-SOLO-IT
Wireless Device Skills 2.75 1.16 4.07 0.84 p<.001 is higher
Value of SOLO-IT
Post-training SANICS surveys were identical to pre-training SANICS surveys, except for
the addition of 10 optional Likert scale items rating the overall effectiveness of SOLO-IT. At the
completion of the final assignment in SOLO-IT, students were asked to rate the value of the
training on a 1-5 Likert scale. Composite rating of all items was 4.0. Scores were highest on the
items “I found instructions in SOLO-IT clear and easy to understand” (4.1), “SOLO-IT provided
information in a manner that was easy to comprehend” (4.1), and “I feel confident I could now
91
use Blackboard” (4.2). Scores were lowest on the items “using SOLO-IT was appropriate for my
learning as a nursing student” (3.8), and “I would recommend SOLO-IT to other students” (3.6)
Table 8
Value of SOLO-IT
I found the SOLO-IT site easy to use and follow. 241 4.0 0.82
The layout and design of SOLO-IT was user friendly. 241 4.0 0.79
Composite 4.0
92
Summary
This chapter presented the factor structure, internal consistency reliability, and
responsiveness of the SANICS instrument before and after SOLO-IT training, and compared
those findings with the original SANICS research conducted by Yoon and colleagues (2009).
Evidence was presented which supports the reliability and validity of the SANICS as a tool to
informatics training intervention. The next chapter will discuss implications of these findings for
93
Chapter V
Discussion
The previous chapter presented findings from this study. This chapter will include a
discussion of those findings, their implications for nursing practice, and recommendations for
This study used a sample of 498 nursing students tested on the 30 item SANICS before
(n=256) and after (n=242) completion of SOLO-IT. The number of participants in this study
instrument (Sapnas & Zeller, 2002). Response rates for both pre- and post-SOLO-IT were high
Study Participants
As is typical for an undergraduate nursing program, the majority of students were female
(89.5%) and between the ages of 20-39 (83.7%). Most students reported frequent use of
computers, with 94.2% reporting use several times a day for more than two years. The younger
age of participants and the frequency of computer usage suggest that most students were exposed
familiarity with computers, this younger group of nursing students still reported significant
increases in perceived competencies for each SANICS sub-scale following completion of SOLO-
Research Question #1
What are the psychometric properties of the SANICS among a population of BSN entry-level
students?
94
The psychometric properties of the SANICS have been well described by Yoon and
colleagues (2009) with their combined sample of 337 BSN/MSN students. Reliability and
validity of the instrument was evident with their sample. Findings from this study also support
the SANICS as a psychometrically sound instrument when used before and after informatics
supported the five factor structure of the SANICS, and was consistent with findings from the
original study. Reliability of the instrument was high and percent of variation for almost all
factors increased from pre- to post, as well as loadings within factors. Parallel analysis showed
all scores to be > .50 after rotation, and simulated eigenvalues validated the five factors of the
scale.
the 93 item scale. Cronbach’s α was in the excellent range (.95-.97) following SOLO-IT when
compared with pre-training and original SANICS scores. None of the inter-item correlations was
less than α = .89. Reliability coefficients of .70 or higher are usually considered acceptable by
Research Question #2
Is the SANICS responsive over time when used to measure informatics competencies
The independent two sample t-test is “the most basic statistical test that measures group
differences…[and] analyzes significant differences between two group means” (Mertler &
Vannatta, 2010, p. 14). Independent t-tests confirmed the responsiveness of the SANICS over
time and also measured the five categorical factors of the scale: 1) Clinical Role, 2) Basic
95
Wireless Device Skills. Significantly higher mean scores following SOLO-IT were reported on
each of the five SANICS’ five sub-scales, compared to pre-SOLO-IT (p < 0.001) and, when
compared to the original SANICS study (p < 0.001). This finding further supports the
responsiveness of the SANICS over time, when used to measure perceived informatics
competencies of nursing students. Each SANICS sub-scale score increased following SOLO-IT
(from a low of 1.87 pre-training to 3.89 post-training for the “Applied Computer Skills”
category, to the highest sub-scale scores for the “Basic Computer Knowledge and Skills”
category, which went from 3.66 pre-to 4.10 post-training). A competency score of 3.0 or greater
Differences from pre-score to post- score were greatest in the sub-scale categories
“Applied Computer Skills” (+2.02) and “Clinical Informatics Role” (+.87). These sub-scales
contained items, such as “use applications for diagnostic coding”, “extract data from clinical data
sets”, “act as advocate of leaders for incorporating innovations and informatics concepts into
their area of specialty”. Such a large increase from pre- to post-training suggests SOLO-IT may
have effectively introduced more advanced practice application concepts to entry level nursing
students who would not be expected to have experience or familiarity with these principles. The
areas with the least change from pre- to post- SOLO-IT were “ Basic Computer Knowledge and
Skills” and “Wireless Device Skills”, as might be expected in a younger population of nursing
Value of SOLO-IT
Post-training items to evaluate program effectiveness indicate that students valued the
online presentation and content offered by SOLO-IT. Composite score for all items was 4.0,
with the highest score reported for the item “I feel confident I could now use the online learning
96
course site, Blackboard” (4.2). Program evaluation scores further support the potential value of
online informatics training as a tool to prepare and familiarize students with the educational
Limitations
The sample population for this study was confined to one academic institution in the mid-
west which may limit generalizability to other settings. A one-group pre/post-test design does
not control for potentially confounding variables, such as history, maturation or regression
Lack of randomization and the requirement (rather than the option) for entry level
students to complete SOLO-IT training could have also confounded study findings. Students
who felt they already possessed requisite computer and information literacy skills may have
reluctantly completed the training, and may have even resented the lack of an option to waive the
training requirement. Likewise, students who struggled with technology skills may have found
the online nature of the training insufficient for their learning needs, since it lacked immediate,
may have given struggling students a sense of informatics competency, even when further
accurately represent ability to demonstrate competency. Participants in this study may have
wanted to present themselves in a positive light, thus enhancing self-perceptions and biasing
study findings. Study subjects "have a tendency to want to present (themselves) in the best
light, and this may conflict with the truth” (Polit & Hungler, 1995, p. 312-13).
97
Completion of SOLO-IT provided many competency demonstration exercises and
knowledge acquisition assignments (quizzes). An average of two attempts per assignment was
necessary for most students to successfully pass each module and achieve the total minimum
completion score of 92%. Repeated attempts allowed students to review content and
practice/repeat demonstration exercises, but also resulted in high scores for every student
successfully completing the training (overall completion average was 97%). Such high
completion scores made the point spread within the data quite narrow and prevented an
The confounding variables of age and lack of nursing experience may have impacted
students’ understanding of some of the SANICS items. The majority of students (58.4%) fell
into the 20-29 year old age group. As first year students, it can be assumed these students had
little, if any, direct nursing experience. Competency on the SANICS assumed that students
possessed a certain degree of critical thinking skills. However, entry level nursing students may
not have understood some of the more complex items on the SANICS. For example, students
with no nursing experience may have lacked the background to fully comprehend and accurately
self-assess the competency item, “Recognize the value of clinician involvement in the design,
scores could have simply reflected more familiarity with the item following informatics training,
competencies could provide comparative, objective data that would more accurately measure
98
competency, and facilitate remediation strategies specific to the learning need. Sufficient
research exists delineating the lack of informatics competency among nurses and nursing
students. However, the lack of published studies examining the role of nursing informatics
interventions.
Finally, the possibility for a conflict of interest exists given the intervention for this study
(SOLO-IT) was also designed and implemented by the researcher. Full disclosure of this
association will be made in all forms of reporting and in the dissemination of research findings.
This study was limited to a single population of undergraduate nursing students. Future
studies should correlate demonstrated and perceived competencies, and should include the
diverse populations of undergraduate and graduate programs and with nurses in clinical practice.
During the course of the study, the version of Blackboard was updated by the university,
requiring the SOLO-IT course be imported into the new version. During this process, the item
on the post-survey which prompted students to enter their numerical identification number was
inadvertently omitted, making it impossible to pair post- with pre-training responses. Future
studies should include numerically paired codes which could allow for paired comparisons of
scores for each participant, as well as correlations of actual skill demonstrations with perceived
competency scores.
While nursing informatics competencies have been described for more than a decade, a
comprehensive review of the literature revealed a surprising lack of research studies describing
99
nursing professionals. Published literature describing valid, reliable assessment tools to measure
the effect of informatics competency training among nurses or nursing students was also sparse.
Limitless opportunities exist for the creation, delivery, and evaluation of validated informatics
training products and instruments which can help ensure a highly qualified nursing profession,
Summary
This study explored the psychometric performance of the SANICS. Evidence was
presented which supports the SANICS as a psychometrically sound instrument when used in a
population of BSN entry-level nursing students. Archived pre/post SOLO-IT data were used to
confirm the factor structure and internal consistency reliability of the SANICS. This study
effectively demonstrated the responsiveness of the SANICS over time and supported its use as a
valid tool to measure pre- and post- informatics competencies associated with an online
informatics training intervention. Significant differences in each sub-scale mean score before
and after completion of SOLO-IT further supported the construct validity of the SANICS.
Results of this study suggest that SOLO-IT may be an effective tool for improving
perceptions of computer competencies among entry level BSN students. Future studies are
recommended which include paired samples of nurses and nursing students from various
populations which would allow for more extensive correlations. Finally, research is
regulatory demands. The concept of “competency” must be continually re-defined if metrics are
to remain current and reflective of a requisite informatics skillset for informed nursing practice.
100
As the nation’s largest consumers of health information technologies, it is no longer acceptable
for a coalition of 20 nursing informatics societies to endorse the European Computer Drivers’
License (ECDL) as the recommended solution for the lack of informatics competencies among
nurses. The ECDL was developed for use by a wide range of industries and does not address the
essential computer skills required in healthcare, or the unique challenges faced in the healthcare
the face of ubiquitous change in healthcare will likely remain unrealized until the profession of
nursing responds with empirically grounded training innovations and validated instruments
101
Appendix A: Self-Assessment Nursing Informatics Competency Scale
For each statement, indicate your current level of competency on the scale of 1 to 5, where:
1 = Not competent, 2 = Somewhat competent, 3 = Competent, 4 = Proficient, and 5 = Expert.
Competent
competent
Somewhat
competent
Proficient
Expert
Not
1. As a clinician (nurse), participate in the selection process, design, 1 2 3 4 5
implementation and evaluation of systems
6. Use different options for connecting to the internet (phone line, mobile 1 2 3 4 5
phone, cable, wireless, satellite) to communicate with other systems (e.g.,
access data, upload, download)
16. Use existing external storage devices (e.g., network drive, CD, DVD, 1 2 3 4 5
USB flash drive, memory card, online file storage)
102
18. Navigate Windows (e.g., manipulate files using file manager, determine 1 2 3 4 5
active printer, access installed applications, create and delete directories)
19. Identify the basic components of the computer system (e.g., features of 1 2 3 4 5
a PC, workstation)
23. Access shared data sets (e.g., Clinical Log Database, Minimum Data 1 2 3 4 5
Set)
24. Extract data from clinical data sets (e.g., Clinical Log Database, 1 2 3 4 5
Minimum Data Set)
26. Recognize that the computer is only a tool to provide better nursing 1 2 3 4 5
care and that there are human functions that cannot be performed by
computer
29. Use wireless device (PDA or cellular telephone) to locate and download 1 2 3 4 5
resources for patient safety and quality care
103
Appendix B: Course Overview and Grading Criteria
In order to prepare you for success in the nursing program, the nursing faculty at Xavier University
would like you to complete a nursing informatics training course called SOLO for a DELTA (SOLO). This
training course is embedded into an off-site distance learning platform called CourseSites© and is
operated by Blackboard©.
SOLO (Successful Online Learning and Orientation) for a DELTA (Digitally Enhanced Learning and
Technological Arena) helps prepare nursing students for success by providing a self-guided, online
framework where new and experienced learners can work at their own pace to develop (or improve)
overall informatics competencies. One of SOLO’s key features is the ability for students to self-select
supplemental resources and tutorials to use as much or as often as needed. In SOLO, all students are
encouraged to repeat exercises, assignments, or quizzes with no limit on time or number of submissions.
Yes. All new incoming graduate and undergraduate students will be required to complete SOLO for a
DELTA as a graded assignment in one of their courses: N 130 [BSN], N505 [MSN], N550 [MIDAS], N 496
[RN-MSN] and N556 [CNL]. SOLO for a DELTA is also a course pre-requisite for N 854: Advanced
Informatics [ALL MSN, including FNP].
SOLO is recommended early in the nursing program since the information contained in SOLO will help
prepare you with the foundational technological skills necessary for the nursing program and for nursing
practice.
Successful completion of SOLO is a pre/co-requisite for N 854 and is associated with assignment credit in
the courses listed above. Once you have completed SOLO, you will not have to repeat it during the
course where it is required (but you will still receive assignment credit/grade, if appropriate). Once SOLO
is completed, you will receive a Certificate of Completion with your final score. Be sure to keep this
certificate, and show it to your instructor as proof that you successfully completed the SOLO for a DELTA
Informatics Training Course.
The average time to complete SOLO is approximately 4 hours (with a range of 2-22 hours, depending
upon your experience and comfort level with computer applications).
104
Length of time varies widely, since students are encouraged to repeat exercises, assignments, and
quizzes as often as needed, in order to demonstrate competency. There is no limit in the amount of
time spent on a module, or the number of times an assignment or quiz can be re-submitted.
SOLO can also be accessed as often as needed in order to complete all six modules listed below. Since
each module builds upon information from previous modules, it is recommended the course be
completed within a one-two week period from the start date.
Module I- Introduction to SOLO: Introduces nursing students to the features of SOLO and
the Blackboard platform.
Module II- Navigating the Computer and Web: Introduces nursing students to the parts and
functions of the computer and presents practice exercises for browsing the web.
Module III- Computer Applications: Presents computer applications commonly used by
nurses, including spreadsheets, documents, and presentation software.
Module IV- Information Literacy: Presents library and web based research tools that
support information literacy and Evidence Based Practice
Module V- Preparation for Research: Presents an overview of professional writing and
publication principles.
Module VI-Issues for the Professional e-Nurse: Introduces future e-nurses to professional
issues encountered in the clinical setting.
Once you have enrolled into the SOLO course, you will first be asked to participate in a survey as part of
a research study. Completion of surveys is strictly voluntary, but will greatly assist us in evaluating the
effectiveness of the SOLO course. As part of the study, you will do the following:
1) Prior to beginning SOLO, complete an anonymous 30 item survey. The survey will involve
ranking your present informatics skills and competencies on a scale from 1 to 5.
2) After completing SOLO, repeat the same anonymous survey (with some additional items
included for course evaluation purposes).
o Survey data are collected via a secure off-site server with SSL encryption.
o Refusal to participate in the data collection procedures associated with SOLO cannot be
tracked, and there are no penalties for lack of participation or type of response.
o Choosing to enter and complete the survey will indicate your permission to participate.
You will need to complete SOLO during the first two months of the semester as a required assignment in
one of the courses listed at the beginning of this document.
105
How do I enroll into the SOLO Course?
1) It’s easy! Check your Xavier email! We will e-mail your enrollment invitation. When you
receive it, click on the link provided in the e-mail to be taken to CourseSites where you will
register.
Select a user name that does not include your actual name or other identifiers (eg.
Nurse1234), since pre and post course survey data will be paired anonymously by user
names. Pairing of anonymous data will allow us to determine how much (if any),
perceptions of informatics competency changed as a result of SOLO.
Data will be reviewed and reported in aggregate form only and cannot be traced to
individual users.
2) Once you create your account, you will be automatically enrolled and can begin.
3) For your convenience, a SOLO Instructional Manual (Word document) is available at the
bottom of the first page of the SOLO course.
4) When you complete the course and all modules have been graded, a Completion Certificate
will be available in CourseSites. The certificate can be printed or saved. Your instructor may
require you to submit this certificate as proof you that you successfully completed the course.
5) It is recommended that you add completion of SOLO for a DELTA Informatics Training Course
to your resume!
An application requesting four continuing education (CE) units of credit has been submitted to the Ohio
Board of Nursing. If/when that application is approved, CE credit will be awarded and a certificate
provided to those who successful completed all six modules of SOLO for a DELTA.
No problem! We are here to help! Simply contact the SOLO Help Desk at help@
. We will look into your issue and respond within 12 hours (excluding weekends and
holidays).
Wishing you every success on your SOLO journey and in your nursing education,
106
SOLO for a DELTA©: An Online Informatics Training Course
Successful Online Learning and Orientation (SOLO)
for a DELTA (Digitally Enhanced Learning and Technological Arena)©
Grading Criteria
Needs to Needs to Needs to Recomme Achieved POINTS
SOLO for a DELTA Repeat Repeat Review nd Maximum
Module Module and/or Review of Points
and and Repeat Module. Allowable
Objectives and Competency Assignme Assignme Module Repeat or -
Demonstrations/Activities nt nt and Correct No need
Repeat and to repeat
Assignme Re-submit Module
nt Assignme or
nt Assignme
nt
Module I Objectives:
Introduction to SOLO: Introduces nursing
students to the features of SOLO and the
Blackboard platform. Objectives for this
module include:
Module II Objectives
Navigating the Computer and Web:
Introduces nursing students to the parts
and functions of the computer and presents
practice exercises for browsing the
web. Objectives for this module include:
Describe the parts and functions of a
personal computer
Explore the use of the internet as a
tool to inform nursing practice
Demonstrate the ability to use and
manage an email account
Demonstrate understanding of the
importance of anti-virus protection
Demonstrate the ability to back up
computer files
For Modules I and II, you will demonstrate
competencies by performing these
activities: 0-1 2 3 4 5
1. Show What You Know: Computers
Quiz 0-1 2 3 4 5
(5 points)
2. Show What You Know: Internet Quiz
107
Needs to Needs to Needs to Recomme Achieved POINTS
SOLO for a DELTA Repeat Repeat Review nd Maximum
Module Module and/or Review of Points
and and Repeat Module. Allowable
Objectives and Competency Assignme Assignme Module Repeat or -
Demonstrations/Activities nt nt and Correct No need
Repeat and to repeat
Assignme Re-submit Module
nt Assignme or
nt Assignme
nt
(5 points)
108
Needs to Needs to Needs to Recomme Achieved POINTS
SOLO for a DELTA Repeat Repeat Review nd Maximum
Module Module and/or Review of Points
and and Repeat Module. Allowable
Objectives and Competency Assignme Assignme Module Repeat or -
Demonstrations/Activities nt nt and Correct No need
Repeat and to repeat
Assignme Re-submit Module
nt Assignme or
nt Assignme
nt
Practice
Demonstrate the process for
locating and evaluating scholarly
websites
Discuss safeguards that should be
applied when using social media
Module V Objectives
Preparation for Research: Presents an
overview of professional writing and
publication principles. Objectives for this
module include:
Describe how to avoid plagiarism
Explain some of the common
issues associated with copyright
Demonstrate the basic principles
of scholarly writing using American
Psychological Association (APA)
format
Explore the role of Evidence Based
Practice (EBP) in nursing
Describe the importance of
nursing research posters as a
means to disseminate research
findings
For Modules IV and V, you will
demonstrate competencies by performing
these activities:
0-9 10-19 20-25 26-29 30
1. Show What You Know: Locate a
Research Article; Complete an 0-2 3-5 6-7 8-9 10
Assignment (30 points)
2. Show What You Know: Plagiarism 0-2 3-5 6-7 8-9 10
Assignment (10 points)
3. Show What You Know: Research and
APA Quiz (10 points)
Module VI
Issues for the Professional e-
Nurse: Introduces future e-nurses to
professional issues encountered in the
clinical setting. Objectives for this module
include:
Explore the role of informatics in
109
Needs to Needs to Needs to Recomme Achieved POINTS
SOLO for a DELTA Repeat Repeat Review nd Maximum
Module Module and/or Review of Points
and and Repeat Module. Allowable
Objectives and Competency Assignme Assignme Module Repeat or -
Demonstrations/Activities nt nt and Correct No need
Repeat and to repeat
Assignme Re-submit Module
nt Assignme or
nt Assignme
nt
healthcare
Describe how Information
Technology (IT) is transforming
health care
Describe changes to healthcare
due to the ARRA and ACA
Describe the roles and
responsibilities of nursing in the
area of health IT and PHI
Explain the difference between
EHR and EMR
Outline opportunities for nursing
in today’s technology rich
healthcare environment
TOTAL POINTS
110
Appendix C: Power Analysis
111
Appendix D: University of Hawaii IRB Approval
112
Appendix E: Xavier University IRB Approval
113
References
http://www.ala.org/acrl/sites/ala.org.acrl/files/content/standards/ils_recomm.pdf
American Nurses Association (2001). Scope and standards of nursing informatics practice.
Nursesbooks:Silver Spring, MD
Bakken, S., Sheets, Cook, S., Curtis, L., Soupios, M., & Curran, C. (2003). Informatics
Bakken, S., Cook, S., Curtis, L., Desjardins, K., Hyun, S., Jenkins, M., John, R., Klein, W.,
Paguntalan, J., Roberts, W., &Soupios, M. (2004). Promoting patient safety through
581—589.
Ball, Douglas & Hinton-Walker (2011). Where caring and technology meet. Springer
Ball M. & Hannah K. (1984). Using computers in nursing. Reston Publishing: Reston, Va.
114
http://library.corporateir.net/library/17/177/177018/items/153636/BBBB%2004AR.pdf?q
=blackboard-inc
Bradford, P., Porciello, M., Balkon, N., & Backus, D. (2007). The Blackboard Learning
System: The be all and end all in educational instruction? Journal of Educational
http://baywood.metapress.com/link.asp?id=x137x73l52615656
Bruin, J. (2006). Programs for data analysis. UCLA: Academic Technology Services,
http://www.ats.ucla.edu/stat/stata/ado/analysis/
Campbell, L., Campbell, B., & Dickinson, D. (1996). Teaching & learning through multiple
Carmines, E. & Zeller, R. (1979). Reliability and validity assessment, vol. 17.Sage
115
p. 1970-1976.
Choi, J., & Bakken, S. (2013). Validation of the self-assessment of nursing informatics
Chou, P. (2012). The relationship between engineering students' self-directed learning abilities
Cohen, J. (1977). Statistical power analysis for the behavioural sciences. New York:
Academic Press.
Cole, I. & Kelsey, A. (2004). Computer and information literacy in post-qualifying education.
http://www.ecdl.org/programmes/index.jsp?p=2931&n=2954
Courey, T., Benson-Soros, J., Deemer, K., & Zeller, R. (2006). The missing link: Information
literacy and evidence-based practice as a new challenge for nurse educators. Nursing
Creswell, J. (2009). Research design: Qualitative, quantitative, and mixed methods approaches.
Cronbach, L. J. (1951). Coefficient alpha and the internal structure of tests. Psychometrika (16):
116
Curran, C. R. (2003). Informatics competencies for nurse practitioners. AACN Clinical Issues,
14, 320–330.
Dee, C. & Stanley, E. (2005). Information-seeking behavior of nursing students and clinical
nurses: Implications for health sciences librarians. Journal of the Medical Library
Delaney, C., & Piscopo, B. (2007). There really is a difference: Nurses' experiences with
DeMaris, A. (2012). Self-directed learning for the adult vocational voice student and
Desjardins, K. Cook, S., Jenkins, M., Bakken, S. (2005). Effect of an informatics for evidence
Economist (2012). All too much: Monstrous amounts of data. The Economist Newspaper,
http://www.thejeo.com/Archives/Volume8Number2/EdwardsandOConnorPaper.pdf
Elder, B. & Koehn, M. (2009). Assessment tool for nursing student computer competencies.
Faul, F., Erdfelder, E., Lang, A.,& Buchner, A. (2007). G*Power 3: A flexible statistical
power analysis program for the social, behavioral, and biomedical sciences. Behavior
117
Journal of Nursing Education 48(2), p. 86-90.
Ferriman, Justin (2013). Blackboard still rules higher ed. Learn Dash Retrieved from:
http://www.learndash.com/survey-blackboard-still-rules-higher-ed/
Fisher, M., King, J., Tague, G. (2001). The development of a self-directed learning
readiness scale for nursing education. Nurse Education Today, 21, 516–525.
Flood, L., Gasiewicz, N., &Delpier, T. (2010). Integrating information literacy across a BSN
Foster, S. & Cone, J. (1995). Validity issues in clinical assessment. Psychological Assessment,
(7), p. 248–260.
Franklin, Gibson, Robertson, Pohlmann & Fralish (1995). Parallel analysis: a method for
http://opensiuc.lib.siu.edu/cgi/viewcontent.cgi?article=1004&context=pb_pubs
Godsey, J. (2011). Successful online learning and orientation (SOLO) for a DELTA (Digitally
Enhanced Learning and Technological Arena). Web based informatics training modules
Glanz, K., Rimer, B., Viswanath, K. (2008). Health Behavior and Health Education, 4th
Graves, J., & Corcoran, S. (1989). The study of nursing informatics. Image: The Journal of
Gross, M. & Latham, D. (2009). Information literacy: Defining, attaining, and self-assessing
118
Qualitative research. Thousand Oaks: Sage, p. 195–220
Gugerty, B. & Delaney, C. (2009). The TIGER informatics competencies collaborative final
10.1177/1745691612454304
Guttman, L. (1941). The quantification of a class of attributes: a theory and method of scale
Health Information and Management System Society (2012). Nursing informatics position
http://www.himss.org/asp/topics_FocusDynamic.asp?faid=578.
Findings from the 2008 national sample survey of registered nurses. Washington, DC:
Hebda, T. & Czar, P. (2013). Handbook of informatics for nurses and healthcare professionals.
Hebert, C. (1999). Nurses and informatics. National Nursing Informatics Project Discussion
Hinkin, T. R. (1998). A brief tutorial on the development of measures for use in survey
119
questionnaires. Organizational Research Method, 1(1).
Hunt, E., Sproat, S., & Kitzmiller, R. (2004). The nursing informatics implementation guide.
Hunter, K., McGonigle, D., & Hebda, T. (2013). The integration of informatics content in
baccalaureate and graduate nursing education: a status report. Nurse Educator, 38(3),
p. 110-113.
Hunter, K. McGonigle, D. & Hebda, T. (2013). The TIGER virtual demonstration center.
Husted J., Cook, R., Farewell, V. & Gladman, D. (2000). Methods for assessing responsiveness:
Hwang, J., & Park, H. (2011). Factors associated with nurses' informatics competency. CIN:
http://www.iom.edu/Reports/2003/Health-Professions-Education-A-Bridge-to-
Quality.aspx
Institute of Medicine (2010). The future of nursing: Leading change, advancing health.
from: http://www.iom.edu/Reports/2010/The-Future-of-Nursing-Leading-Change-
Advancing-Health/Report-Brief-Education.aspx?page=1
Jacobs, S. (2014). Nursing resources: A self-paced tutorial and refresher. New York University
120
Johnson, B., & Christensen, L. (2004). Educational research: Qualitative, quantitative, and
Kaminski, J. (2011). Pre-test for attitudes toward computers in healthcare (PATCH) self-
Kaplan, S (2014). Nursing resources: Self-paced tutorial and refresher. New York University.
Available at http://nyu.libguides.com/nursingtutorial
Kaya, N. & Turkinaz, A. (2008). Validity and reliability of turkish version of the pretest for
Kelley, K., Clark, B., Brown, V., & Sitzia, J. (2003). Good practice in the conduct and reporting
of survey research. International Journal for Quality in Health Care, 15(3), p. 261-266,
doi: 10.1093/intqhc/mzg031.
Knowles, M.S. (1975). Self-directed learning: A guide for learners and teachers. New York:
Association Press.
Kossman, S., Bonney, L., & Kim, M. (2014). Electronic health record tools’ support of
Nursing, 539-544.
Lin, T. (2011). A computer literacy scale for newly enrolled nursing college students:
Lord, F. (1958). Some relations between Guttman’s principal components of scale analysis and
Lupo, D. & Erlich, Z. (2001). Computer literacy and applications via distance e-learning.
121
Majid, S., Foo, S., Luyt, B., Zhang, X., Theng, Y., Chang, Y,, Mokhtar, I. (2011). Adopting
McCormack, B., Kitson, A., Harvey, G., Rycroft-Malone, J., Titchen, A., Seers, K. (2002).
Getting evidence into practice: The meaning of ‘context’. Journal of Advanced Nursing,
38(1): 94-104.
McDowell, D. & Ma, X. (2007). Computer literacy in baccalaureate nursing students during
Mcginn, C., Gagnon, M., Shaw, N., Sicotte, C., Mathieu, L., Leduc, Y., Grenier, S., Duplantie,
J., Abdeljelil, A. & Legare, F. (2012). Users’ perspectives of key factors to implementing
electronic health records in Canada: A Delphi study. BMC Medical Informatics and
McNeil, B., Elfrink, V., Pierce, S., Beyea, S., Bickford, C., Averill, C. (2005). Nursing
pp. 1021-1030.
Mertler, C. A., & Vannatta, R. A. (2010). Advanced and multivariate statistical methods:
Middel, Berrie, & van Sonderen (2002). Statistical significant change versus relevant or
122
Psychological Science. Retrieved from:
http://www.psychologicalscience.org/index.php/news/releases/what-makes-self-directed-
learning-effective.html.
National League for Nursing (2008). Position Statement: Preparing the next generation of
Neale, J., Liebert, M. (1986). Science and Behaviour: An introduction to methods of research,
Norman G., Wyrwich, K. & Patrick, D. (2007). The mathematical relationship among different
Nunnally, J. & Bernstein, I. (1994). Psychometric theory (3rd ed.). New York: McGraw-Hill.
Nursing Executive Center (2008). Bridging the preparation-practice gap: Quantifying new
graduate nurse improvement needs. The Advisory Board Company: Washington, D.C..
development/Shared%20Documents/Manager%20Tools/Published%20Articles/Bridging
%20the%20Preparation%20Practice%20Gap.10.10.pdf
Ornes, L. & Gassert, C. (2007). Computer competencies in a BSN Program. Journal of Nursing
Pearson Vue (2000). Certiport IC³ internet and computing certification. Available at:
https://www.certiport.com/portal/common/documentlibrary/IC3_GS4_Program_Overvie
w.pdf
Phillips, D. & Burbules, N. (2000). Post-positivism and educational research. Lanham, MD:
123
Journal of Nursing Informatics, 5(4).
Polit, D. & Hungler, B. (1999). Nursing research: Principles and methods. Philadelphia:
Lippincott, p. 312-13.
Rajalahti, E., Saranto, K. (2012). Nursing informatics competences still challenging nurse
Reynaldo, J., & Santos, A. (1999). Cronbach's Alpha: A Tool for assessing the reliability of
http://www.libraries.rutgers.edu/rul/rr_gateway/research_guides/nursing/tutorial/
Saba, V. & McCormick, K. (2011). Nursing informatics: Essentials of computers for nurses.
Scholes M. & Barber B. (1980). Towards nursing informatics. In Medinfo, Lindberg DAD,
Shorten, A., Wallace M. & Crookes P. (2001). Developing information literacy: A key to
from: https://explorable.com/construct-validity
Skiba, D., Connors, H, Jeffries, P. (2008). Information technologies and the transformation of
Spratley, E., Johnson, A., Sochalski, J., Fritz, M., & Spencer, W. (2000). The registered nurse
124
population: Findings from the national sample survey of registered nurses. Rockville,
MD: U.S. Dept. of Health & Human Services, Public Health Service, Health Resources
http://academic.stedwards.edu/competency/module2/Lesson1/wwwandinterntgetconnecte
d.htm
Staggers, N., & Gassert, C. (2000). Competencies for nursing informatics. In B. Carty
(Ed.), Nursing informatics: Education for practice (pp. 17-34). New York: Springer.
Staggers, N., Gassert, C., & Curran, C. (2001). Informatics competencies for nurses at
Staggers, N., Gassert, C., & Curran, C. (2002). A Delphi study to determine informatics
competencies for nurses at four levels of practice. Nursing Research, 51(6), p. 383-390.
Staggers N., Thompson C., Happ B., Bartz C. (1998). A new definition for nursing informatics
Tabachnick, B., & Fidell, L. (2007). Using multivariate statistics. Boston: Pearson/Allyn &
Bacon.
the information literacy and academic writing skills of part-time post-registration nursing
Thede, L., & Sewell, J. (2010). Informatics and nursing: Competencies and applications
125
Initiative: Informatics competencies for every practicing nursing. Recommendations
http://www.thetigerinitiative.org/docs/TigerReport_VirtualLearningCenter_000.pdf
Turley, J. (1996). Toward a model for nursing informatics. Image: Journal of Nursing
US Department of Health and Human Services (2010). Affordable Care Act: About the Law.
http://www.sciedu.ca/journal/index.php/jnep/article/viewFile/1425/1035
Wallace, M., Shorten, A., Crookes, P., McGurk, C., & Brewer, C. (1999). Integrating
19(2), p. 136-41.
Waltz, C.F., Strickland, O.L., & Lenz, E.R. (2005). Measurement in nursing and health research
Wang, V. C., & Cranton, P. (2012). Promoting and implementing self-directed learning
Waters, Rochester & McMillan (2012). Drivers for renewal and reform of contemporary nursing
Westra, B. & Delaney, C. (2008). Informatics competencies for nurses and healthcare leaders.
126
Yoon, S., Yen, P, Bakken, S. (2009). Psychometric properties of the self-assessment of nursing
127