[go: up one dir, main page]

Academia.eduAcademia.edu
Toward Web Browsers that Make or Break Trust A Usability Evaluation of Certificate Management in Mozilla Firefox 3 Hazim Almuhimedi Amit Bhan Dhruv Mohindra School of Computer Science Carnegie Mellon University Heinz School of Public Policy and Management Carnegie Mellon University Heinz School of Public Policy and Management Carnegie Mellon University hazim@cmu.edu abhan@andrew.cmu.edu Joshua S. Sunshine dmohindr@andrew.cmu.edu School of Computer Science Carnegie Mellon University josh.sunshine@cs.cmu.edu ABSTRACT Keywords Web browsers are the gateway to the Internet and control how a user interacts with remote resources. Web browsers have traditionally automatically made basic security decisions and left important decisions to the end user. For example, when confronted with a website offering invalid SSL Certificates to establish secure connections, the decision to accept the certificate was subject to the users’ discretion. The browsers’ mechanism involved showing a series of warning messages, urging the user to exhibit cautiousness. In this paper, we discuss the usability versus security trade-offs of a new certificate management scheme that is being implemented in Mozilla Firefox Version 3.0. We will highlight issues with enforcing trust on the end user and related usability concerns. Through a pilot user study, survey and an in lab user study exercise, we uncover the mental model of the user when faced with tasks such as opening trusted and untrusted websites using the new interface. Additionally, presence or absence of security indicators and warnings on the new Firefox interface and their relative intuitiveness as compared to previous versions, will be analyzed. Certificates, Browsing, Firefox, Usable Security Categories and Subject Descriptors C.2.0 [Computer-Communication Networks]: General— Security and Protection; H.1.2 [Models and Principles]: User/Machine Systems—Human factors, Software psychology; H.5.2 [Information Interfaces and Presentation]: User Interfaces—Evaluation or methodology, Graphical user interfaces General Terms Security, Human Factors Copyright is held by the author/owner. Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee. Symposium on Usable Privacy and Security (SOUPS) 2008, July 23–25, 2008, Pittsburgh, PA USA . 1. INTRODUCTION The internet as we know it today depends on secure communication—electronic commerce relies on the private transmission of payment information, social networking sites and configurable portals on the secret exchange of login credentials, etc. Certificates and encryption are the fundamental building blocks of the Transport Layer Security (TLS) and Secure Socket Layer (SSL) protocols used to protect transmitted data. However, current browsers rely on users to perform (at least) two tasks to support secure communication: recognize encrypted websites and respond to invalid certificates. Both tasks are more complicated than they seem at first glance. In the first task users are asked to think to themselves every time they send data over the internet if the data needs to be kept secret. If it requires security they are further asked to check if the website is encrypted based on notification mechanisms they don’t recognize or comprehend. In the second task a user visits a website with an invalid certificate and they are asked to decide whether or not to proceed without any knowledge of the man-in-themiddle vulnerability the certificate protects them against. Studies have confirmed that these tasks are unusable—users do not notice or do not understand encryption identification mechanisms in browsers and connect to websites with invalid certificates just as they connect to websites with valid certificates. Browser designers have recognized this fundamental flaw in their design and both major browsers—Mozilla Firefox and Internet Explorer—have implemented substantial changes to their certificate management interfaces in their upcoming releases (Firefox 3 and IE 8 respectively). In this paper we describe the two-pronged experimental evaluation of the relevant features of Firefox 3. The first prong is a 270 user online survey which asked users to evaluate a screenshot of a website displayed using Firefox 2 or 3 and identify if the website was encrypted. The second prong is a 10 user, in-lab, user study which compared the reaction of users to invalid certificates from both familiar and unfamiliar websites and in Firefox 2 and 3. The survey was designed to evaluate the encryption identification task discussed earlier and the user study the invalid certificate reaction task. The remainder of this paper is organized as follows. In Section 2, we present the certificate management interface and encryption notification features of Firefox 3 and compare these features to those in Firefox 2. Section 3, details the encryption notification survey methodology and presents the results. Section 4 describes the lab-based certificate management user study methodology and summarizes the results. We review related work in Section 5, present future work in Section 6, and conclude in Section 7. and its certificate was issued by a CA that the Certificate Manager doesn’t know about or doesn’t trust, you will be asked whether you want to accept the web site’s certificate. When you accept a new certificate, the Certificate Manager adds it to its list of web site certificates. 2.2 Invalid Certificates In Firefox 2, when a website with an invalid certificate is encountered, the user sees the following dialog box: 2. FIREFOX 3 SECURITY A digital certificate is a representation of information that identifies the certification authority issuing the certificate, names or identifies the certificates’ subscriber, contains the subscriber’s public key, identifies its operational period, and is digitally signed by the certification authority issuing it. Certificates can be obtained from public CAs, from system administrators or special CAs within an organization or from web sites offering specialized services that require a means of identification that is more reliable than an ID and a password. 2.1 Certificate Management Certificate Management in Firefox has the following basic tasks: 1. Manage personal certificates. 2. Load certificates into web browsers for authentication in intercommunication activities like registration, electronic signature, etc. 3. Manage server certificates. A certificate manager can be used to manage the certificates that are available in your browser. Certificates may be stored on the computer’s hard disk or on smart cards or other security devices that are attached to your computer. To open the Certificate Manager: 1. In Windows Open the Tools menu and choose Options. (In Mac OSX or in Linux open the Edit menu and choose Preferences.) 2. Under the Privacy and Security category, choose Certificates. (If no subcategories are visible, click to expand the list.) 3. In the Manage Certificates section, click Manage Certificates. You see the Certificate Manager. There is a ”Your Certificates” tab in the Certificate Manager that allows to examine and work with the certificates that are identified by the user. The user can either view, backup or delete the certificate. The ”Authorities” tab in the Certificate Manager allows you to examine and work with the certificates you have on file that identify certificate authorities (CAs). Websites also use certificates to identify themselves. Such identification is essential before the web site can encrypt information transferred between the site and user’s computer, so that no one can read the data while it is in transit. If the URL for a web site begins with https://, the website has a certificate. If a user visits such a website Figure 1: Firefox 2 Invalid Certificate Warning The dialog box above appears like a popup as soon as a user directs his browser to a website with an invalid certificate. Users are asked to verify the identity of the site by checking whether it is trustworthy or not. They are also provided with reasons why this error could have occurred. Before trusting this CA for any purpose, users should examine the sites certificate and its policy and procedures by clicking on ”Examine Certificate”. After examination, users have three choices. One, users can accept the certificate permanently. If the user does click on ”Accept this user certificate permanently”, it ensures that the user completely trusts the website. Second, users can accept this certificate temporarily for this session. And third, users can choose not to connect to this website. Most users, inevitably, click OK and move on to the desired location without reading what the dialog box says. This is clear from the first line that states, ”Unable to identify mail.gtllimited.com as a trusted site.” This is followed by a few reasons why the browser generates an error. By default, ”Cccept this certificate temporarily for this session” is selected, and therefore, users choose this option. To improve this situation, Firefox 3.0 introduced a new look to this error message in their Beta 4 version. This error message is displayed whenever there is some sort of a problem with a site’s identity certificate. Instead of giving users a choice of what they should do, in Firefox 3.0 beta 4, this is a simple error message. It looks very similar to ”The connection timed out” message. This error message sends out a clear message, ”mail.gtllimited.com uses an invalid security certificate. This certificate is not trusted because its self signed.” Unlike Firefox 2, there is no OK button here. But there is a blue link, ”Or you can add an exceptionĚ” Once the user clicks on ”Or you can add an exception...”, the box in Figure 3 opens up in the same window. The user now has a choice of either clicking on ”Get me out of here!” or ”Add Exception...”. If the user trusts the website, he will click on ”Add Exception...”. If not, the user will completed by 290 participants. The favicon popup message was found to be particularly effective and communicating to users that a given website is encrypted. The following subsection detail the methodology, results, and limitations of the survey. 3.1 Figure 2: Firefox 3 Invalid Certificate Warning click on ”Get me out of here!” which will take them to their homepage. Clicking ”Add Exception...” will open another dialog box(Figure 4. Figure 3: Firefox 3, ”Get Me Out of Here!” and ”Add an Exception” buttons After the user clicks on ”Get Certificate”, he has an option of viewing the certificate. It also gives the user an option to ”permanently store this exception”. By default this option is selected. If the user wants to store the certificate for the session, he can uncheck this box. And then, the user can click on ”Confirm Security Exception”. Once that is done, the user is finally taken to the website. These certificates are stored in the Certificate Manager. 3.2 Figure 4: Menus to add an exception in Firefox 3. Notice the ”permanent checkbox” at the bottom of the dialog. 3. ENCRYPTION NOTIFICATION SURVEY We evaluated the encryption notification mechanisms in Firefox 2 and 3 with an online survey. The survey was Methodology The survey was broken up into three pages. The first pages asked for a range of demographic information. Users were asked about their browser use, gender, age, and level of education. In addition, they were asked four questions designed to determine their level of technical sophistication: 1) Have you ever designed a website? 2) Have you ever registered a domain name? 3) Have you ever used SSH? 4) Have you ever configured a firewall? Notably we thoroughly examined the data for correlations between the results and demographic information. We did not notice any significant correlation and we therefore will not mention the demographic information again. The main surprise is that technically sophisticated users as determining by our four questions were not better at determining if a website is secure. The second and third page each presented a screenshot of a website in Firefox 2 or Firefox 3. Three random numbers were drawn to determine the websites and order to display. The first random number selected one screenshot of the Sanford Institution for Savings website from the eight displayed in Figure 5. Two of the eight screenshots (Figure 5(a) and (b)) are from Firefox 2 and the remainder are from Firefox 3. The second random number selects a screenshot, from the same browser as the first image, of one of three popular websites: Google, Wikipedia, and Ask. The final random number determines whether the image of the popular website or of Sanford is displayed first. To summarize, all users saw two randomly selected images in a random order: one of Sanford and one of Google, Ask, or Wikipedia. Users were asked two questions about each image. The first, a yes or know question, asked ”Does the webpage displayed in the image to the left use encryption?”. The second, a free response question, asked ”How do you know?” Users were recruited to the survey with a promise to give a $100 Amazon gift card to one participant. (On a fourth page of the survey, users were asked to enter their email address to be selected as a winner.) Advertisements were posted online in five locations: prizey.blogspot.com, Facebook, Craigslist in Pittsburgh and Washinton D.C., and direct email correspondents. Web tracking data indicates that 62% of the traffic was directed from Prizey and 15% from Facebook. The survey was implemented in C# and ASP.NET and deployed on a Windows-based Carnegie Mellon web server. Results The most interesting results are displayed in Figure 5 which displays the results of the encryption question for all Sanford screenshots. Image (c), which represents the Sanford website as it displays by default in Firefox 3 (i.e. this is how the vast majority of users will see it in Firefox 3), confuses more users than Image (a) which is the Firefox 2 equivalent (Firefox 3 58% Correct vs. Firefox 2 71% Correct). The Sanford websites pays GoDaddy for extended validation and Image (c) reflects that fact. Users are not used to extended validation yet so one might think that the difference just discussed represents extended validations (a) Yes: 71% No: 29% (b)* Yes: 29% No: 71% (c) Yes: 58% No: 42% (d)* Yes: 30% No: 70% (e) Yes: 79% No: 21% (f) Yes: 63% No: 37% (g) Yes: 80% No: 20% (h)* Yes: 69% No: 31% Figure 5: One of the eight screenshot displayed above was displayed to every survey participant. They were asked, ”Does the webpage displayed in the image use encryption?” Results are displayed in the caption underneath each screenshot. The screenshots represent: (a) Firefox 2, Encrypted (b) Firefox 2, Unencrypted (c) Firefox 3, Encrypted (d) Firefox 3, Unencrypted (e) Firefox 3, Encrypted with Favicon Popup (f ) Firefox 3, Encrypted without Extended Validation (g) Firefox 3, Encrypted with Favicon Popup off the Browser Chrome (h) Firefox 2, Unencrypted with Favicon Popup off Browser Chrome. Screenshots b, d, and h are unencrypted so the correct answer was ’No’. newness. However, Image (f), displays the Sandford website as if it did not have extended validation but the situation hardly improves. Users are dramatically better at determining that the Sanford website is encrypted when one shows them the Favicon popup as in Image (e). They correctly decide that the website is encrypted 79% of the time. However, there is a fairly straightforward spoof which takes advantage of the popup. Note that (b), (d), and (h) the correct answer is ”No” the website does not use encryption. One immediately notices, that Image (h), which displays a spoofed popup which looks exactly the same as the popup in (e) has by far the most incorrect responses. Image (h) fools 69% respondents. When one looks at the responses given to ”How do you know?” questions (Figure 6) explanations for the above phenomena emerge. As you can see 54 of the 270 users relied on the popup to determine if the page is encrypted. This becomes especially striking when one realizes that only 98 of the respondents even saw the popup. Fully, 55% of popupviewing users determined there response by the popup window. The popup window explicitly states that website is ”encrypted” so it is no surprise that confused users found there answer in the popup. One user, who wrongly indicated that Image (h) was encrypted wrote: ”Because the pop up says so...but the url doesn’t say https:”. The popup was even able to override his otherwise correct thinking! However, some users were not fooled, for example ””I am suspicious of domain name and the drop-down menu with the lock. The site looks like it is trying too hard to convince me it is safe, but there is no lock symbol from Firefox.” This user noticed that popup was not coming from Firefox—a wise user. One can also see in Figure 6 an explanation for the comparatively worse performance of Firefox 3 without the popup to Firefox 2. The lock icon was sited by 44 users as an indicator of encryption. However, Firefox 3 does not present a lock icon unless one enables the popup. We suspect that the response to image (f) and (a) would have been identical if the lock icon had still been in the address bar. Several other features of Figure 6 are worth noting. The sheer number of guesses (76) is jarring – so many users cannot understand the question or do no know how to determine if a website is encrypted. The presence or absence of https in the URL is by far the number one explanation given by users. Only 11 users mentioned the color of the address bar in their responses. Internet Explorer 8 and Firefox 3, both indicated Extended validation with a green address bar. This survey suggests, however, that such changes will go unnoticed unless users are given substantial training. No one notices the yellow address bar, why will they notice green? Finally, many users (34), relied on information about the page to determine their answer. For example they mentioned the fact that page contained a login link, that it is a bank website, or that there is an anti-phishing icon on the page. These users have a very dangerous misunderstanding of the web. Browser designers will have to be very creative to help them. Finally, it is worth highlighting two of the most outrageous responses to the ”How...” question. These users have constructed very strange mental Number of Respondants 100 90 80 70 60 50 40 30 20 10 0 Guess Page Address Bar Color Lock https Popup Figure 6: Categorized responses to the question ”How do you know” for all Sanford screenshots. models: This first puts nonexistent meaning into the domain name ”There is no www. in the URL, which means that this site is not hosted on the world wide web; rather, it is likely hosted on a secure server owned and operated by the bank.” The second, wrongly decides that the website is encrypted with the simple explanation, ”it uses flash.” 3.3 having the user visit an untrusted and unfamiliar website, canada.com, and the second scenario involved two tasks with a trusted and familiar website, the CMU library catalog (Cameo). The first subtask was to reinforce the users’ trust in a familiar website (CMU Cameo library), that used a self signed certificate, which is considered illegal by most browsers. Firefox 2.0 was used for this purpose with the hypothesis that most users will unknowingly accept the certificate by clicking through the usual warnings. The second subtask involved switching to the Firefox 3 beta and opening the same website which would throw a secure connection failed error message. For the purpose of gathering timing metrics, the tasks on Firefox 3 were divided into three intervals represented as Interval 1, Interval 2 and Interval 3. Interval 1 is the time interval starting with the display of the warning (Figure 2) and ending when the user clicked ”Or you can add an exception...” link. The second interval is between this link click and the click of ’Add Exception...’ (Figure 3). Interval 3 starts when the user clicks ’Add Exception...’ and ends when the user actually adds the certificate and clicks ”Confirm” (Figure 4). The summary of results is shown below in Figure 8. Limitations The most noteworthy limitation of the survey involves the we users saw screenshots rather than a live browsers. The popup indicator in Firefox 3 only appears after clicking the Favicon. This is obviously more difficult to spoof than a static popup. We hope to address this problem in future work. In addition, we only tested users with one particular website. We do not know how much the particular design of the Sanford site affected the results. 4. LABORATORY STUDY We recruited 10 subjects for the actual in-lab user study. We gathered demographic data which is shown in Figure 7. Users were recruited by posting flyers on the Carnegie Mellon campus and sending a message to a Carnegie Mellon email list. All participants were Carnegie Mellon students. Figure 7: Demographic information for Lab-study participants. The study was divided into two parts - the first involved Figure 8: Timing data for all study participants There are important differences between the trusted and untrusted cases (Figure ??, a few subjects spent much less time on reading the warning message before clicking the blue link while there were others who took very long. One possibility is that in the untrusted case, people were not sure whether the website is designed to work while in the trusted case they were surprised when it failed to work and tried other means such as opening Internet Explorer to check. The mean value in Interval 1 was greater for the trusted website as compared to the untrusted one which shows that users may switch browsers the moment one blocks access to the websites they love to visit. For Interval 2, the distinction gets more dramatic. Very few users spent time in perusing the text in the yellow message box, in the trusted case. This shows that the user will clearly add an exception instead of clicking ’Get me out of here!’ if he trusts the website he has been visiting. On the other hand, when there was lack of trust, some users took more time to decide and even clicked the back and forward buttons repeatedly. In fact, two users in this case even ended up clicking ’Get me out of here...’. This goes on to show that the new scheme does work when the website is unknown, however, it fails to stop people from proceeding when they have visited the website before. There was one exception out of six, in the trusted case which invalidates this hypothesis partially. One astute user clicked the ’Cancel’ button after importing the certificate, thinking that it could be a spoof of the website he was intending to visit. In effect, the users’ trust did break in one case in the trusted website scenario. There are no significant timing deviations in Interval 3 as the task was straightforward, however a small increase in time while adding certificates can be observed in the untrusted case which may suggest that users are more likely to examine the management interface when they are dealing with untrusted websites. (e.g. viewing the certificate before downloading it, or selecting/unselecting the permanently store this certificate option). Figure 10: Actions Taken by Users Before Correctly Deleting the Certificate. usability fallout is that often the users are confused about why deletion is not working as expected although the real reason is that they highligh the root CA’s name instead of selecting a subnode before pressing the delete button. The only user who gave up the task of deletion actually came till the ’View Certificates’ option but hit cancel saying ”No, it can’t be View Certificates. I want to delete them”. Clearly, such instances will be more common with a larger data set. These observations are summarized in Figure 11. Figure 9: Timing information for Trusted vs. Untrusted websites The second task that was assigned was to delete the recently added certificate. We did not help any of the candidates and 90 could finish the task as expected with a mean time of 4 minutes and 17 seconds. For balancing out the difficulty we highlighted the main tab in half the cases, however when the user chose to view the certificates from the ”Advanced” tab, we always made the certificate deletion screen visible by default. The key observations were that there was a difference between the mental model of the user and how options are arranged in the Options tool bar. Figure 10 demonstrates what the users first reaction was when challenged on certificate deletion. 8 out of 9 users first clicked the Security or Privacy tab since they correlated them with certificate management. Almost half of them went on to select the ’Exceptions’ or ’Settings’ button within these tabs since this also directly maps to the ’Add an exception...’ link that the users would now be familiar with. Of particular interest is the fact that an equal number of users cannot tell where the certificates should be deleted from and hunt in system areas such as control panel and sometimes even retry loading the original website. While certificate deletion may not be a regular task, it is important given that very few users will uncheck the default ’Accept this certificate permanently’ checkbox while adding a certificate. Thus, they will anyhow end up adding invalid certificates inadvertently, to stay forever. Another Figure 11: Interesting Observations from the Laboratory Study. Two of the users shifted to Internet Explorer and we estimate these numbers to go up, if users are confronted with a difficult procedure to add certificates, especially from sites they trust, as is evident. We also noticed that 7 out of 8 users had their mouse hover over ’Get me out of here!’ before eventually choosing ’Add an exception...’. While this may only indicate that this button is more is the visual focus of the page, it may also indicate that they are thinking about whether they want to ’Get out of here’. It would be interesting to see what happens if the position of this button is swapped with that of the ’Add Exception...’ button. Two of the users closed their browsers during the deletion task, reopened them and navigated back to the same website. Both were surprised when it loaded without warnings and commented ”Oh! It loads just fine now”. This shows that users do not understand how the system is working and store the certificate permanently without realizing the consequences. These comments were unexpected but very useful in understanding a typical users’ mental model. 5. RELATED WORK In 2006, a study was done by MIT Computer Science and Artificial Intelligence Lab[2]on how security toolbars are useful in preventing phishing attacks. The study was done on 3 different types of security toolbars (The Neutral Information toolbar shows website information, such as domain name, hostname, registration date and hosting country, the SSLVerification toolbar differentiates sites that use SSL from those that do not, and the System-Decision toolbar displays a red light and the message ŞPotential Fraudulent SiteŤ if it decides that a web page encourages a phishing attacks) as well as browser address and status bars and found that they are ineffective at preventing phishing attacks. In addition, the study found that many users do not understand phishing attacks. As a result, some guidelines were suggested for designing effective anti-phishing solutions. Another related study [4] was done by Stanford and Microsoft researchers which evaluated Extended Validation and Picture-in-Picture Phishing Attacks. The study was done by asking the users to classify 12 websites as fraudulent or legitimate. The users were divided into 3 groups: controlled, untrained and trained group where the trained group was using extended validation certificate and help file about IE 7 security features to classify the websites. The study found out that extended validation didn’t provide a significant advantage in identifying the phishing attacks. However, extended validation could become more effective when it is adopted by more web sites. Also, extended validation is vulnerable to picture-in-picture user interface spoofing attacks. In addition, the study found out that the phishing filter’s help document should be designed carefully and should explicitly mention that it is not 100 Two researchers at the University of Pittsburgh [3] found that even though web security server certificates constitute a secure mechanism in theory, it is not so in practice because a lot of users don’t understand certificate related concepts or they ignore certificate validation warnings. As a result, the researchers proposed context-sensitive certificate verification (CSCV), where the browser asks the user about the context in which a certificate verification warning occurs following which the browser guides the user in handling the warning. Also, they propose specific password warnings (SPW) to make users more aware when they are about to send passwords in a form vulnerable to eavesdropping. They performed 3 user studies to evaluate CSCV comparing it to existing certificate warning mechanisms and SPW. It was found that CSCV and SPW can greatly improve web browsing security and are easy to use even without training. Another study[1] concentrated on usersŠ perceptions about web security and asked users to define ”secure connection”, identify it based on screen shots, provide reasons for their thinking and draw visual portrays of the secure connection. They found that many users perceive insecure connections as secure ones. In addition, some users evaluated the connection to be secure based on incorrect reasoning. Also, they confirmed that high technology users not always accurately understand web security concepts. 6. FUTURE WORK In the very near future we plan to rerun the same study (or a very similar one) with a larger number and more diverse set of users. We hope our findings in this study will be supported by qualitative evidence from a larger and broader participant set and quantitative data with statistical significance. In addition, we plan to expand the survey to include Internet Explorer 7 and 8. In conjuntion with the expanded lab study in the above paragraph, we will evaluate the Firefox 3 encryption notification mechanism (which we evaluated in this work by online survey). We hope to see if users click on the Favicon, something we could not observe in the screenshots. In the longer term, we would like to redesign and implement a new certificate management interface for Firefox which will both support easy certificate installation and convince users not to visit websites with invalid certificate without the proper due diligence. We hope to evaluate this redesign against current implementations. Finally, we hope to perform a Firefox field study to better evaluate users security decisions regarding un-trusted web sites. We think that our instruction in the lab to visit an unfamiliar website gave users confidence in the website they would not have felt in the field. 7. CONCLUSIONS Before the study began our hypothesis could be summarized as: ”The more things change the more they stay the same.” In particular we thought that the differences between: familiar and unfamiliar websites, Firefox 2 and Firefox 3, Extended Validation and Normal Certificates, and Favicon Popups and Lock icons would be minimal. We now believe that we were basically correct. However, we discovered many subtleties in this process that together have substantial impact. These have been discussed at length througout the paper. The proposed changes to Firefox 3 that have been successful are those that take advantage of users existing mental models and do not try to impose new models on users. Extended validation was not successful in our study, because it is a new concept. However, changing the button labels when one reaches a website with an invalid certificate – ”Add exception...” and ”Get me out of here” instead of ”OK” and ”Cancel” has been very effective. 8. ACKNOWLEDGMENTS The authors would like to thank Lorrie Faith Cranor for her advice and guidance throughout this process. We would also like to thank our classmates in the Usable Privacy and Security course for their questions and comments during inclass presentations. Finally, we are grateful to Microsoft for its financial support. 9. REFERENCES [1] B. Friedman, D. Hurley, D. C. Howe, E. Felten, and H. Nissenbaum. Users’ conceptions of web security: a comparative study. In CHI ’02: CHI ’02 extended abstracts on Human factors in computing systems, pages 746–747, New York, NY, USA, 2002. ACM. [2] M. Wu, R. C. Miller, and S. L. Garfinkel. Do security toolbars actually prevent phishing attacks? In CHI ’06: Proceedings of the SIGCHI conference on Human Factors in computing systems, pages 601–610, New York, NY, USA, 2006. ACM. [3] H. Xia and J. C. Brustoloni. Hardening web browsers against man-in-the-middle and eavesdropping attacks. In WWW ’05: Proceedings of the 14th international conference on World Wide Web, pages 489–498, New York, NY, USA, 2005. ACM. [4] Z. E. Ye, S. Smith, and D. Anthony. Trusted paths for browsers. ACM Trans. Inf. Syst. Secur., 8(2):153–186, 2005.