1. Introduction
Privacy threats from social media and other large entities are acknowledged, given that they collect personal information and use this to increase their profit margins [
1]. Even so, Internet users employ these platforms and accept the privacy risks. The role of government, on the other hand, as a potential threat to digital privacy is seldom considered. Citizens’ privacy can easily be sacrificed by the heavy-handed actions of government agency employees. For example, in 2020, ‘kiosks’ were introduced by Police Scotland to triage mobile devices during police investigations. The kiosk software was able to extract extensive private information from a smart mobile device. After protests by NGOs and consequent debates in the Scottish Parliament [
2], the Information Commissioner condemned the kiosks for potentially violating privacy rights of citizens [
3]. The UK government is proposing the use of AI-powered facial recognition across the country [
4], as do the Metropolitan Police in London [
5] and Railway Stations across the UK [
4]. The Big Brother Privacy watchdog is calling out these proposals for their potential to violate privacy. The UK’s Regulation of Investigatory Powers Act 2000
1 provides powers to intercept the content of communications, for example, by listening to telephone conversations or voicemail messages by a wide range of public authorities. As such, UK citizen privacy is under threat from a range of of entities, both Big Brother (government) and Middle Brother (organisations).
Privacy-Enhancing Tools (PETs), such as Virtual Private Networks (VPNs) and anonymous browsers, are available to online users who want to protect themselves from these kinds of privacy threats. Although some of these are widely advertised, the uptake of PETs remains modest, thereby reducing the potential of users to protect their privacy online. A recent (April 2024) survey [
6] found that 80% of a UK sample (
N = 201) had heard of at least one of the following PETs: VPNs, device encryption, webcam covers, non-tracking search engines, anonymous browsers and Faraday bags. However, 49% had not used any of these PETs in the last year and 63% were currently not using any of these. Moreover, even if users use a particular PET, this would only partly protect them, as different PETs protect from different privacy threats.
Although perfect protection is infeasible, using a range of different PETs that each protect against a specific class of privacy threat will result in a more comprehensive privacy protection regime. This paper explains how a decision support tool called PEDRO was developed to help online users to protect themselves by encouraging the adoption of PETs. The design of this tool builds on an existing classification of privacy threats [
7], our novel staged model of PET adoption and our empirical research that is presented in this paper.
The aim of PEDRO was to deconstruct common barriers to adoption based on the staged support of adoption requisites [
8] advancing from privacy- and threat awareness towards achieving self-protection via PET adoption. Similar to the transtheoretical model [
9], our approach challenges existing dominant ‘stageless’ theories of tool adoption, such as Protection Motivation Theory [
10].
To develop the tool, we carried out three studies. The first was an expert survey of PETs, in which cybersecurity experts analysed the effectiveness and feasibility PET adoption by lay users. The second study was a lay user survey of PETs, in which we asked crowd workers to rate their current adoption of PETs according to or adoption model and identify PET adoption barriers. The third study developed the PET decision support tool (PEDRO), building on the insights gained from Studies 1 and 2.
2. Background
2.1. Current State of Research
Existing research on classfying and adopting PETs is reviewed here, as this fed into our creation of the PEDRO decision support tool.
2.1.1. Classification of Privacy-Enhancing Tools
Support for privacy-enhancing tool (PET) adoption decisions needs to build on a solid foundation of privacy threats. Such a classification allows researchers to compare PETs not only in terms of their capabilities [
7], but also explicitly links each PET to the privacy threat(s) it mitigates. Heurix
et al.’s [
7] taxonomy meets this need by linking privacy threats (called `aims’ in [
7]). The taxonomy builds on four distinct threats:
Indistinguishability “makes it impossible to unambiguousy distinguish an entity from another entity”. For example, if a bad snooper is able to distinguish one particular user from another, they can track the user’s activities to violate their privacy; A VPN can prevent this.
Confidentiality is the requirement to keep personal data “protected from unintended disclosure”. Encryption keeps users’ data and information protected from unintended disclosure, even if leaked.
Deniability is “the ability to plausibly deny a fact, possession or transaction” and “is the direct opposite of accountability”. For example, when an online user employs a private search engine, no one can link him/her to their searches, enhancing deniability.
Unlinkability “indicates that an entity cannot be linked to another entity where the entities need not necessarily be of the same class”. For example, when an online user makes use of a private browser, he/she cannot be linked to another piece of data (such as, for instance, personal identity and/or other visited sites).
2.1.2. Adoption of Privacy-Enhancing Tools
Existing research on privacy self-protection has focused on
awareness or education [
11,
12,
13]. Specifically, research has been conducted on a possible learning taxonomy for PETs [
14,
15] as well as privacy awareness and knowledge [
16,
17], but does not address other factors that contribute to PET adoption such as barriers to adoption and challenges faced by adoptees.
Stageless, multifactor, technology acceptance modelling has a long tradition. Researchers have investigated the influence of a variety of factors on technology adoption. Examples include the technology acceptance model (TAM) [
18] and the unified theory of acceptance and use of technology (UTAUT) [
19]. Influential factors on adoption include perceived usefulness and perceived ease of use. The technology acceptance model has been applied and extended to help understanding users’ acceptance of privacy-enhancing tools [
20,
21], but does not address PET adoption process stages.
Another line of technology adoption research has used technology diffusion theory [
22] which distinguishes a knowledge stage from a persuasion stage. This research has focused on (workers in) organisations rather than on personal adoption. Influential factors include relative advantage, ease of use, compatibility, image, result demonstrability, visibility, voluntariness, and trialability ([
23], p. 507). However, this work has not addressed the adoption of privacy protection tools.
According to stageless protection models such as Protection Motivation Theory (PMT) [
24], the Health Belief Model (HBM) [
25] and the Theory of Planned Behaviour (TPB) [
26], intention to protect themselves has a positive effect on self-protection behaviours, and intention itself is influenced by other social-cognitive variables such as threat and coping appraisal (in protection motivation theory). The theory of planned behaviour [
27] and protection motivation theory [
28] have been applied to understand the determinants of the adoption of privacy-enhancing tools. However, by their nature, stageless model do not addressed the adoption process.
In staged protection models, such as the transtheorectical model of change (TTM) ([
9]), “
The stage dimension defines behaviour change as a process that unfolds over time and involves progress through a series of stages” (p. 845). The TTM has mainly been used in health, but also in other domains such as reducing energy consumption [
29]. Nevertheless, it has not been applied to privacy-enhancing tool adoption.
In sum, missing from the existing research is a model that explicitly represents the staged PET adoption process. The current study proposes such a model and uses this as a basis for developing the PEDRO tool to promote the adoption of PETs.
2.2. Current Study
In this project, we study and aim to cultivate citizens’ self-protective behaviour by developing, using and testing a novel staged PET adoption model (
Figure 1). The central idea of the model is the staged development of adoption requisites [
8], advancing from privacy- and threat awareness towards achieving adoption of a range of PETs. Similar to the transtheoretical model [
9], our approach challenges existing dominant ‘stageless’ models of self-protection, such as protection motivation theory [
10].
PET adoption is not a one-off simple A-or-B decision, such adoption is a process [
30], similar to other kinds of adoption in this domain [
31]. Consider that, for a PET to be adopted, the adopter needs to proceed through the following stages, as shown in
Figure 1:
- Stage 1.
Awareness of privacy threats [
32,
33].
- Stage 2.
Wanting to preserve their privacy [
34].
- Stage 3.
Knowing about privacy enhancing tools (PETs) [
35,
36].
- Stage 4.
Believing that PETs will enhance privacy [
37].
- Stage 5.
Knowing how to use the PET [
38,
39].
- Stage 6.
Feeling empowered to use the PET [
40].
- Stage 7.
Not being afraid to use the PET [
41].
If all these stages are successfully traversed, adoption becomes possible.
3. Study 1: Experts
We recruited cybersecurity experts to complete a survey to gain insight into the feasibility PET adoption by lay users. Our expert survey of PETs was guided by the following research question: Which PETs do experts believe are feasible for lay users to use? For each of the broad threat categories of (a) distinguishability, (b) linkability, (c) lack of confidentiality, and (d) lack of deniability, we identified both software- and hardware-based tools that effectively mitigate the privacy threats (
Table 1).
We developed an online survey
2. For each of the chosen PETs, the survey asked about the effectiveness of, feasibility of, challenges of, and ways to encourage the use of PETs that home users could install and deploy.
3.1. Materials and Methods
A list of feasible PETs (to home users), with their features was produced (
Table 2).
We surveyed 12 experts to gauge the feasibility of PET usage by non-expert home users (i.e., choose, install and deploy) and the effectiveness thereof in mitigating the applicable privacy threat.
Metric. We considered PETs to be infeasible for home usage if a majority of experts did not believe the PET was either infeasible for home users (i.e. non-experts) or if its effectiveness in mitigating the threat was questioned.
Recruitment. We used personal contacts and snowball sampling to contact privacy experts in the UK and USA. The UK participants were compensated with Shopping Vouchers.
Participants. There were 12 participants (10 male; 2 female).
3.2. Results and Discussion
The experts did not consider “Switching the Microphone off” and “Anonymous Letters” to be feasible PETs, nor did they consider the former particularly effective (
Table 2). There was general agreement that all the others could be adopted and are effective to a certain extent, with a general lack of awareness being considered the major deterrent (
Table 2). We also considered the experts’ personal usage of the specific PET in deciding whether to retain a PET or not.
3.3. Conclusion
From Study 1, the conclusion is that the PETs that should be considered for inclusion in our decision support tool are VPN, encryption, non-tracking search engine, anonymous browser, Faraday bag and WebCam cover.
4. Study 2: Lay Users
We surveyed 500 crowd workers to identify PET adoption barriers to address the following research questions: (a) what is the level of PET adoption by lay Internet users and (b) what barriers prevent people from using PETs?
4.1. Materials and Methods
4.1.1. Research Design and Procedure
We used a 1-factor survey design with two survey conditions (one for software PETs and another for hardware PETs). The factor was PET (
Table 1) and most of the PETs were also used as in Study 1 (see
Section 3), with face mask substituted for switching off microphone on smart TV. For each PET, we (a) explained the threat, (b) introduced the mitigating PET, (c) took them step by step up the ladder shown in Table
Figure 1 and asked them for their position regarding the adoption barrier on each of the stages.
4.1.2. Instrumentation and Participants
We constructed an online survey
3 that was implemented in two versions, one for hardware PETs and another for software PETs. For each of the PETs, the survey posed a set of questions according to the step model (
Table 3 and
Figure 1). Five hundred crowd workers were recruited from an online survey panel to take part in the survey. In the hardware-PET condition, 255 took part and in the software-PET condition 245. There were 100 participants in each of the age bands 18-30, 31-40, 41-50, 51-60, over 60. There were 252 female and 248 male participants.
4.2. Results
4.2.1. Adoption Step Model Stages
Model Step 1. A majority of participants were familiar with the privacy principle of confidentiality (67%/hardware condition; 76% software condition), but only a minority were familiar with the principles of deniability, indistinguishability and unlinkability (18%-35%) (
Figure 2).
Model Step 2. A majority (79%/hardware; 75%/software) found confidentiality very important or extremely important, but, in comparison, for the other principles the figure varied around 50% (38%-57%) (
Figure 3).
Model Step 3. A majority was familiar with the following PETs: encryption (85%), VPN (82%) or webcam cover (67%), but a majority was unfamiliar with tin foil (83%), face mask (78%), anonymous letter (62%) (
Figure 4). Roughly equal numbers were familiar or unfamiliar with non-tracking search engines (45%/47%) or anonymous browsing (46%/47%).
Model Step 4. A majority (58%) found encryption either quite effective or very effective. Most of the other PETs were found to be either effective or quite effective or very effective by a majority (
Figure 5). However, this was a minority for face mask (32%) and tin foil (34%).
Model Step 5. For none of the PETs, a majority knew how to use it (
Figure 6). Compared to other PETs, the number of those with knowledge how to use a VPN was relatively high (49%), but for encryption the number without this knowledge was relatively high (56%).
Model Step 6. A majority felt empowered to use a webcam cover (64%) or a VPN (60%) (
Figure 7). The feeling of empowerment was equally split for encryption and non-tracking search engine. A majority did not feel empowered to use anonymous browsing (57%), anonymous letter (75%), face mask (82%) or tin foil (83%).
Model Step 7. Fear to use was relatively low for each PET (1%-18%) (
Figure 8).
PET use. Current users were a minority for each PET (
Figure 9). There were relatively large minorities for webcam cover (28%) and VPN (25%). Next, were encryption (16%), anonymous browsing (14%) and non-tracking search engine (11%). Anonymous letter, face mask and tin foil were used by 5% or less.
4.2.2. Extent of Privacy Self-Protection
By definition, users will more fully self-protect their privacy the more PETs they use. Therefore, we undertook an analysis of the extent of self-protection (‘defence in breadth’) in terms of the number of PETs in relation to the adoption model steps, including PET use (
Figure 10).
Hardware PETs. There was limited evidence for defence in breadth for the different model steps, and even less so for PET use. For each of the model steps, the percentage of users declined with the number of PETs.
Software PETs. There was limited evidence for defence in breadth for familiarity with privacy threat and knowledge how to use PET, and even less so for PET use. For the model steps familiarity with privacy threat, knowing how to use a particular PET and PET use, the percentage of users declined with the number of PETs. However, this trend did not occur, and the distribution was more even for familiarity with PETs and feeling empowered PETs.
Overall, the majority of users did not employ any of the PETs. Therefore, only a minority of users employed one PET. Even fewer users employed more than one PET. In conclusion, respondents did not protect themselves against a range of threats to their online privacy by adopting PETs.
4.2.3. Barriers to PET Adoption
A thematic content analysis was conducted of the open-ended questions asking about reason not using a PET, reason for stopping PET use, reason why others may not use a PET and reason for fear to use a PET (
Table 4). The sub-themes (with more than one response providing evidence for a theme) from the analysis are barriers to PET use. These were organised in main themes and are presented in
Table 4. The largest main themes (in terms of the number of sub-themes) are a lack of awareness or perceived benefit and incompatibility with ways of working or other technology. Other main themes are a lack of knowledge, a lack of empowerment, a lack of social acceptance and a lack of trust. The main themes provide further support for our step model of PET adoption. In particular, a lack of PET awareness represents the model steps awareness of privacy threat and awareness of PET. The theme of a lack of perceived benefit represents the model step effectiveness. Incompatibility is not explicitly represented in the step model, but could cause a lack of empowerment. A lack of knowledge represents the model step knowing how to use a particular PET and a lack of empowerment represents the step empowerment to use the PET. A lack of social acceptance could be a cause for a lack of empowerment and a lack of trust could be a cause for not using the PET, although neither of these lacks are explicitly represented in the step model.
4.3. Conclusion
The level of PET adoption varied considerably between model step stages. In particular, in Stage 1 (awareness of privacy principle) and Stage 2 (importance of privacy principle) the level was either high or low, in Stage 3 (awareness of PET) high or middling, in Stage 4 (effectiveness of PET) predominantly high, but also middling or low, in Stage 5 (knowing how to use PET) and Stage 6 (empowerment) middling or low, in Stage 7 (fear) low. In addition, PETs varied in the extent to which they were used, but a majority did not use each of the PETs. The step model results provide a PET adoption baseline. The introduction of PET decision support tool may increase adoption. We also identified barriers to address in PET decision support design, in which the step model steps were used to organise the guidance for non-specialist users within the tool (Study 3). Based on the barriers that were identified, anonymous letter and face mask were not included in the design of the PET decision support tool (Study 3). This is because these our sample did not consider these to be socially acceptable or effective. In the remaining set of PETs, there is a considerable variation in level of adoption at the different adoption model stages and between PETs. The conclusion from the collective Studies 1 and 2 is that the PETs that should be considered for inclusion in our decision support tool are VPN, Encryption, Non-Tracking Search Engine, Anonymous Browser, Faraday Bag and WebCam Cover.
5. Study 3: PEDRO Design and Implementation
5.1. Design and Implementation
The design of the PEDRO website (PEt Decision suppoRt tOol; see
Figure 11 and
Figure 12) was implemented using HTML to preserve the privacy of users, and uses JavaScript to support interactivity.
It was initially developed on the [ORGANISATION’S NAME REDACTED] development server. PEDRO addresses each of the privacy threats introduced in the background section, and also explains WHY privacy is important (
Section 2), WHAT privacy threats exist (
Section 1), and HOW privacy can be assured in the face of these threats (
Section 3). In some cases, advice is directly provided, and, in others, helpful YouTube videos are embedded. URLs for advice sources are provided.
The core page for each of the six PETs is interactive, addressing each of the adoption requisites shown in
Figure 1 and provides information that can remove that barrier.
Each of these pages opens with a story – originally generated by ChatGPT and tweaked as feedback was provided by experts. All images on the site are either non-copyrighted or generated by ChatGPT to match the context. Each emoticon in the left panel can be clicked on, with the information on the right changing to provide specific information.
5.2. Expert Evaluation
We carried out two studies to validate the website: the first with five cybersecurity experts and the second with five usability experts. All feedback was used to iteratively improve the website as each evaluation was carried out. Next, one cybersecurity expert and one usability expert evaluated the final version of the tool, and, based on their feedback, the tool was improved one final time. The production version is hosted at
https://pedro.infinityfreeapp.com/index.html
6. General Discussion
The aim of the this study was to cultivate citizens’ self-protective behaviour by developing, refining and using a novel staged PET adoption model. Three studies were conducted. First, in our expert survey of PETs, cybersecurity experts analysed the feasibility PET adoption by lay users.
Second, in our lay user survey of PETs, crowd workers rated their level of adoption according to the step model and identified PET adoption barriers. The conclusion from these two studies regarding the selection of a PET after using the tool that was to developed was the same: VPN, encryption, non-tracking search engine, anonymous browser, Faraday bag and webcam cover.
Third, we then developed a PET decision support tool called PEDRO based on the results of Study 1 and Study 2. Specifically, the tool explicitly represents and addresses the adoption model steps. Specific barriers that were identified in Study 2 were explicitly addressed in the content of the tool to promote PET adoption.
The decision support tool that has been developed opens opportunities for future work in several areas. Our conceptual framework consisting of the step adoption model together with the classification of privacy threats allows for potential new PETs to be added.
Empirical evaluation of the tool will be important to establish to what extent using the tool leads to the adoption of PETs. In addition, when the tool has been publicised, responses by online users to the tool will be useful to further improve the tool design. In the first instance, non-specialist UK citizens and, more generally, citizens from English-speaking countries are the target user population. A potential further development is foreign-language version in collaboration with international partners. The potential impact of the tool will include increased PET use by online users and, as a result better privacy self-protection against a range of privacy threats (defence in breadth).
Supplementary Materials
The following supporting information can be downloaded at the website of this paper posted on
Preprints.org. URL of PEDRO Website goes here.
Institutional Review Board Statement
The study was conducted according to the guidelines of the Declaration of Helsinki, and approved by the Ethics Committee of [NAME OF INSTITUTION REDACTED FOR ANONYMOUS PEER REVIEW], protocol code [CODE REDACTED], [DATE OF APPROVAL REDACTED].
Informed Consent Statement
Informed consent was obtained from all subjects involved in the study.
Data Availability Statement
Data are available on request from the authors.
Acknowledgments
The authors are grateful to the [FUNDER’S NAME REDACTED] for financial support.
Conflicts of Interest
The authors do not report any conflict of interest.
References
- Debatin, B.; Lovejoy, J.P.; Horn, A.K.; Hughes, B.N. Facebook and online privacy: Attitudes, behaviors, and unintended consequences. Journal of computer-mediated communication 2009, 15, 83–108. [Google Scholar] [CrossRef]
- BBC. Police Scotland cyber kiosks ’could be unlawful’, 2018. https://www.bbc.com/news/uk-scotland-46225771.
- Tibbitt, A. Privacy watchdog orders Police Scotland to up standards at mobile phone labs, 2021. https://theferret.scot/privacy-watchdog-orders-police-scotland-to-up-standards-at-mobile-phone-labs/.
- nna Gross.; Murgia, M. UK government seeks expanded use of AI-based facial recognition by police, 2023. https://www.ft.com/content/858981e5-41e1-47f1-9187-009ad660bbbd.
- BBC. Met Police to deploy facial recognition cameras, 2020. https://www.bbc.com/news/uk-51237665#.
- PureProfile. Prevalence of the use of privacy-enhancing technology, 2024. personal communication.
- Heurix, J.; Zimmermann, P.; Neubauer, T.; Fenz, S. A taxonomy for privacy enhancing technologies. Computers & Security 2015, 53, 1–17. [Google Scholar] [CrossRef]
- Rosenberg, A. Philosophy of social science; Westview Press Boulder, CO, 2008.
- Prochaska, J.O. Decision making in the transtheoretical model of behavior change. Medical Decision Making 2008, 28, 845–849. [Google Scholar] [CrossRef] [PubMed]
- Prentice-Dunn, S.; Rogers, R.W. Protection motivation theory and preventive health: Beyond the health belief model. Health Education Research 1986, 1, 153–161. [Google Scholar] [CrossRef]
- Alshehri, A.; Clarke, N.; Li, F. Privacy enhancing technology awareness for mobile devices. In Proceedings of the International Symposium on Human Aspects of Information Security & Assurance (HAISA 2019), University of Plymouth, 2019; pp. 73–88. [Google Scholar]
- O’Hagan, J.; Saeghe, P.; Gugenheimer, J.; Medeiros, D.; Marky, K.; Khamis, M.; McGill, M. Privacy-enhancing technology and everyday augmented reality: Understanding bystanders’ varying needs for awareness and consent. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 2023, 6, 1–35. [Google Scholar] [CrossRef]
- Basyoni, L.; Tabassum, A.; Shaban, K.; Elmahjub, E.; Halabi, O.; Qadir, J. Navigating Privacy Challenges in the Metaverse: A Comprehensive Examination of Current Technologies and Platforms. IEEE Internet of Things Magazine 2024, 7, 144–152. [Google Scholar] [CrossRef]
- Paul, S.K.; Knox, D. A taxonomy and gap-analysis in digital privacy education. In Proceedings of the International Symposium on Foundations and Practice of Security. Springer; 2022; pp. 221–235. [Google Scholar] [CrossRef]
- Klymenko, A.; Meisenbacher, S.; Messmer, F.; Matthes, F. Privacy-Enhancing Technologies in the Process of Data Privacy Compliance: An Educational Perspective. In Proceedings of the CIISR@ Wirtschaftsinformatik; 2023; pp. 62–69. [Google Scholar]
- Gerber, N.; Gerber, P.; Drews, H.; Kirchner, E.; Schlegel, N.; Schmidt, T.; Scholz, L. FoxIT: enhancing mobile users’ privacy behavior by increasing knowledge and awareness. In Proceedings of the Proceedings of the 7th Workshop on Socio-Technical Aspects in Security and Trust, 2018, pp.; pp. 53–63.
- Ghazinour, K.; Messner, K.; Scarnecchia, S.; Selinger, D. Digital-PASS: a simulation-based approach to privacy education. In Proceedings of the Proceedings of the 18th ACM Workshop on Privacy in the Electronic Society, 2019, pp.; pp. 162–174. [CrossRef]
- Davis, F.D.; Venkatesh, V. Toward preprototype user acceptance testing of new information systems: implications for software project management. IEEE Transactions on Engineering Management 2004, 51, 31–46. [Google Scholar] [CrossRef]
- Blut, M.; Chong, A.; Tsiga, Z.; Venkatesh, V. Meta-analysis of the unified theory of acceptance and use of technology (UTAUT): challenging its validity and charting a research agenda in the red ocean. Journal of the Association for Information Systems 2022, 23, 13–95. [Google Scholar] [CrossRef]
- Harborth, D.; Pape, S. Examining technology use factors of privacy-enhancing technologies: The role of perceived anonymity and trust. In Proceedings of the AMCIS 2018 Proceedings; 2018; p. 15. [Google Scholar]
- Lucier, D.M.; Howell, R.T.; Okabe-Miyamoto, K.; Durnell, E.; Zizi, M. We make a nice pair: Pairing the mID with a NeuroTechnology privacy enhancing technology improves mID download intentions. Computers in Human Behavior Reports 2023, 11, 100321. [Google Scholar] [CrossRef]
- Eaton, J.; Kortum, S. International technology diffusion: Theory and measurement. International Economic Review 1999, 40, 537–570. [Google Scholar] [CrossRef]
- Yuen, K.F.; Cai, L.; Qi, G.; Wang, X. Factors influencing autonomous vehicle adoption: An application of the technology acceptance model and innovation diffusion theory. Technology Analysis & Strategic Management 2021, 33, 505–519. [Google Scholar] [CrossRef]
- Rogers, R.W. A protection motivation theory of fear appeals and attitude change. The Journal of Psychology 1975, 91, 93–114. [Google Scholar] [CrossRef] [PubMed]
- Maiman, L.A.; Becker, M.H. The health belief model: Origins and correlates in psychological theory. Health Education Monographs 1974, 2, 336–353. [Google Scholar] [CrossRef]
- Ajzen, I. The theory of planned behavior. Organizational behavior and human decision processes 1991, 50, 179–211. [Google Scholar] [CrossRef]
- Yao, M.Z.; Linz, D.G. Predicting self-protections of online privacy. CyberPsychology & Behavior 2008, 11, 615–617. [Google Scholar] [CrossRef]
- Matt, C.; Peckelsen, P. Sweet idleness, but why? How cognitive factors and personality traits affect privacy-protective behavior. In Proceedings of the 2016 49th Hawaii International Conference on System Sciences (HICSS); IEEE, 2016; pp. 4832–4841. [Google Scholar]
- AlSkaif, T.; Lampropoulos, I.; Van Den Broek, M.; Van Sark, W. Gamification-based framework for engagement of residential customers in energy applications. Energy Research & Social Science 2018, 44, 187–195. [Google Scholar] [CrossRef]
- Morton, A.; Sasse, M.A. Privacy is a process, not a PET: A theory for effective privacy practice. In Proceedings of the Proceedings of the 2012 New Security Paradigms Workshop, 2012, pp.; pp. 87–104.
- Siponen, M. Stage theorizing in behavioral information systems security research. In Proceedings of the Hawaii International Conference on System Sciences, Honolulu, 3-6 January, 2024; Available online: https://hdl.handle.net/10125/106952.
- Caviglione, L.; Lalande, J.F.; Mazurczyk, W.; Wendzel, S. Analysis of human awareness of security and privacy threats in smart environments. In Proceedings of the Human Aspects of Information Security, Privacy, and Trust: Third International Conference, HAS 2015, Held as Part of HCI International 2015, Los Angeles, CA, USA, August 2-7, 2015; Proceedings 3. Springer, 2015; pp. 165–177. [Google Scholar]
- Alkhalifah, A.; Al Amro, S. Understanding the Effect of Privacy Concerns on User Adoption of Identity Management Systems. J. Comput. 2017, 12, 174–182. [Google Scholar] [CrossRef]
- Deuker, A. Addressing the privacy paradox by expanded privacy awareness–the example of context-aware services. In Proceedings of the Privacy and Identity Management for Life: 5th IFIP WG 9.2, 9.6/11.4, 11.6, 11.7/PrimeLife International Summer School, Nice, France, September 7-11, 2009; Revised Selected Papers 5. Springer, 2010; pp. 275–283. [Google Scholar]
- Story, P.; Smullen, D.; Yao, Y.; Acquisti, A.; Cranor, L.F.; Sadeh, N.; Schaub, F. Awareness, adoption, and misconceptions of web privacy tools. In Proceedings of the Proceedings on Privacy Enhancing Technologies; 2021. [Google Scholar]
- Alsaleh, M.; Alomar, N.; Alarifi, A. Smartphone users: Understanding how security mechanisms are perceived and new persuasive methods. PloS one 2017, 12, e0173284. [Google Scholar] [CrossRef] [PubMed]
- Gürses, S. PETs and their users: a critical review of the potentials and limitations of the privacy as confidentiality paradigm. Identity in the Information Society 2010, 3, 539–563. [Google Scholar] [CrossRef]
- Krontiris, I.; Benenson, Z.; Girard, A.; Sabouri, A.; Rannenberg, K.; Schoo, P. Privacy-ABCs as a case for studying the adoption of PETs by users and service providers. In Proceedings of the Privacy Technologies and Policy: Third Annual Privacy Forum, APF 2015, Luxembourg, Luxembourg, October 7-8, 2015; Revised Selected Papers 3. Springer, 2016; pp. 104–123. [Google Scholar]
- Vemou, K.; Karyda, M. A classification of factors influencing low adoption of pets among sns users. In Proceedings of the Trust, Privacy, and Security in Digital Business: 10th International Conference, TrustBus 2013, Prague, Czech Republic, August 28-29, 2013; Proceedings 10. Springer, 2013; pp. 74–84. [Google Scholar]
- Poireault, K. Russia Blocks VPN Services in Information Crackdown, 2024. https://www.infosecurity-magazine.com/news/russia-blocks-vpn-services-2024/.
- HIDE.me. Using a VPN in Restrictive Countries – How To Bypass Censorship, 2024. https://hide.me/en/blog/using-a-vpn-in-restrictive-countries/.
|
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).