Advertisement · 728 × 90
#
Hashtag
#DataPrivacyConcerns
Advertisement · 728 × 90
Preview
Microsoft BitLocker Encryption Raises Privacy Questions After FBI Key Disclosure Case   Microsoft’s BitLocker encryption, long viewed as a safeguard for Windows users’ data, is under renewed scrutiny after reports revealed the company provided law enforcement with encryption keys in a criminal investigation. The case, detailed in a government filing [PDF], alleges that individuals in Guam illegally claimed pandemic-related unemployment benefits. According to Forbes, this marks the first publicly documented instance of Microsoft handing over BitLocker recovery keys to law enforcement. BitLocker is a built-in Windows security feature designed to encrypt data stored on devices. It operates through two configurations: Device Encryption, which offers a simplified setup, and BitLocker Drive Encryption, a more advanced option with greater control. In both configurations, Microsoft generally stores BitLocker recovery keys on its servers when encryption is activated using a Microsoft account. As the company explains in its documentation, "If you use a Microsoft account, the BitLocker recovery key is typically attached to it, and you can access the recovery key online." A similar approach applies to organizational devices. Microsoft notes, "If you're using a device that's managed by your work or school, the BitLocker recovery key is typically backed up and managed by your organization's IT department." Users are not required to rely on Microsoft for key storage. Alternatives include saving the recovery key to a USB drive, storing it as a local file, or printing it. However, many customers opt for Microsoft’s cloud-based storage because it allows easy recovery if access is lost. This convenience, though, effectively places Microsoft in control of data access and reduces the user’s exclusive ownership of encryption keys. Apple provides a comparable encryption solution through FileVault, paired with iCloud. Apple offers two protection levels: Standard Data Protection and Advanced Data Protection for iCloud. Under Standard Data Protection, Apple retains the encryption keys for most iCloud data, excluding certain sensitive categories such as passwords and keychain data. With Advanced Data Protection enabled, Apple holds keys only for iCloud Mail, Contacts, and Calendar. Both Apple and Microsoft comply with lawful government requests, but neither can disclose encryption keys they do not possess. Apple explicitly addresses this in its law enforcement guidelines [PDF]: "All iCloud content data stored by Apple is additionally encrypted at the location of the server. For data Apple can decrypt, Apple retains the encryption keys in its US data centers. Apple does not receive or retain encryption keys for [a] customer's end-to-end encrypted data." This differs from BitLocker’s default behavior, where Microsoft may retain access to a customer’s encryption keys if the user enables cloud backup during setup. Microsoft states that it does not share its own encryption keys with governments, but it stops short of extending that guarantee to customer-managed keys. In its law enforcement guidance, the company says, "We do not provide any government with our encryption keys or the ability to break our encryption." It further adds, "In most cases, our default is for Microsoft to securely store our customers' encryption keys. Even our largest enterprise customers usually prefer we keep their keys to prevent accidental loss or theft. However, in many circumstances we also offer the option for consumers or enterprises to keep their own keys, in which case Microsoft does not maintain copies." Microsoft’s latest Government Requests for Customer Data Report, covering July 2024 through December 2024, shows the company received 128 law enforcement requests globally, including 77 from US agencies. Only four requests during that period—three from Brazil and one from Canada—resulted in content disclosure. After the article was published, a Microsoft spokesperson clarified, “With BitLocker, customers can choose to store their encryption keys locally, in a location inaccessible to Microsoft, or in Microsoft’s cloud. We recognize that some customers prefer Microsoft’s cloud storage so we can help recover their encryption key if needed. While key recovery offers convenience, it also carries a risk of unwanted access, so Microsoft believes customers are in the best position to decide whether to use key escrow and how to manage their keys.” Privacy advocates argue that this design reflects Microsoft’s priorities. As Erica Portnoy, senior staff technologist at the Electronic Frontier Foundation, stated in an email to The Register, "Microsoft is making a tradeoff here between privacy and recoverability. At a guess, I'd say that's because they're more focused on the business use case, where loss of data is much worse than Microsoft or governments getting access to that data. But by making that choice, they make their product less suitable for individuals and organizations with higher privacy needs. It's a clear message to activist organizations and law firms that Microsoft is not building their products for you."

Microsoft BitLocker Encryption Raises Privacy Questions After FBI Key Disclosure Case #BitLockerencryptionkeys #CyberSecurity #DataPrivacyConcerns

0 0 0 0
Preview
Data Collection Ethics: Are Tech Giants Going Too Far? The Unseen Transaction: Navigating the Murky Waters of Data Collection Ethics Let's be honest. You've clicked "I Agree" on a terms of service document you haven't read. We all have.…

Data Collection Ethics: Are Tech Giants Going Too Far? #personalinformationsecurity #bigtechdatacollection #techgiantaccountability #onlineprivacylaws #userdataprotection #GDPRcompliance #datamonetization #surveillancecapitalism #dataprivacyconcerns #digitalfootprintethics

0 0 0 0
Preview
Ethical Concerns of AI: A Guide to Responsible Tech The Double-Edged Sword: Navigating the Murky Waters of AI Ethics It seems like you can't scroll through a news feed without seeing something about Artificial Intelligence. It's either a groundbreaking…

Ethical Concerns of AI: A Guide to Responsible Tech #artificialintelligencemorality #AIjobdisplacement #AIsafety #AIaccountability #AIethics #AItransparency #algorithmicbias #dataprivacyconcerns #responsibleAI #FutureofAI

0 0 0 0
Preview
The Ethics of Geo-targeting: Persuasion or Manipulation? The Ethics of Geo-targeting and Algorithmic Persuasion Ever walked past a coffee shop and, seconds later, gotten a notification on your phone for a discount on their new latte? Or…

The Ethics of Geo-targeting: Persuasion or Manipulation? #behavioraltargeting #dataprivacyconcerns #filterbubbles #targetedadvertising #personalizedmarketing #admanipulation #datacollection #geotargetingethics #algorithmicpersuasion #consumerprivacy

0 0 0 0
Preview
Committee does not advance bill requiring coroners to report psychiatric drugs after violent deaths amid privacy and resource concerns A proposed bill that would require coroners to test for and publicly report therapeutic levels of psychiatric medications in all suicides and homicides drew broad opposition from coroners, public health officials and family witnesses and was not moved by the committee. Opponents cited privacy, administrative burden on understaffed county coroner

A controversial bill aimed at requiring coroner reports on psychiatric drugs in violent deaths has sparked heated debate in Wyoming, with concerns over privacy and administrative burdens leading to its rejection.

Read the full story!

#WY #CitizenPortal #PublicHealthPolicy #DataPrivacyConcerns

0 0 0 0
Preview
Ethical Concerns of AI: Bias, Privacy, and Our Future The Double-Edged Sword: Navigating the Murky Ethical Concerns of AI Artificial intelligence isn't science fiction anymore. It's here. It’s in your phone, your car, the way you get a loan,…

Ethical Concerns of AI: Bias, Privacy, and Our Future #responsibleAI #AIregulation #artificialintelligencerisks #FutureofAI #AItransparency #algorithmicbias #jobdisplacementbyAI #AIethics #machinelearningfairness #dataprivacyconcerns

0 0 0 0
Preview
Smart Glasses Face Opposition as Gen Z Voices Privacy Concerns   The debate over technology and privacy is intensifying as Meta prepares to announce a third generation of its Ray-Ban smart glasses, a launch that will hold both excitement and unease in the tech community at the same time. In the new model, which will be marketed as Meta Ray-Ban Glasses Gen 3, the features that have already attracted more than two million buyers since they were introduced in 2023 will be refined.  Even though Meta's success is a testament to the increasing popularity of wearable technology, the company is currently facing significant scrutiny due to discussions regarding potential facial recognition capabilities, which raise significant privacy and data security concerns.  There has been an increasing trend in smart glass adoption over the past couple of years, and observers believe that the addition-or even the prospect- of such a feature may alter not only the trajectory of smart glasses, but also the public's willingness to embrace them as well. An industry-wide surge in wearable innovation has seen the introduction of some controversial developments, including glasses powered by artificial intelligence, which have been developed by two Harvard dropouts who recently raised $1 million in funding to advance their line of AI-powered smart glasses.  It was originally known as a company that experimented with covert face recognition, but today the entrepreneurs are focusing their efforts on eyewear that records audio, processes conversations in real time, and provides instant insights.  The technology demonstrates striking potential to transform human interaction, but it has also caused a wave of criticism over the risks of unchecked surveillance, which has prompted a wave of criticism. It has become increasingly evident that social media platforms are becoming a platform where widespread unease is being expressed, with many users warning of a future in which privacy will be compromised through constant surveillance. Comparisons with the ill-fated Google Glass project are becoming increasingly common, and critics argue that such innovations could ultimately lead to dystopian territory without adequate safeguards and explicit consent mechanisms. The regulation and advocacy groups for digital rights are also attempting to establish clearer ethical frameworks, emphasising the delicate balance between fostering technological development and protecting individual freedoms.  It is no secret that most members of Generation Z are sceptical about smart glasses owing to concerns about privacy, trust, and social acceptance, as well as other social issues. Even though most models come equipped with small LED indicators to indicate when the camera is activated, online tutorials have already demonstrated that these safeguards can be easily bypassed by anyone in order to conceal a camera.  There are numerous examples of such “hacks” on platforms like TikTok, fuelling fears of being unknowingly filmed in the classroom, public space, or private gatherings on platforms like TikTok. These anxieties are compounded by a broader mistrust of Big Tech, with companies like Meta, maker of Ray-Ban Stories, still struggling with reputational damage as a result of past data abuse scandals.  Since Gen Z has grown up with a much more aware awareness of how personal information is gathered and monetised than older generations, they have developed heightened suspicions about devices that could function as portable surveillance tools, as opposed to older generations. There are, however, cultural challenges beyond regulation.  Wearing glasses on the face places recording technology directly in front of the eye, which is a situation many find invasive. Some establishments, such as restaurants, gyms, and universities, have acted to restrict their use, signalling resistance at a social level. Furthermore, critics note a generational clash over values, where Gen Z values authenticity and spontaneity in their digital expression, while the discreet recording capabilities of smart glasses risk creating a sense of distrust and eradicating genuine human connections as a result.  According to analysts, manufacturers should prioritise transparency, enforce tamper-proof privacy indicators and shift towards apps that emphasise accessibility or productivity. If manufacturers do not do these things, the technology is likely to remain a niche novelty and not a mainstream necessity, particularly among the very demographic it aims to reach out to.  It is MTA's policy to emphasise that safeguards have been built into its devices, and a spokesperson for the company, Maren Thomas, stated that Ray-Ban smart glasses are equipped with an external light that indicates when recording is active as well as a sensor that detects if the light is blocked. According to her, the user agreement of the company prohibits disabling the light.  Although these assurances are present, younger consumers remain sceptical of the effectiveness of such measures, even though such assurances remain high. Critics point out that online tutorials already circulate showing how to bypass recording alerts, which raises concerns that the system could be misused in the workplace, classroom, or any other public setting. As a result of their concern that they will be covertly filmed, people in customer-facing positions are especially vulnerable.  Researchers contend that these concerns stem from a generational gap in attitudes towards digital privacy: millennials tend to share personal content more freely, whereas Generation Z tends to think about the consequences of exposure, especially as social media footprints become increasingly influential in job opportunities and college selections.  There is a growing movement within this generation to establish informal boundaries with their peers and families about what information should be shared and what information should not be shared, and wearable technology poses the potential to upend these unspoken rules in an instant.  It is important to note, however, that despite the controversy, the demand for Meta Ray-Ban sunglasses in the United States is forecasted to reach almost four million units by the end of this year, a sharp increase from 1.2 million units in 2024, and the results of social media monitoring by Sprout Social show that, despite most online mentions remaining positive or neutral, younger users are disproportionately concerned about privacy.  It is believed by industry experts that the future of smart glasses may not hinge purely on technological innovation, but instead on the ability of companies to navigate the ethical and social dimensions of their products effectively. Although privacy concerns dominate the current conversation, advocates maintain that the technology can also be very beneficial if deployed responsibly as well.  In addition to assisting with visual impairments in navigating the world, smart glasses could also provide real-time language translation as well as hands-free communication in healthcare and industry settings. Smart glasses would provide meaningful improvements to accessibility and productivity as well. There is no doubt that manufacturers will need to demonstrate transparency, build trust through non-negotiable safeguards, and work closely with regulators to develop clear consent and data usage standards to reach that point.  Social acceptance will require a cultural shift as well, one that will reassure people that innovation and respect for individual rights can coexist. In particular, Gen Z, a generation that values authenticity and accountability, will require the industry to design products that empower, not monitor, and connect, rather than alienate. The test will be whether the company can achieve this goal. Achieving that balance will perhaps enable smart glasses to evolve from a polarising novelty into a universally adopted tool that will have a profound impact on the way people see the world, interact with it, and process information.

Smart Glasses Face Opposition as Gen Z Voices Privacy Concerns #DataPrivacyConcerns #datasecurity #EthicalInnovation

0 0 0 0
Preview
Own Your Data Delete your account or access the personal data organizations have on you using this free service.

here is a free tool that allows you to request companies delete your personal data. search for the company you want, and select your jurisdiction &
it generates an email

#privacy #digitalrights #dataprivacy #dataprivacyconcerns #meta #facebook #OpenAI #palantir #surveillance

yourdigitalrights.org

2 0 0 0
Preview
Musk's DOGE Pushes His Grok AI Into Government, Raising Ethics Elon Musk's efficiency team is pushing his Grok AI into federal agencies while he profits from each contract. Three government sources reveal how sensitive data on millions of Americans may be training his chatbot.

Musk pushes Grok AI into federal agencies through efficiency role while profiting from contracts. Government sources say sensitive data on millions of Americans may train his chatbot without oversight. #GrokAI #DataPrivacyConcerns

0 0 0 0
Preview
Lawmakers discuss record sealing, access, and concerns over misuse by law enforcement Legislators address record sealing processes and ensure appropriate access for law enforcement.

Lawmakers in Vermont are grappling with the fine line between law enforcement access to criminal records and protecting individual rights—what's at stake?

Click to read more!

#VT #CitizenPortal #DataPrivacyConcerns #LawEnforcementOversight #VermontLawmakers #CriminalRecordAccess

0 0 0 0