Tag Archives: privacy

What CCPA Press Release SHOULD Say

On Jan. 1, 2020, the California Consumer Privacy Act of 2018 (CCPA) is in effect. So, too, is the law governing so-called data brokers. To understand the CCPA, it is sometimes important to suspend belief. What follows is a parody, a form of communicating that seems particularly appropriate for the CCPA and its $55 billion compliance price tag.

California Attorney General Announces Issuance of Subpoenas Over Privacy Law Violations

Feb. 1, 2020

SACRAMENTO – The California Department of Justice today announced that North Pole Enterprises, LLC, dba “Santa Claus” has been issued an investigative subpoena to address concerns over widespread misuse and improper collection of personal information. The potential numerous violations of the California Consumer Privacy Act of 2018 (CCPA) include:

Improper collection of biometric data. Santa Claus is alleged to know when consumers are sleeping and when they are awake. When this biometric data, as defined in Civil Code § 1798.140(b) to include, “an individual’s physiological, biological or behavioral characteristics” is collected, upon information and belief, Santa Claus has shown a pattern and practice of failing to inform consumers as to the categories of personal information to be collected and the purposes for which the categories of personal information shall be used as required by Civil Code § 1798.100(b).

The violations of this part of the CCPA may also extend to biometric information indicating when California consumers are naughty or nice, are bad or good or are pouting or crying.

Improper collection of geolocation data. Santa Claus delivers gifts on Christmas Eve to consumers throughout California. To do this, Santa Claus has developed a comprehensive data base of consumers’ residential locations. This is within the definition of “personal information” as defined in Civil Code § 1798.140(o)(1)(G). Santa Claus obtains this personal information through soliciting and receiving “Christmas Lists” from California consumers, which generally contain attestations that the consumer has been “nice,” which is, and noted above, also biometric information.

See also: Vast Implications of the CCPA  

To the extent these lists and the personal information contained therein are generated by traffic through the Santa Claus website, upon information and belief there are no posted online privacy policies to advise consumers of their rights under the CCPA. This is a violation of Civil Code § 1798.130(a)(5).

Failure to provide notice of right to opt out. The gifts Santa Claus delivers on Christmas Eve are allegedly crafted in a workshop at the North Pole. Upon information and belief, the workshop is a cooperative corporation, “Santa’s Co-op Workshop” (SCW), located just outside Alturas in Humboldt County. As such, Santa Claus is selling personal information, as defined in Civil Code § 1798.140(t), to a third party without giving California consumers notice of their rights to opt out of the sale of their personal information. This is a violation of Civil Code § 1798.120 and Civil Code § 1798.130.

In addition, if Santa Claus is selling the personal information of California consumers to third parties, it is acting as a data broker and as such has failed to register with the Department of Justice as required by Civil Code § 1798.99.82.

To the extent Santa Claus is selling the personal information of minors to third parties, additional violations of the CCPA may have taken place. The Department of Justice reserves the right to revise its charges once there is compliance with the subpoena.

It should be noted that there is an allegation by the members of SCW that they are operating exclusively for and under the control of Santa Claus and as such are employees of Santa Claus per Assembly Bill 5 (Gonzalez). (See: “Potential Labor Law Violations”, below)

Denial of goods or services. Upon information and belief, Santa Claus may be engaged in a pattern of discrimination against California consumers who have not attested to being “nice” in their Christmas Lists as noted above. If Santa Claus is discriminating against California consumers because they have exercised their right not to disclose personal information, this may be a violation of Civil Code § 1798.125.

Potential labor law violations. Upon information and belief, SCW may not be a bona fide business, as that term is used in Labor Code § 2750.3(e). As such, per the Supreme Court’s decision in Dynamex Operations West, Inc. v. Superior Court of Los Angeles (2018) 4 Cal.5th 903, and Assembly Bill 5 (Gonzalez), the Division of Labor Standards Enforcement (DLSE) has opened a concurrent investigation of Santa Claus for possible wage and hour violations and failure to maintain workers’ compensation insurance.

See also: How CCPA Will—and Won’t—Hit Insurance  

California takes its consumers’ privacy seriously, as it does the violations of its laws protecting workers. Potential penalties under the CCPA, if not cured, could reach $2,500 per violation. Because each California resident has had its personal information collected by Santa Claus, total penalties could be as high as $100 billion, assuming no violations were intentional.

We will keep you informed as events develop.

Where to Turn for Cyber Assistance?

Virtually every company owns, licenses or maintains personal information and other sensitive data. Until recently, companies were not legally required to implement cybersecurity policies, procedures or controls specifically designed to protect personal and other sensitive information. Some companies might even have decided not to comply because of perceived high implementation costs and complex operational changes. However, recent expansions in a number of laws have changed this dynamic. Across the U.S., state regulations are being promulgated that require companies to implement and maintain a reasonable level of cybersecurity controls. Some of these laws provide for significant penalties in cases of non-compliance. As companies begin to take steps toward compliance with these regulations, one significant source of assistance is the cyber insurance market.

New Privacy and Cybersecurity Regulations

As of the writing of this article, at least 25 state laws impose obligations on their corporate citizens to have reasonable data and information security practices to protect sensitive data from unauthorized disclosure. Some laws go even further and prescribe specific standards that corporate citizens must follow to protect the privacy rights of those states’ residents.

Two of the most stringent regulations currently in effect are in New York and Massachusetts, while a third, which may very well be the most stringent regulation, becomes effective in California on Jan. 1, 2020.

New York state’s recently passed, “Stop Hacks and Improve Electronic Data Security Act,” or SHIELD Act, applies to businesses that maintain private information of New York residents, regardless of whether such entities actually conduct business within New York. SHIELD requires covered entities to implement “reasonable safeguards,” taking into account administrative, technical and physical safeguards such as training, risk assessments, regular testing of key controls and procedures and the disposal of private information within a reasonable time after it is no longer needed. Similar requirements exist in Massachusetts, Ohio, Oregon and Vermont.

See also: Hidden Dangers for Cybersecurity  

SHIELD also allows for possible fines for violations of the notification requirements up to $250,000. Notably, the imposition of the “reasonable safeguards” requirements brings the new law closer to New York’s 2017 Department of Financial Services’ Cybersecurity Regulation, which prescribes holistic security measures applicable to a broad swath of financial services companies operating under New York’s banking, insurance and financial services laws.

In addition, many of New York’s small and medium-sized businesses in industries unaccustomed to the regulations applicable to the financial sector will now be required to address their security measures and implement controls, including risk assessments, to protect sensitive information and systems from unauthorized use or access.

Massachusetts’ current regulation, 201 CMR 17.00, et seq., establishes minimum standards to be met in connection with the safeguarding of personal information contained in both paper and electronic records. The regulations apply to all persons who own or license personal information about a resident. 201 CMR 17.03 and 17.04 impose obligations on covered entities to implement prescribed safeguards, including (this is a small sample from the list set forth in the statute):

  • A comprehensive, written information security program that contains administrative, technical and physical safeguards.
  • Designation of one or more employees to maintain the information security program.
  • Identification and assessment of reasonably foreseeable internal and external risks, and evaluation of the effectiveness of current safeguards for limiting such risks.
  • Continuing employee education and training.
  • Measures to oversee third-party service providers, including requiring such vendors by contract to implement and maintain appropriate security measures for personal information.
  • Password management and controls.

California’s much discussed California Consumer Privacy Act, see oag.ca.gov/privacy/ccpa, Assembly Bill 375 (CCPA) becomes effective on Jan. 1, 2020. The CCPA is a significant expansion of privacy rights granted to consumers and shifts the burden of compliance regarding consent and the collection, use, storage and destruction of personal data to businesses. The CCPA’s expansive requirements placed on businesses relative to the collection and use of personal data bring California much closer in line with the privacy culture of the European Union and its GDPR.

While California, Massachusetts and New York have been at the forefront of imposing affirmative obligations on businesses requiring implementation of comprehensive cybersecurity policies and protocols, more than 25 states in total have laws that address data security practices of private-sector entities. Most of these data security laws require businesses that own, license or maintain personal information concerning a resident of that state (in several cases, including even entities who maintain such personal information but who are not doing business in the state) to implement and maintain “reasonable security procedures and practices,” taking into account the size and resources of the entity and the nature of the information it holds.

Accordingly, businesses with personal information in their systems – for example, unencrypted combinations of names with Social Security numbers, driver’s license numbers, account numbers, PINs, biometrics, etc. – are required, by law, to adopt, implement, maintain and regularly update their information security programs. The conjunction of new obligations imposed by states like Massachusetts and New York, with the expansion in privacy rights granted to California residents in the CCPA, leaves companies with little choice but to take cyber security and corporate privacy with the utmost seriousness and spend time and resources in advance of any type of occurrence.

The Role of Cyber Insurers

It is clear now that instituting and managing cybersecurity protections are no longer merely options for companies in all industry sectors. One key resource for any business in developing and deploying a cybersecurity program is its cyber insurer.

In today’s environment, businesses of all sizes should purchase cyber insurance. Cyber insurance is designed to cover numerous risks associated with both privacy and technology. Moreover, cyber insurance is there to respond after an incident occurs.

See also: It’s Time for the Cyber 101 Discussion  

Most traditional insurance policies do not cover measures like those contemplated by the various regulations. However, a few cyber insurers provide value-added risk control services along with free or reduced-cost access to cyber security vendors that can help an insured through this process. These services are available to the policyholder upon binding coverage with the insurer in advance of any actual cyber occurrence.

CNA, for example, recently launched a suite of cyber security services. CNA CyberPrep is a program of cyber risk services designed to aid cyber policyholders in cyber threat identification, mitigation and response.

  • Identifying cybersecurity posture is a critical beginning. Services include detailed analyses from a network of free or reduced-cost cybersecurity experts, reports that provide a snapshot of policyholder security posture and numerous recommendations for improvement.
  • Working in collaboration with their broker, CNA Risk Control, and Cyber Underwriting, policyholders execute the cybersecurity experts’ recommendations to mitigate their cyber risks and improve their cybersecurity posture.
  • CNA CyberPrep continues to benefit policyholders. In the event of a cyber breach, CNA’s panel of experienced incident response vendors provide guidance and strategies to help expedite recovery and minimize loss.

Offering these types of services is in the best interests of both insurers and policyholders, as an ounce of prevention is worth a pound of cure. Also, given the regulatory developments across the U.S., companies should, more than ever, take advantage of their cyber insurer’s ability to help facilitate this process.

Simply put, companies can no longer fail to implement cybersecurity policies, procedures and controls. Failure to comply with the state and federal laws and regulations can easily result in substantial fines and mandated corrective action. Some states also permit individuals to sue when failures result in loss of their personal information, potentially resulting in treble damages under unfair competition laws.

Given the staggering costs of non-compliance, businesses must implement appropriate cybersecurity policies and procedures, keep them current, train their employees and use the latest technical protections. When selecting your cyber insurer, look for one that can help supply the requisite resources for a comprehensive – and compliant – cybersecurity program.

In Race to AI, Who Guards Our Privacy?

Way back in 1975, geochemist Dr. Wallace Broecker of Columbia University published his article “Climatic Change: Are We on the Brink of a Pronounced Global Warming?” Today, almost 45 years later, the debate has intensified but still rages, even as some believe the clock is running out. The U.N. Intergovernmental Panel on Climate Change warns that we have only 11 years to limit the chances of a climate change catastrophe.

I see very strong parallels between Dr. Broecker’s warnings and those related to our loss of personal data privacy. Society is facing the threat of climate change, which some experts say will reach a tipping point; we may be reaching a similar tipping point with privacy and cyber security.

In their paper presented at the 1965 Fall Joint Computer Conference titled “Some Thoughts About the Social Implications of Accessible Computing,” E. E. David, Jr. of Bell Labs and R. M. Fano of MIT warned that “the same technology which has given us new dimensions in communication has been used to implement eavesdropping equipment.” They went on to say that “the very power of advanced computer systems makes them a serious threat to the privacy of the individual”.

See also: Untapped Potential of Artificial Intelligence  

Just as we continued to contribute to climate change, we continue to surrender personal privacy in exchange for the lure of instant gratification delivered through simple, easily accessible technologies.

Insurance Industry Opportunity

The insurance industry is uniquely positioned to take the lead in safeguarding data privacy; few other industries have the same depth and breadth of personal information or the same level of dependency on the trust and loyalty of their customers.

Many insurers of property, life and health, along with numerous supply chain intermediaries, are employing a wide range of connected digital technologies to gather individual data and store, analyze and use it to train AI and use it to offer new, different and attractive products and services. And, as of now, there is no easy way for customers to reclaim their data. People may consciously understand the trade-offs of using digital services, but few understand how extensively their data is captured, used and shared. And that data exists in digital form and therefore virtually forever, most certainly long after we are gone.

Without applicable data laws, we’re left with a decentralized patchwork system, devoid of human control. Privacy concerns are surfacing almost daily now, but successful, high-profile applications of analytics are drowning out the cautionary voices. Facial recognition, which is not unlike taking your fingerprints without your permission, is being used by China to keep track of all of their citizens and has been deployed by law enforcement agencies all over the world.

Too Little, Too Late

In a relatively small victory for opponents of this rapid adoption, San Francisco recently became the first U.S. city to ban the use of facial recognition by local agencies. And California’s tough new law, the California Consumer Privacy Act, which takes effect in January 2020, will significantly limit how companies handle, store and use consumer data. The law will require businesses to be more transparent, give consumers the ability to delete and download collected data and give them the chance to opt out of the sale of their information. Still, according to a new survey by TrustArc, most companies still aren’t ready to comply.

See also: 3 Steps to Demystify Artificial Intelligence  

Elsewhere, the European Union’s General Data Protection Regulation (GDPR), a set of new privacy laws, went into effect in May 2018. And Hawaii, Massachusetts and Washington are all considering their own state privacy laws, while Brazil passed its own regulations, which will take effect in 2020.

Insurance Industry Call To Action

What we really need, however, is a standardized, global set of rules and regulations on the permissible uses of personal data and a process governing and enforcing them. The global insurance industry would gain much by taking the lead in this effort – and sooner than later.

Mobile Apps and the State of Privacy

Mobile applications or mobile apps or just plain apps are software programs designed and developed to run on a mobile device.

Mobile apps can be downloaded and accessed directly by users using their smartphone; tablet; mobile phone; PDA; etc., and they can be downloaded by one or more of the following ways:

  • Via the mobile operating system owner’s online app store or the internet (e.g. the Apple Store);
  • Preloaded by your internet provider.

Some apps are “free” – meaning they are not purchased with real money by the user but funded by advertisers (whose ads dominate and sometimes interfere with the use of the app), while other apps must be purchased with real money by the user

According to Ericsson, as of March 2018 there were 7.9 billion mobile device subscriptions worldwide. There were 98 million new subscriptions during the first quarter of 2018. Mobile application subscriptions associated with smartphones now account for 68% of all mobile phone subscriptions. That number exceeds the population in many countries. It is estimated that by 2020 almost 75% of the global population will be connected by mobile. Much of this growth will come from Asia, and in particular China, which will account for almost half of app users in 2020 (source: Ericsson.com – Mobility Report, June 2018).

This rise in mobile use, and the ever-increasing departure by marketers from traditional marketing to selling brands and products through mobile applications, has led to developments in technology that will continue to transform how the world communicates.

So, if you use a smartphone or other mobile device to access the nternet, chances are you have downloaded, or your mobile device came with pre-loaded, mobile apps that you are accessing and using for many of your online activities instead of just an internet browser.

There are hundreds of thousands of apps available. They are easy to download and extremely convenient. These mobile apps allow users to:

  • Access and read the news/books
  • Play games
  • Stream music
  • Take photos
  • Watch videos
  • Monitor their heart rate
  • Work out with a fitness regime
  • Get directions and maps
  • Find a nearby restaurant
  • Get the weather report
  • Pay for purchases on the spot
  • And a whole bunch more

Awesome, yes.

But…

Along with the exciting capabilities mobile apps offer, it is prudent to keep in mind that with the expanding functionality that mobile apps provide when integrated into mobile devices, the online worldwide privacy risks and the concern of how to protect the user’s (your) privacy increases.

Why?

Because mobile apps can collect all sorts of data and transmit it to:

  • The app developer;
  • The app store;
  • The internet provider;
  • The platform owner of the mobile device operating system; and
  • Third-party advertisers or an ad network

Some apps access only the data they need to function; others access data that’s not related to the purpose of the app.

The bottom line is: This data being collected from you, including your personal and private information, may then be shared or sold by these entities in their sole discretion to other companies or entities around the world and oftentimes without the user’s (your) permission or knowledge.

A case in point: In FTC (Federal Trade Commission) vs. Frostwire LLC), the FTC sued the developer of a peer-to-peer file sharing mobile app. The complaint alleged that the app’s default settings were configured so that, immediately on a user’s installing and setting up the app on a mobile device, it would publicly share files stored on that device. According to the FTC complaint, the default settings were likely to cause users to unwittingly disclose personal files stored on their mobile devices. Among other things, the settlement:

  • Bars the company from using default settings that share users’ files.
  • Requires the app to provide clear and prominent disclosures about file sharing and how to disable it.

The question then inevitably becomes:

How private and secure is your private and personal information when accessing and using a mobile application that is now integrated within your mobile device(s)?

This article is intended to explore and answer this question from the perspective of the risks to your (the user’s) private and personal information in the access and use of mobile apps, as well as recommendations on how to manage these risks.

Onward!

See also: Do Health Apps Threaten Privacy?  

Using Mobile Apps

When you directly download and install an app, or your internet service provider pre-downloads and installs an app or applications you decide to activate on your mobile device, you are instantly allowing that app or applications to access data stored on your smartphone or other mobile device.

The app’s access to your data could be limited, or it could be an app capable of accessing large amounts of information, including:

  • Your personal and private information
  • Information on and of your friends and associates
  • Family photos and videos
  • Your phone and email contacts
  • Call logs
  • Internet data
  • Calendar data
  • Health data
  • Data about the device’s location
  • The device’s unique IDs
  • Information about how you use the app itself
  • Your web browsing history, etc. that is stored on your mobile device

So before you download an app or use a pre-loaded app it may be wise to understand at a minimum:

  • What of your data the app is going to collect
  • How it stores your data
  • Where and what other devices or entities is your data going to be shared with

To get the answers is easy, right? You just go to the app’s privacy policy.

Yet, the reality is that it is foolish to assume that any data is private in the mobile app world, or that the mobile app world has taken the responsibility to protect a user’s right of privacy seriously, because almost all mobile apps do not have privacy policies. Are you shocked to learn such a fact? I certainly was!

So why don’t the majority of apps have privacy policies? Because:

  • Most developers think it is technically too complicated and time-consuming as they rush to develop apps; or
  • Some developers are focused on getting new products to market to meet a deadline at the behest of an organization, and adequate consideration of privacy and security is not a priority, if at all; and
  • There is a belief among some developers and organizations that no one, (e.g. the user or the FTC or the courts), is really enforcing the laws governing privacy in the mobile world.

At this point, I believe it is worth noting again: These apps collect and store a tremendous amount of information. Even apps that appear to ask for permissions during installation can become a back door to your mobile devices and your private and personal information, along with that of your friends and family.

So, what does this mean for organizations (as well as the developers) of the apps they offer?

Well, first and foremost, for organizations (and developers) to dismiss the safeguarding of a user’s privacy whether technically, legally or morally in the interest of following the money, suggests a failure of transparency to the user in how those organizations collect, use and share personal and private information.

So what can be done to address this concern?

As a start, certain attorneys general and legislators in certain states in the U.S. have started to advocate and support new laws as well as to enforce current laws governing privacy in the mobile world.

So let’s take a moment to discuss some states’ actions:

California has long been a leader in privacy legislation to ensure that cutting-edge innovations, inclusive of mobile apps, are developed responsibly to protect users’ private and personal information.

To that end, In 2004, California enacted the California Online Privacy Protection Act (CalOPPA) requiring commercial operators of websites and online services, inside and outside of California, to conspicuously post clear, detailed privacy policies to promote transparency, be reasonably accessible to consumers of the online service and enable consumers to understand how companies collect, use and share personal information and those third parties with which they share that information.

One of the principles agreed on is to make mobile apps’ privacy policies available to users on the app platform before they download the app. This will give them the opportunity to either opt-in or opt-out before they download or activate the app, as opposed to having no real choice after the fact.

If developers and companies do not comply within 30 days after being notified of noncompliance, they can be prosecuted under California’s Unfair Competition Law or False Advertising Law.

For example:

The attorney general considered any service available over the internet or that connects to the internet, including mobile apps, to be an “online service.” Based on this interpretation, letters were sent to up to 100 non-compliant apps at the time, starting with those available for mobile users that were the most popular. The companies were given 30 days to conspicuously post a privacy policy within their app that informed users of what personally identifiable information about them was being collected and what would be done with that private information.

Delta Airlines was among the recipients of this letter. In December 2012, the attorney general of California, Kamala D. Harris, announced the first legal action under California’s online privacy law against Delta Airlines, for failing to comply with the 30-day notice letter to conspicuously post a privacy policy within the mobile app “Fly Delta.”

The suit sought to enjoin Delta from distributing its app without a privacy policy and penalties of up to $2,500 for each violation. The suit was filed in the San Francisco Superior Court.

It is no secret that California is currently unique in applying its privacy law to mobile apps, and many states look to California as a leader in this area. It is anticipated that more dedicated state laws will be forthcoming based on these actions.

But it is not just states in the U.S. that are concerned about mobile app privacy. This concern reaches across the pond. It is, therefore, important to note the actions of other countries, as well.

See also: Blockchain, Privacy and Regulation  

The European Union

The ePrivacy directive (2002/58/EC, as revised by 2009/136/EC) sets specific standards for all parties worldwide that wish to access and store information already stored in the mobile devices of users located in the European Economic Area.

The most important of the standards in regard to developing for mobile platforms is article 5(3) stating that the storing of information, or the gaining of access to information already stored, in the terminal equipment of a subscriber or user is only allowed on condition that the subscriber or user concerned has given his or her consent. This consent needs to be based on the user, having been provided with clear and comprehensive information by the mobile platform, in accordance with Directive 95/46/EC. For example: a clear explanation of the purposes for which the mobile platform is processing and storing the user’s information.

So the bottom line is this: It is important for organizations and app developers to know that these directives are imperative laws in that the individual’s rights are non-transferable and not subject to contractual waiver. This means that the applicability of European privacy law cannot be excluded by a unilateral declaration or contractual agreement.

As a result, the mobile app developer or organization must:

Provide a readable, understandable and easily accessible privacy policy, which at a minimum informs users about:

  • Who they are (identity and contact details)
  • What precise categories of personal data the app wants to collect and process
  • Why the data processing is necessary (for what precise purposes)
  • Whether data will be disclosed to third parties (not just a generic but a specific description to whom the data will be disclosed)
  • What rights users have, in terms of withdrawal of consent and deletion of data

Note: Similar laws exist in other countries as well with slight modifications. It may be of interest to you to read in their entirety such similar laws, particularly your own country’s law.

Multiplying the Risks

The online worldwide privacy risks associated with the use of mobile devices increases with the use of mobile applications, not only because of the lack of privacy policies and transparency associated with the applications, but because mobile apps have their own unique set of challenges for the user who cares about mobile privacy, such as:

  • Mobile devices hold personal information for a long time by design. In other words, nothing is ever erased. This information is provided and accessed by the developer as he/she designs the mobile app and then disseminates it to the world. For example: If an organization requests or pays for a developer to develop an app, the organization provides the developer access to the user information stored on the mobile device or devices to which the app will be downloaded. That information is then stored in the new app for dissemination to the world.
  • Encrypting information is not foolproof to protect privacy, as encryption on both the Android and iPhone can be broken with minimal effort. In addition, it is not that difficult to extract data from a passcode-protected device. In other words: Never underestimate a hacker.
  • Mobile app developers rely on and use hardware device identifiers (hardware IDs) to track users and to enable:
    • Their apps’ functionality
    • Content
    • Advertising providers to track users across many mobile apps

It’s important to understand the key difference between hardware IDs and identifiers associated with social media platforms’ browser cookies.

The key difference between hardware IDs and identifiers associated with website browser cookies is that hardware IDs are permanently associated with the device. By deleting cookies and local shared objects, an end user can typically prevent a certain amount of tracking and retain some degree of anonymity from third parties. Each time the third party’s servers connect with the end user, the third party must set new, different, unique identifiers.

However, in the mobile app context, even if a user deletes the app, clears all web content, wipes all storage and restores factory defaults, the hardware ID remains unchanged. Third parties that have tracked the end user’s network traffic and stored that information can still associate it with the end user’s device.

In other words, hardware IDs are unique and permanent identification numbers, or character strings, associated with a device, and they can practically not be deleted or reset by a user.

As a result, even if a user deletes the app, clears all web content, wipes all storage and restores factory defaults on their browser, the hardware ID remains intact. Third parties that have tracked the user’s network traffic and stored that information can still associate it with the user’s device and identify that mobile device for the life of the device. This has prompted objections from privacy advocates regarding the use of hardware IDs for tracking purposes.

Types of hardware IDs include:

  • Cell phone radio (mobile equipment identifier (MEID))
  • International mobile station equipment identity (IMEI)
  • Wi-Fi radio (media access control (MAC)) address
  • Bluetooth radio identifier
  • Platform-specific identifiers (e.g. Apple’s unique device Identifier (UDID). Note: although Apple prohibits its developers from accessing UDID, in an analysis conducted by Appthority in 2013, 5.5% of the tested iOS apps were accessing it anyway.
  • Integration of apps with social media platforms, giving them even more of a user’s private and personal information. For example: Facebook, in response to the pressure from its stakeholders to make more revenue via mobile advertising, is streaming advertisers’ ads via mobile applications that also allow them to leverage the Facebook Connect feature, which invites users to sign into numerous apps and websites using their Facebook identity. This provides Facebook and its advertisers with the ability to monitor the actions that users take in all such apps, which in turn has potentially many monetarily satisfying commercial opportunities for Facebook, its partners and advertisers. These mobile ads are getting more and more aggressive, such as accessing and transmitting personal information and changing phone settings without user consent (reference: Lookout-a mobile security firm). Even if a developer is cognizant of the importance in providing users with a privacy policy that actually protects their private and personal information and does so, such a policy is often long and difficult to read on devices with smaller screens. (Try reading the Apple Store privacy policy on your mobile device).

Other Risks

Wow! After that litany of unique risks, it may seem difficult for some of our readers to believe there are other risks a user needs to be aware of – but there are.

For instance: children and mobile applications.

  • The apps collect personal information
  • The apps let children spend real money even if the app was free.  For example: The game Robolox is free. It also allows the user to enhance one’s character in the game by “purchasing” various add-ons by using points earned during the game (i.e.: swords, helmets, the Phoenix, etc.). However, if you do not have enough points, you can use real money (usually from mom or dad’s credit card) to buy the points you are lacking to purchase the coveted add-ons.
  • Apps include ads (which is extremely annoying to most children – and – raises the question: Is there any violation of the Children’s Online Privacy Protection Act (COPPA) as amended effective July 2013 to include the mobile app space).
  • Apps link children to social media web services without the parental notice and consent COPPA requires. (reference: see Children’s Online Privacy and Apps section of COPPA 16 C.F.R Part 312), and
  • Surprise of surprises, the apps most likely will not tell you they are transferring data (how can they when most of them choose not to be transparent with the user?)

The point is: Mobile applications can pose significant privacy risks for organizations, their customers/clients and individuals worldwide if they are not made aware of how their personal and private data is used.

So how can you, as the user of these apps that organizations provide you to download or you buy directly from developers (such as Rovio, which is the developer of Angry Birds) manage the risks threatening your mobile app privacy?

Well, the truth of the matter is: There is no easy way to know what data a specific app will access or how it will be used.

However, if possible:

Before you download or access and activate a pre-loaded app, find out who created the app and for what purpose; look at screen shots; read the description, content rating and any users’ reviews. .

In other words: Do your due diligence, and only access and use apps from trusted sources.

Managing the risks of how an app stores your data (as an individual or an organization)

For mobile apps, as well as social media platforms, user data can be stored remotely on servers on the web. However:

In the social media platform or website context, most user data stored locally is stored centrally in browser files, while in the mobile app environment it is stored locally by each app.

Therefore, your information stored in a mobile app is not centrally located but is splintered and app-specific, making it more difficult if not impossible for users to know how much of their data is stored in each app and disseminated externally to third parties.

Additionally, mobile apps generally do not provide tools to the user to:

  • Access local storage to review what the app has stored of the user’s information; or
  • Manage the content of the information stored

The foregoing is another way of reinforcing that, as a rule, realistically and practically users do not have any control or access to their data that is stored on a mobile app, This lack of control includes access to manage the use of their personal and private data or any other part of their data for that matter.

Don’t provide your credit/ATM card information

Some mobile payment acceptance applications that are marketed and sold to retailers, airports, etc. for processing of credit/debit card information will store such information on the user’s mobile device if there is no internet connection available at the time and then send it when a network connection can be made.

The point? Any time data lingers on a device, even if encrypted, there is a higher risk of that data being compromised (need we say “Target”?).

Currently, a user has no means to manage this risk except to not provide this information.

See also: Wearable Tech Raises Privacy Concerns  

On the other hand:

To manage your risk for those mobile payment acceptance applications you have on your own mobile device, check to see if your payment acceptance application has a “store and forward” feature, and, if it does, turn it off.

  • Location information. Many apps track your location There are location-based mobile application services like Yelp and Foursquare that need your location to function properly. However, there are also apps (such as a simple flashlight) that do not need your location to function and yet still track it.
  • Some apps provide location data to ad networks, which may combine it with other information in their database to target ads based on your interest and your location
  • Once an app has your permission to access your location data, it can do so until you change the settings on your phone
  • However, if you don’t want to share your location, you can turn off location services in your phone’s setting. The downside is even if you turn off location services it may not be possible to completely stop the app from broadcasting your location data.

Bottom line: Now that you have the information, use it wisely in making your decision to download or activate a pre-loaded app that will provide specific location data

Managing where and what other devices or entities your data is going to be shared with

Users should not assume any of their data is private in the mobile app world or that the mobile app world has taken the responsibility to protect your right of privacy seriously.

For instance: Many apps send users data via unencrypted connections that potentially expose users’ personal and private data to everyone on a worldwide network without the user’s knowledge or permission.

The lesson, therefore, in how to manage the risk of a mobile application violating the privacy rights of an organization as well as its customers/clients and the individual user is to understand that currently there is little or no privacy protection for users of mobile applications, and based on that understanding, as well as doing your due diligence, make your decision as to whether to access and use an app accordingly.

Follow (or I will be writing about them, as well) the developments of:

  • The Federal Trade Commission’s increasing focus on the subject of mobile app privacy or lack of same to determine FTC’s regulation and enforcement.
  • The multi-stakeholder process facilitated by the National Telecommunications and Information Administration to develop an enforceable code of conduct on mobile app transparency.
  • The implementation of the recommendations of Kamala Harris in her white paper “Privacy on the Go,” describing an approach for developers and other players (like the mainstream social media platforms, which provide the user information to the developers) in the mobile app world to consider when designing the app.
  • State/country legislative and enforcement actions to achieve privacy controls that allow users to make, review and change their privacy choices based on widely accepted fair information practice principles (FIPPs) that form the basis for many privacy codes and laws in different parts of the world.

Takeaway

Users care about mobile privacy, and, yes, they do find value in mobile apps. They are also eager to try them as they are released (as opposed to waiting for several versions to have been tested first).

However, as Harris said: “Losing your personal privacy should not be the cost of using mobile apps, but all too often it is. Users of those apps deserve to know [and have the ability to control] what is being done with their personal information.”

I would submit to you that It should now be clear that the risks to one’s personal information is substantial when using mobile apps and that these risks are good enough reasons as to why a developer or the organization that engages the developer as well as other stakeholders in the world of mobile apps should first and foremost begin with the mindset of worldwide privacy and security of a users’ personal data in the initial design of any mobile app.

What GDPR Means for Insurtech

After Solvency II, the European Union is ready for its next big and comprehensive regulation, called GDPR (General Data Protection Regulation). GDPR was approved by the EU Parliament in April 2016 and after a two-year grace period took effect in May 2018! The new regulation will replace the current Data Protection Directive 95/46/EC.

Regulatory Landscape and Breaches

The first key point of the new regulation is protecting all E.U. citizens’ data privacy with an extended regulatory landscape. New data privacy rules should be applied to all personal data of data subjects residing in the European Union, regardless of companies’ locations.

With GPPR, fines for possible breaches were increased sharply, up to 4% of annual global revenue or 20 million euro (whichever is greater). Another radical change is that regulations apply to not just controllers, but also processors. So, cloud processors are also covered. Under GDPR, the data owner must give consent through a document that is understandable, simple and easily accessible. Withdrawal of consent for data usage must also be easy.

With GDPR, breach notification will become mandatory and should be performed within 72 hours after the breach is spotted. Notifications must be to all affected data owners.

The Key Point for Insurtech

These changes are key for insurtech. Data security and privacy had seemed to be key concerns that would hold back insurtech, because of the dangers created by the increased use of connected IoT devices, real-time data collection and high profile cyberattacks. But customers will be much more comfortable with insurtech because GDPR will alleviate concerns about data privacy, without regard to a company’s scale. With GDPR, drivers of insurtech like IoT, machine learning and much more won’t be considered as possible tools for data breaches. GDPR will be a spontaneous trigger of insurtech!