Your SlideShare is downloading. ×
0
Avoid Privacy by Disaster by Adopting Privacy by Design
Avoid Privacy by Disaster by Adopting Privacy by Design
Avoid Privacy by Disaster by Adopting Privacy by Design
Avoid Privacy by Disaster by Adopting Privacy by Design
Avoid Privacy by Disaster by Adopting Privacy by Design
Avoid Privacy by Disaster by Adopting Privacy by Design
Avoid Privacy by Disaster by Adopting Privacy by Design
Avoid Privacy by Disaster by Adopting Privacy by Design
Avoid Privacy by Disaster by Adopting Privacy by Design
Avoid Privacy by Disaster by Adopting Privacy by Design
Avoid Privacy by Disaster by Adopting Privacy by Design
Avoid Privacy by Disaster by Adopting Privacy by Design
Avoid Privacy by Disaster by Adopting Privacy by Design
Avoid Privacy by Disaster by Adopting Privacy by Design
Avoid Privacy by Disaster by Adopting Privacy by Design
Avoid Privacy by Disaster by Adopting Privacy by Design
Avoid Privacy by Disaster by Adopting Privacy by Design
Avoid Privacy by Disaster by Adopting Privacy by Design
Avoid Privacy by Disaster by Adopting Privacy by Design
Avoid Privacy by Disaster by Adopting Privacy by Design
Avoid Privacy by Disaster by Adopting Privacy by Design
Avoid Privacy by Disaster by Adopting Privacy by Design
Avoid Privacy by Disaster by Adopting Privacy by Design
Avoid Privacy by Disaster by Adopting Privacy by Design
Avoid Privacy by Disaster by Adopting Privacy by Design
Avoid Privacy by Disaster by Adopting Privacy by Design
Avoid Privacy by Disaster by Adopting Privacy by Design
Avoid Privacy by Disaster by Adopting Privacy by Design
Avoid Privacy by Disaster by Adopting Privacy by Design
Avoid Privacy by Disaster by Adopting Privacy by Design
Avoid Privacy by Disaster by Adopting Privacy by Design
Avoid Privacy by Disaster by Adopting Privacy by Design
Avoid Privacy by Disaster by Adopting Privacy by Design
Avoid Privacy by Disaster by Adopting Privacy by Design
Avoid Privacy by Disaster by Adopting Privacy by Design
Avoid Privacy by Disaster by Adopting Privacy by Design
Avoid Privacy by Disaster by Adopting Privacy by Design
Avoid Privacy by Disaster by Adopting Privacy by Design
Avoid Privacy by Disaster by Adopting Privacy by Design
Avoid Privacy by Disaster by Adopting Privacy by Design
Avoid Privacy by Disaster by Adopting Privacy by Design
Avoid Privacy by Disaster by Adopting Privacy by Design
Avoid Privacy by Disaster by Adopting Privacy by Design
Avoid Privacy by Disaster by Adopting Privacy by Design
Avoid Privacy by Disaster by Adopting Privacy by Design
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Avoid Privacy by Disaster by Adopting Privacy by Design

1,515

Published on

A presentation by Commissioner Cavoukian to Cancer Care Ontario on how adopting Privacy by Design can avoid Privacy by Disaster.

A presentation by Commissioner Cavoukian to Cancer Care Ontario on how adopting Privacy by Design can avoid Privacy by Disaster.

Published in: Education
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
1,515
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
0
Comments
0
Likes
1
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide
  • Presentation Outline
  • Vital Need to Protect Personal Health Information
  • Unique Characteristics of PHI
  • Why Protecting PHI is So Critical
  • Importance of Health Research
  • Big Data
  • The Case for Health Research and Analysis
  • Recognition of Value of Health Research
  • Consequences of Inadequate Attention to Privacy For example, section 29 permits PHI to be collected, used and disclosed for research purposes with consent , unless obtaining consent is impractical, in which section 44 permits its collection, use and disclosure without consent , provided that appropriate safeguards are in place such as a research plan and research ethics board approval As another example, section 39(1)(c) and section 45 of PHIPA permit health care providers to disclose PHI to prescribed registries (such as the Cardiac Care Network) to facilitate the provision of health care and to prescribed entities (such as ICES) for analysis of the health care system. In addition, section 37(1) of PHIPA permits PHI to be used for activities to improve or maintain the quality of care provided and to plan, evaluate or allocate resources to the programs and services provided
  • Consequences of Inadequate Attention to Privacy
  • IPC Investigation Powers
  • Statutory/Common Law Actions
  • Cost of Legal Liabilities In Jones v. Tsige , the Ontario Court of Appeal recognized a new common law action for “intrusion upon seclusion” The case involved a woman by the name of Winnie Tsige who formed a common-law relationship with the former husband of Sandra Jones. Both women worked for the Bank of Montreal but at different branches. Winnie Tsige accessed the bank records of Sandra Jones at least 174 times over a period of four years There are three required elements of the cause of action: Intentional or reckless conduct by the defendant The defendant must have invaded, without lawful justification, the plaintiff’s private affairs or concerns (T he Court of Appeal stated that intrusions into health records are highly offensive) A reasonable person would regard the invasion as highly offensive causing distress, humiliation or anguish The Court of Appeal stressed that such an action will only arise for deliberate and significant invasions of privacy Proof of actual loss is not required However, due to the intangible nature of the interest, damages will “ordinarily be measured by a modest conventional sum” – generally to a maximum of $20,000
  • Cost of Privacy Breaches A class action is also expected in Ontario after seven staff at Peterborough Regional Health Centre inappropriately accessed the records of 280 individuals. On October 11, 2012, a group of residents whose personal health information was inappropriately accessed by seven staff at Peterborough Regional Health Centre have retained legal counsel to start a class-action lawsuit over the breach of privacy affecting more than 280 patients at Peterborough Regional Health Centre. These s even staff members, who inappropriately accessed records, were terminated by the hospital
  • Common Causes of Privacy Breaches
  • Insecure Disposal of Records
  • Order HO-001
  • Order HO-006
  • Mobile and Portable Devices
  • Mobile and Portable Devices
  • Unauthorized Access to Records
  • Orders HO-002 and HO-010
  • Minimizing the Risk of Privacy Breaches Examples of Unauthorized Access in Other Jurisdictions In 2011, a pharmacist in Alberta was fined $15,000 for accessing the information of 11 women who attended her church and posting prescription information on Facebook; In 2011, a physician in Alberta accessed the information of his partner's former spouse and the mother and girlfriend of the former spouse for a divorce and custody dispute;  In 2012, a clerk at Western Health in Newfoundland inappropriately accessed information of over 1,000 people; In 2012, Eastern Health in Newfoundland terminated five employees and suspended six others for unauthorized access to the information of 122 individuals.
  • Privacy by Design
  • Privacy by Design – 7 Foundational Principles
  • Jerusalem Resolution
  • Operationalizing PbD
  • Build a Culture of Privacy
  • Data Minimization
  • Dispelling the Myths About De-Identification
  • Data De-Identification Tool Canada Research Chair in Electronic Health Information CHEO Research Institute and University of Ottawa Dr. Khaled El Emam is an Associate Professor at the University of Ottawa, Faculty of Medicine, a senior investigator at the Children’s Hospital of Eastern Ontario Research Institute, and a Canada Research Chair in Electronic Health Information at the University of Ottawa. His main area of research is developing techniques for health data anonymization and secure disease surveillance for public health purposes. Previously Khaled was a Senior Research Officer at the National Research Council of Canada, and prior to that he was head of the Quantitative Methods Group at the Fraunhofer Institute in Kaiserslautern, Germany. He has co-founded two companies to commercialize the results of his research work. In 2003 and 2004, he was ranked as the top systems and software engineering scholar worldwide by the Journal of Systems and Software based on his research on measurement and quality evaluation and improvement, and ranked second in 2002 and 2005. He holds a Ph.D. from the Department of Electrical and Electronics, King’s College, at the University of London (UK).
  • Evidence that the Tool Works Publication issued March 30, 2010
  • Evidence that Re-Identification is Extremely Difficult This information was taken from the article De-identification Methods for Open Health Data: The Case of the Heritage Health Prize Claims Dataset found at J Med Internet Res 2012;14(1):e33 by Khaled El Emam et al. In April 2011 the Heritage Provider Network launched the Heritage Health Prize – a competition to construct a model to predict the number of days a patient will be hospitalized in the following year, by using current and previous years’ claims data contained in a data set that was de-identified using Dr. El Emam's de-identification tool. Dr. El Emam de-identified the data set in order to make the data publicly available, while meeting the requirements of the HIPAA Privacy Rule. Unbeknownst to Dr. El Emam, the Heritage Provider Network retained an expert to attempt to re-identify the data in the data set that was de-identified using Dr. Elmam’s de-identification tool. The expert was unsuccessful.
  • Data Minimization for Record Linkages This information was taken from the article A Systematic Review of Re-Identification Attacks on Health Data by Khaled El Emam et al. Some recent articles in the medical, legal, and computer science literature have argued that de-identification methods do not provide sufficient protection because they are easy to reverse. A systematic review was conducted by Dr. Khaled El Emam and his colleagues of 14 published articles demonstrating successful re-identification attacks on data sets. The review found that only two of the data sets had been de-identified according to existing standards, thereby demonstrating that improperly de-identified data can be re-identified. Further, only one of the two data sets that was properly de-identified were able to be re-identified and the success rate was very low -- 0.013%.
  • De-Identification of Genetic Information This information is based on an unpublished article by Khaled El Emam and Craig Earle entitled Secure Probabilistic De-duplication of Databases The protocol uses an additive homomorphic encryption system that allows mathematical operations to be performed on encrypted values. The following steps are involved in the protocol: Generation of a public-private key pair. Using a public key to encrypt the first dataset. Using a public key to encrypt the second dataset. Applying a secure comparison algorithm to the encrypted values in both datasets which will generate an encrypted record of match results. Securely transferring the encrypted match results to the private key holder who will decrypt the match results. The private key holder will use a probabilistic linking algorithm to compute the probability of the records being a match (for example, the commonly used Fellegi-Sunter probabilistic linkage method. The private key holder will securely transfer an index of matching record pairs to the organizations requesting the data linkage.
  • Homomorphic Encryption Study was published in the journal Science. After seeing how easy it was to find the individuals and their extended families, the N.I.H. removed people's ages from the public database, making it more difficult to identify them. But Dr. Jeffrey R. Botkin, associate vice president for research integrity at the University of Utah, which collected the genetic information of some research participants whose identities were breached, cautioned about overreacting. Genetic data from hundreds of thousands of people have been freely available online, he said, yet there has not been a single report of someone being illicitly identified. He added that "it is hard to imagine what would motivate anyone to undertake this sort of privacy attack in the real world." But he said he had serious concerns about publishing a formula to breach subjects' privacy . By publishing, he said, the investigators "exacerbate the very risks they are concerned about." The project was the inspiration of Yaniv Erlich, a human genetics researcher at the Whitehead Institute, which is affiliated with M.I.T. He stresses that he is a strong advocate of data sharing and that he would hate to see genomic data locked up. But when his lab developed a new technique, he realized he had the tools to probe a DNA database. And he could not resist trying. He and his colleagues calculated they would be able to identify, from just their DNA sequences, the last names of approximately 12 percent of middle class and wealthier white men -- the population that tends to submit DNA data to recreational sites like the genealogical ones. Then by combining the men's last names with their ages and the states where they lived, the researchers should be able to narrow their search to just a few likely individuals. When the subjects in the 1000 Genomes Project agreed to participate and provide DNA, they signed a form saying that the researchers could not guarantee their privacy .
  • A Policy is Not Enough
  • Implement a Privacy Breach Protocol and Conduct Privacy Impact Assessments
  • Circle of Care The privacy breach protocol should: Require those acting on your behalf to notify you of a privacy or suspected privacy breach; Identify the person and set out the procedure to be followed in notifying individuals and senior management of a privacy breach; Clarify responsibilities for containing and investigating a privacy breach; Outline the procedure to be followed in containing, investigating and remediating a privacy breach. The purpose of a privacy impact assessment is to: Review the privacy impact of an information system, technology or program; Identify, address and mitigate actual or potential risks to privacy; Ensure the collection, use, disclosure, retention and disposal of identifying information complies with privacy statutes; Ensure steps that are reasonable in the circumstances are taken to protect identifying information from unauthorized use or disclosure; Ensure identifying information is retained, transferred and disposed of securely.
  • Stop Think Protect Issued in September, 2009.
  • Deter and Prevent Unauthorized Access STOP Ask yourself: Do I really need to store personal health information on this device? THINK Consider the alternatives: Would de-identified or coded Information serve the purpose? Could you access the information remotely through a secure connection or virtual private network instead? PROTECT If you need to retain personal health information on a mobile device: Ensure it is strongly encrypted; Ensure it is protected with strong passwords; Retain the least amount of identifying information for the shortest amount of time; Develop a policy and procedures for secure retention on mobile or portable devices; Provide training on the policy and procedures; Audit compliance on the policy and procedures.
  • Conclusions Ben Parks, a practising family physician running a group practice of 200 physicians in 75 sites across two states - Indiana and Ohio, implemented a policy of zero tolerance for physicians and staff who access personal health information in the EMR for unauthorized purposes.   This has reduced the incidences of unauthorized access. In the year prior to the implementation of the zero tolerance policy, which involves immediate termination of the employment or contract of the individual, there were 11 breaches involving unauthorized access. Since then, there is an average of between 2 and 4 breaches a year involving unauthorized access
  • Privacy by Disaster
  • How to Contact Us
  • Transcript

    • 1. Avoid Privacy by Disaster by Adopting Privacy by Design Ann Cavoukian, Ph.D.Information and Privacy Commissioner Ontario Cancer Care Ontario January 24, 2013
    • 2. Presentation Outline1. Vital Need to Protect Personal Health Information2. Importance of Health Research and Analysis3. Consequences of Inadequate Attention to Privacy4. Common Causes of Privacy Breaches5. Minimizing the Risk of Breaches6. Privacy by Design: The Gold Standard7. Conclusions
    • 3. Vital Need to ProtectPersonal Health Information
    • 4. Unique Characteristics of Personal Health Information• Highly sensitive and personal in nature – in need of strong protection;• But must be shared immediately among a range of health care providers, for the benefit of the patient;• Also used and disclosed for secondary purposes seen to be in the public interest (e.g., research, health system planning and evaluation, quality assurance);• This dual nature of personal health information is reflected in PHIPA.
    • 5. Why Protecting Personal Health Information Is So Critical• Extreme sensitivity of personal health information;• Massive growth in online connectivity;• Increasing number of persons involved in the delivery of health care;• Emphasis on information technology, including electronic medical records and electronic health records;• Health information forms the basis of invaluable research, seen to be in the public interest, but which could be jeopardized if the public’s trust is eroded.
    • 6. Importance of HealthResearch and Analysis
    • 7. “Big Data”• Each day we create 2.5 quintillion bytes of data – 90% of all data was created in the past 2 years;• Big Data analysis and data analytics promise new opportunities to gain valuable insights and benefits;• However, it may also enable expanded surveillance and will increase the risk of unauthorized use and disclosure, on a scale previously unimaginable.
    • 8. The Case for Health Research and AnalysisHealth research and analytics are vital in: • Understanding the determinants of health; • Informing and improving clinical practice guidelines; • Identifying and achieving cost efficiencies; • Facilitating health promotion and disease prevention; • Assessing the need for health services; • Evaluating the services provided; • Allocating resources to the health system; • Educating the public how to improve their health.
    • 9. Recognition of the Value of Health Research and Analysis• The Personal Health Information Protection Act (PHIPA) came into effect on November 1, 2004;• It recognizes the value of health research and analysis;• PHIPA permits health care providers to collect, use and disclose personal health information for purposes beyond the provision of health care, in appropriate circumstances;• PHIPA attempts to ensure that these other purposes are achieved in a manner that minimizes the impact on privacy.
    • 10. Consequences of Inadequate Attention to Privacy
    • 11. Consequences of Inadequate Attention to Privacy• Individuals may suffer discrimination, stigmatization and economic or psychological harm;• Individuals may withhold or falsify information provided;• Loss of trust or confidence in the health system;• Damage to the reputation of the health care provider;• Individuals may be deterred from seeking testing or treatment, or may engage in multiple doctoring;• Lost time and expenditure of resources needed to contain, investigate and remediate privacy breaches;• Costs of legal liabilities and ensuing proceedings.
    • 12. Investigations by the Information and Privacy Commissioner of Ontario• My office has fairly broad order-making powers under PHIPA;• An Order may also include comments, guidance or recommendations;• A final Order may be filed with the court and upon filing, is enforceable as an Order of the Court.
    • 13. Statutory or Common Law Actions• A person affected by a final order may commence a proceeding for damages for actual harm suffered as a result of the contravention of PHIPA or its regulations;• A person affected by conduct that gave rise to a conviction for an offence under PHIPA that has become final may also commence a proceeding for damages for actual harm suffered;• Where the harm was caused wilfully or recklessly, the court may award an amount, not exceeding $10,000, for mental anguish;• The Court of Appeal for Ontario has also recently recognized a new common law cause of action in tort for invasion of privacy, called “intrusion upon seclusion.”
    • 14. Costs of Legal Liabilities and Proceedings• In December 2009, a public health nurse lost a USB key containing the unencrypted health information of 83,524 individuals attending an H1N1 immunization clinic;• Following my Order in January 2010, a $40 million class action was initiated by individuals affected by the breach;• A settlement was reached and approved by the Ontario Superior Court of Justice on July 12, 2012;• In the U.S., numerous fines have been issued for violating the Health Insurance Portability and Accountability Act, including a fine of $4.3 million for failing to provide access and a fine of close to $1 million for improper access to an electronic medical record.
    • 15. Cost of Privacy Breaches• A U.S. study found that the cost of a data breach was $194 per record; the average cost per operating company was more than $5.5 million per breach. — 2011 Annual Study: Cost of a Data Breach, Ponemon Institute• A U.S. report found that the average time it takes to restore an organization’s positive reputation following a data breach is one year – minimum brand damage is a 12% loss.• A hospital in Ontario reported breach management costs of between $100 and $200 per individual, not including costs to reputation and erosion of trust. — Health Care Quarterly, Vol. 12, No. 1, 2009
    • 16. Common Causes of Privacy Breaches
    • 17. Insecure Disposal of Records
    • 18. Order HO-001• A clinic hired a company to shred records of personal health information dated between 1992-1994;• Due to a misunderstanding, the records were given to a recycling company instead of being shredded;• The recycling company sold the records to a special effects company and were used publically in a film shoot.
    • 19. Order HO-006• Employees of a laboratory placed records of personal health information in boxes designated for recycling instead of shredding;• The boxes designated for recycling were located immediately beside those for shredding;• The recycling boxes were put out curbside for pick-up;• The records of personal health information were found scattered on the street outside the laboratory.
    • 20. Mobile and Portable Devices
    • 21. Mobile and Portable Devices• The IPC has issued three orders involving mobile and portable devices in the health context: Order HO-004 – Theft of a laptop containing the unencrypted personal health information of 2,900 individuals; Order HO-007 – Loss of a USB memory stick containing the unencrypted personal health information of 83,524 individuals; Order HO-008 – Theft of a laptop containing the unencrypted personal health information of 20,000 individuals.
    • 22. Unauthorized Access to Records
    • 23. Orders HO-002 and HO-010• I have issued two orders involving unauthorized access to electronic records of personal health information:Order HO-004 – A girlfriend of the patient’s estranged husband, who was a nurse at the hospital, but was not involved in the care of the patient, viewed the patient’s electronic record on numerous occasions; Order HO-010 –A former spouse of the patient’s current spouse, who was a diagnostic imaging technologist at the hospital, but not involved in the care of the patient, viewed the patient’s electronic records on multiple occasions.
    • 24. Minimizing the Risk of Privacy Breaches
    • 25. Privacy by Design: The 7 Foundational Principles1. Proactive not Reactive: Preventative, not Remedial;2. Privacy as the Default setting;3. Privacy Embedded into Design;4. Full Functionality: Positive-Sum, not Zero-Sum;5. End-to-End Security: Full Lifecycle Protection;6. Visibility and Transparency: Keep it Open;7. Respect for User Privacy: Keep it User-Centric. www.ipc.on.ca/images/Resources/7foundationalprinciples.pdf
    • 26. Adoption of “Privacy by Design” as an International StandardLandmark Resolution Passed to Preserve the Future of PrivacyBy Anna Ohlden – October 29th 2010 - http://www.science20.com/newswire/landmark_resolution_passed_preserve_future_privacyJERUSALEM, October 29, 2010 – A landmark Resolution byOntarios Information and Privacy Commissioner, Dr. Ann Cavoukian,was unanimously passed by International Data Protection and PrivacyCommissioners in Jerusalem today at their annual conference.The resolution ensures that privacy is embedded into new technologiesand business practices, right from the outset – as an essentialcomponent of fundamental privacy protection. Full Article: http://www.science20.com/newswire/landmark_resolution_passed_preserve_future_privacy
    • 27. Operationalizing Privacy by Design9 PbD Application Areas•CCTV/Surveillance cameras inmass transit systems;•Biometrics used in casinos andgaming facilities;•Smart Meters and the Smart Grid;•Mobile Communications;•Near Field Communications;•RFIDs and sensor technologies;•Redesigning IP Geolocation;•Remote Home Health Care;•Big Data and Data Analytics. www.privacybydesign.ca
    • 28. Build A Culture of Privacy• Build a culture of privacy – privacy must be built into your practices and procedures;• The commitment to privacy must come from the top down;• Think of privacy as a means of building trust rather than just a matter of compliance;• Ensure those acting on your behalf know how to apply privacy policies and procedures in their day-to-day work;• Provide on-going privacy and security training;• Use multiple means to communicate privacy messages;• Measure the effectiveness of your privacy program.
    • 29. Data Minimization• Data minimization is an essential safeguard in protecting personal health information, including for purposes involving health research and analysis;• Do not collect, use or disclose personal health information if other types of information (i.e. de-identified, aggregated, or anonymized) will serve the purpose;• Do not collect, use or disclose more personal health information than is reasonably necessary to meet the intended purpose.
    • 30. Dispelling the Myths about De-Identification…• The claim that de-identification has no value in protecting privacy due to the ease of re-identification, is a myth;• If proper de-identification techniques and re-identification risk management procedures are used, re-identification becomes a very difficult task;• While there may be a residual risk of re-identification, in the vast majority of cases, de-identification will strongly protect the privacy of individuals when additional safeguards are in place. www.ipc.on.ca/English/Resources/Discussion-Papers/Discussion-Papers-Summary/?id=1084
    • 31. Data De-Identification Tool• Developed by Dr. Khaled El Emam, a leading investigator at the Children’s Hospital of Eastern Ontario Research Institute and Canada Research Chair at the Electronic Health Information Research Institute;• De-identification tool that minimizes the risk of re-identification based on: - The low probability of re-identification; - Whether mitigation controls are in place; - Motives and capacity of the recipient; - The extent a breach invades privacy;• Simultaneously maximizes privacy and data quality while minimizing distortion to the original database. www.ipc.on.ca/images/Resources/positive-sum-khalid.pdf
    • 32. Evidence that the Tool Works• Dr. El Emam was approached to create a longitudinal public use dataset using his de-identification tool for the purposes of a global data mining competition – the Heritage Health Prize;• Participants in the Heritage Health Prize competition were asked to predict, using de-identified claims data, the number of days patients would be hospitalized in a subsequent year;• Before releasing the dataset created using Dr. El Emam’s tool, the de-identified dataset was subjected to a strong re-identification attack by a highly skilled expert;• The expert concluded the dataset could not be re-identified – Dr. El Emams de-identification tool was highly successful!
    • 33. Evidence that Re-Identification is Extremely Difficult• A literature search by Dr. El Emam et al. identified 14 published accounts of re-identification attacks on de-identified data;• A review of these attacks revealed that one quarter of all records and roughly one-third of health records were re-identified;• However, Dr. El Emam found that only 2 out of the 14 attacks were made on records that had been properly de-identified using existing standards;• Further, only 1 of the 2 attacks had been made on health data, resulting in a very low re-identification rate of 0.013%.
    • 34. Data Minimization for Record Linkages• Dr. El Emam has also developed a protocol for securely linking databases without sharing any identifying information;• The protocol uses an encryption system to identify and locate records relating to an individual, existing in multiple datasets;• This involves encrypting personal identifiers in each dataset and comparing only the encrypted identifiers, using mathematical operations, resulting in a list of matched records, without revealing any personal identifiers;• The protocol promotes compliance with existing prohibition in PHIPA by allowing linkages of datasets without the disclosure of any identifying information – a win/win solution – positive-sum!
    • 35. De-identification of Genetic Information• A researcher was able to re-identify five randomly selected individuals participating in the 1000 Genomes Project;• The 1000 Genomes Project posts genetic information, age and region of individuals from around the world for researchers to use freely;• That information, a genealogy website, and Google searches, were sufficient to locate complete family trees;• While the methods for extracting relevant genetic data from the raw genetic sequence files are specialized, the possibility of re-identification was not expected;• Dr. Khaled El Emam is working on a process to de-identify genetic information and anticipates that a tool will be available in 3 to 4 years.
    • 36. Homomorphic Encryption• A form of encryption that allows computations to be carried out on encrypted data to obtain an encrypted result;• Homomorphic describes the transformation of one data set into another while preserving relationships between data elements in both sets;• Homomorphic encryption allows you to make computations or engage in data analytics on encrypted value – data you cannot “read” because it is not in plain text, therefore inaccessible;• Can also be used to link two or more databases without the disclosure of any unique identifiers – positive-sum – win/win.
    • 37. A Policy is Not Enough:It Must be Reflected in Concrete Actions• Implement a privacy policy that INSERT PICTURE OF reflects your privacy needs and risks; DOCUMENT HERE –• Link each requirement in the privacy WHEN AVAILABLE policy to a concrete, actionable item;• Demonstrate how each will be implemented;• Conduct privacy education and awareness training;• Designate a central “go to” person for privacy-related queries;• Verify compliance with privacy policies, procedures and processes;• Prepare for a possible breach. www.privacybydesign.ca
    • 38. Implement A Privacy Breach Protocol and Conduct Privacy Impact Assessments http://www.ipc.on.ca/English/Resources/Discussion-Papers/Discussion-Papers- http://www.ipc.on.ca/English/Resources/Best-Practices-and-Professional-Guidelines/Best- Summary/?id=433 Practices-and-Professional-Guidelines-Summary/?id=574
    • 39. Circle of Care: Sharing Personal Health Information for Health Care Purposes• In 2009, we produced a guide to clarify the circumstances in which health information custodians may collect, use or disclose personal health information within the Circle of Care;• Members of the working group included: • College of Physicians and Surgeons of Ontario • Ontario Long Term Care Association • Ontario Hospital Association • Ontario Medical Association • Ontario Ministry of Health and Long Term Care • Ontario Assoc. of Community Care Access Centres • Ontario Assoc. of Non-Profit Services for Seniors • Information and Privacy Commissioner of Ontario www.ipc.on.ca
    • 40. Stop. Think. Protect.… Protect Personal Health Information on Mobile and Portable Devices
    • 41. Deter and Prevent Unauthorized Access• Immediately terminate access to records pending an investigation into the issue of unauthorized access;• Implement appropriate access controls;• Log and audit access to records;• Implement a policy of “zero tolerance;”• Impose appropriate discipline for unauthorized access;• Provide ongoing training using multiple means of raising awareness on appropriate access such as: • Confidentiality agreements; • Reminder notices displayed upon log-in to electronic records.
    • 42. Conclusions• Make privacy a priority – ensure that privacy is embedded into health research and analysis;• It is easier and far more cost-effective to build in privacy up-front, rather than after-the-fact;• Privacy risks are best managed by proactively embedding the principles of Privacy by Design;• Get smart – lead with Privacy – by Design, not privacy by chance or, worse, privacy by Disaster!
    • 43. PRIVACY byDISASTER
    • 44. How to Contact UsAnn Cavoukian, Ph.D.Information & Privacy Commissioner of Ontario2 Bloor Street East, Suite 1400Toronto, Ontario, CanadaM4W 1A8Phone: (416) 326-3948 / 1-800-387-0073Web: www.ipc.on.caE-mail: info@ipc.on.caFor more information on Privacy by Design, please visit: www.privacybydesign.ca

    ×