Case study on communicating with research ethics committees about minimizing risk through software: an application for record linkage in secondary data analysis

Objective In retrospective secondary data analysis studies, researchers often seek waiver of consent from institutional Review Boards (IRB) and minimize risk by utilizing complex software. Yet, little is known about the perspectives of IRB experts on these approaches. To facilitate effective communi...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:JAMIA open 2024-04, Vol.7 (1), p.ooae010-ooae010
Hauptverfasser: Schmit, Cason, Ferdinand, Alva O, Giannouchos, Theodoros, Kum, Hye-Chung
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page ooae010
container_issue 1
container_start_page ooae010
container_title JAMIA open
container_volume 7
creator Schmit, Cason
Ferdinand, Alva O
Giannouchos, Theodoros
Kum, Hye-Chung
description Objective In retrospective secondary data analysis studies, researchers often seek waiver of consent from institutional Review Boards (IRB) and minimize risk by utilizing complex software. Yet, little is known about the perspectives of IRB experts on these approaches. To facilitate effective communication about risk mitigation strategies using software, we conducted two studies with IRB experts to co-create appropriate language when describing a software to IRBs. Materials and Methods We conducted structured focus groups with IRB experts to solicit ideas on questions regarding benefits, risks, and informational needs. Based on these results, we developed a template IRB application and template responses for a generic study using privacy-enhancing software. We then conducted a three-round Delphi study to refine the template IRB application and the template responses based on expert panel feedback. To facilitate participants’ deliberation, we shared the revisions and a summary of participants’ feedback during each Delphi round. Results 11 experts in two focus groups generated 13 ideas on risks, benefits, and informational needs. 17 experts participated in the Delphi study with 13 completing all rounds. Most agreed that privacy-enhancing software will minimize risk, but regardless all secondary data studies have an inherent risk of unexpected disclosures. The majority (84.6%) noted that subjects in retrospective secondary data studies experience no greater risks than the risks experienced in ordinary life in the modern digital society. Hence, all retrospective data-only studies with no contact with subjects would be minimal risk studies. Conclusion First, we found fundamental disagreements in how some IRB experts view risks in secondary data research. Such disagreements are consequential because they can affect determination outcomes and might suggest IRBs at different institutions might come to different conclusions regarding similar study protocols. Second, the highest ranked risks and benefits of privacy-enhancing software in our study were societal rather than individual. The highest ranked benefits were facilitating more research and promoting responsible data governance practices. The highest ranked risks were risk of invalid results from systematic user error or erroneous algorithms. These societal considerations are typically more characteristic of public health ethics as opposed to the bioethical approach of research ethics, possibly reflecting the d
doi_str_mv 10.1093/jamiaopen/ooae010
format Article
fullrecord <record><control><sourceid>gale_pubme</sourceid><recordid>TN_cdi_pubmedcentral_primary_oai_pubmedcentral_nih_gov_10903982</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><galeid>A792005563</galeid><oup_id>10.1093/jamiaopen/ooae010</oup_id><sourcerecordid>A792005563</sourcerecordid><originalsourceid>FETCH-LOGICAL-c456t-c207181956c9d0ed1282836124b64e03e8346b77ae81396769f798e9fff529163</originalsourceid><addsrcrecordid>eNqNkstuFDEQRVsIRKKQD2CDLLFhwSR-9MtsUDTiJUViA2urxl3udtJtN7abaPgaPhVPZhglEgvkha2qc2-V7SqKl4xeMCrF5Q1MFvyM7tJ7QMrok-KUV0254pVgTx-cT4rzGG8opUxKWQv6vDgRbZnTtDotfq8hIolp6bbEO6L9NC3OakjW9eTOpoEEjAhBDwTTYHW8R2xKiJHAxi-JTNbZyf7aCYKNtyQNwS_9QKI36Q4CviPgCMzzeG-bixgfsqv2oSOjdbfQI7GOxBxxHYQt6SBB1sC4jTa-KJ4ZGCOeH_az4vvHD9_Wn1fXXz99WV9dr3RZ1WmlOW1Yy2RVa9lR7BhveStqxstNXSIV2Iqy3jQNYMuErJtamka2KI0xFZesFmfF-73vvGwm7DS6FGBUc7BT7kl5sOpxxtlB9f6nyp9BhWx5dnhzcAj-x4IxqclGjeMIDv0SFZei5A1rOM3o6z3aw4jKOuOzpd7h6qqRnNKqqkWmLv5B5dXhZPNjobE5_kjA9gIdfIwBzbF9Rnd9CnUcGnUYmqx59fDeR8XfEcnA2z3gl_k__P4AMDnTNw</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2934271720</pqid></control><display><type>article</type><title>Case study on communicating with research ethics committees about minimizing risk through software: an application for record linkage in secondary data analysis</title><source>DOAJ Directory of Open Access Journals</source><source>Oxford Journals Open Access Collection</source><source>EZB-FREE-00999 freely available EZB journals</source><source>PubMed Central</source><creator>Schmit, Cason ; Ferdinand, Alva O ; Giannouchos, Theodoros ; Kum, Hye-Chung</creator><creatorcontrib>Schmit, Cason ; Ferdinand, Alva O ; Giannouchos, Theodoros ; Kum, Hye-Chung</creatorcontrib><description>Objective In retrospective secondary data analysis studies, researchers often seek waiver of consent from institutional Review Boards (IRB) and minimize risk by utilizing complex software. Yet, little is known about the perspectives of IRB experts on these approaches. To facilitate effective communication about risk mitigation strategies using software, we conducted two studies with IRB experts to co-create appropriate language when describing a software to IRBs. Materials and Methods We conducted structured focus groups with IRB experts to solicit ideas on questions regarding benefits, risks, and informational needs. Based on these results, we developed a template IRB application and template responses for a generic study using privacy-enhancing software. We then conducted a three-round Delphi study to refine the template IRB application and the template responses based on expert panel feedback. To facilitate participants’ deliberation, we shared the revisions and a summary of participants’ feedback during each Delphi round. Results 11 experts in two focus groups generated 13 ideas on risks, benefits, and informational needs. 17 experts participated in the Delphi study with 13 completing all rounds. Most agreed that privacy-enhancing software will minimize risk, but regardless all secondary data studies have an inherent risk of unexpected disclosures. The majority (84.6%) noted that subjects in retrospective secondary data studies experience no greater risks than the risks experienced in ordinary life in the modern digital society. Hence, all retrospective data-only studies with no contact with subjects would be minimal risk studies. Conclusion First, we found fundamental disagreements in how some IRB experts view risks in secondary data research. Such disagreements are consequential because they can affect determination outcomes and might suggest IRBs at different institutions might come to different conclusions regarding similar study protocols. Second, the highest ranked risks and benefits of privacy-enhancing software in our study were societal rather than individual. The highest ranked benefits were facilitating more research and promoting responsible data governance practices. The highest ranked risks were risk of invalid results from systematic user error or erroneous algorithms. These societal considerations are typically more characteristic of public health ethics as opposed to the bioethical approach of research ethics, possibly reflecting the difficulty applying a bioethical approach (eg, informed consent) in secondary data studies. Finally, the development of privacy-enhancing technology for secondary data research depends on effective communication and collaboration between the privacy experts and technology developers. Privacy is a complex issue that requires a holistic approach that is best addressed through privacy-by-design principles. Privacy expert participation is important yet often neglected in this design process. This study suggests best practice strategies for engaging the privacy community through co-developing companion documents for software through participatory design to facilitate transparency and communication. In this case study, the final template IRB application and responses we released with the open-source software can be easily adapted by researchers to better communicate with their IRB when using the software. This can help increase responsible data governance practices when many software developers are not research ethics experts. Lay Summary Objective In our study, we wanted to learn how experts who check and approve studies that use special computer programs to keep information safe and private feel about using these tools. We did two studies with experts to make guidelines on how to talk about the software to make it easy to understand. Materials and Methods We talked with experts to get their thoughts on the good and bad things about using this software and what information is needed. Using their ideas, we made a form and a guide for researchers to ask for approval to do studies using this software. We improved these forms by getting feedback from a group of experts. Results We had 11 experts in two discussion groups, and they had 13 ideas about the good and bad things and what information is needed. Then, 17 experts gave feedback, with 13 finishing all three rounds of providing feedback. Most experts agreed that this privacy software reduces risks, but there is a small chance of unexpected problems when using existing data. Most (84.6%) thought that the risks for people in these studies were not higher than what people face in their daily lives with technology. So, studies using existing data without talking to people directly can be seen as low risk. Conclusion We found that experts do not always agree on the risks of using existing data in research. This matters because it can affect decisions by review boards, and different groups might think differently about similar studies. Also, our study showed that the big risks and benefits of this privacy software affect society more than individual people. Some important benefits were in allowing more research and helping manage data responsibly. The main risks were mistakes or problems with the software. This shows that we need to think about public health ethics along with traditional research ethics, especially when it is hard to get permission from people for studies that only use their existing data. Finally, we found good communication between privacy experts and software makers is important for making tools to keep data private. Privacy is a complex issue that requires careful attention, and privacy should be included in the beginning when making software. Sadly, privacy experts are often forgotten during this process. Our study suggests that including the privacy experts in making these tools and deciding how to share information is important. We made a form and guide that researchers can use to talk to their review boards when they use this software. This can make sure data is managed responsibly, especially since many software makers are not experts in research ethics.</description><identifier>ISSN: 2574-2531</identifier><identifier>EISSN: 2574-2531</identifier><identifier>DOI: 10.1093/jamiaopen/ooae010</identifier><identifier>PMID: 38425705</identifier><language>eng</language><publisher>United States: Oxford University Press</publisher><subject>Algorithms ; Case studies ; Computer software industry ; Ethical aspects ; Medical ethics ; Public software ; Rankings ; Research and Applications ; Research ethics ; Technology application</subject><ispartof>JAMIA open, 2024-04, Vol.7 (1), p.ooae010-ooae010</ispartof><rights>The Author(s) 2024. Published by Oxford University Press on behalf of the American Medical Informatics Association. 2024</rights><rights>The Author(s) 2024. Published by Oxford University Press on behalf of the American Medical Informatics Association.</rights><rights>COPYRIGHT 2024 Oxford University Press</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c456t-c207181956c9d0ed1282836124b64e03e8346b77ae81396769f798e9fff529163</cites><orcidid>0000-0002-6882-8053</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC10903982/pdf/$$EPDF$$P50$$Gpubmedcentral$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC10903982/$$EHTML$$P50$$Gpubmedcentral$$Hfree_for_read</linktohtml><link.rule.ids>230,314,723,776,780,860,881,1598,27903,27904,53769,53771</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/38425705$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Schmit, Cason</creatorcontrib><creatorcontrib>Ferdinand, Alva O</creatorcontrib><creatorcontrib>Giannouchos, Theodoros</creatorcontrib><creatorcontrib>Kum, Hye-Chung</creatorcontrib><title>Case study on communicating with research ethics committees about minimizing risk through software: an application for record linkage in secondary data analysis</title><title>JAMIA open</title><addtitle>JAMIA Open</addtitle><description>Objective In retrospective secondary data analysis studies, researchers often seek waiver of consent from institutional Review Boards (IRB) and minimize risk by utilizing complex software. Yet, little is known about the perspectives of IRB experts on these approaches. To facilitate effective communication about risk mitigation strategies using software, we conducted two studies with IRB experts to co-create appropriate language when describing a software to IRBs. Materials and Methods We conducted structured focus groups with IRB experts to solicit ideas on questions regarding benefits, risks, and informational needs. Based on these results, we developed a template IRB application and template responses for a generic study using privacy-enhancing software. We then conducted a three-round Delphi study to refine the template IRB application and the template responses based on expert panel feedback. To facilitate participants’ deliberation, we shared the revisions and a summary of participants’ feedback during each Delphi round. Results 11 experts in two focus groups generated 13 ideas on risks, benefits, and informational needs. 17 experts participated in the Delphi study with 13 completing all rounds. Most agreed that privacy-enhancing software will minimize risk, but regardless all secondary data studies have an inherent risk of unexpected disclosures. The majority (84.6%) noted that subjects in retrospective secondary data studies experience no greater risks than the risks experienced in ordinary life in the modern digital society. Hence, all retrospective data-only studies with no contact with subjects would be minimal risk studies. Conclusion First, we found fundamental disagreements in how some IRB experts view risks in secondary data research. Such disagreements are consequential because they can affect determination outcomes and might suggest IRBs at different institutions might come to different conclusions regarding similar study protocols. Second, the highest ranked risks and benefits of privacy-enhancing software in our study were societal rather than individual. The highest ranked benefits were facilitating more research and promoting responsible data governance practices. The highest ranked risks were risk of invalid results from systematic user error or erroneous algorithms. These societal considerations are typically more characteristic of public health ethics as opposed to the bioethical approach of research ethics, possibly reflecting the difficulty applying a bioethical approach (eg, informed consent) in secondary data studies. Finally, the development of privacy-enhancing technology for secondary data research depends on effective communication and collaboration between the privacy experts and technology developers. Privacy is a complex issue that requires a holistic approach that is best addressed through privacy-by-design principles. Privacy expert participation is important yet often neglected in this design process. This study suggests best practice strategies for engaging the privacy community through co-developing companion documents for software through participatory design to facilitate transparency and communication. In this case study, the final template IRB application and responses we released with the open-source software can be easily adapted by researchers to better communicate with their IRB when using the software. This can help increase responsible data governance practices when many software developers are not research ethics experts. Lay Summary Objective In our study, we wanted to learn how experts who check and approve studies that use special computer programs to keep information safe and private feel about using these tools. We did two studies with experts to make guidelines on how to talk about the software to make it easy to understand. Materials and Methods We talked with experts to get their thoughts on the good and bad things about using this software and what information is needed. Using their ideas, we made a form and a guide for researchers to ask for approval to do studies using this software. We improved these forms by getting feedback from a group of experts. Results We had 11 experts in two discussion groups, and they had 13 ideas about the good and bad things and what information is needed. Then, 17 experts gave feedback, with 13 finishing all three rounds of providing feedback. Most experts agreed that this privacy software reduces risks, but there is a small chance of unexpected problems when using existing data. Most (84.6%) thought that the risks for people in these studies were not higher than what people face in their daily lives with technology. So, studies using existing data without talking to people directly can be seen as low risk. Conclusion We found that experts do not always agree on the risks of using existing data in research. This matters because it can affect decisions by review boards, and different groups might think differently about similar studies. Also, our study showed that the big risks and benefits of this privacy software affect society more than individual people. Some important benefits were in allowing more research and helping manage data responsibly. The main risks were mistakes or problems with the software. This shows that we need to think about public health ethics along with traditional research ethics, especially when it is hard to get permission from people for studies that only use their existing data. Finally, we found good communication between privacy experts and software makers is important for making tools to keep data private. Privacy is a complex issue that requires careful attention, and privacy should be included in the beginning when making software. Sadly, privacy experts are often forgotten during this process. Our study suggests that including the privacy experts in making these tools and deciding how to share information is important. We made a form and guide that researchers can use to talk to their review boards when they use this software. This can make sure data is managed responsibly, especially since many software makers are not experts in research ethics.</description><subject>Algorithms</subject><subject>Case studies</subject><subject>Computer software industry</subject><subject>Ethical aspects</subject><subject>Medical ethics</subject><subject>Public software</subject><subject>Rankings</subject><subject>Research and Applications</subject><subject>Research ethics</subject><subject>Technology application</subject><issn>2574-2531</issn><issn>2574-2531</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>TOX</sourceid><recordid>eNqNkstuFDEQRVsIRKKQD2CDLLFhwSR-9MtsUDTiJUViA2urxl3udtJtN7abaPgaPhVPZhglEgvkha2qc2-V7SqKl4xeMCrF5Q1MFvyM7tJ7QMrok-KUV0254pVgTx-cT4rzGG8opUxKWQv6vDgRbZnTtDotfq8hIolp6bbEO6L9NC3OakjW9eTOpoEEjAhBDwTTYHW8R2xKiJHAxi-JTNbZyf7aCYKNtyQNwS_9QKI36Q4CviPgCMzzeG-bixgfsqv2oSOjdbfQI7GOxBxxHYQt6SBB1sC4jTa-KJ4ZGCOeH_az4vvHD9_Wn1fXXz99WV9dr3RZ1WmlOW1Yy2RVa9lR7BhveStqxstNXSIV2Iqy3jQNYMuErJtamka2KI0xFZesFmfF-73vvGwm7DS6FGBUc7BT7kl5sOpxxtlB9f6nyp9BhWx5dnhzcAj-x4IxqclGjeMIDv0SFZei5A1rOM3o6z3aw4jKOuOzpd7h6qqRnNKqqkWmLv5B5dXhZPNjobE5_kjA9gIdfIwBzbF9Rnd9CnUcGnUYmqx59fDeR8XfEcnA2z3gl_k__P4AMDnTNw</recordid><startdate>20240401</startdate><enddate>20240401</enddate><creator>Schmit, Cason</creator><creator>Ferdinand, Alva O</creator><creator>Giannouchos, Theodoros</creator><creator>Kum, Hye-Chung</creator><general>Oxford University Press</general><scope>TOX</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7X8</scope><scope>5PM</scope><orcidid>https://orcid.org/0000-0002-6882-8053</orcidid></search><sort><creationdate>20240401</creationdate><title>Case study on communicating with research ethics committees about minimizing risk through software: an application for record linkage in secondary data analysis</title><author>Schmit, Cason ; Ferdinand, Alva O ; Giannouchos, Theodoros ; Kum, Hye-Chung</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c456t-c207181956c9d0ed1282836124b64e03e8346b77ae81396769f798e9fff529163</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Algorithms</topic><topic>Case studies</topic><topic>Computer software industry</topic><topic>Ethical aspects</topic><topic>Medical ethics</topic><topic>Public software</topic><topic>Rankings</topic><topic>Research and Applications</topic><topic>Research ethics</topic><topic>Technology application</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Schmit, Cason</creatorcontrib><creatorcontrib>Ferdinand, Alva O</creatorcontrib><creatorcontrib>Giannouchos, Theodoros</creatorcontrib><creatorcontrib>Kum, Hye-Chung</creatorcontrib><collection>Oxford Journals Open Access Collection</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>MEDLINE - Academic</collection><collection>PubMed Central (Full Participant titles)</collection><jtitle>JAMIA open</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Schmit, Cason</au><au>Ferdinand, Alva O</au><au>Giannouchos, Theodoros</au><au>Kum, Hye-Chung</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Case study on communicating with research ethics committees about minimizing risk through software: an application for record linkage in secondary data analysis</atitle><jtitle>JAMIA open</jtitle><addtitle>JAMIA Open</addtitle><date>2024-04-01</date><risdate>2024</risdate><volume>7</volume><issue>1</issue><spage>ooae010</spage><epage>ooae010</epage><pages>ooae010-ooae010</pages><issn>2574-2531</issn><eissn>2574-2531</eissn><abstract>Objective In retrospective secondary data analysis studies, researchers often seek waiver of consent from institutional Review Boards (IRB) and minimize risk by utilizing complex software. Yet, little is known about the perspectives of IRB experts on these approaches. To facilitate effective communication about risk mitigation strategies using software, we conducted two studies with IRB experts to co-create appropriate language when describing a software to IRBs. Materials and Methods We conducted structured focus groups with IRB experts to solicit ideas on questions regarding benefits, risks, and informational needs. Based on these results, we developed a template IRB application and template responses for a generic study using privacy-enhancing software. We then conducted a three-round Delphi study to refine the template IRB application and the template responses based on expert panel feedback. To facilitate participants’ deliberation, we shared the revisions and a summary of participants’ feedback during each Delphi round. Results 11 experts in two focus groups generated 13 ideas on risks, benefits, and informational needs. 17 experts participated in the Delphi study with 13 completing all rounds. Most agreed that privacy-enhancing software will minimize risk, but regardless all secondary data studies have an inherent risk of unexpected disclosures. The majority (84.6%) noted that subjects in retrospective secondary data studies experience no greater risks than the risks experienced in ordinary life in the modern digital society. Hence, all retrospective data-only studies with no contact with subjects would be minimal risk studies. Conclusion First, we found fundamental disagreements in how some IRB experts view risks in secondary data research. Such disagreements are consequential because they can affect determination outcomes and might suggest IRBs at different institutions might come to different conclusions regarding similar study protocols. Second, the highest ranked risks and benefits of privacy-enhancing software in our study were societal rather than individual. The highest ranked benefits were facilitating more research and promoting responsible data governance practices. The highest ranked risks were risk of invalid results from systematic user error or erroneous algorithms. These societal considerations are typically more characteristic of public health ethics as opposed to the bioethical approach of research ethics, possibly reflecting the difficulty applying a bioethical approach (eg, informed consent) in secondary data studies. Finally, the development of privacy-enhancing technology for secondary data research depends on effective communication and collaboration between the privacy experts and technology developers. Privacy is a complex issue that requires a holistic approach that is best addressed through privacy-by-design principles. Privacy expert participation is important yet often neglected in this design process. This study suggests best practice strategies for engaging the privacy community through co-developing companion documents for software through participatory design to facilitate transparency and communication. In this case study, the final template IRB application and responses we released with the open-source software can be easily adapted by researchers to better communicate with their IRB when using the software. This can help increase responsible data governance practices when many software developers are not research ethics experts. Lay Summary Objective In our study, we wanted to learn how experts who check and approve studies that use special computer programs to keep information safe and private feel about using these tools. We did two studies with experts to make guidelines on how to talk about the software to make it easy to understand. Materials and Methods We talked with experts to get their thoughts on the good and bad things about using this software and what information is needed. Using their ideas, we made a form and a guide for researchers to ask for approval to do studies using this software. We improved these forms by getting feedback from a group of experts. Results We had 11 experts in two discussion groups, and they had 13 ideas about the good and bad things and what information is needed. Then, 17 experts gave feedback, with 13 finishing all three rounds of providing feedback. Most experts agreed that this privacy software reduces risks, but there is a small chance of unexpected problems when using existing data. Most (84.6%) thought that the risks for people in these studies were not higher than what people face in their daily lives with technology. So, studies using existing data without talking to people directly can be seen as low risk. Conclusion We found that experts do not always agree on the risks of using existing data in research. This matters because it can affect decisions by review boards, and different groups might think differently about similar studies. Also, our study showed that the big risks and benefits of this privacy software affect society more than individual people. Some important benefits were in allowing more research and helping manage data responsibly. The main risks were mistakes or problems with the software. This shows that we need to think about public health ethics along with traditional research ethics, especially when it is hard to get permission from people for studies that only use their existing data. Finally, we found good communication between privacy experts and software makers is important for making tools to keep data private. Privacy is a complex issue that requires careful attention, and privacy should be included in the beginning when making software. Sadly, privacy experts are often forgotten during this process. Our study suggests that including the privacy experts in making these tools and deciding how to share information is important. We made a form and guide that researchers can use to talk to their review boards when they use this software. This can make sure data is managed responsibly, especially since many software makers are not experts in research ethics.</abstract><cop>United States</cop><pub>Oxford University Press</pub><pmid>38425705</pmid><doi>10.1093/jamiaopen/ooae010</doi><orcidid>https://orcid.org/0000-0002-6882-8053</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 2574-2531
ispartof JAMIA open, 2024-04, Vol.7 (1), p.ooae010-ooae010
issn 2574-2531
2574-2531
language eng
recordid cdi_pubmedcentral_primary_oai_pubmedcentral_nih_gov_10903982
source DOAJ Directory of Open Access Journals; Oxford Journals Open Access Collection; EZB-FREE-00999 freely available EZB journals; PubMed Central
subjects Algorithms
Case studies
Computer software industry
Ethical aspects
Medical ethics
Public software
Rankings
Research and Applications
Research ethics
Technology application
title Case study on communicating with research ethics committees about minimizing risk through software: an application for record linkage in secondary data analysis
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-24T11%3A02%3A36IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-gale_pubme&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Case%20study%20on%20communicating%20with%20research%20ethics%20committees%20about%20minimizing%20risk%20through%20software:%20an%20application%20for%20record%20linkage%20in%20secondary%20data%20analysis&rft.jtitle=JAMIA%20open&rft.au=Schmit,%20Cason&rft.date=2024-04-01&rft.volume=7&rft.issue=1&rft.spage=ooae010&rft.epage=ooae010&rft.pages=ooae010-ooae010&rft.issn=2574-2531&rft.eissn=2574-2531&rft_id=info:doi/10.1093/jamiaopen/ooae010&rft_dat=%3Cgale_pubme%3EA792005563%3C/gale_pubme%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2934271720&rft_id=info:pmid/38425705&rft_galeid=A792005563&rft_oup_id=10.1093/jamiaopen/ooae010&rfr_iscdi=true