Deforestation
Study  |  11/14/2024

Climate Impact of Carbon Crediting Projects Is Substantially Overestimated

A new meta-study published in Nature Communications has revealed that emission reductions from climate mitigation projects are significantly lower than claimed. Benedict Probst, Head of the Net Zero Lab at the Institute, and coauthors systematically reviewed  more than 60 empirical studies, uncovering substantial quality issues with carbon credits.

Deforestation
Photo: Adobe Stock

Carbon markets play a critical role in firms’ and governments’ climate strategies by enabling the purchase and sale of carbon credits. These credits represent a specific amount of carbon emissions (CO2) that has been mitigated through projects, such as avoiding deforestation or destroying potent greenhouse gases. These credits help organizations and countries meet their climate targets by offsetting a portion of their own emissions.

The Problem

A pressing question is whether these carbon credits really reflect genuine emission reductions or whether the claimed effects are illusory. Do these projects truly benefit the environment or are we paying for something that lacks tangible value? Carbon crediting mechanisms allow project developers to earn credits through emission reduction projects. However, numerous studies have raised concerns about environmental integrity. A systematic assessment has been lacking to date.

The New Study and its Findings

The new meta-study published in Nature Communications analyzes 14 studies covering 2,346 climate projects and 51 studies of comparable projects for which no carbon credits were issued. All studies considered were based on experimental or rigorous observational methods. The analysis covers one fifth of the total credit volume issued to date, which corresponds to almost a billion tonnes of CO2 emissions.

The analysis shows that less than 16% of carbon credits issued to the evaluated projects represented actual emission reductions. Specific examples of this are:


  • For clean cookstove projects, in which traditional stoves are replaced by cleaner ones, the actual emission reductions corresponded to only 11% of the carbon credits issued.
  • In the abatement of the potent greenhouse gas SF6, the actual emission reductions amounted to only 16% of the carbon credits issued.
  • Avoided deforestation showed a 25% reduction.
  • Reducing the potent greenhouse gas HFC-23 performed comparatively well, with actual emission reductions amounting to 68%.

With regard to wind energy, the data shows that the projects would probably have been implemented even without the sale of carbon credits and that issuing carbon credits did not lead to any additional mitigation. Improved forest management was also implemented in reference areas without access to carbon credits to the same extent as in areas that benefited from carbon credits.

In projects destroying the waste gases hydrofluorocarbon (HFC)-23 and sulphur hexafluoride (SF6) in industry, however, the data shows that waste gas generation increased when plant operators were able to generate carbon credits.

Urgent Need for Improved Certification Rules

Dr. Benedict Probst, Head of the Net Zero Lab at the Max Planck Institute for Innovation and Competition, emphasized, “There is an urgent need to establish better rules for issuing carbon credits. All project types face systemic quality issues, and the quantification of emission reductions needs substantial improvement.”

Co-author Dr. Lambert Schneider from the Oeko-Institut in Berlin points out that there is too much leeway when calculating emission reductions. “The rules of the carbon crediting programs often give project developers too much flexibility. This can lead to unrealistic assumptions being made or inaccurate data being used, resulting in an overestimation of reductions.”

The carbon crediting programs have a particular responsibility to improve the quality of the carbon credits, the authors indicate. Carbon crediting programs should enhance their approaches to assessing projects and calculating emission reductions to ensure they are based on conservative assumptions and the latest scientific findings.

The Societal Importance of the Study

Major climate goals are at risk: if carbon credits do not lead to real emission reductions, we will not make the progress we think we are making in combating climate change.

A potential trust issue looms: governments and firms rely on carbon credits to meet their climate pledges. If these credits are ineffective, it could undermine trust in carbon markets, which are seen as an essential tool in the fight against global warming.

Avoiding potential greenwashing is critical: some firms could use ineffective carbon credits to claim “carbon neutrality” without actually reducing their emissions, misleading consumers and regulators.

Conclusion

The study shows that carbon markets are not delivering the necessary and expected impact. Reforms are urgently needed to ensure that carbon credit mechanisms are truly contributing to mitigate climate change. If we do not reform these mechanisms, we risk missing climate targets and allowing firms to appear more environmentally friendly than they really are.

About the Net Zero Lab


Environmental economist Benedict Probst has led an independent Max Planck Research Group at the Max Planck Institute for Innovation and Competition in Munich since May 2024. The Net Zero Lab aims to accelerate the development of green technologies that are crucial for replacing fossil fuels in industry as well as of technologies that directly remove CO2 from the atmosphere.


For more information on the Net Zero Lab, see: https://www.netzerolab.science/

Directly to the study:

Probst, Benedict S., et al. (2024). Systematic assessment of the achieved emission reductions of carbon crediting projects, Nature Communications, 15, 9562. Available at https://doi.org/10.1038/s41467-024-53645-z


All data in filterable graphics: https://www.carboncredits.fyi/


Other scientists and institutions contributing to the study:

Malte Toetzke (1,4), Andreas Kontoleon (3), Laura Díaz Anadón (3,5), Jan C. Minx (6,7), Barbara K. Haya (8), Philipp A. Trotter (10,11), Thales A.P. West (3,12), Annelise Gill-Wiehl (13), Volker H. Hoffmann (2)


(1) Net Zero Lab, Max Planck Institute for Innovation and Competition, (2) Group for Sustainability and Technology, ETH Zurich, (3) Department of Land Economy, Centre for Environment, Energy, and Natural Resource Governance, University of Cambridge, (4) Public Policy for the Green Transition, Technical University of Munich, (5) Harvard Kennedy School, Harvard University, (6) Mercator Research Institute on Global Commons and Climate Change, (7) Priestley International Centre for Climate, School of Earth and Environment, (8) Goldman School of Public Policy, University of California, Berkeley, (9) Oeko-Institut, Berlin, (10) Schumpeter School of Business and Economics, University of Wuppertal, (11) Smith School of Enterprise and the Environment, University of Oxford, (12) Institute for Environmental Studies (IVM), Vrije Universiteit Amsterdam, (13) Energy & Resources Group, University of California, Berkeley.

Symbolic image for a machine learning model that searches patent documents for similarities
Study  |  10/10/2024

NBER Study Confirms Strong Performance of Institute’s PaECTER Model for Patent Analysis

A recent study published by the National Bureau of Economic Research (NBER) has confirmed the strong performance of PaECTER, a patent analysis model developed by a team of researchers at the Max Planck Institute for Innovation and Competition. The model came out on top in a comparison with other models in tasks critical to patent examination and innovation research.

Developed by Mainak Ghosh, Sebastian Erhardt, Michael E. Rose, Erik Buunk, and Dietmar Harhoff, PaECTER (Patent-Level Representation Learning Using Citation-Informed Transformers) uses advanced transformer-based machine learning techniques fine-tuned with patent citation data. The model is specifically designed to address the complex challenges of patent text analysis and provides significant improvements in the identification and categorization of similar patents, making it highly valuable for both patent examiners and innovation researchers.


The new NBER working paper “Patent Text and Long-Run Innovation Dynamics: The Critical Role of Model Selection” rigorously compares PaECTER with other Natural Language Processing (NLP) models. The authors Ina Ganguli (University of Massachusetts Amherst), Jeffrey Lin (Federal Reserve Bank of Philadelphia), Vitaly Meursault (Federal Reserve Bank of Philadelphia), and Nicholas Reynolds (University of Essex) assessed the models’ performances in patent interference tasks, where multiple inventors claim similar inventions.


The study concluded that PaECTER significantly reduces false positives and improves efficiency compared to traditional models like TF-IDF (Term Frequency – Inverse Document Frequency). The study also highlighted PaECTER’s capabilities when compared with other modern models such as GTE and S-BERT (Generalized Text Embedding and Sentence-BERT as methods for representing texts in the form of numerical vectors that capture semantic information about words or entire sentences). While PaECTER performed exceptionally well in expert-driven tasks like interference identification, it also held its own in broader patent classification tasks, further reinforcing its versatility.


“We are pleased that PaECTER’s performance has been validated by the NBER study, which shows its strengths in patent similarity analysis and confirms its role as a reliable tool for those working in the field of innovation and intellectual property,” says Mainak Ghosh, one of PaECTER’s developers. “This independent validation further strengthens its relevance in the field of patent examination.”


The PaECTER model is available for use on the Hugging Face platform, making it accessible to researchers, policymakers, and patent professionals worldwide. Its robust performance, as demonstrated by the NBER study, underscores its value in improving the way patent data is processed, contributing to more accurate and efficient analysis of patent innovations over time. As of today, PaECTER has been downloaded more than 1.4 million times.


More information:


PaECTER on Hugging Face


Ganguli, Ina; Lin, Jeffery; Meursault, Vitaly; Reynolds, Nicholas F. (2024). Patent Text and Long-Run Innovation Dynamics: The Critical Role of Model Selection (No. w32934). National Bureau of Economic Research. Available at https://www.nber.org/papers/w32934


Ghosh, Mainak; Erhardt, Sebastian; Rose, Michael; Buunk, Erik; Harhoff, Dietmar (2024). PaECTER: Patent-Level Representation Learning Using Citation-Informed Transformers, arXiv preprint 2402.19411. Available at https://arxiv.org/abs/2402.19411

UNO building in Vienna.
Miscellaneous  |  10/09/2024

Automation of Contracting Simplified by UN Model Law

The United Nations Commission on International Trade Law (UNCITRAL) has adopted a Model Law that is intended to standardize and facilitate the formation and performance of automated contracts online. As a member of the E-Commerce Working Group, Jörg Hoffmann, a researcher at the Institute, participated in several discussion rounds on the drafting of the Model Law.

Meeting room at the United Nations in Vienna with MPI sign
Meeting room at the United Nations in Vienna with MPI sign. Photo: Jörg Hoffmann
UNO building in Vienna.
UNO building in Vienna. Photo: Jörg Hoffmann

The Institute is officially recognized by UNCITRAL due to its research focus in the field of Artificial Intelligence (AI) and Data Law respectively. The technology-specific research approach, which the Institute pursues more closely, especially in invention and creation processes in the IP area, helped in the preliminary work and in the final version, which was adopted by the UNCITRAL Commission this summer at its 57th session in New York.


The Model Law provides a legal framework to enable the use of automation in contracts, including the use of AI technologies and “Smart Contracts”, as well as for machine-to-machine transactions. It is intended to complement and supplement existing laws on electronic contracting, particularly those based on other UNCITRAL texts on electronic commerce. These have already been adopted in over a hundred jurisdictions worldwide. The Model Law is the first piece of legislation to emerge from UNCITRAL's exploratory work on legal issues relating to the digital economy and digital commerce. One of its primary objective is to reduce legal uncertainties and transaction costs, thus promoting AI-driven innovation. Meanwhile, work continues on related areas, such as data contracts and distributed ledger technology (blockchain).


About UNCITRAL
The United Nations Commission on International Trade Law (UNCITRAL) is the core legal body of the United Nations system in the field of international trade law. Its mandate is to remove legal barriers to international trade through the progressive modernization and harmonization of trade law. It prepares legal texts in a number of key areas such as international commercial dispute settlement, electronic commerce, insolvency, international payments, sale of goods, transport law, procurement and infrastructure development. 
 

Press release of UNCITRAL
Documents of the UNCITRAL Working Group Electronic Commerce

Excerpt from the cover page of the journal Science from 13 September 2024
Study  |  09/16/2024

Open Access is Shaping Scientific Communication

Open access (OA) represents a transformative shift in scientific publishing, aiming to ensure unrestricted access to taxpayer-funded research and data. In a new study published in Science, Frank Mueller-Langer, Affiliated Research Fellow at the Institute and Professor at the University of the Bundeswehr Munich, and Mark McCabe, Professor at the SKEMA Business School, make a significant contribution to the discourse on OA by analyzing the economic, political, and institutional dynamics shaping the transformation of scholarly publishing.

Excerpt from the cover page of the journal Science from 13 September 2024
Cover page of the journal Science from 13 September 2024

The OA movement seeks to eliminate cost barriers for accessing scientific knowledge. This is supported by major international policies like the Berlin Declaration (2003) and mandates such as the U.S. White House Directive (2022). By 2025, federally funded U.S. research must be freely available without delay, underscoring OA’s expanding global influence.


Impact on Researchers and Institutions


For researchers, publishing in reputable, high-impact journals is essential for career advancement. Historically, the costs of accessing research were managed by university libraries, resulting in steep subscription price increases, known as the “serials crisis”. OA, through models like “Gold OA” (funded by article processing charges – APCs) and transformative agreements (TAs), has sought to address these challenges by shifting costs to publishing rather than reading.


Yet, debates persist. Critics worry that APC-based systems might compromise research quality by incentivizing volume over rigor. Additionally, while OA enhances visibility and downloads, its impact on citation metrics remains modest. Meanwhile, transformative agreements between publishers and institutions aim to cap costs but show varying success in fostering competition and transparency.


Comprehensive Analysis of Open Access Impacts


The authors examine how OA policies, particularly TAs, are reshaping the relationships between researchers, publishers, and institutions. They highlight that while OA increases the visibility of research, it also introduces challenges, such as the potential impact of  APC-based models on research quality.


Evaluation of Transformative Agreements (TAs)


A major focus is on TAs, which aim to replace subscription costs with OA publishing fees. The authors analyze these agreements for their effectiveness and incentive structures, pointing out that many TAs (with notable exceptions like those at the University of California) lack sufficient mechanisms to control costs effectively.


Emphasis on Market Structures and Competition


The article highlights the opportunities and risks OA poses for competition among publishers. While APC-based OA theoretically enhances competition for paper submission among publishers by shifting the decision-making from readers under the traditional reader-pays model (multi-homing) to authors under the author-pays model (single-homing), the authors warn that OA “Big Deals” (where one or more universities and a single publisher negotiate the APCs for publishing in any of the latter’s journals) may further concentrate the market and stifle innovation.


Linking Publishing and Data Analytics


The authors explore how major publishers like Elsevier are increasingly integrating data analytics tools into their business models, potentially enhancing their market dominance. This trend is critically evaluated for its implications on competition and the independence of scientific data analytics.


Policy Recommendations and Evidence-Based Experiments


Mueller-Langer and McCabe advocate for experimental approaches to accompany policy measures like the U.S. Office of Science and Technology Policy (OSTP) initiative. These could help better understand the effects of new funding and publishing models, allowing for more informed and effective policymaking.


Conclusion


The authors provide valuable insights into how OA strategies could reshape the balance between publication costs, access to research, and quality assurance. Their article emphasizes the need for careful monitoring and evaluation of market mechanisms to ensure a sustainable and competitive scholarly communication landscape in the long term.


Directly to the study:

McCabe, Mark J.; Mueller-Langer, Frank (2024). Open Access Is Shaping Scientific Communication – Funders and Publishers Should Roll Out Policies in Ways to Support Their Evaluation, Science, 385 (6714), 1170-1172. Available at https://doi.org/10.1126/science.adp8882

Lucy Xiaolu Wang and Dennis Byrski received the Program Chair Award of the American Society of Health Economists (ASHEcon).
Study  |  08/13/2024

Marketing Authorization and Strategic Patenting: Evidence from Pharmaceuticals

Patents are designed to incentivize innovation, but pharmaceutical firms often extend market exclusivity with secondary patents on marginally beneficial improvements. Such behaviors evoke discussions about raising patentability standards. Lucy Xiaolu Wang and Dennis Byrski have now received the Program Chair Award from the American Society of Health Economists (ASHEcon) for a new study on this topic.

Lucy Xiaolu Wang and Dennis Byrski received the Program Chair Award of the American Society of Health Economists (ASHEcon).
Lucy Xiaolu Wang and Dennis Byrski received the Program Chair Award of the American Society of Health Economists (ASHEcon).

The new study examines whether pharmaceutical firms move away from filing strategic patents once the focal drug gains marketing authorization and the disclosed trial-related information becomes novelty-threatening prior art.


The authors construct novel patent-drug dyadic data and leverage unique European drug patent and marketing contexts. Using an event study methodology, they exploit plausibly exogenous variation in the length of time from patent filing to drug approval. First, they illustrate that drugs with early and late marketing authorization share similar ex ante patent and drug characteristics. Second, they support the hypothesis that strategic patenting behavior decreases substantially after marketing authorization. In contrast, meaningful follow-on innovations remain unaffected. Third, they show that these effects are likely driven by obstacles in the enforceability of marginal patents filed after approval.


The results of the study suggest that post-marketing increases in patentability standards are welfare-enhancing with examiner scrutiny and firm self-adjustment. They highlight the importance of better data provision to patent examiners to increase the quality of follow-up inventions.


Lucy Xiaolu Wang was a Senior Research Fellow at the Institute and is now a tenure-track Assistant Professor at the University of Massachusetts Amherst. She continues to be closely associated with the Institute as an Affiliated Research Fellow.


Dennis Byrski was a Junior Research Fellow at the Institute. In 2021 he submitted his doctoral thesis titled “From Scientific Research to Healthcare Markets – Empirical Essays on the Economics of Pharmaceutical Innovation”. He is now a Junior Engagement Manager at McKinsey & Company.


The American Society of Health Economists is a professional organization dedicated to promoting excellence in health economics research. It aims to enhance individual and societal health by providing evidence and expertise for the development of private and public policies. ASHEcon’s awards honor individuals who have made significant contributions to the field of health economics.


Directly to the publication:
Byrski, Dennis; Wang, Lucy Xiaolu (2024). Marktzulassung und strategische Patentierung: Evidence from Pharmaceuticals. Verfügbar unter https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4638115.


More information:
2024 ASHEcon Award Winners

SDG 3: Good Health and Well-Being
Study  |  08/07/2024

Improving Health and Well-Being through Data Governance

As part of the project Data Governance in Emerging Economies to Achieve the Sustainable Development Goals, a second report has now been presented – this a joint effort with researchers from India. The focus here is on the United Nations’ Sustainable Development Goal 3 (SDG 3), which aims to ensure good health and well-being for all.

Participants of the workshop in Bengaluru, India
Participants of the workshop in Bengaluru, India
SDG 3: Good Health and Well-Being
Sustainable Development Goal 3: Good Health and Well-Being

With its findings, the international project Data Governance in Emerging Economies to Achieve the Sustainable Development Goals contributes to the discussion on how Data Governance can be used to achieve the Sustainable Development Goals of the United Nations, especially in emerging economies. Together with researchers from India, the data governance landscape in the Indian healthcare sector is being examined. The report, which has now been published, presents preliminary findings along with questions for defining a research agenda building on the insights from the multistakeholder-workshops organized in Bengaluru in September 2022.


Written in the specific context of SDG 3, the report underscores the role of data governance for good health and well-being in India. A brief introduction to the project is followed by four more sections. It first provides a detailed background on the selection of SDG 3 and India for the study of data governance in growth regions. The second section offers an overview of the existing legal framework for data governance in India. In addition to analyzing open data initiatives, this section maps the judicial and legislative steps towards safeguarding personal data, while highlighting the ongoing efforts to legislate non-personal data sharing. In addition to analyzing open data initiatives, this section outlines the legal and legislative steps taken to protect personal data and highlights the ongoing efforts to legislate the sharing of non-personal data. The findings from the workshop are used in the next section to examine the role of the public sector, private sector and start-ups in determining the scope of data governance in the Indian healthcare sector. The final section summarizes the findings and emphasizes the importance of a data governance framework for achieving the Sustainable Development Goals, particularly SDG 3 in India. Finally, the researchers pose a series of questions aimed at developing a research agenda to help build a framework for data governance in emerging economies like India.


A result of doctrinal research and participatory discussion with the stakeholders, the report indicates the need for stronger engagement with all the stakeholders for evolving appropriate data governance frameworks – including but not limited to – measures establishing legal rights and obligations of stakeholders. The institutional and technical aspects of data governance are considered an essential complement. More broadly, project deliberations point towards significant gains resulting from a data governance approach, developed in close relation with the SDG framework, to legal research and policy formulation.


Arul George Scaria, Vikas Kathuria, Shraddha Kulhari, Vidya Subramanian
Data Governance in Emerging Economies to Achieve the Sustainable Development Goals
India Country Report Based on the Workshop Data Governance for Good Health & Well-Being: India’s Way Forward to Achieving Sustainable Development Goal 3 (Bengaluru, September 8-9, 2022)

Max Planck Institute for Innovation & Competition Research Paper No. 24-08


Mor Bakhoum, Begoña Gonzalez Otero, Jörg Hoffmann, Minata Sarr
Data Governance in Emerging Economies to Achieve the Sustainable Development Goals
Senegal Country Report Based on the Workshop Shaping Data Sharing Policies in the Agricultural and the Financial Services Sector (Dakar, March 16-17, 2022)

Max Planck Institute for Innovation & Competition Research Paper No. 24-05

Participants of the Dakar workshop, Senegal, 17 March 2022
Study  |  02/28/2024

Sustainable Development in Senegal through Data Governance

Together with Senegalese scholars, researchers from the Institute have investigated how dealing with data can help to achieve sustainable development in emerging economies and have now presented their findings in a report.

Participants of the Dakar workshop, Senegal, 17 March 2022
Participants of the Dakar workshop, Senegal, 17 March 2022. Photo: Begoña Gonzalez Otero
UN Sustainable Development Goals Wheel
UN Sustainable Development Goals

It was based on the results of a workshop that took place in Dakar (Senegal) in March 2022. This was part of the broader international project "Data Governance in Emerging Economies to Achieve the Sustainable Development Goals (SDGs)", which examines the opportunities and possibilities offered by data policy to achieve the United Nations' Sustainable Development Goals.


Structured into four distinct parts, the report provides an exhaustive evaluation of Senegal's regulatory landscape concerning data access and sharing (Part I), laying the groundwork for a detailed examination of the alignment of these regulations with SDGs. It then focuses on the agricultural sector's data-sharing practices and their potential contributions to economic growth and sustainable development (Part II), followed by an exploration of the challenges and opportunities in data governance for financial services in the digital era (Part III). Part IV synthesizes the workshop's discussions, offering valuable insights, conclusions, and forward-looking recommendations.


This scholarly endeavor contributes significantly to the ongoing discourse surrounding data governance and its pivotal role in realizing the SDGs. The nuanced analysis and insights presented herein serve as a valuable resource for policymakers, academics, and practitioners operating at the intersection of data governance, development, and sustainability. Moreover, the outlined recommendations and prospective research agenda provide a roadmap for our future endeavors aimed at advancing data governance in emerging economies, aligning with the vision of the UN AI Advisory Board to govern AI for humanity.


Mor Bakhoum, Begoña Gonzalez Otero, Jörg Hoffmann, Minata Sarr
Data Governance in Emerging Economies to Achieve the Sustainable Development Goals Senegal Country Report Based on the Workshop Shaping Data Sharing Policies in the Agricultural and the Financial Services Sector (Dakar, March 16-17, 2022)
Max Planck Institute for Innovation & Competition Research Paper No. 24-05

European Commission Brussels
Opinion  |  02/07/2024

Position Statement on the Commission’s Proposal for a Regulation on Standard Essential Patents

The Position Statement of the Max Planck Institute for Innovation and Competition of 6 February 2024 on the Commission's Proposal for a Regulation on Standard Essential Patents, assesses the proposal in the light of its adequacy to address the challenges of SEP licensing in the context of the Internet of Things and its potential for contributing to a balanced global SEP licensing. Preceding these assessments, the Institute elaborates on the legal and economic foundations of an innovation-oriented standardisation and outlines the context in which major problems addressed by the Commission’s Proposal arise.

European Commission Brussels
European Commission, Brussels Photo: Hella Schuster

On 27 April 2023, the European Commission presented its proposal. The proposed regulation aims at improving the licensing of SEPs by reducing the uncertainty that surrounds licensing negotiations and lowering transaction costs. In order to achieve these objectives, the Commission has considered different policy options. Of these, the Proposal implements (1) the setting up of a mandatory register for SEPs with essentiality checks of selected and representative random samples of SEPs, (2) the establishment of a process for determining a non-binding aggregate royalty rate, and (3) a mandatory pre-litigation conciliation procedure for FRAND royalty determination, combined with (4) voluntary guidance on SEP licensing. Institutionally, a new competence centre within the European Intellectual Property Office (EUIPO) is to be in charge of managing and performing these tasks.

Symbolic image of genome editing. Image: vchalup/Adobe Stock
Opinion  |  08/11/2023

Position Statement on New Genomic Techniques and Intellectual Property Law

A new Position Statement of the Institute addresses concerns related to intellectual property protection for genome-editing technologies and genome-edited plants in the EU. It proposes a set of policy recommendations to facilitate access to and utilisation of IP-protected genome-editing technologies and their products in the plant breeding sector.

Symbolic image of genome editing. Image: vchalup/Adobe Stock
Symbolic image of genome editing. Image: vchalup/Adobe Stock

On 5 July 2023, the European Commission issued a proposal for the regulation that intends to relax the requirements for marketing authorisation of plants obtained by certain new genomic techniques (NGTs) in the EU. While NGTs are expected to become more appealing to breeders and farmers, the complexity of the intellectual property (IP) landscape surrounding NGTs and resulting products can have a discouraging effect on innovation. In view of numerous concerns related to IP protection for NGTs and NGT-derived plants, a research group at the Institute has developed a set of policy recommendations that can facilitate access to and utilisation of IP-protected NGTs and their products in the plant breeding sector.


To the Position Statement:
Position Statement (8 August 2023) on New Genomic Techniques and Intellectual Property Law: Challenges and Solutions for the Plant Breeding Sector


More on this topic:
CRISPR/Cas Technology and Innovation: Mapping Patent Law Issues

Dr. Valentina Moscon, Senior Research Fellow at the Institute.
Study  |  07/24/2023

Access to Data versus Exclusive Control over Data in European Data Rules

Valentina Moscon, Senior Research Fellow at the Institute, in her recent article identifies a trend in the European data rules. This is moving toward the creation of data exclusivity based on copyright and technical protection measures and contradicts the claim of free access to data.

Dr. Valentina Moscon, Senior Research Fellow at the Institute.
Dr. Valentina Moscon, Senior Research Fellow at the Institute.

Moscon uses case studies – specifically, the text and data mining (TDM) regime in the 2019 EU Copyright Directive and the upcoming EU Data Act laying down rules on access to IoT data – to analyze the ways in which the identified trend is already gaining traction and where it conflicts with both established principles of European and international copyright law and the balanced consideration of stakeholders’ interests.


On the one hand, the case of TDM shows that the scope of copyright is expanding and arguably, private ordering mechanisms such as technological protection measures (TPMs) which allow right holders to wield exclusive rights, are extending this scope even further, beyond the realm of works, to the realm of data. On the other hand, new legislative initiatives regulating data leave intellectual property rights other than the sui generis database right unaffected, with minor limitations, so that there will likely be a clash between data access rules and the exclusive rights of copyright and related rights holders. Also, the Data Act proposal introduces the protection of technological protection measures over data thereby further strengthening the exclusive control over data. Finally, in her paper Moscon formulates some recommendations for action.


Valentina Moscon
Data Access Rules, Copyright and Protection of Technological Protection Measures in the EU. A Wave of Propertisation of Information
Max Planck Institute for Innovation & Competition Research Paper No. 23-14