
Open-source technology developed in civilian field also has the ability to use in military applications or simply can be misused. Navigate it Dual use The capacity in engineering areas is becoming more important, as innovation goes in both ways. While the “openness” of the open-source technology is part of innovation and allows everyone to access, also, unfortunately, means that it is easily accessible to others including military and criminals.
What happens when an evil state, a non-state militia, or a school shooter displays the same creativity and innovation with open-source technology that engineers do? This is the question that we are discussing here: How can we maintain our principles of open research and innovation to run progress by reducing the underlying risks that come with accessible technology?
More than only open-ended risk, let’s discuss specific challenges, which is on the open-source technology and its dual-use capacity robotics. Understanding these challenges can help engineers know what to see in their own subjects.
Power and crisis of openness
Open-access publication, software and educational materials are fundamental to pursue robotics. They have democratized access to knowledge, enabling fertility, and promoting a vibrant, associate international community of scientists. Like platforms and open-sources initiatives like Arxiv and Github Robot operation system And this Open Dynamic Robot Initiative Robotics are important in accelerating research and innovation, and there is no doubt that they should be openly accessible. Losing access to these resources will be disastrous in the field of robotics.
However, robotics bear the risks of underlying dual use because most robotics techniques can be rebuilt for military use or harmful purposes. A recent example of custom-made drones in current conflicts is particularly practical. Resurrection Increase in civil drone technology Received worldwide, often praise, news coverage. Their creativity is made possible through the ability of commercial drones, spare parts, 3D printers, and availability of open-source software and hardware. This allows people with small technical backgrounds and money to easily create, control and reproduce for military applications. One can certainly argue that it has a strong impact on Ukrainians who defended their country. However, these same conditions also offer opportunities for a wide range of potential bad actors.
Openly available knowledge, designs and software can be misused to increase the existing weapons systems with vision-based capabilities Navigation, autonomous targeting, or herdAdditionally, the public nature of the open-source code unless proper safety measures are not taken makes it unsafe for cyber attack, potentially malicious actors to gain control of robotic systems and cause them to cause malfunctions. Rude purposeMany ROS users already recognize that they do not invest enough Cyber security For their applications.
Guidance is necessary
Dual use risks from openness in research and innovation is a concern for many engineering sectors. Do you know that engineering was basically only a military activity? The word “engineer” was coined in the Middle Ages to describe “a designer and fortification and creator of arms”. Some engineering expertise, especially those including large -scale destruction (chemical, biological, radiological and atoms) weapons, have developed clear guidance, and in some cases, how can the research and innovation be conducted and spread. They also have community-granted procedures, which are meant to reduce the risks of dual use associated with spreading knowledge. For example, Biorxiv and Medrxiv- Preprint server for biology and health science- for screen submission for materials that publish them a bio-safety or health risk before publishing them.
The area of robotics, compared, does not give any specific regulation and little guidance to how robotists should think and address the risks associated with openness. Most universities do not taught the risk of dual use, yet it is something that students will face in their careers, such as when their work is assessed. Export control rules on dual -use items,
As a result, robotists may not feel encouraged or equipped to evaluate and reduce dual use risks associated with their work. This represents a major problem, as damage associated with open robot research and misuse of innovation is likely to be greater than atoms and biological research, both require much more resources. The production of the “Do-It-Yorcell” robotic weapon system using open-source design and software and off-the-shelf commercial components is really relatively easy and accessible. Keeping this in mind, we wonder that it is a high time for the robotics community how researchers and companies can best navigate the risks of dual use related to the open spread of their work.
A roadmap for responsible robotics
Balancing between safety and openness is a complex challenge, but a robotics community should embrace. We cannot stop innovation, nor can we ignore the ability to harm. An active, multi-dimensional approach is required to navigate the dilemma of this dual use. Drawing lessons from other areas of engineering, we propose a roadmap focusing on four major areas: education, encouragement, moderation and red lines.
Education
It is paramount to integrate research and innovation responsible in robotics education at all levels. This includes not only dedicated courses, but Systematic inclusion Dual use and cyber security ideas within the core Robotics coursesWe should promote the culture of innovation responsible so that we can strengthen roboticists to make informed decisions and address potential risks.
Educational initiative may be involved:
Incentive
All must be encouraged to assess the potential negative consequences of making their work completely or partially open. Funding agencies can make the risk assessment compulsory as a condition for the funding of the project, indicating their importance. Professional organizations, such as IEEE Robotics and Automation Society (Ras), can adopt and promote best practicesProviding equipment and outline to researchers to identify, assess and reduce risks. Such devices may include self-assessment checklists for individual researchers and establishing guidance moral review boards for faculty and laboratories. Educational magazines and conferences can make colleague-review risk assessment an integral part of the publishing process, especially for high-risk applications.
Additionally, incentives such as prizes and recognition programs can highlight exemplary contribution to risk evaluation and mitigation, promoting the culture of responsibility within the community. Risk assessment can also be encouraged and rewarded by more informal methods. People on leadership positions, such as PhD supervisors and chief of laboratories, can create ad hoc opportunities to discuss possible risks for students and researchers. They can catch a seminar on the subject and introduce outdoor experts and stakeholders such as social scientists and experts of non -governmental organizations.
Control
Robotics can apply community Self-regulation system To moderate the spread of high -risk materials. This may include:
- Screening work before publishing to prevent the spread of materials to prevent severe risks.
- Applying graduate access control (“gating”) for data on some source code or open-source repository, potentially users need to identify themselves and specify their intended use.
- To ensure transparency and establish clear guidelines and community inspections to prevent misuse of these moderation mechanisms. For example, organizations such as RAS can design the categories of risk levels for robotics research and applications and create a monitoring committee to track and document the real matters of misuse of robotics research to understand and imagine the scale of risks and to create better mitigation strategies.
Red line
The robotics community should seek to define and implement red lines for the development and deployment of robotics technologies. Efforts to define red lines have already been made in that direction, especially in the context of IEEE Global Initiative on the morality of autonomous and intelligent systemsCompanies, including Boston dynamics, Unilateral, Agile robotics, Clearing robotics, Any kindAnd Open robotics Wrote an open letter for the rules on WeaponUnfortunately, their efforts were very narrow in the scope, and the front of robotics has a lot of value in the end-USES, which should be considered off-lymph or demand additional precautions.
It would be absolutely difficult for the community to agree on standard red lines, as the one who is considered morally acceptable or problematic is highly subjective. To support the process, individuals and companies can consider what they consider unacceptable use of their work. This can lead to policies and use conditions that beneficiaries of open research and open-source design software have to be formally agreed (eg specific-use open-source license). This will provide a basis to cancel access, deny software updates, and potentially prosecute those who misuse technology. Some companies including Boston Dynamics have already implemented these measures to some extent. Any person or company can operate open research, repeating this example.
Openness is the key to democratization of many engineering subjects, including innovation and robotics, but it also increases the ability to misuse. It is the responsibility of the engineering community to address the dilemma of double use. From education and risk evaluation to moderation and red lines, by embracing responsible practices, we can promote an ecosystem where openness and safety co -existence. Challenges are important, but much more to ignore bets. It is important to ensure that research and innovation benefit society globally and do not become the driver of instability in the world. This goal, we assume, align with the mission of IEEE, whose mission is “advance techniques for the benefit of humanity”. The engineering community, especially robotists, need to be active on these issues to prevent any backlash from society and potentially competitive measures or international rules that can damage open science.

