On Thursday, March 15, a pedestrian bridge at Florida International University collapsed, months before it was set to open. The structure’s 950-ton main span had just been installed using a process called Accelerated Bridge Construction (ABC) technology, designed in part to reduce the time that street traffic would be halted. When it collapsed, the bridge crushed cars below, killing at least six people, and leaving investigators to determine how it happened, and who would be held responsible. And although any hypothesis have been floated, the lack of final construction plans makes it hard to determine a cause. Many experts say that the FIU bridge collapse will be examined and taught in engineering schools throughout the years.
Just a few days later, The Guardian and The New York Times broke the news that data from 50 million Facebook profiles were harvested for UK-based Cambridge Analytica, which might have influenced the results of the 2016 US presidential elections. This number was later revised to as many as 87 million Facebook profiles.
In some ways, engineering disasters, such as the Miami bridge collapse, feel more comprehensible — though no less dreadful — than the growing set of engineering disasters that are happening far above our nation’s bridges, in the inter-connected webs of the Cloud. Technology companies have developed a new normal for how we drive, communicate, shop and learn. And within that brave new world has come a set of complicated and often unwieldy ethical conversations connected to computer engineering.
For instance, who is responsible when a social media company shares information, which is then used for much deeper political purposes? How does one deal with the ethics of a driverless car that kills a pedestrian, or the tangible effects of Artificial Intelligence (AI) on the future of American jobs, transportation or warfare?
These are topics that shouldn’t be left for our technology companies to figure out, but that every STEM program today should be actively exploring with their students.
I read not too long ago that a group of tech titans, representing Alphabet, Amazon, Google, Microsoft and others were convening on a regular basis to discuss the tangible effects and implications of Artificial Intelligence on society. Their basic premise: to ensure that AI is designed to benefit people and not harm them, just as we expect of our engineered structures and products.
At ABET, these are questions we ask regularly in our accreditation process, and have begun implementing in some of our revised criteria, as we know the critical importance of embedding forward-thinking ethics conversations and questions into programs of higher education.
Together with our member societies, we have a responsibility to build a better world and that means arming students of applied science, computing, engineering, and engineering technology with the real-world skills and the moral courage to step into high-stakes environments with the clarity to know when something is right and when it feels wrong.
Take for example Harvard University and Massachusetts Institute of Technology who this semester are offering a joint course on the ethics and regulation of artificial intelligence for the first time. The University of Texas at Austin also recently introduced a course titled “Ethical Foundations of Computer Science,” and is looking to eventually make it a requirement for all computer science majors. And how, for the last two decades, Olin College has been reimagining the engineering curriculum not only through project-based learning but also putting a strong emphasis through their Arts, Humanities, and Social Sciences (AHS) program, whose curriculum is based on the idea that engineering starts with people and ends with people. AHS courses are meant to foster students’ development as critical and contextual thinkers, broad-thinking creators, persuasive communicators, ethical practitioners, and as self-reflective individuals.
Today, we are convening with academic leaders from around the globe for our 2018 ABET Symposium. We’ll be hosting conversations on sustainability, diversity and inclusion, ethics and quality improvement — themes that are relevant in campuses around the world. One of our invited speakers is Olin College President Richard Miller, who will talk about his work at this highly innovative institution, how to keep students interested in engineering and how Olin has managed to attract a large percentage of women to its engineering programs.
When it comes to Facebook or driverless cars, the answers many not all be clear, but there’s one thing we know for sure: a commitment to building ethical STEM graduates is an investment that will benefit us all.