Wolfgang Fink, PhD
Dr. Wolfgang Fink is an associate professor of electrical and computer engineering and the inaugural Edward & Maria Keonjian Endowed Chair in Microelectronics at the University of Arizona. He holds joint appointments in the departments of electrical and computer engineering, biomedical engineering, systems and industrial engineering, aerospace and mechanical engineering, and ophthalmology and vision science.
Dr. Fink is the founder and director of the Visual and Autonomous Exploration Systems Research Laboratory at Caltech and the University of Arizona and the founder and director of the University of Arizona Center for Informatics and Telehealth in Medicine.
Dr. Fink’s research comprises general smart service systems, autonomous systems, brain-computer interfaces, smart platforms for mobile and telehealth, and computer-optimized design. He was one of the principal investigators of the US Department of Energy’s Artificial Retina consortium—involving six National Laboratories, four universities, and one industry partner—which pioneered the only FDA-approved visual prosthesis to date: the Argus II by Second Sight Medical Products, Inc. Dr. Fink is an AIMBE fellow, a PHMS fellow, an SPIE fellow, an ARVO fellow, and a senior member of IEEE. He holds more than 29 US and foreign patents to date.
Deidra Hodges, PhD
Dr. Deidra R. Hodges is an associate professor and the chair of the Department of Electrical and Computer Engineering at Florida International University. She is an exceptional leader in photovoltaics (PV) and solar energy research with extensive experience in PV and X- and gamma-ray radiation detectors for National Security. She is highly focused on advancing renewable energy, sustainability, nuclear materials, and extreme photon sensing.
Dr. Hodges’ contributions include supporting and developing the pixilated cadmium zinc telluride (CZT) gamma detector at Brookhaven National Laboratory and highly efficient thin-film mixed perovskite halides photovoltaics. She has achieved perovskite solar cell power conversion efficiencies greater than 21 percent, approaching the world record efficiency of 25.2 percent.
Dr. Hodges’ network spans many Department of Energy government laboratories, including BNL, Idaho National Laboratory, Kansas City National Security Center Honeywell FM&T, and the National Renewable Energy Laboratory, collaborating with Scientists and as a user of facilities.
The world runs on circuits and chips. But Moore’s Law, which states that the number of transistors in an integrated circuit doubles roughly every two years, is slowing down. What’s been a relatively reliable staircase of global economic productivity could be put into jeopardy as engineers run up against physical and logistical limitations, with transistors getting smaller and smaller.
While Moore’s Law was never really a law, only an observation, the nomenclature represents how fundamental a consideration it has become. Finding ways to continue making more efficient and capable circuits, and thus more powerful computers, is critical for pushing the technological envelope further forward.
“What has come up over the last several years is multicore systems, where you simply pack multiple cores on the same chip area,” Dr. Fink says. “While this is quite well understood, and you now have a formidable number of cores on chips, challenges remain.”
Those challenges include compensating for heat generation, efficiently routing data, and placing chip components and connecting links most effectively. Those challenges have, thus far, prevented multi-core systems from reaching true scalability, Dr. Fink says. But advancements in AI are helping.
“What my lab has been using—we just got a patent for it—is a stochastic optimization scheme to have the network-on-chip design, i.e., router placement and link connectivity between the routers, done with evolutionary algorithms, i.e., without a human in the loop,” Dr. Fink says.
Revolutionizing chip design would be a major breakthrough, with massive economic and societal benefits. Governments around the world are kicking in to incentivize engineers further. In August 2022, President Joe Biden signed the CHIPS Act, which supplied roughly $280 billion in new funding to boost domestic research and manufacturing of semiconductors in the US.
“We used to dominate the technology, manufacturing, and science of semiconductors, but we outsourced that,” Dr. Hodges says. “With the CHIPS Act, we can regain our leadership in semiconductors and microelectronics. I’m excited about that.”
Human-computer interaction (HCI) is a multidisciplinary field of study with unique applications involving electrical engineering. Implantable devices that merge biological elements with electronic elements could open up a new frontier of human potential.
Already, engineers have developed retinal implants like the Argus II, the only FDA-approved visual prosthesis to date that can help restore partial vision to individuals with visual impairments, even blindness. Further innovations in this and other areas related to implantable human-computer interfaces could greatly increase accessibility, and restore sensory input or output, for those with disabilities.
“The interface where the electronics meet the body still needs to undergo a transformational change,” Dr. Fink says. “But if there are breakthroughs in energy consumption, heat dissipation, resolution and precision of stimulation, and making these things more organic or natural as opposed to just biomimetic, then that’s something attainable.”
Right now, most human-computer connections are done with electrodes, which match crudely to our brain’s neurons. The proximity of those electrodes to the tissue they’re supposed to stimulate remains an issue, as does the corrosion they experience over time. More powerful and long-lasting human-computer links will need to connect more gracefully and more seamlessly. These are the ongoing logistical issues of implantable devices, but their potential is unbounded: Neuralink, which develops implants that it hopes will allow individuals to control computer interfaces with brain waves, opened its first-in-human clinical trials in September 2023.
The world has no future without clean energy. Photovoltaic (PV) cells, designed and fabricated by electrical engineers, convert sunlight into clean electrical energy. However, issues around reliability, scalability, efficiency, and cost remain. Electrical engineers are looking at new materials, structures, and designs for PV cells that could help.
“Much of our energy now [comes from] and will come from solar energy,” Dr. Hodges says. “Right now, silicon is the material that dominates the market for manufacturing solar panels of PV. But silicon isn’t the only material. Cadmium telluride is a thin-film material that’s a lot less costly to make, and a little more efficient than silicon.”
The US Department of Energy (DoE) is investing significant time and money in building up the manufacturing capacity of photovoltaics in the US. It’s also funding research into new materials that might be cheaper and more efficient than silicon or cadmium telluride.
“There is the promising and hot field of organic photovoltaics (OPV), with efficiencies becoming comparable to silicon PV,” Dr. Fink adds.
Each resulting innovation could help empower a cleaner, healthier future.
In July 2023, scientists in South Korea claimed that a material known as LK-99 allowed for room temperature superconductivity and at relatively low pressure, making it suitable for use in many real-world devices.
Such a breakthrough would’ve been truly monumental, potentially unlocking lossless power transmission, magnetic levitation, hyper-efficient batteries, high-performance quantum computing, and more. Much hype, followed by much debate, resulted in much disappointment: the results of the scientists’ studies were found to be non-reproducible.
“It would, of course, have huge implications were it ever to be accomplished,” Dr. Fink says. “It’s material science. It could happen overnight, but it could also not happen. It’s almost like fishing in the dark.”
This was not the first time room-temperature superconductors have entered into public debate. In 2020, scientists claimed to have achieved superconductivity at around 59 degrees Fahrenheit, with a material composed of carbon, sulfur, and hydrogen; however, the material also needed to be kept under high pressure, which would’ve reduced its real-world applicability drastically. Ultimately, the paper was retracted in 2022. Electrical engineers, however, are not giving up on the idea of what a stable room-temperature superconductor could power. Room-temperature superconductors will continue to be warmly, if not hotly, debated for years to come.
“It’s a golden opportunity,” Dr. Hodges says. “There are a lot of researchers looking for that Holy Grail of superconducting materials at room temperature.”
Quantum computers represent and process in a fundamentally different way than traditional computers. While traditional computers use bits as the basic unit of account and can be represented as a zero or a one, quantum computers utilize the qubit, which can exist as a zero, a one, or both simultaneously, in a state of superposition. As with all things on the quantum level, it only gets more confusing, and more interesting, the closer you look.
“Quantum computing would be far more powerful than today’s supercomputers,” Dr. Hodges says.
Quantum computers are best at large, complex, interconnected problems, like those found in finance, logistics, and drug discovery. They can simulate quantum systems themselves, opening up potential chemistry and materials science breakthroughs. And they could lead to a complete rethink of cryptography: on one hand, more powerful quantum computing would be able to break today’s strongest encryption; on the other hand, it would be able to devise a newer, stronger, more secure system of encryption for critical infrastructure.
“We have quantum computers today, but they require large amounts of liquid helium, and to scale that up would be very costly,” Dr. Hodges says. “We have to bring it down to where it’s affordable and scalable.”
Quantum computers aren’t good at everything. They have higher error rates than traditional computers and aren't suited to storing large amounts of data. For most user-level activities, the zeroes and ones are enough. But that’s not stopping major players like Google and Microsoft from attempting to achieve what’s known as quantum supremacy: the point at which quantum computers can complete tasks that would be impossible for traditional computers within a reasonable amount of time. The field is still a ways off from designing a practical, user-friendly, cost-effective quantum computer, but electrical engineers are still dreaming up one.
“Wherever there’s a will, there’s a way,” Dr. Hodges says. “That’s engineering.”
The ability of a computer to learn and problem solve (i.e., machine learning) is what makes AI different from any other major technological advances we’ve seen in the last century. More than simply assisting people with tasks, AI allows the technology to take the reins and improve processes without any help from humans.
With 100 percent renewable energy as the ideal future state, startups and established players are racing to find the right mix of cheap, safe, and effective utility-scale energy storage. Learn more about some of the latest advances and new directions for combating climate change by making better batteries.
George Crabtree, senior scientist and distinguished fellow at Argonne National Laboratory, distinguished professor at the University of Illinois at Chicago, and director of the JCESR, sees energy storage research having major applications when it comes to electric vehicles (EVs) and the electric grid.
The 12th annual National Robotics Week (RoboWeek) takes place April 2-10, 2022. Established by Congress in 2010, this tech-focused week is about demonstrating the positive societal impacts of robotic technologies, and inspiring students of all ages to pursue careers related to Science, Technology, Engineering, and Math (STEM).
Apple’s A16 Bionic processor has 16 billion transistors. There is no way for a human to design such a chip manually. In electronic design automation (EDA), a field that includes the design of chips and circuit boards, today’s engineers rely on software tools to aid them.