Online Engineering Programs > Features > Tech 2020: What’s Coming in Computer Engineering

Tech 2020: Advances in Computer Engineering

Find schools

*sponsored

The New Sabermetrics: Big Data & Algorithms for Social Good

In the early 21st century, Major League Baseball’s low-budget team, the Oakland A’s, managed to build a roster capable of competing with its better-financed rivals. They did it with data. This was achieved through an approach now known as sabermetrics, where the Oakland A’s made their staffing decisions based on then-obscure data points. Instead of focusing on the traditional measures of the past (e.g., a player’s pitching speed or stolen base percentage), the team’s management dove deeper into newly recorded statistics and uncovered the ones they thought really mattered. The result wasn’t just a better team for the Oakland A’s—it was a revolution in the way baseball franchises do business.

Big Data gives computer engineers more information than ever to power their decisions. With practically infinite data points available, the trick is knowing what questions to ask. But this isn’t just for games and businesses anymore; it’s also for the social good. Innovators are using algorithmic sorting and sabermetrics to tackle inequality, improve hiring practices, and stem the flow of misinformation.

Rediet Abebe, a PhD candidate in computer science at Cornell University, pioneered a new method of algorithmic sorting that seeks to bridge gaps in the delivery of resources to disadvantaged communities. As an intern at Microsoft, she developed an AI project that sought to identify unmet health needs in Africa by scanning people’s search queries. Abebe designed her algorithms to identify which demographics were prone to seek out information about HIV stigma, HIV discrimination, and natural HIV cures, for example. In doing so, she unlocked parts of the population that needed help but weren’t receiving it. Her project expanded to all 54 African nations and harvested web-only data to identify those most likely to be in need of support. And now she’s bringing that project to the US, working with the National Institutes of Health’s Advisory Committee to tackle health disparities in America.

In another example, two computer science and engineering undergraduates at the University of Nebraska, Vy Doan and Eric Le, are unleashing the power of algorithms in the battle against misinformation. Recognizing that humans are prone to confirmation bias, Doan and Le have developed a machine learning algorithm that can identify questionable news all on its own. By having a system pour over years’ worth of Twitter posts, Doan and Le were able to discern the data points that mattered in spotting misinformation: location, account age, and frequency of posts. Once a system of detection was in place, they set about programming a browser extension that could caution users about unreliable sources of information.

Big Data isn’t new anymore. As it reaches maturity, the question for computer engineers isn’t how much data they have, but which data to pay attention to, and to what end.

Brain-Computer Interfaces: Hacking the Mind

As technology continues to progress, computer engineers are exploring an increasing number of ways to physically connect humans to it. Brain-Computer Interfaces (BCIs) do that in a startlingly literal way. By linking the brain with today’s hardware, BCIs have the potential to super-charge human evolution to its next stage.

It’s estimated that over a quarter of Americans suffer from brain disorders. That may sound like the set-up to a really bad joke, but in reality the punchline is much more dour: these disorders manifest as post-traumatic stress, as restricted mobility, and as memory issues like Alzheimer’s. In 2013, President Obama announced the Brain Research through Advancing Innovative Neurotechnologies (BRAIN) initiative, which aimed to study the way the brain works as a piece of technology and how technology could best interact with it.

So far, the results have been fruitful. Preclinical studies have shown how brain cells combine to process emotions. Non-invasive ultrasonic technology may allow the release of medication into specific areas of the brain. Adaptable electrical stimulation devices can be used to treat movement disorders. This is a revelation in the way we treat issues of the brain: not just with diffused chemicals, but with electric connectivity.

The public sector isn’t the only one gaining ground in this field: Elon Musk has his own BCI company, Neuralink, a 100-person startup that’s developing data transmission systems between people and computers. Founded in 2017, it’s only now publicizing some of its progress: recording a rat’s brain activity through thousands of electrodes implanted along its neurons and synapses. Musk has also alluded to a successful BCI placement in a primate, which allowed the animal to control a computer with its thoughts. The next step, which will seek FDA approval as early as 2020, is clinical human trials. The aim of these trials will be to insert those electrodes into paralyzed patients and give them the ability to control electronic devices with their thoughts. Other companies like Kernel and CTRL-labs are following suit.

BCI advocates point to applications in the fight against Parkinson’s disease, epilepsy, and blindness. Regulators and those overly familiar with dystopian science fiction, remain leery. Fortunately for all, the next beneficiaries of BCI technology will likely be only those battling brain disorders. The dreaded cyborg army will have to wait.

Neuromorphic Computing: Building Brains

One way to avoid the regulatory hurdles of BCIs is to ditch the human subject and simply build an entirely new brain. Neuromorphic computing seeks to engineer machines that can mimic the function of the human brain in both hardware and software and it’s gaining traction going into 2020.

From humble beginnings in the 1980s, neuromorphic computing took a big step forward in 2017, when Intel unveiled the Lohi neuromorphic processor, a self-learning chip which mimics brain functions by adapting to feedback from the observed environment. The Lohi chip is extremely energy-efficient, using recorded data to draw inferences and get smarter over time. And it’s high-powered, too: neuromorphic hardware excels in what have previously been seen as human-dominant areas like kinesthetics (prosthetic limbs) and visual recognition (pattern sorting).

In 2019, Intel integrated 64 Lohi chips into a single, large-scale neuromorphic system called Pohoiki Beach. That turned the hardware equivalent of 130,000 neuron analogs into 8,000,000. To put that in more graspable terms: a single Lohi chip has half the neural capacity of a fruitfly, while the Pohoiki Beach system has the neural capacity of a zebrafish. The most impressive part of this isn’t the current state but rather where it’s going. The Lohi chips consume 100 times less power than Graphic Processing Units (GPUs) and five times less power than dedicated IoT inference hardware, meaning that Intel can scale up to about 50 times its current capacity and still retain better performance than its peers.

Next year, Intel promises the unveiling of an even larger neuromorphic system, nicknamed Pohoiki Springs. Competitors like Samsung and IBM are already taking notice and developing their own projects. It will be a long time before the pseudo-brains of neuromorphics match or exceed the size and capacity of the engineers creating them. But the interim should see an uptick in more efficient—and more humanesque—computational power.

Quantum Computing: The Final Frontier

Quantum computing exists, but it also doesn’t. It’s kind of in between. And that apparent paradox is the driving force of one of computer engineering’s most exciting possibilities.

Where traditional computing consists of bits coded as zeroes and ones, quantum computing replaces those with qubits that exist in a state of superposition, meaning they can act as both zeroes and ones simultaneously. If quantum computing can be scaled up, it could quickly solve problems that traditional computing technology would take years or even centuries to process. Paradigm shifts in finance, medicine, and IT would follow. The applications are practically limitless, and, to the uninitiated eye, the results could look like magic.

In March 2018, Google’s Quantum AI Lab unveiled its 72-qubit processor, named Bristlecone. This was a critical step towards quantum supremacy: the moment that a quantum computer begins to outperform traditional supercomputers. But this is the quantum world, where nothing is linear and it’s not just about processing power. Quantum computers are prone to errors and achieving quantum supremacy requires not just raw power, but low error rates to go with it. Quantum supremacy has been illusory since the idea was first introduced, with some doubting it was even theoretically possible. But in 2019, Google discovered it was closer than anybody thought.

With each new improvement to Google’s quantum chips, there’s been a growth in power unlike anything else in nature. While traditional computing power has grown at an exponential rate (in accordance with Moore’s Law), Google’s quantum computing power is growing at a doubly-exponential rate. If such a trend continues, practical quantum computing could arrive in the next year. People have mapped out use cases for everything from better drugs, to better batteries, to new forms of AI, to entirely new materials.

Whether Google’s quantum technology can continue its rapid ascent and scale effectively remains in doubt. But the industry isn’t taking any chances. Researchers are already exploring ways to redesign critical digital infrastructure, such as encryption, for a post-quantum world. Heavyweights like IBM and Intel are charging ahead with their own quantum devices. They might get there in 2020, or they might not. Alternatively, in true quantum fashion, they might both get there and not get there simultaneously. In any event, it’ll be intriguing days ahead for computer engineering.

The Internet of Things: Tying It All Together

Simply put, the internet of things (IoT) allows technological devices to talk to each other. This has some innocuous applications such as your thermostat regulating itself or your refrigerator detecting you’re out of milk and ordering you some more from Amazon. But it also makes it possible for businesses to optimize their supply chains—for inanimate objects to be turned into spigots of valuable data and for cars to drive themselves (not to mention for ships to sail themselves and even planes to fly themselves). Unlocking the full potential of IoT devices has been a Holy Grail for computer engineers for years, but the biggest hurdles to IoT’s growth—less than real-time data access, limited bandwidth, and outdated operating systems—are set to be jumped in 2020.

IoT devices produce a massive amount of data, and that data needs to be processed through data centers. Managing the flow of information across cloud servers reduces speeds to less than real-time, which can be critical for the more innovative applications of IoT. Edge computing solves this problem by bringing computation and data storage closer to the location where it’s needed, improving processing speeds and preserving bandwidth. With the proliferation of edge nodes on the level of cell-phone towers, edge computing can empower the IoT with real-time communications for autonomous cars, home automation systems, and smart cities.

Bandwidth has long been the bane of IoT developers. The current capabilities of Wi-Fi and 4G simply aren’t able to handle the load of real-time communications necessary for sensors to link together in meaningful ways. But a critical tipping point for IoT developments could come with the dawn of 5G telecom networks. Select cities (Denver, Chicago, Minneapolis) have already been hooked up and a more comprehensive rollout across the nation is coming in 2020.

Operating systems like Windows and iOS were developed long before the internet of things was in focus, and, as a result, newer IoT applications can feel like a square peg in a round hole. That’s why, in April 2019, Microsoft acquired Express Logic, a real-time operating system (RTOS) for IoT devices and edge computing that’s powered by microcontroller units (MCUs). At the time of acquisition, the RTOS already had over six billion deployments.

That’s just the beginning. Other industry heavyweights are looking to acquire or develop their own IoT operating systems. Gartner, a research firm, predicts there will be more than 20 billion connected devices in 2020, with over nine billion MCUs deployed annually. The total market for industrial IoT devices is expected to reach over $120 billion in less than a year. That means more and more people are talking about IoT—and more things in the IoT are talking to each other.

Related Features

Artificial Intelligence Systems & Specializations: An Interview with Microsoft’s Sha Viswanathan

The ability of a computer to learn and problem solve (i.e., machine learning) is what makes AI different from any other major technological advances we’ve seen in the last century. More than simply assisting people with tasks, AI allows the technology to take the reins and improve processes without any help from humans.

Automotive Cybersecurity: Connected & Self-Driving Vehicles

This guide, intended for students and working professionals interested in entering the nascent field of automotive cybersecurity, describes some of the challenges involved in securing web-enabled vehicles, and features a growing number of university programs, companies, and people who are rising to meet those challenges.

Business Systems Analyst - A Day in the Life

This is a role for tech-lovers, for logical thinkers, for those who like being given an answer and then are told to find the question. But it’s also a role for communicators, for relationship builders, for people who enjoy cross-departmental collaboration.

Heroes in Engineering: A Spotlight on Artificial Intelligence

When is a computer more than a machine? One person’s concept of artificial intelligence (AI) can be quite different from the next person’s. In its most basic form, artificial intelligence is a process in which machines take cues from their environment and act responsively to achieve a goal.

Heroes in Engineering: A Spotlight on Robotics

A recent report by the International Data Corporation (IDC) projects that global spending on robotics and related services will exceed $135 billion by 2019, and continue to grow at an annual compound rate of about 17 percent.

Heroes in Engineering: An Interview with a White Hat Hacker

For many of us, when we hear the term “hacker” we immediately think of the perpetrators of data breaches. HackerOne is helping to change that. The truth is that hackers are essential in keeping the internet safe. Hackers are intellectually curious, driven by transparency, and motivated to find weaknesses.

Heroes in Engineering: The Architects of Benevolent VR

From developing VR programs and environments to designing more affordable VR technology, engineers have given other experts a means of improving our lives and environment.

Interview with an Expert: A Spotlight on Self-Driving Vehicles

Not long ago, self-driving cars were science fiction. Today, not so much. Influential companies like Tesla, Uber, Apple, and Google boast dynamic auto-drive programs, and many new startups are following their lead.

My Career as a Microsoft User Researcher: Human-Centered Design & Engineering

Human-centered design, sometimes also called human-computer interaction or engineering psychology, teaches the importance of prioritizing the needs, desires, and behaviors of people when building technology. Professionals in this discipline are usually called user experience (UX) practitioners.

Private Research Universities with Phenomenal Engineering Programs & Faculty

Students with a penchant for mathematics and the sciences might consider pursuing coursework in engineering at a private research university. Programs in engineering vary widely, but all of them train students to analyze, interpret, and build solutions for commercial and societal needs.