Last week I made a short trip down to SW Michigan, to visit some of our industrial partners, and to pay visits to some old friends and new colleagues. It was my first time in that part of the state and I enjoyed it thoroughly.
Upon arrival in Grand Rapids, I picked up my rental car which turned out to be a new Volvo station wagon, and I got a quick lesson on where things are headed with autonomous vehicles. When I merged onto the highway and turned on the cruise control, I quickly figured out that it had a feature called “adaptive cruise control.” This is where the vehicle measures the distance to the vehicle in front using radar, and adjusts the speed as necessary to maintain a minimum distance between the two, where the safe following distance depends on the speed. I was familiar with this as I had experienced it once in a rental car about a year ago, on vacation in Colorado. At that time, I did not know quite what was happening and actually thought the cruise control was broken as the car kept slowing down on I-70 with lots of traffic. It was only when I got out on a two-lane highway, with just me and the car in front, that I figured out what was going on, and was amazed at how well it worked. On this latest trip, I felt I was already an old pro at adaptive cruise control, but was amazed all over again when I realized the Volvo was driving itself! This is another new feature called “pilot assist” that uses both video cameras and radar sensors to keep the vehicle in the center of the lane. It was kind of spooky at first when I realized that the steering wheel was moving on its own, ever so slightly, but again it was remarkable how well it worked. This was at night, in good weather and on a clear highway with bright white stripes reflecting my headlights, so it was not a challenging control scenario. Even so, I was impressed at how smooth and steady the vehicle was going right down the middle of the lane. I could even take my hands off the wheel entirely! It would issue a little warning after about 15 seconds, and then I would have to put my hands on the wheel again or just tap it gently to keep the system engaged. I guess it just wanted to be reassured that I was still there.
Adaptive cruise control and pilot assist are examples of what the Society of Automotive Engineers (SAE) calls Level 2 autonomy. They define 6 different levels of autonomy, from 0 to 5, with 0 being no autonomy whatsoever (i.e. old-fashioned human driving) and 5 being total autonomy in all conditions. Level 2 autonomy includes these kinds of driver assist technologies that can partially take over the accelerator, brakes, and steering for relatively simple tasks, with the expectation that the driver is paying attention at all times. I hope that is a realistic expectation. I have to confess, the pilot assist feature really did make it easier for me to eat lunch in the car. We are not close to Level 5 autonomy yet, but auto manufacturers are making progress at a pretty good clip, and there are optimistic projections on when we might see Level 4 cars on the road. A lot of people are pretty nervous about the prospect of autonomous vehicles. My guess is that we will get used to them not all at once, but rather one feature at a time like in my experience with the rental car. It will come with a pull, not a push: drivers will see how easy it is to use the new-fangled technology and how it makes their lives better, and then they will be demanding more and more.
This is as good a place as any to put in a plug (again) for our Robotics Systems Enterprise. Michigan Tech is one of eight North American universities participating in the GM/SAE AutoDrive Challenge, a collegiate competition in which students will integrate sensors and develop the control algorithms to take an existing vehicle (a Chevy Bolt) and make it autonomous. This is an interesting step for SAE, which has a lot of automotive collegiate competitions; Michigan Tech mechanical engineers participate in several and do very well. In the AutoDrive Challenge, the automotive powertrain is off limits; the students have access electronically only to the accelerator, brakes, and steering, and beyond that it’s all about the sensors and controls. This creates a lot more opportunities for participation by electrical engineers, computer engineers, and computer science students. The Michigan Tech team is hosted in the Robotics System Enterprise, led by ECE faculty member Dr. Jeremy Bos and ME-EM faculty member Dr. Darrell Robinette. I am looking forward to witnessing the Year 1 competition in Yuma, Arizona, at the end of this semester, and I am quite certain you will read about it here.
My automobile experience was only one of several times on this trip when I was reminded about the opportunities for electrical engineers in the area of controls. The industrial partners I visited confirmed for me what I have seen and heard many times before at Career Fair and with our External Advisory Committee, that automation is everywhere and that electrical engineers who have controls expertise are in high demand. This is one of the reasons we took our controls course in the EE curriculum, moved it to the junior level, and made it a required course. This is a good time to be studying electrical engineering and entering the job market, and graduates who can claim some expertise and experience with control systems will find many more doors opening up.
Electrical engineering is a huge field of course, and even within the sub-field of controls there are several flavors. At one end of the spectrum we have industrial control systems, where the tool of choice is the Programmable Logic Controller, or PLC. Such systems are found in factories and other industrial facilities like steel mills and chemical process plants, and in buildings with elevators and air conditioners. Some electrical engineers find that PLCs lack the mathematical complexity that might make them interesting, but as far as I am concerned, anything worth doing is worth doing well. If engineers and engineering students see a need that is addressed with a certain technology, and see challenges and rewards working in that field, then they should be encouraged to do so as long as they do a good job. Michigan Tech has a two-semester PLC course sequence, cross-listed between the ECE Department and the School of Technology, taught in a beautiful new facility that was renovated with gift funds from Nucor Steel.
“Traditional” or “classical” control theory often involves the electrical control of mechanical systems, and so is taught in EE and ME departments, and it shows up in almost all engineering disciplines in one form or another. The typical paradigm involves a “plant” – something to be controlled, like a motor – along with sensors that measure what the plant is doing, and actuators that control its behavior. The sensor outputs are fed into a control algorithm which also has inputs indicating the desired plant behavior, and this in turn determines the actuator signals that tell the plant what to do, creating what is called a “feedback control system.” Understanding how such systems work requires a lot of the mathematical machinery taught in undergraduate EE and ME curricula, such as differential equations, Fourier and Laplace transforms, and complex analysis. Feedback control systems show up all the time in the natural and biological world – think of birds flying or your heart beating – and many of our solutions to technological problems mimic that behavior.
More recently we have seen the emergence of control algorithms and control systems that are driven by complex computer algorithms, such as those from the worlds of artificial intelligence and machine learning, that are highly complex and cannot be boiled down to a few mathematical equations as is often the case in classical control theory. Systems that have both cognitive and physical attributes like this are called “cyber-physical systems.” These control systems have seen explosive growth in recent years, due in large part to the speed and power of computing systems that are just now getting to the point where the algorithms can reasonably be expected to work on practical time scales. The autonomous vehicle is the most prominent example of a cyber-physical system in today’s culture, with all of the cognitive processing that has to take place between the cameras, radars, and lidars (the sensors) and the steering, accelerator, and brake (the actuators). The emergence of cyber-physical systems has greatly elevated the importance of computing and computer science in engineering applications, a trend that I believe merits close attention at Michigan Tech and similar educational institutions.
The next time you are out driving, pay attention to all the human processing you are doing to keep your vehicle going where you want, at the speed you want. There is a lot happening there. Now think about how we might mimic that with cameras and computers. It is scary and exciting all at the same time, but there is little doubt that the day is coming when the computer/electrical/mechanical control system will be doing more and the human less, not just in cars but in everything we get our hands on. It is a golden opportunity for today’s students in electrical and computer engineering. I hope they make the most of it.
– Dan
p.s. thanks to ECE faculty member Jeff Burl for bringing Yoda’s sage advice to my attention.
Daniel R. Fuhrmann
Dave House Professor and Chair
Department of Electrical and Computer Engineering
Michigan Technological University