Texas A&M Engineering SoundBytes

Engineer This!: How do you hack hardware security (Featuring Dr. JV Rajendran)

January 14, 2020 Texas A&M Podcast Network Season 1 Episode 20
Texas A&M Engineering SoundBytes
Engineer This!: How do you hack hardware security (Featuring Dr. JV Rajendran)
Show Notes Transcript

How is picking a lock similar to breaking into a computer? Dr. Jeyavijayan "JV" Rajendran, assistant professor in the Department of Electrical and Computer Engineering, poses this exact question to his students. Rajendran is researching ways to improve hardware security, which has a become a weak link in cybersecurity. He also has found a passion for helping students get excited about cybersecurity through a "capture the flag" type competition called Hack@DAC.

Hannah Conrad:

Hi there and welcome to the new decade. When talking about cybersecurity, it's easy to imagine a hacker in a hoodie breaking their way into a government network, but that's not the only part of technology that needs protection. This is Texas A&M Engineering presents SoundBytes, the podcast where faculty, students and staff share their passions, experience and expertise. I'm Hannah Conrad and with me, as always, is my cohost Steve Kuhlmann.

Steve Kuhlmann:

Howdy. For this episode we have a conversation with Dr. JV Rajendran, an assistant professor in the Department of Electrical and Computer Engineering, about hardware security and how he's helping train a new generation of security-conscious engineers. Enjoy the show.

Hannah Conrad:

JV, thank you so much again for joining us. We're really excited to get to talk to you. Could we just start off with what got you interested in the hardware side of cybersecurity?

Dr. Rajendran:

So my undergrad was in electrical engineering. I was interested in security because you see all these hacker movies(with) people breaking into things like Mission Impossible.

Steve Kuhlmann:

Right.

Dr. Rajendran:

So you do get excited. The security courses are typically offered in computer science. So when I was applying for graduate school, I was looking for something that meets my interest and then I went to NYU(New York University) where they were doing hardware security research and, historically, that's the time that hardware security research was just taking off. I was lucky to be at the right place at the right time. NYU does this competition as part of an event called Cybersecurity Awareness Week. So as part of that they used to have this competition. So I participated in it and we won. So that gave me confidence that okay I can do security. Right. So that led to hardware security.

Hannah Conrad:

Very cool. It's interesting that you bring up the idea of hacker movies because when I think about cybersecurity, most of the time I think about some shady figure typing away at a computer and hacking into these giant mainframes. How do people hack into quote-unquote hardware?

Dr. Rajendran:

One thing I want to clarify is that when we typically think of hackers, we tend to think of them as bad guys. Yes, there are definitely hackers who are bad guys, but there are ethical hackers- that are called white hat hackers- and they help the community in general to find vulnerabilities so that the prospective authorities can go and fix them. So it's like they are the dark knights, as you might say. How do you hack the hardware, or typically any systems for that matter? Your attacker can range from some bored teenager in a basement to a group of tech savvy people to a small or medium size company or even state-nation, state actors. And depending upon what resources and skill set that they have, hardware can be hacked at many different levels, at many different phases in which hardware is designed and used.

Steve Kuhlmann:

From a non-engineer standpoint, you really don't hear a lot about hardware vulnerabilities and that being aside that people are worried about, usually it's people are worried about their credit card numbers or this digital information. Is it because it's less of a problem or is it just because people don't know about it as much? They're not as aware.

Dr. Rajendran:

It's definitely not less of a problem. And I would say that people are not informed of it. Unlike software, where you can easily patch if something went bad, hardware is difficult to patch.

Steve Kuhlmann:

I guess it makes sense. You can't just send a software update one day to fix something. If you find it, you'd have to recall the item.

Dr. Rajendran:

Precisely. So sometimes you'll get lucky where even if you have a hardware vulnerability, you can patch it through software.

Hannah Conrad:

We've been talking hardware and software. Could you just quickly... What's the difference between hardware and software?

Dr. Rajendran:

You are going to land me in trouble because this is becoming a territorial issue. So, in layman's terms, think about software as your programs. You's operating systems, right? They are flexible. You can go and change them. And hardware is, think of your transistors, your gauge, your circuits, antennas and things. And there are layers where the lines are not black and white. It becomes gray and the territorial dispute pops up.

Hannah Conrad:

Yeah.

Steve Kuhlmann:

So we've talked a little bit now about some of the ways that there are challenges, but why is it so important to make sure that hardware is made more securely?

Dr. Rajendran:

I'm going to give you a very biased answer because if you think of the system, a software is closer to the user. You have applications, your software, they are closer to the user than the hardware, but if you think about it, everything runs on the hardware at the end of the day, you need some physical mechanism to do your computations and communication.

Hannah Conrad:

Are there any signs of, if my phone hardware, for example, was attacked or in any way corrupted would I know?

Dr. Rajendran:

You may. Again, it depends on the attack. See, the problem of being a professor is you want to give your very technically correct answer, but at the same time, sometimes it gets confusing, too. Yes. Sometimes you may know of the attack. For example, if I do a denial of service attack, I just want to irritate you. Like, I just want to keep turning your phone on and off or I want to drain your battery. Those attacks are visible, but if I go and do an attack where I leak your passwords, I steal your credentials, those attacks, you may not notice it immediately. Maybe eventually you see some crazy statements with credit cards, then you may notice it, right? So one thing we need to differentiate is, so there is something called a bug or a vulnerability and then there is something called an exploit. So a bug is like a weakness and that can reside in software(or) hardware, right? But an exploit attack is how I'm going to exploit that bug to do what I want.

Steve Kuhlmann:

Okay.

Dr. Rajendran:

Okay.

Hannah Conrad:

What factors can lead to attacks on hardware?

Dr. Rajendran:

So there is a variety of factors. I would probably classify them into three categories. So one of them is the supply chain factor. Number two is the way hardware has traditionally been designed. The goals within these design principles. And number three, the understanding of hardware. Right? So let me talk about the supply chain problems. So, if you consider integrated circuits, or IC chips, that you see in your laptop, phones and so on. So back in the 1980s a single company can design, they can manufacture, they can test and release a product because the technology then was so simple, right? But nowadays the technology is so advanced and the designs are so complex, there are so many things that are being packed into the chip. Companies, nowadays, have multiple design teams located throughout the world. You have different companies that designed very specific components that go into their ICs and then whatever the ICs that we produce get used throughout the world. So you have a very globalized supply chain. And why it happens is because people want to speed up the time to market and they want to tackle the design... The growth and design complexity. Which makes sense, right? But the problem here is that now you are relying on some team, some company to do a part of this hardware design. And especially if you consider chips that go into the Department of Defense, military infrastructure or critical infrastructure, how would you trust something that is made by some other country or someone else that is not directly under your control? So you have this trust problem. And, so that is the supply chain aspect of it. Traditionally, hardware is designed in such a way that, okay, they want to minimize the power, they want to make the design as small as possible and they want to make it as fast as possible and more reliable. And when I say reliable, they will target faults. Those are not necessarily attacks. So what this traditional mindset is driven to is completely forget about security, right? These are the primary goals of my design. So I will go and do this. That's the mindset that hardware designers have. Oh, I want to go and make my design better and better and better without worrying much about security.

Steve Kuhlmann:

You brought up an interesting point just a minute ago about trust. Now, you know, we're so much more globalized.

Hannah Conrad:

How do you address hardware security?

Dr. Rajendran:

I don't think hardware security... There is no one silver bullet. People are worried about inserting malicious components into the design. So these are called hardware Trojans and then people are worried about people stealing their designs because I am sending my design, I'm sending the blueprint of my chip to someone else who will be manufacturing it. So you are to identify, okay, here is my business model. So in this business model, which parties can I trust? Which parties can I not trust? From that I can devise my threat model. Okay, these are the attacks that I'm anticipating, these are the controls that I have. And with this limited control, what can I do to my design to detect or prevent these attacks? That's what companies as well as government agencies are trying to do. So the idea here is that I don't want the attacker to know what my design does, so we call that technique logic locking. So, basically you embed a lock into the chip such when one plays the correct key, the design works. If you apply the incorrect key, the design does not work.

Hannah Conrad:

Oh cool.

Dr. Rajendran:

So now the attacker in the foundry, can make as many copies as he wants, but none of them is going to work because there is... He does not know the key to make them functional.

Steve Kuhlmann:

Do you have any examples of how you demonstrate logic locking concepts in the classroom?

Dr. Rajendran:

So, I see a lot of electrical engineers and they are very talented and I want them motivate them to do security. So I need to think of like something exciting.

Steve Kuhlmann:

Sure.

Dr. Rajendran:

And the same time that can give them confidence. I actually teach in my class how to pick locks.

Steve Kuhlmann:

Oh, yeah, let's talk about that.

Dr. Rajendran:

That's what I teach in the first week of my security of embedded systems class. I teach... Start them with how to pick the locks and you may think"who is this professor teaching our students to pick a lock," right? The answer actually, is that to tell, to talk about the vulnerabilities in commonly used systems. I teach security through these buggy locks that, so you can talk a lot about security principles, how you cannot achieve security through obscurity. When I started the class, I usually give one lock per student and then by the time I taught them, they were already able to pick their lock and they're like"Hey, can they have one more lock?" So, that creates an excitement like that kind of, I believe that kind of instills confidence into my students."Oh, I can do security now."

Steve Kuhlmann:

Sure. Yeah. Actually getting to see a concrete outcome from the principles that you're teaching has got to be just so impactful.

Dr. Rajendran:

Yeah, and find me, even if you don't get anything from my course, at least you know something now.

Jenn Reiley:

Howdy. It's your producer Jenn coming at you with 2020 vision.

Steve Kuhlmann:

Jenn? No. Why?

Jenn Reiley:

Because it's 2020 and I have to. I'll leave the dad jokes to Steve, then. I'm here to tell y'all about some hardware technology that you would probably have seen in your everyday life. So have you heard of these RFID blocking wallets? If you haven't, I'll give a little bit of an explanation. So first off, all of these new credit cards that we're getting have the ability to release what's called a radio frequency identification or RFID. And that helps with contactless purchases that we might see at stores or at vending machines. There is a concern of pickpockets being able to walk by you and use a scanner to actually collect the personal information off of your credit card number, which could then lead to fraud because of this RFID that's being emitted from the card. They developed this wallet that has RFID blocking technology. So that way the signals from your card aren't being emitted. And while that's hardware security that fits in your pocket. Let's head back into the interview to learn more about how students learn about hardware security in and outside of the classroom.

Hannah Conrad:

I think it's really interesting that you said at the beginning that you kind of came into this field at a brilliant time through your own education and now you're sort of inviting other people to come in as well in the student body with like Hack@DAC. Could you talk about Hack@DAC really quick?

Dr. Rajendran:

The DAC stands for Design Automation Conference, which is one of the premier research conferences and the hardware design community. We want to educate, we want to bring in more electrical engineers, more computer engineers into the hardware security domain. Typically people, when they think of security, they think that it's more software, it's more computer science, but there's also an analytical engineering component to it. The motivation is to create an excitement among electrical engineers, computer engineers that they can do security. What I want to do is that I want to, we wanted to create a hardware equallant of capture the flag. So what we do is we want to create a design that will have known vulnerabilities that are, that our students need to find. We called Intel. So I have my previous labmates who are now researchers at Intel. They are trying to find bugs in Intel's designs internally. So we said,"Oh, can we run a competition with the same thing that what you do in your company?" They said,"Oh, that'll be great except that we won't give any Intel designs." Which make sense from a business prospective, right? Because they don't want to release their copywrited design. So, then, what we said is,"okay, you don't need to give your design, but how about we take an open source design- a design that is public, a design that people use for research purposes- and then let's think about what bugs are interesting to you." Like if you can detect this bug, if you come up with a technique to detect this class of bugs, you are good enough to work with Intel, right? So, with these three dimensions in place, we created the Hack@DAC competition, where we create an open source design riddled with bugs. Like that's so interesting to me because people usually... When they design something, they want to minimize the number of bugs or eliminate it completely. We're like,"Oh, let's maximize that." So... Sorry, I clapped.

Steve Kuhlmann:

You're okay.

Dr. Rajendran:

So we created this buggy design, we actually call it a buggy design. We put it out there in the open and we invite teams around the world, both student teams as well as industry teams, to go and find vunerabilities, b ugs in the design. And whenever they submit the bugs, oh I think this l ine of the code, this part of the design has a bug, this is how this g ets exploited, so all t heir findings get evaluated by Intel engineers. They say, Oh, is this critical? Is this not critical? And stuff. And what we r ealized that that in fact students, especially students, they found more bugs than what we actually inserted.

Steve/Hannah:

Oh.

Dr. Rajendran:

So they found bugs that were actually, originally part of the design. So we are to, we then inform the developers of, the original developers of the actual design. Um, and then they were able to fix it. The more important component is to make the students excited about it, which we did. There are more than 50 teams that participated in 2018 and again, more than 40 teams participated this year. There were close to 300 bug submissions in both years. We have sponsors from the National Science Foundation, Army Research Office, uh, Qualcomm, Mentor Graphics. So companies are also interesting, are also interested in this because oh, let's go on find which student can find bugs and hire them.

Steve Kuhlmann:

Absolutely.

Dr. Rajendran:

Uh, so and also from a research perspective, people were able to identify bugs, uh, that would have slipped past into products, but having someone to, motivating them to pursue what they like is another personal goal of mine.

Steve Kuhlmann:

Sure. Yeah. Having seen that a couple of years now, what's it like to see the excitement in these students as they're trying to forge into this new field?

Dr. Rajendran:

I, I personally believe that all students are great, right? All students at A&M are smart. But there is this confident factor, at least for me, that used to be the case, right? If I can go and solve a problem, I get more confident about it. The more confident I become about something, it gives me a positive feedback to explore more, to do more on that particular thing. So that's the same thing that I am seeing in many of the students through this Hack@DAC competition. So initially like, Oh, do you think these are typically computer engineers. They know how to design and stuff. T hey w ere like, Oh, do you think we can find about security, we don't know much about security. And these are students coming from universities where they do not have a hardware security course. So they a re, of course they will lack confidence. They have this question that, or I don't know much about security, how can I participate in a security competition? But nowadays it's like, well I know hardware so I can go and give a crack at it and see what happens. And they are doing much a nd much better. And now t hey're like, Oh, I want to do security now. So I see this positive change and more excitement that students, y ou build t heir confidence in this hardware security domain and they go and pursue this domain.

Steve Kuhlmann:

So does your work with Hack@DAC connect with what you do in the classroom at all?

Dr. Rajendran:

So that we take one of the previous designs, I teach them what security works, look like, how to identify them, and then actually give one of these designs as a lab exercises. So what are people do in two months? These students need to do it in two weeks. That may sound hard, but it's not that hard actually. You can do it. Uh, so a lot of students, they take the class and they feel more comfortable finding bugs.

Steve Kuhlmann:

Excellent.

Hannah Conrad:

You're kind of paving the way for future researchers, which I think is incredibly impactful and very interesting. Where's the future of your research heading?

Dr. Rajendran:

We have some exciting work coming on. There are two other topics that we are pursuing. So one is the logic locking. If we have a DARPA(Defense Advanced Research Projects Agency) project where they have asked us to develop a tool that can lock the designs in an automated fashion. So that's one research. And the next research is that... I talked about, security bugs in hardware, I talked about this Hack@DAC competition. So if you look at this Hack@DAC competition, what we realized is that people usually go through the code and find a lot of bugs, of course. They simulate the code, design and see where things are mismatched. But the question that we ask is can we detect these bugs in an automated fashion? Because then you get to scale more than you can get to explore your complicated system. Right? So, that's where our research is heading to: developing automated techniques to detect these bugs.

Jenn Reiley:

Hey, it's your producer Jenn, and I'm here to introduce our next segment where our non-engineering cohosts try their best to answer engineering questions. It's Ask an Engineer. Steve and Hannah will have 30 seconds to search online to answer the question and then our guest will tell them how they're wrong. We'll fast forward through the searching part. All right, so y'all's question today is what are some of the most vulnerable pieces of hardware and why?

Hannah Conrad:

This is like the opposite of confidence building.

Jenn Reiley:

That's not my job. Alright. Your 15 seconds, please explain to us what you learned. Go.

Hannah Conrad:

Well, first and foremost, I found out that Googling answers is a terrible idea. Um, but, uh, I learned that some of the most vulnerable pieces of hardware, not part of the question, but older computers are not great because, yup.

Jenn Reiley:

Beause yup. Alright, Steve, tell us what did you learn about vulnerable hardware?

Steve Kuhlmann:

Alright. So basically what I found was, um, just a couple of items that were listed as vulnerable pieces and that's USB interfaces and RAM, were first two that I got to.

Jenn Reiley:

Alright. There you go.

Steve Kuhlmann:

So how did we do?

Dr. Rajendran:

Okay, Hannah, gets an A.

Hannah Conrad:

Hey!

Dr. Rajendran:

So, let me explain the reason why. So older computers do not necessarily have security protections to prevent against modern-day attacks. That is one as dimension to the problem. The other problem is that when older systems get worn out, you want to replace them with systems that have similar configuration. And typically these systems tend to be like at least 10, 20 years old. And when they get worn out, when they get faulty, they want to replace it and they may not necessarily find the parts. And that's when counterfeit products come into picture. So, Hannah gets an A. And, Steve, you also get an A.

Steve Kuhlmann:

Oh wow. I think that's the first time for me.

Jenn Reiley:

So does that mean they're now engineers?

Dr. Rajendran:

Yes. So let me explain Steve's answer. So the USB interfaces, nowadays they may come with some viruses. So that's why you always need to scan your USB whenever you are connecting it to your machine. And RAM, again, the problem may be this counterfeit product that is a problem. But you know what, the question is too simple because the answer is all of it. So, as long as you give me an answer, you get an A for this question.

Hannah Conrad:

That's really, I... For some reason I have it in my head and maybe it's more like films that I get the idea from of, you know, like you said, submarines, planes and all of that running on older equipment. I never actually thought about that.

Dr. Rajendran:

So if you go and play with these smart grids, oil and gas equipment, they usually tend to have older systems because they are much more resilient against and much more reliable against natural faults, not necessarily attacks.

Steve Kuhlmann:

Interesting. So what I'm hearing is that I need to get a new computer.

Dr. Rajendran:

No, so companies usually know what systems they use and, more importantly, they are also aware of what vulnerabilities exist in the system. So they go and deploy defense mechanisms to protect against these vulnerabilities. And government also informs and helps these companies to ensure that they are secure.

Hannah Conrad:

Well, JV, thank you again for joining us. This was a lot of fun and really interesting to learn about, so I'm glad that we got to talk.

Dr. Rajendran:

Thank you for having me over here. This is my first time doing a podcast, so I feel honored and I made a lot of nervous statements. Hopefully you don't catch them.

Steve Kuhlmann:

Thanks again. Have a great rest of the day.

Dr. Rajendran:

Thank you. You, too.

Hannah Conrad:

(Disclaimer) Thanks so much for tuning in to Texas A&M Engineering presents SoundBytes. What'd you think? Do you have any burning questions you want to ask? Hit us up and let us know at EngineeringSoundBytes@tamu.edu. That's bytes with a Y. And keep an eye out for us in ZACH. We like to wander the building from time to time and we love hearing from you. So come on over and say hello or lend us your voice for a future episode. Finally, just so you know, the views and opinions expressed in this podcast are those of the hosts and guests and do not necessarily reflect the official policy or position of the Texas A&M University System. Make sure to tune in next week. Until then, from everyone on the PodSquad sounding off: Thanks and gig'em.