Lean Software, Power Electronics, and the Return of Optical Storage

Stephen Cass: Hi. I’m Stephen Cass, a senior editor at IEEE Spectrum. And welcome to Fixing The Future, our bi-weekly podcast that focuses on concrete solutions to hard problems. Before we start, I want to tell you that you can get the latest coverage from some of Spectrum‘s most important beats, including AI, climate change, and robotics, by signing up for one of our free newsletters. Just go to spectrum.ieee.org/newsletters to subscribe.

Today on Fixing The Future, we’re doing something a little different. Normally, we deep dive into exploring one topic, but that does mean that some really interesting things get left out for the podcast simply because they wouldn’t take up a whole episode. So here today to talk about some of those interesting things, I have Spectrum‘s Editor in Chief Harry Goldstein. Hi, boss. Welcome to the show.

Harry Goldstein: Hi there, Stephen. Happy to be here.

Cass: You look thrilled.

Goldstein: I mean, I am thrilled. I’m always excited to talk about Spectrum stories.

Cass: No, we’ve tied you down and made you agree to this, but I think it’ll be fun. So first up, I’d like to talk about this guest post we had from Bert Hubert which seemed to really strike a chord with readers. It was called Why Bloat Is Still Software’s Biggest Vulnerability: A 2024 plea for lean software. Why do you think this one resonated with readers, and why is it so important?

Goldstein: I think it resonated with readers because software is everywhere. It’s ubiquitous. The entire world is essentially run on software. A few days ago, even, there was a good example of the AT&T network going down likely because of some kind of software misconfiguration. This happens constantly. In fact, it’s kind of like bad weather, the software systems going down. You just come to expect it, and we all live with it. But why we live with it and why we’re forced to live with it is something that people are interested in finding out more, I guess.

Cass: So I think, in the past, when we associated giant bloated software, we had associated with large projects, these big government projects, these big airlines, big, big, big projects. And we’ve written about that a lot at Spectrum before, haven’t we?

Goldstein: We certainly have. And Bob Charette, our longtime contributing editor, who is actually the father of lean software, back in the early ‘90s took the Toyota Total Quality Management program and applied it to software development. And so it was pretty interesting to see Hubert’s piece on this more than 30 years later where the problems have just proliferated. And think about your average car these days. It’s approaching a couple hundred million lines of code. A glitch in any of those could cause some kind of safety problem. Recalls are pretty common. I think Toyota had one a few months ago. So the problem is everywhere, and it’s just going to get worse.

Cass: Yeah. One of the things that struck me was that Bert’s making the argument that you don’t actually need now an army of programmers to create bloated software— to get all those millions of lines of code. You could be just writing a code to open a garage door. This is a trivial program. Because of the way you’re writing it on frameworks, and those are pulling in dependencies and so on, you’re pulling in just millions of lines of other people’s code. You might not even know you’re doing it. And you kind of don’t notice unless, at the end of the day, you look at your final program file and you’re like, “Oh, why is that megabytes upon megabytes?” which represents endless lines of source code. Why is that so big? Because this is how you do software. You just pull these things together. You glue stuff. You focus on the business logic because that’s your value add, but you’re not paying attention to this enormous sort of—I don’t know; what would you call it?—invisible dark matter that surrounds your software.

Goldstein: Right. It’s kind of like dark matter. Yeah, that’s kind of true. I mean, it actually started making me think. All of these large language models that are being applied to software development. Co-piloting, I guess they call it, right, where the coder is sitting with an AI, trying to write better code. Do you think that might solve the problem or get us closer?

Cass: No, because I think those systems, if you look at them, they reflect modern programming usage. And modern programming usage is often to use the frameworks that are available. It’s not about really getting in and writing something that’s a little bit leaner. Actually, I think the Ais—it’s not their fault—they just do what we do. And we write bloaty softwares. So I think that’s not going to get any better necessarily with this AI stuff because the point of lean software is it does take extra time to make, and there are no incentives to make lean software. And Bert talks about, “Maybe we’re going to have to impose some of this legis— l e g i s l a tively.”—I speak good. I editor. You hire wise.—But some of these things are going to have to be mandated through standards and regulations, and specifically through the lens of these cybersecurity requirements and knowing what’s going into your software. And that may help with all just getting a little bit leaner. But I did actually want to— another news story that came up this week was Apple closing down its EV division. And you mentioned Bob Charette there. And he wrote this great thing for us recently about why EV cars are one thing and EV infrastructure is an even bigger problem and why EVs are proving to be really quite tough. And maybe the problem— again, it’s a dark matter problem, not so much the car at the center, but this sort of infrastructure— just talk a little bit about Bob’s book, which is, by the way, free to download, and we’ll have the link in the show notes.

Goldstein: Everything you need to know about the EV transition can be yours for the low, low price of free. But, yeah. And I think we’re starting to see– I mean, even if you mandate things, you’re going to– you were talking about legislation to regulate software bloat.

Cass: Well, it’s kind of indirect. If you want to have good security, then you’re going to have to do certain things. The White House just came out with this paper, I think yesterday or the day before, saying, “Okay, you need to start using memory-safe languages.” And it’s not quite saying, “You are forbidden from using C, and you must use Rust,” but it’s kind of close to that for certain applications. They exempted certain areas. But you can see, that is the government really coming in and, actually, what has often been a very personal decision of programmers, like, “What language do I use?” and, “I know how to use C. I know how to do garbage collection,” the government kind of saying, “Yeah, we don’t care how great a programmer you think you are. These programs lead to this class of bugs, and we’d really prefer if you used one of these memory-safe languages.” And that’s, I guess, a push into sort of the private lives of programmers that I think we’re going to see more of as time goes by.

Goldstein: Oh, that’s interesting because the—I mean, where I was going with that connection to legislation is that—I think what Bob found in the EV transition is that the knowledge base of the people who are charged with making decisions about regulations is pretty small. They don’t really understand the technology. They certainly don’t understand the interdependencies, which are very similar to the software development processes you were just referring to. It’s very similar to the infrastructure for electric cars because the idea, ultimately, for electric cars is that you also are revamping your grid to facilitate, whatchamacallit, intermittent renewable energy sources, like wind and solar, because having an electric car that runs off a coal-fired power plant is defeating the purpose, essentially. In fact, Ozzie Zehner wrote an article for us way back in the mid-Teens about the— the dirty secret behind your electric car is the coal that fuels it. And—

Cass: Oh, that was quite controversial. Yeah. I think maybe because the cover was a car perched at the top of a giant mountain of coal. I think that—

Goldstein: But it’s true. I mean, in China, they have one of the biggest electric car industries in the world, if not the biggest, and one of the biggest markets that has not been totally saturated by personal vehicles, and all their cars are going to be running on coal. And they’re the world’s second-largest emitter behind the US. But just circling back to the legislative angle and the state of the electric vehicle industry– well, actually, are we just getting way off topic with the electric vehicles?

Cass: No, it is this idea of interdependence and these very systems that are all coupled in all kinds of ways we don’t expect. And with that EV story— so last time I was home in Ireland, one of the stories was— so they had bought this fleet of buses to put in Dublin to replace these double-decker buses, electric double-deck, to help Ireland hit its carbon targets. So this was an official government goal. We bought the buses, great expense purchasing the buses, and then they can’t charge the buses because they haven’t already done the planning permission to get the charging stations added into the bus depot, which just was this staggering level of interconnect whereas, one hand, the national government is very— “Yes, meeting our target goals. We’re getting these green buses in. Fantastic advance. Very proud of it,” la la la la, and you can’t plug the things in because just the basic work on the ground and dealing with the local government has not been there to put in the charging stations. All of these little disconnects add up. And the bigger, the more complex system you have, the more these things add up, which I think does come back to lean software. Because it’s not so much, “Okay. Yeah, your software is bloaty.” Okay, you don’t win the Turing Prize. Boo-hoo. Okay. But the problem is that because you are pulling all of these dependencies that you just do not know and all these places where things break— or the problem of libraries getting hijacked.

So we have to retain the capacity on some level— and this actually is a personal thing with me, is that I believe in the concept of personal computing. And this was the thing back in the 1970s when personal computers first came out, which the idea was it would— it was very explicitly part of the culture that you would free yourself from the utilities and the centralized systems and you could have a computer on your desk that will let you do stuff, that you didn’t have to go through, at that stage, university administrators and paperwork and you could— it was a personal computer revolution. It was very much front and center. And nowadays it’s kind of come back full circle because now we’re increasingly finding things don’t work if they’re not network connected. So I believe it should be possible to have machines that operate independently, truly personal machines. I believe it should be possible to write software to do even complicated things without relying on network servers or vast downloads or, again, the situation where you want it to run independently, okay, but you’ve got to download these Docker images that are 350 megabytes or something because an entire operating system has to be bundled into them because it is impossible to otherwise replicate the correct environment in which software is running, which also undercuts the whole point of open source software. The point of open source is, if I don’t like something, I can change it. But if it’s so hard for me to change something because I have to replicate the exact environment and toolchains that people on a particular project are using, it really limits the ability of me to come in and maybe— maybe I just want to make some small changes, or I just want to modify something, or I want to pull it into my project. That I have to bring this whole trail of dependencies with me is really tough. Sorry, that’s my rant.

Goldstein: Right. Yeah. Yeah. Actually, one of the things I learned the most about from the Hubert piece was Docker and the idea that you have to put your program in a container that carries with it an entire operating system or whatever. Can you tell me more about containers?

Cass: Yeah. Yeah. Yeah. I mean, you can put whatever you want into a container, and some containers are very small. It distributes its own thing. You can get very lean containers that is just basically the program and the install. But it basically replaces the old idea of installing software, where you’d— and that was a problem, because every time you installed a bit of software, it scarred your system in some way. There was always scar tissue because it made changes. It nestled in. If nothing else, it put files onto your disk. And so over time, one of the problems was that this then meant that your computer would accumulate random files. It was very hard to really uninstall something completely because it’d always put little hooks and would register itself in a different place in the operating system, again, because now it’s interoperating with a whole bunch of stuff. Programs are not completely standalone. At the very least, they’re talking to an operating system. You want it to talk nicely to other programs in the operating system. And this led to all these kind of direct install problems.

And so the idea was, “Oh, we will sandbox this out. We’ll have these little Docker images, basically, to do it,” but that does give you the freedom whereby you can build these huge images, which are essentially virtual machines running away. So, again, it relieves the process of having to figure out your install and your configuration, which is one thing he was talking about. When you had to do these installers, it did really make you clarify your thinking very sharply on configuration and so on. So again, containers are great. All these cloud technologies, being able to use libraries, being able to automatically pull in dependencies, they’re all terrific in moderation. They all solve very real problems. I don’t want to be a Luddite and go, “We should go back to writing assembler code as God intended.” That’s not what I’m saying, but we do sometimes have to look at— it does sometimes enable bad habits. It can incentivize bad habits. And you have to really then think very deliberately about how to combat those problems as they pop up.

Goldstein: But from the beginning, right? I mean, it seems to me like you have to commit to a lean methodology at the start of any project. It’s not something that the AI is going to come in and magically solve and slim down at the end.

Cass: No, I agree. Yeah, you have to commit to it, or you have to commit to frameworks where— I’m not going to necessarily use these frameworks. I’m going to go and try and do some of this myself, or I’m going to be very careful in how I look at my frameworks, like what libraries I’m going to use. I’m going to use maybe a library that doesn’t pull in other dependencies. This guy maybe wrote this library which has got 80 percent of what I need it to do, but it doesn’t pull in libraries, unlike the bells and whistles thing which actually does 400 percent of what I need it to do. And maybe I might write that extra 20 percent. And again, it requires skill and it requires time. And it’s like anything else. There are just incentives in the world that really tend to sort of militate against having the time to do that, which, again, is where we start coming back into some of these regulatory regimes where it becomes a compliance requirement. And I think a lot of people listening will know that time when things get done is when things become compliance requirements, and then it’s mandatory. And that has its own set of issues with it in terms of losing a certain amount of flexibility and so on, but that sometimes seems to be the only way to get things done in commercial environments certainly. Not in terms of personal projects, but certainly for commercial environments.

Goldstein: So what are the consequences, in a commercial environment, of bloat, besides— are there things beyond security? Here’s why I’m asking, because the idea that you’re going to legislate lean software into the world as opposed to having it come from the bottom up where people are recognizing the need because it’s costing them something—so what are the commercial costs to bloated software?

Cass: Well, apparently, absolutely none. That really is the issue. Really, none, because software often isn’t maintained. People just really want to get their products out. They want to move very quickly. We see this when it comes to— they like to abandon old software very quickly. Some companies like to abandon old products as soon as the new one comes out. There really is no commercial downside to using this big software because you can always say, “Well, it’s industry standard. Everybody is doing it.” Because everybody’s doing it. You’re not necessarily losing out to your competitor. We see these massive security breaches. And again, the legislating for lean software is through demanding better security. Because currently, we see these huge security breaches, and there’s very minimal consequences. Occasionally, yes, a company screws up so badly that it goes down. But even so, sometimes they’ll reemerge in a different form, or they’ll get gobbled up in someone.

There really does not, at the moment, seem to be any commercial downside for this big software, in the same way that— there are a lot of weird incentives in the system, and this certainly is one of them where, actually, the incentive is, “Just use all the frameworks. Bolt everything together. Use JS Electron. Use all the libraries. Doesn’t matter because the end user is not really going to notice very much if their program is 10 megabytes versus 350 megabytes,” especially now when people are completely immune to the size of their software. Back in the days when software came on floppy disk, if you had a piece of software that came on 100 floppy disks, that would be considered impractical. But nowadays, people are downloading gigabytes of data just to watch a movie or something like this. If a program is 1 gigabyte versus 100 megabytes, they don’t really notice. I mean, the only people who notice is if, say, video games— a really big video game. And then you see people going, “Well, it took me three hours to download the 70 gigabytes for this AAA game that I wanted to play.” That’s about the only time you see people complaining about the actual storage size of software anymore, but everybody else, they just don’t care. Yeah, it’s just invisible to them now.

Goldstein: And that’s a good thing. I think Charles Choi had a piece for us on– we’ll have endless storage, right, on disks, apparently.

Cass: Oh, I love this story because it’s another story of a technology that looks like it’s headed off into the sunset, “We’ll see you in the museum.” And this is optical disk technology. I love this story and the idea that you can— we had laser disks. We had CDs. We had CD-ROMs. We had DVD. We had Blu-ray. And Blu-ray really seemed to be in many ways the end of the line for optical disks, that after that, we’re just going to use solid-state storage devices, and we’ll store all our data in those tiny little memory cells. And now we have these researchers coming back. And now my brain has frozen for a second on where they’re from. I think they’re from Shanghai. Is it Shanghai Institute?

Goldstein: Yes, I think so.

Cass: Yes, Shanghai. There we go. There we go. Very nice subtle check of the website there. And it might let us squeeze this data center into something the size of a room. And this is this optical disk technology where you can make a disk that’s about the size of just a regular DVD. And you can squeeze just enormous amount of data. I think he’s talking about petabits in a—

Goldstein: Yeah, like 1.6 petabits on–

Cass: Petabits on this optical surface. And the magic key is, as always, a new material. I mean, we do love new materials because they’re always the wellspring from which so much springs. And we have at Spectrum many times chased down materials that have not fulfilled necessarily their promise. We have a long history— and sometimes materials go away and they come back, like—

Goldstein: They come back, like graphene. It’s gone away. It’s come back.

Cass: —graphene and stuff like this. We’re always looking for the new magic material. But this new magic material, which has this—

Goldstein: Oh, yeah. Oh, I looked this one up, Stephen.

Cass: What is it? What is it? What is it? It is called–

Goldstein: Actually, our story did not even bother to include the translation because it’s so botched. But it is A-I-E, dash, D-D-P-R, AIE-DDPR or aggregation-induced emission dye-doped photoresist.

Cass: Okay. Well, let’s just call it magic new dye-doped photoresist. And the point about this is that this material works at basically four wavelengths. And why you want a material that responds at four different wavelengths? Because the limit on optical technologies— and I’m also stretching here into the boundaries on either side of optical. The standard rule is you can’t really do anything that’s smaller than the wavelength of the light you’re using to read or write. So the length of your laser sets the density of data on your disk. And what these clever clogs have done is they’ve worked out that by using basically two lasers at once, you can, in a very clever way, write a blob that is smaller than the wavelength of light, and you can do it in multiple layers. So usually, your standard Blu-ray disk, they’re very limited in the number of layers they have on them, like CDs originally, one layer.

So you have multiple layers on this disk that you can write to, and you can write at resolutions that you wouldn’t think you could do if you were just doing— from your high school physics or whatever. So you write it using these two lasers of two wavelengths, and then you read it back using another two lasers at two different wavelengths. And this all localizes and makes it work. And suddenly, as I say, you can squeeze racks and racks and racks of solid-state storage down to hopefully something that is very small. And what’s also interesting is that they’re actually closer to commercialization than you normally see with these early material stories. And they also think you could write one of these disks in six minutes, which is pretty impressive. As someone who stood and has sat watching the progress bars on a lot of DVD-ROMs burn over the years back in the day, six minutes to burn these—that’s probably for commercial mass production—is still pretty impressive. And so you could solve this problem of some of these large data transfers we get where currently you do have to ship servers from one side of the world to the other because it actually is too slow to copy things over the internet. And so this would increase the bandwidth of sort of the global sneakernet or station wagon net quite dramatically as well.

Goldstein: Yeah. They are super interested in seeing them deployed in big data centers. And in order for them to do that, they still have to get the writing speed up and the energy consumption down. So the real engineering is just beginning for this. Well, speaking of new materials, there’s a new use for aluminum nitride according to our colleague Glenn Zorpette who wrote about the use of the material in power transistors. And apparently, if you properly dope this material, it’ll have a much wider band gap and be able to handle higher voltages. So what does this mean for the grid, Stephen?

Cass: Yeah. So I actually find power electronics really fascinating because most of the history of transistors, right, is about making them use ever smaller amounts of electricity—5-volt logic used to be pretty common; now 3.3 is pretty common, and even 1.1 volts is pretty common—and really sipping microamps of power through these circuits. And power electronics kind of gets you back to actually the origins of being an electronics engineer, electrical engineers, which is when you’re really talking about power and energy, and you are humping around thousands of volts, and you’re humping around huge currents. And power electronics is an attempt to bring some of that smartness that transistors gives you into these much higher voltages. And we’ve seen some of this with, say, gallium nitride, which is a material we had talked about in Spectrum for years, speaking of materials that had been for years floating around, and then really, though, in the last like five years, you’ve seen it be a real commercial success. So all those wall warts we have have gotten dramatically smaller and better, which is why you can have a USB-C charger system where you can drive your laptop and bunch of ancillary peripherals all off one little wall wart without worrying about it bringing down the house because it’s just so efficient and so small. And most of those now are these new gallium-nitride-based devices, which is one example where a material really is making some progress.

And so aluminum nitride is kind of another step along that, to be able to handle even higher voltages, being able to handle bigger currents. So we’re not up yet to the level where you could have these massive high-voltage transmission lines directly, but the more and more you— the rising tide of where you can put these kind of electronics into your systems. First off, it means more efficient. As I say, these power adapters that convert AC to DC, they get more efficient. Your power supplies in your computer get more efficient, and your power supplies in your grid center. We’ve talked about how much power grid centers today get more efficient. And it bundles up. And the whole point of this is that you do want a grid that is as smart as possible. You need something that will be able to handle very intermittent power sources, fluctuating power sources. The current grid is really built around very, very stable power supplies, very constant power supplies, very stable frequency timings. So the frequency of the grid is the key to stability. Everything’s got to be on that 60 hertz in the US, 50 hertz in other places. Every power station has got to be synchronized very precisely with the other. So stability is a problem, and being able to handle fluctuations quickly is the key to both grid stability and to be able to handle some of these intermittent sources where the power varies as the wind blows stronger or weaker, as the day turns, as clouds move in front of your farm. So it’s very exciting from that point of view to see these very esoteric technologies. We’re talking about things like band gaps and how do you stick the right doping molecule in the matrix, but it does bubble up into these very-large-scale impacts when we’re talking about the future of electrical engineering and that old-school power and energy keeping the lights on and the motors churning kind of a way.

Goldstein: Right. And the electrification of everything is just going to put bigger demands on the grid, like you were saying, for alternative energy sources. “Alternative.” They’re all price competitive now, the solar and wind. But–

Cass: Yeah, not just at the generate— this idea that you have distributed power and power can be generated locally, and also being able to switch power. So you have these smart transformers so that if you are generating surplus power on your solar panels, you can send that to maybe your neighbor next door who’s charging their electric vehicle without at all having to be mediated by going up to the power company. Maybe your local transformer is making some of these local grid scale balancing decisions that are much closer to where the power is being used.

Goldstein: Oh, yeah. Stephen, that reminds me of this other piece we had this week, actually, on utilities and profit motive on their part hampering US grid expansion. It’s by a Harvard scholar named Ari Peskoe, and his first line is, “The United States is not building enough transmission lines to connect regional power networks. The deficit is driving up electricity prices, reducing grid reliability, and hobbling renewable-energy deployment.” And basically, they’re just saying that it’s not—what he does a good job explaining is not only how these new projects might impact their bottom lines but also all of the industry alliances that they’ve established over the years that become these embedded interests that need to be disrupted.

Cass: Yeah, the truth is there is a list of things we could do. Not magic things. There are pretty obvious things we could do that would make the US grid— even if you don’t care much about renewables, you probably do care about your grid resilience and reliability and being able to move power around. The US grid is not great. It is creaky. We know there are things that could be done. As a byproduct of doing those things, you also would actually make it much more renewable friendly. So it is this issue of— there are political problems. Depending on which administration is in power, there is more or less an appetite to deal with some of these interests. And then, yeah, these utilities often have incentives to kind of keep things the way they are. They don’t necessarily want a grid where it’s easier to get cheaper electricity or more green electricity from one place to a different market. Everybody loves a captive monopoly market they can sell. I mean, that’s wonderful if you could do that. And then there are many places with anti-competition rules. But grids are a real— it’s really difficult to break down those barriers.

Goldstein: It is. And if you’re in Texas in a bad winter and the grid goes down and you need power from outside but you’re an island unto yourself and you can’t import that power, it becomes something that is disruptive to people’s lives, right? And people pay attention to it during a disaster, but we have a slow-rolling disaster called climate change that if we don’t start overturning some of the barriers to electrification and alternative energy sources, we’re kind of digging our own grave.

Cass: It is very tricky because we do then get into these issues where you build these transmission lines, and there are questions about who ends up paying for those transmission lines and whether they get built over their lands, the local impacts of those. And it’s hard sometimes to tell. Is this a group that is really genuinely feeling that there is a sort of justice gap here— that they’re being asked to pay for the sins of higher carbon producers, or is this astroturfing? And sometimes it’s very difficult to tell that these organizations are being underwritten by people who are invested in the status quo, and it does become a knotty problem. And we are going to, I think, as things get more and more difficult, be really faced into making some difficult choices. And I am not quite sure how that’s going to play out, but I do know that we will keep tracking it as best we can. And I think maybe, yeah, you just have to come back and see how we keep covering the grid in pages of Spectrum.

Goldstein: Excellent. Well—

Cass: And so that’s probably a good point where— I think we’re going to have to wrap this round up here. But thank you so much for coming on the show.

Goldstein: Excellent. Thank you, Stephen. Much fun.

Cass: So today on Fixing The Future, I was talking with Spectrum‘s Editor in Chief Harry Goldstein, and we talked about electric vehicles, we talked about software bloat, and we talked about new materials. I’m Stephen Cass, and I hope you join us next time.

Source: IEEE Semiconductors