Esteve Almirall

Listen to this podcast via Spotify | Apple Podcasts | Google Podcasts

How can a company build an IT infrastructure today? How can we manage the huge amount of APIs available in the market? Is open source the future? In this podcast, Esade professor Esteve Almirall, and Steven Willmott, senior director and head of API Infrastructure at Red Hat, discuss how technologies are changing in the IT world and how these developments and infrastructures are occurring faster than ever – and becoming increasingly standard.


Esteve Almirall: Good morning to everybody and welcome to the Esade podcast. Today we have with us Dr. Steven Willmott, a mathematician turned computer scientist first and an entrepreneur later. Now he's the senior director and head of API Infrastructure at Red Hat. Dr. Willmott served as a professor at the Polytechnic University of Catalonia (UPC) for many years and directed quite a lot of European projects, mostly on AI and agents and related technologies. He created 3scale, a company with offices in San Francisco and Barcelona that rapidly became a leader in API intermediation in the world.

Intermediation tries to solve a very simple problem: how can we manage all these hundreds and hundreds of thousands of millions of APIs around with so many companies, and how do we make them pay and ensure that all these payments make sense and are easy for companies? This was a solution that many companies were seeking and that API interfaces resolved at that time. This was a very welcome solution that saw a lot of growth. After many years in the market, Red Hat bought 3scale, and then Red Hat was bought by IBM. This is normal; it's part of the industry's consolidation and transformation. It has happened to all of us. I mean, I don't think that any of us are still working in our original companies. They have all disappeared, died or whatever, and this characterises the sector. In addition to that, Dr. Willmott also has a passion for games. Maybe we can dedicate a small part of this podcast to talk about that. Dr. Willmott, it's a pleasure and an honour to have you here. We are so happy.

Steven Willmott: Thank you. It's a pleasure to be here. Thank you for the invitation, I really appreciate it.

Esteve Almirall: Thank you so much. So let's start with this podcast. Probably everything started around 2002, when Jeff Bezos wrote his very famous memo and API Mandate, transforming the whole structure of Amazon from being a monolithic company in terms of IT to becoming an API company. This led to great gains for Amazon and made it easy to open the boundaries to serve parties in logistics, in the marketplace, and so on. It's no longer 2002, almost 20 years later, what are APIs today?

Steven Willmott: Good question. Yes, I think Jeff Bezos' email to the company in the early 2000s is a really great read if you've never read it, because you'd think it was technical but it's not. It's actually organisational. It just says you should connect to all other business units in a structured way using APIs, and that's the way to do it. Don't make direct connections with unstructured technologies or just straight to the database or whatever.

So he didn't even specify what type of API to use. He just said to be deliberate and structured about the connections. And that's a lesson that I think Amazon learned, and they evolved many, many times, but the organisational structure created their ability to connect to many other companies. And there were some public APIs at the time.

Jeff Bezos
Jeff Bezos transformed the whole structure of Amazon from being a monolithic company in terms of IT to becoming an API company (Photo: James Duncan Davidson/Flickr)

I think eBay had an API in the early 2000s. And there were a couple of others that came out, but it still took a long time for public APIs to become real. But Amazon had already organised internally and a few other companies had. Today, it's really common for companies to have APIs. In fact, if I look at the Fortune 2000 [companies] that we work with, almost everybody has APIs in many places of the business. It's not always so consolidated; it's not always so deliberate.

But it's clear that our IT systems have become hundreds of times more complex than they were in the early 2000s, and these connections are really critical. I could say many things about people doing it well – and not doing it well, but APIs are clearly the glue that's holding many IT systems together today.

Our IT systems have become hundreds of times more complex than they were in the early 2000s, and these connections are really critical

Esteve Almirall: Many things are changing, as you said, in computer architecture in companies. The monolithic application is definitely breached, and private data centres are being replaced by cloud or hybrid clouds. The modern system design is based on micro-servers and data streams. And micro-services themselves connect through APIs. Was 3scale API infrastructure too early or was it part of a slow evolution that took quite a long time to happen? Is today when we are seeing the real API transformation, or is it not even today? Will it be tomorrow? What do you think?

Steven Willmott: I think there are many phases. There are still some very big phases which we have hardly started. For sure, in 2007 or 2008 with 3scale… It was pretty early. Most people did not know what an API was when you talked to them. People were really not familiar with that technology. They had heard of web services and XML and some other technologies which did similar things, but it was rare that people knew what an API was. To some extent, the ten years after that was building the usage of APIs in a very high growth way. But it was still only touching a few companies, and now we see it in almost every company.

So there's been a big transformation. I'll talk about micro-services in a minute, but APIs have really grown and they have become much more mature. They are a very mature enterprise technology now.

The phase that we thought was almost most exciting in 2008 and still has not come, I believe, is when essentially every company has all of its major functions exposed as API to the public internet. And that's still not true today. Most companies do not have that. They're closed partner systems or they're very limited functions.

When all major functions are there, you can imagine that the internet becomes a much richer toolbox than it is today. Websites essentially mean that humans click on lots of buttons to do things. But, why would you need that if you are able to automate that? And I believe it will happen, because we now have things like Amazon Alexa and other devices that are voice-activated. Keyboards are disappearing... There are many reasons why the standard website is probably going to need to be replaced by many other interface types. So I think this whole world is still very nascent. I think that API evolution is still, let's say, in the middle.

Website browser
There are many reasons why the standard website is probably going to need to be replaced by many other interface types (Photo: Ban Phote/Twenty20)

Esteve Almirall: That's great. It's true. We were all dreaming of agents doing many things for us. And we are still clicking too many buttons everywhere all day long that we don't know what on Earth they do most of the time…. And erasing too many emails that we subscribed to one day.

Steven Willmott: And accepting too many cookies...

Esteve Almirall: Yeah, accepting too many cookies that we don't even read, never ever. [Clicking] yes, yes, yes… That’s absolutely true. And this is closely linked maybe to a computer architecture that has evolved tremendously, and now it has become software… Now all these power paths are configured to software… You go to a website like Amazon and then you configure the computer that you want, or you can do it programmatically with iPhone or something else.

Not only that, I mean, databases that we were all fighting with, now they have single digits in the second performance and infinite scalability like DynamoDB, Cassandra and many others, the basic ones you have in the market. However, these remote micro-services that are behind the APIs still have a significant latency and are still driven by languages that are interpreted, like Python, which have this little bit of a cold-start problem...

Modern languages haven’t solved that. if you go to Julia, it's the same thing or worse, and it seems that we are encountering these kinds of problems. Do you think that this dream of completely getting rid of the computer architecture and having functions in the cloud will ever be reality? Because now we have these Lambda functions and so on, but they are full of restrictions, full of things that you cannot do, full of complicated stuff. And then, many times in the end, you put a virtual machine and the virtual computer doing the other stuff because it works… Until… The other, well, is not so clear. However, it's true that things are evolving. But do you think that this dream of having the micro-services we need for these agents to click for us will ever become reality?

Steven Willmott: I do think technology is getting better all the time. I mean, I'm shocked at what’s possible today compared to five or ten years ago and I do think micro-service architectures are very powerful. The main thing is that it’s not a solution for everything. Like anything, it suits certain scenarios very well – micro-service scenarios are great for certain things and they're bad for others.

Micro-service scenarios are great for certain things and they're bad for others

If you listen to technologists over the years, then let's imagine around the year 2000. The message was "decentralise, decentralise, decentralise." That was the web services message. Then a few years later, it was like, "well, there is a lot of complexity now, because it's really complicated when you have all these small pieces everywhere." So the message was “centralise, centralise, centralise”, and you got ESPs and enterprise integration buses and central registries and all these things. And, then, the next cycle was, well, "decentralise, decentralise, decentralise." So now we're in the micro-services world, and I think in each cycle we get a bit better, but we don't solve all the problems.

So an analogy would be if you're going to go on a long journey with 25 people, the question is: do you take a bus, or do you take 25 motorbikes? I know what your answer would be. [Both laugh] But, the point is, it depends. If you're going on motorways to Germany, then the bus is probably a great idea. If you're crossing the Sahara Desert, I’d pick the motorbikes, right? So micro-services are like that. It’s an architecture that only applies to certain cases, and, in other cases, centralising is still better. And you mentioned latency problems.

We see many customers that push to micro-services very fast and then they find they have a lot of complexity in between the micro-services, and they have to manage this latency. Often, for some applications, it's better just to stay in a certain set of containers that are managed very tightly with a tightly-coupled architecture for those executions. It's better if you know how big you have to scale that component; you design it in a very high performance way.

But if you don't know how many new things you need to add to it or how much capacity, then micro-services are a great idea. So you have this trade-off going on, and the technologies are getting faster all the time. So I think the latencies will go down. What will not go away is the inherent system complexity of distributed systems. The idea that micro-services remove complexity is false, because it just moves a complexity from inside the services to between the services. So you have to be ready for this. That's the real thing to remember, I think.

The idea that micro-services remove complexity is false, because it just moves a complexity from inside the services to between the services

Esteve Almirall: You mentioned containers, and I'm very curious about your point of view. In the last 5, 10 years, the world has been taken over by Kubernetes. And then the fast responses from the big players and ideas came very fast. We can see this in Far Gate. Nobody knew what you could do with CCS and Far Gate and nobody wanted even to try them because it was so new and breaking down all day long. And then finally the whole thing has been fixed somehow. But now, as you say, the pendulum is swinging a lot between these monolithic applications to micro-services. Micro-services didn't work so well, then we wanted to move to containers and then we tried to go to micro-services again. How do you see the future of containers and the Kubernetes world, the CS world, and so on? Are they here to stay or just an interim solution that we don't know?

Steven Willmott: I think containers are probably more important right now. They’re at a different level from micro-services and function as a service and serverless. I think, fundamentally, if you look at what's driving IT today, containers are just critical and they will continue to be critical for many years, five to ten years at least. The reason is that they provide this level of abstraction to hardware that VMs provided before, but in a much faster and portable way. And so, even if you look at Red Hat’s strategy, Red Hat really is focused on hybrid cloud which means running your workloads anywhere on any of the major clouds or on premises. And what containers allow someone to do is, whatever architecture they're running even if it's monolithic or if it's micro-services, you can spin these containers up and create copies of all these systems all over your entire IT landscape. If you look at the major reasons why IBM acquired Red Hat, Kubernetes and containers are really front and center to build these hybrid cloud systems.

If you look at what's driving IT today, containers are critical and they will continue to be critical for many years

Now, I don't think containers preclude function as a service or micro-services. In fact, Red Hat's function – its service solutions – are based on containers. At some point, the dirty secret behind serverless is that there are some servers, right? Someone is managing the server somehow, and, right now, the way to do that is using containers, actually funnily enough.

So I think these technologies layer on top of each other, and containers are a great abstraction because they finally nailed relatively fast start-up portability with security for workloads. VMs had the security and the safety, but they didn't have this fast start-up portability. I think containers will be with us for a long time and they will be a fundamental building block. But the other things, micro-services and functions, will be up there on top and evolving very fast. So one day the containers will be there, but you will not know that they’re there anymore. That's the way it will go, I think.

Containers are a great abstraction because they finally nailed relatively fast start-up portability with security for workloads

Esteve Almirall: Yeah, that's a great idea. I think that the containers will be completely transparent to the user and will be the implementation of functions at that point. Now they solve these major problems that you have with functions like Lambda and so on that always have start-up problems, problems with responsiveness, lack of resources, lack of focus tension capacity across things that are a little bit more complex than just running Python and all these kinds of things. That’s fantastic.

Let’s look at the other side. In parallel with this technological development that has been extending fast, we have seen, astonishingly for the most part, that when you go to the management community, we don't have this kind of understanding and development. Companies, large and small, are often absent from this transformation. And all these discussions that we have about moving to containers should be at the core of the company, because the benefits in terms of scalability, and the potential for zero marginal cost are so great for productivity that now, with Covid, they have become very important. It's so clear, but few companies consider these benefits. Most companies are completely absent from these discussions and many are still using data centres or they’re still questioning if they should transition to a cloud or not. They haven’t phased in a cloud highway; they are still into things… Why does this happen?

Micro services
Many companies are still using data centres or they're still questioning if they should transition to a cloud or not (Photo: Oleksandr Zozulinskyi/Twenty20) 

Steven Willmott: Well, firstly, I think, I would challenge a little bit that most are absent. I spend a lot of time with customers working on architectures in all sectors from finance to telecoms to media. There are groups in each of these customers working on these transformations, and some of them are very advanced while others are not. There are definitely areas where people are behind, and, obviously, some companies are further ahead than others.

I would say that there are two factors which make it harder for many, especially large companies, to transform at this level. The first one is legacy systems. People obviously want to modernise. But the investment in legacy systems is massive. And, many times, they perform hugely mission-critical systems, and in many cases and for many companies these systems were built at a time to last for a while and the people that built them are no longer there. This means you have lots of systems which have been invested in over 20 or 30 years that, if they went away suddenly or were transformed, this would be an extra cost the company simply had not planned for in any way.

The good news is that very often you can wrap these components in APIs and bring them into a more modern world where you can do your new development that still uses the old components. I think it's very unlikely that we'll see completely transformed companies from an IT level that literally replace every single system they ever built. In fact, that would be economically suicidal.

It's very unlikely that we'll see completely transformed companies from an IT level that literally replace every single system they ever built

I think what we're seeing much more is people putting an API/micro-services/cloud architecture on top of their existing systems. They will very often keep their on-premises data centres. Especially now with Covid, they want to get the maximum out of the resources they have already invested. But I think they will add cloud, different clouds. They will add new capabilities on top and they will add APIs to connect these systems together.

We cannot just throw the legacy away. That just doesn't happen. And coupled with that, most large companies will want to use multiple clouds; they want to use Google and Azure and Amazon. This is really important, because it's not just lifting and shifting everything to one place.

The second reason which is a bit more amusing, I think, is that many of the technology companies are still not good at explaining how that technology solves real business problems. We are still showing them that one flavour of Kubernetes is better than the other flavour of Kubernetes, but we don't explain why you need Kubernetes in the first place. So this is a very significant problem, and I think the transformation benefits only come if you actually show how teams can reorganise, how data flows can be better, how new functionality is being created by the customer.

Many of the technology companies are still not good at explaining how that technology solves real business problems

One of the most exciting things that I've seen in the last couple of years is that many companies do this, but Red Hat does it really quite well. There are these things called Innovation Labs where customer IT teams come and work directly with specialists on all sorts of technologies, and they just build something for a few weeks. Normally these result in really great ideas on how to transform. Most of the time, the marketing that IT companies do just isn't good enough to explain why you should be using these technologies, to be honest.

Esteve Almirall: Yeah, that's absolutely true. I think in business schools it’s the same thing. Many times we see technology as a performance enhancement function and that's it. Many times, very simple economic growth models see technology as a tool to raise productivity. They don’t see technology as a transformational agent that will transform the company internally, doing things in a completely different way. So people say okay to performance enhancement functions, because troubles are solved a bit better and I have to invest a lot, and who cares? Well, they miss the whole point, because it's not 10% better and then who cares. Your company is going to change and, if it doesn’t, your company will be dead and then you will care, which is completely different.

We are very bad at this idea of why one thing is more important than the other… One thing that is very interesting is that APIs always talk about the external world, how companies communicate with the exterior. But, also, APIs change a lot how companies are organised internally. You have been participating in many of these transformations in companies that are big and small. What can you tell us about all this?

Steven Willmott: Right now, at least today, the biggest value that's unseen with APIs is often internal. They make it possible to reuse things that were just not reusable before. In previous generations of technologies, web services and so on, you would create point-to-point connections and every time a new business unit wanted to connect to you, you had to create another one which created not only cost; it created new legacy every time. But this generation of APIs is much better, and, typically, you can create one set of APIs for your business unit and then other people can use it in multiple ways and it will just evolve slowly over time. But you're serving many needs with the same interface.

APIs change a lot how companies are organised internally

Some of the funniest things I've seen, I mean, I won’t name the banks, but there are a couple of banks that we worked with and they had many APIs but they had servers where calls were coming in and they had no idea where the calls were coming from. They just, literally, didn't know. And, in the end, sometimes the only way to check was to switch off the server and see what happened. This kind of crazy transformation... And, in many cases, companies have no idea what APIs they have and when they do the inventory and we help them, they find three or four APIs that do the same thing from different groups. Sometimes even using the same data, so maybe you have a database, and another team is copying the database onto an FTP server, unpacking it, creating a new database, reformatting and creating an API for that because they don't know that you have your own API.

You have these crazy scenarios within companies where, over years, there are data flows which have never been audited, never been checked. And so there’s huge potential in making all this much leaner. And if your industry changes suddenly and you need to transform and produce a new set of functions for your customers and you don't understand your own data infrastructure, you're dead. Because you cannot move, so you need to do this internal transformation to actually be ready to respond to change.

If your industry changes suddenly and you need to transform and produce a new set of functions for your customers and you don't understand your own data infrastructure, you're dead

Esteve Almirall: Can you give us some clues about this internal information? How would you start?

Steven Willmott: What I've seen successful is that there must be a set of use cases that are business-focused. If it's just the IT organisation saying, "okay, it's time for us to invest in our infrastructure" and they try to do a project to clean stuff up, maybe they even get budget. But the danger is that, over time, this budget is seen as things to be taken away when there are other emergencies.

It's really important to build a partnership to say, "we are going to build some new capabilities. We're going to wrap some old ones and we're going to work with these two or three business units." This is really the only way. You have to connect to some key business. And, it sounds really obvious, but many people don't do that.

The other reason is that it helps you get the right infrastructure. Because, you just imagine the IT team inventing, well, they say, "we want to go to Kubernetes and containers and we want to do serverless. So let's do it." That always ends badly, because there are 10 million ways to do this. And why would they pick one that's going to work? I think that driving towards some new business outcomes and attaching the transformation project to those business outcomes is a really critical step.

Esteve Almirall: That's great. Fantastic. Let's change gears and let's talk about open source. You've been working on open source for many years. Open source started as a movement that was kind of revolutionary. "We’re going to change IP. We're going to change the way software was built" and so on. It has been so successful particularly in systems software. After that, big companies came along and saw this open source movement and now it's a completely different picture than the one before, particularly for the big problems. But, do you think there is still room for individuals and small companies or even the public sector to do things in open source? Or is it just the way big companies have to produce big software projects?

If you don't have open source implementations of the standards and the APIs, then often an industry cannot get off the ground and get connected

Steven Willmott: I think there's still huge potential and, frankly, a need for open source in many ways. So, in fact, I would distinguish three things: there are open standards; there is open source; and there are open interfaces. And you really need all three of them. If you don't have standards which emerge, then it's hard to keep all the interfaces working. If you don't have open source implementations of the standards and the APIs, then often an industry cannot get off the ground and get connected.

So I think open source is critical. Obviously, every single Red Hat product that it makes is open source first. Red Hat's model is to make all the code open. There are other companies that have other models which are fine, so they have some of the code open and so on. I wish all those companies well. Because I think that shared code is so critical for us to build our systems safely… We would all be very lost without it, and it's going up the stack. More and more systems are becoming open source.

I would point to one thing that's worrying about open source right now for companies, which is that there are some licensing challenges. Sometimes when open source projects that are popular get taken by companies and run as hosted cloud services, some of those companies don't always give back the innovation that they have created. They monetise the open source without giving back to the community, and this is a little bit dangerous. Well, it's quite dangerous, actually. So you've seen some changes in open source licenses from companies like Redis which I think rightly have defended themselves against other companies using their software to run a hosted service to compete with them but not give back any innovation.

Cloud services
Sometimes when popular open source projects get taken by companies and run as hosted cloud services, some of those companies don't always give back the innovation that they have created (Photo: Antoine Ménard/Twenty20)

So I think there’s a huge future for open source, and I'm grateful to all the companies including Red Hat, but also especially Google and Microsoft, and many companies that give back very freely to open source. I think that the cumulative benefit to the world is just huge. If we didn't have open source, we could probably roll back at least 10-15 years of innovation in IT right now, which would take us back to 2005 and there was no iPhone in 2005. That was… [Both laugh] You can imagine that, right?

Esteve Almirall: Yeah… In fact, open source has changed a lot. Now many of the products that we use are open source, particularly in systems software and other things like React and Angular. Even what we were talking about before, Kubernetes, things like Far Gate, have a new vision of open source, of this refactoring of open source, and other kinds of things. I mean, the most common databases that you have in a system are versions of open source databases. They are thinking up many things; there are also things that were unexpected like low-level instruction sets. Now we have low-level instruction sets. And, now, the reign of the 8086 instruction set is probably diminishing, and a new era of open source processors and core processors is starting. How do you see the future of open source?

Steven Willmott: Well, I think that it is more and more challenging for a small company to take a fully open source product from the beginning and build a sizable enough cash flow from an open source product to succeed in the market. This has happened a few times in the past few years. The companies that have gone public with open-source offerings, there are probably 10 -15 of them. It's getting harder to climb that mountain, because there are now many players that will suddenly pop up and support things. So maybe the customers will not buy your support, and it's challenging.

If we didn't have open source, we could probably roll back at least 10-15 years of innovation in IT right now

I hope that what will happen is that large and small companies will continue to invest in open source, not just if they're building it but also in the foundations that help guide the specifications. I think it's a bigger mountain to climb now, and many companies are investing, they’re sponsoring projects. People are using things like Patreon, for example, to sponsor open source projects, and there are even some open-source-specific funding options that help open source projects get off the ground.

So it's crowdfunding for open source, and that's really exciting. I think people are willing to contribute, but they need to find a way. And sometimes they cannot contribute by writing code; they can contribute by helping with marketing or by donating money or whatever it is. We all need to keep investing in those things, and that will keep the technology more democratic. I think that's the key thing.

We all need to keep investing in open source, and that will keep the technology more democratic

Esteve Almirall: Patreon is fantastic. But on the other side, you see that the traditional model of open source is being eroded. The traditional model of "I provide you security by having verified and tested versions of Linux that you can run in your system" is being eroded in the sense that you have companies that do cloud, that provide the same thing for free, because you pay per use and you don't pay for the software, the testing, the validation and so on. All this is included in the usage contract. And many more things have been moving to the cloud. How do you see the business models that could still work in the open source community?

Steven Willmott: Well, I think the business models are evolving. Red Hat, for example, now provides managed versions of most of its products and it's aggressively pushing that way so you can get hosted OpenShift, which is Red Hat’s Kubernetes, on demand. So companies like Red Hat that traditionally did soft package software, they are moving into hosted.

There are other companies that don't do the software and just do the hosting. Some of them behave really well in the sense that they contribute back to the original open source community. For me, the key question is whether you are contributing back to the open source, not your business model. So the companies that contribute back are helping the ecosystem, because then more people can support these hosted services or customers can run them on premises.

It's when people only take the innovation and then monetise it for themselves that makes it dangerous, because sometimes they become so dominant that no one else can make any money. There are some license changes which are making it more difficult for people to do this, to just take and not give. But I think open source will easily survive the cloud era. I do think that cloud and open source are very compatible as long as the people hosting the cloud services are also contributing back to the open source.

It's when people only take the innovation and then monetise it for themselves that makes it dangerous

Esteve Almirall: In addition to this tension, we have another one which is the West, basically, China versus the US ecosystem that comprises the US, Europe, Israel and some other countries. In this you see the tension of, well, one side copying many things and, on the other side, attention focused on trying to make some of the things proprietary to avoid this copying and to avoid the lead on the Chinese side, for example, having much more data, a captive market, the possibility of raising money so easily from the government and having an initial level of money… And then in the middle of all this you have the old organisations like Apache and its New Focus, trying to survive all this mess of huge players in some places. Do they still have a chance?

Steven Willmott: I think they do, especially because they are non-political, in general. I mean, when they get large, large companies become involved. So there can be a lot of politics in this sense, which is sometimes damaging, but I feel that they are non-geopolitical. So, in general, they have a more neutral role in the world.

Some of them are maybe very US-centric, so it's not completely true. But I think that creating technology that is reusable by anyone in any geography is quite important, because this basically means that we are able to push the rate of innovation to places on the planet that maybe haven't seen the same rate and the baseline is moved for everybody.

Creating technology that is reusable by anyone in any geography means that we are able to push the rate of innovation

I'm quite optimistic because I see more and more investments in these open source foundations. In some ways, it would be nice if there were more of them, because we seem to have gotten a little bit ossified in the options. Having a bit more diversity in the organisations themselves might help as well. But I think that is a good place to invest. My experience has always been that there are very smart people that contribute to the projects there, and, in general, they're focused on getting the technology right and making it work well and they're less focused on the politics of it, which I think is good.

Esteve Almirall: That's fantastic. There is a huge need for open source in the public sector. When you look at IT in the public sector, it’s like going back to the 80s. It's terrible. We need a foundation in the public sector. That's obvious… Switching to another issue, we have been talking about systems for open source, in general. But now the action, the discussion is on AI. We are seeing different forces and now a force that is moving AI to metal, to power, that is moving AI to silicon, for instance, the engine. For example, in AWS, they are diminishing the concept of AI in a way that we never thought was possible before. Particularly for deep learning but also for other kinds of work with AI. How do you see the future of AI and open source?

If we have AI centralised only in a few players, the ethics problem will get bigger

Steven Willmott: I don't really know whether to be optimistic or pessimistic about this. On the one hand, there are a lot of open source API tools. Also, some great ones that are being released, and that's been a really big thing and important. But I think one of the lessons of the last five years is that the tools are one thing, but the data is a totally different thing.

The companies that have the biggest leads in certain parts of APIs because of AI is [due to] that they have the data available, and I don't really see that changing in a very rapid way. I am optimistic in the sense that I think they will continue to be open tools. I'm hopeful that some of the data will be made available in forums so that people can innovate with it and create new AI systems. But I'm worried that if that doesn't happen, we will end up with only a very few players that are really able to do AI at a very significant level. And I think that will be a problem, because, again, you'll have this bootstrap problem that no one will be able to get enough data in one place to actually do something meaningful. And that might seem beneficial for the big players for a while and probably will be, but it will cut innovation in the long run.

I think the ethics problem will grow. If we have AI centralised only in a few players, the ethics problem will get bigger because it will not just be an ethical issue about whether AIs have certain powers or rights or abilities and who uses them; it will also be tied to the identities of the companies that own those AIs, and that whole thing will be much more complicated.

Esteve Almirall: So we face a complicated future but probably a very interesting one.

Steven Willmott: Probably a very interesting one. Yes, that's right.

Esteve Almirall: So, thank you so much for the interview and thank you so much for all this time. And I hope to see you again in short.

Steven Willmott: Just like always, a pleasure. Thank you for the invitation.

The answers by Steven Willmott are personal opinions only and should not be taken as representative of his employer.

All written content is licensed under a Creative Commons Attribution 4.0 International license.