vVv AMA | Eigenlayer
EigenLayer: An Ethereum Layer 2 Trust Protocol and Restaking Solution For Open Innovation
October 4, 2022

EigenLayer solves the liquidity issue for new dApps by leveraging staked Ethereum and allowing access to it through restaking on EigenLayer. Through ETH2 restaking, EigenLayer supplies flexible and scalable trust for novel innovations built on it’s protocol. They address scalability and security issues to help onboard the next 1 billion users. Additionally, they have introduced a dual token staking mechanism that could utilize tokens from other PoS chains and provide economic security for middleware tokens. EigenLayer’s core premise is to support open innovation and security through their free marketplace of trust.
Sreeram Kannan is an associate professor at the University of Washington in the Department of Electrical & Computer Engineering. He’s been working on blockchain technology for several years and started building EligenLayer in 2020. He joined us for an AMA on October 4th.

vVv: How about we get started with the 30,000 feet overview of what is EigenLayer and then delve into your role as a founder and your vision for the next couple of weeks and months?
Sreeram: To give a little bit of background about myself, I’ve been in the academic blockchain space for the last several years working on consensus, protocols, scalability,.etc. One interesting thing I discovered is that the rate of innovation at the core protocol layers is much slower than the rate of innovation we see at the application layers. And the main reason is that it’s difficult to build infrastructure, which is true to some extent. But diving deeper, we found that that is not the sole reason. We see exciting, powerful protocols invented and built in academics, and sometimes, even by companies like Facebook, who built extremely powerful protocols and virtual machines but were never deployed on public blockchains. As we looked into these things, it became clear that the reason we have such an insane rate of innovation in the dApp layer is the decoupling of trust and innovation, or that dApp creators do not need to be trusted. They’re borrowing decentralized trust from the blockchain. Since they don’t need to be trusted, they can be anonymous; they can be an add-on and still create very valuable products. This transformation was brought forth by Ethereum (ETH). Bitcoin created the concept of decentralized trust. But this decentralized trust was bundled with its core network, using its consensus protocol and particular application. You can think of Bitcoin as the first application-specific blockchain. And Ethereum modularized this trust and created a marketplace where one can exchange trust for fees, resulting in people building more applications. The application users are essentially paying a fee to Ethereum for the privilege of using, or consuming, the decentralized trust. If decentralized trust is the core value proposition of blockchain, then Ethereum created the first free market pricing for decentralized trust. However, Ethereum was limited in the dimensions in which it could supply this trust. It could supply it to dApps, written on top of the Ethereum virtual machine (EVM), but there were a number of other things that could not consume decentralized trust, and hence the market was rather limited. Examples include: I want to build a consensus protocol; I want to build a middle-man like an Oracle; I want to build a data availability layer; I want to build faster consensus protocol which has different properties; I want to build an authentication layer; I want to build a layer where the zero knowledge proofs are verified in parallel; whatever the set of things are that you want to do. You cannot simply do that on top of Ethereum. Essentially, Ethereum has taken programs written in an EVM to coordinate Ethereum protocol coordinates for distribution to its massive network of 10,000s or 1000,000s of nodes. This was the friction for creating and deploying new innovations at the deeper layers of infrastructure stack because they cannot simply consume trust. Each new idea needed to create new decentralized trust mechanisms, which has fragmented the market. It reminds me of Bitcoin developers’ situation before the launch of Ethereum. If you wanted to build a domain name system, you had to build your own chain, namecoin, and so on. The same thing is occurring today as a result of the limitations of the decentralized trust marketplace. EigenLayer is a mechanism to make this market significantly more efficient by supplying decentralized trust, not in the limited dimensions of what Ethereum is doing it, but in a more generalized fashion. You can use ETH staking and re-staking to reuse the capital you’ve alllocated to Ethereum and start building any of these other decentralized trust services. You can build Oracle’s deep data availability layers, fast zero knowledge proof verifiers, faster consensus protocols that will allow you to settle these roll-ups much faster than Ethereum latency on fast bridges, all of these things on top of EigenLayer. That is our vision for EigenLayer; EigenLayer is mechanism that allows ETH stakers to provide additional services, thus creating a massive degree of freedom in which trust is offered to services.
”The reason we have such an insane rate of innovation in the dApp layer is the decoupling of trust and innovation.
That is our vision for EigenLayer; EigenLayer is mechanism that allows ETH stakers to provide additional services, thus creating a massive degree of freedom in which trust is offered to services.
vVv: What’s your personal motivation to build EigenLayer?
Sreeram: I was introduced to blockchain around January 2018. But my interest in peer-to-peer systems dates far back to my masters and PhD, from 2006-2011. I was interested in peer-to-peer wireless systems, specifically, how to build infrastructure free wireless systems, which do not require things like base stations and implanted infrastructure. With the primary goal of creating the last mile of wireless coverage where there is no infrastructure? It turned out that there are massive infrastructure deployments in developing countries around the world. Thus, I switched to work on synthetic and computational biology from 2012 to recently. This piqued my interest because of genomics’ ability to actually touch the root node of who we are and actually start improving ourselves. However, in 2018 when I heard about blockchains, I pondered whether this was a speculative bubble with a short expiry or if there was something long-standing that could be developed and persist.
One of my core operating principles is understanding things through the lens of evolution. And according to Harari, in his famous book, Sapiens: A Brief History of Humankind, with respect to competitive advantages, humans are special. Not necessarily because we are intelligent, but because we can cooperate flexibly in large numbers. The major friction to cooperation in human society is trust. I will cooperate with you if I know I can trust you in terms of what you will do and how the rewards of such cooperation will be distributed. Suppose trust is the major friction of cooperation. How much can decentralized trust accelerate the basic evolutionary advantage of humans, which is the ability to cooperate flexibly in large numbers? As I think through this core thesis, Bitcoin has already established the mechanisms for trustless cooperation. I find this to be a fascinating idea, that decentralized trust can help us cooperate better. If you expand this basic premise, that cooperation flexibly in large numbers is the driving value proposition, then what are the connotations of cooperating flexibly? Harari’s illustrates his definition of cooperating flexibly with an example of army ants. They cooperate in large numbers because of their large colony size, but their cooperation is genetically determined. Rather than being able to adapt flexibly by knowledge and memes, and so on. That’s the difference I see between Bitcoin and Ethereum. Bitcoin is enabling trustless cooperation and Ethereum is enabling trustless flexible cooperation. Bitcoins’ decentralized trust is fixed at birth whereas Ethereum has an ability to upgrade not only itself, but the ability to build on top of this modularizing cooperation. And then you can ask, what is enabling cooperation in large numbers? This would be scalability as in onboarding the next billion users. If you think about it, what are the problems at the root node of blockchain? One major problem is when you want to onboard the next 1 billion users, they do not want to pay for security. Security is very non-perceivable in the short term, and it’s not clear if users will pay for security. And the only way we are going to get 1 billion users to build on top of the blockchain is if security is cheap. There should be no trade-off between security and scalability. Thinking through how to minimize the cost of security, we developed EigenLayer as a mechanism for reusing staked tokens so that the capital cost of security decreases. With that in mind, we also think through very carefully the operational cost of data availability for EigenDA. EigenLayer is expanding the scope of flexible cooperation because it supplies trust more flexibly than Ethereum can by just reusing the same Ethereum platform. And you can think of in large numbers as the scalability that L2s are addressing, but also with data availability and some of our other perimeters, we want to think through how to scale services horizontally on top of EigenLayer. When you build an Oracle, do you need everybody to get the price feeds? Can you get only a subset of nodes to get the price feeds but maintain full security? high security without compromising on cost is something that that drives us.
”Bitcoin is enabling trustless cooperation and Ethereum is enabling trustless flexible cooperation. EigenLayer is expanding the scope of flexible cooperation because it supplies trust more flexibly than Ethereum can by just reusing the same Ethereum platform.
vVv : That sounds super interesting, and extremely impressive. Can you give us a rough idea of where you are currently at with the development status?
Sreeram: 100% of EigenLayer should power all of decentralized trust because it’s just a marketplace that enables ETH stakers to provision services. We are very early in this broad picture. How do we build a restaking platform? What do these smart contracts look like? How do payments, staking, and slashing work? We’ve built all of these things as smart contracts. Second, we’ve also built the first service on top of EigenLayer, EigenDA, a data availability service utilizing Ethereum’s best practices, danksharding, and incorporating these into an opt-in layer. Two accomplishments we have completed towards building core aspects of the platform EigenLayer, are the risk-taking mechanism and the general-purpose modular slashing built on top of it. With our first data availability application, we have demonstrated how to use this general-purpose platform and built something on top of it. Currently, we are on internal test nets and hope to be on mainnet in Q2 2023.
vVv: It seems your team is involved in multiple projects: academia, EigenLayer, and EigenDA. Is this a heavy load for them? Do you have immediate plans to expand the team, or do all these different aspects work synergistically?
Sreeram: I’ve been on leave from the University of Washington building this startup for the last 15 months. Regarding the team, we’re interested in expanding and looking for anybody deep into infrastructure to join us in building this platform. Our perception of EigenLayer is not only as a core platform but also as a provider of ideas of things that can be built on top of us. We have a long track record on building new consensus protocols, data availability, sharding and other solutions on in the blockchain space. Our guiding vision for EigenLayer is that we should enable as many people as possible to build on top of a common framework in a frictionless method. And, I briefly alluded to this, the reason we have a pseudonymous economy today, is because of the separation of trust and innovation. And we view it as a paradigm shift, akin to venture capital, which was the separation of innovation and capital. Blockchains have done similarly with Ethereum, the ability to decouple trust and innovation and we want to do the same thing to enable anybody to build on top of a common platform. We think of this as an extremely non-zero sum game and a net benefit to everybody if good ideas are not only incorporated inside our system, but widely adopted through the industry.
”Our guiding vision for EigenLayer is that we should enable as many people as possible to build on top of a common framework in a frictionless method.
vVv: Can you explain how EigenLayer works on Ethereum in a simplified technical way?
Sreeram: The core premise of EigenLayer is restaking: I want to use the same capital staked on Ethereum to provide additional services. So how would you do that? How would you get the same capital to be reused for other services? How do you restake part of restaking to share economic security? What is economic security? Economic security is in proof of stake systems (PoS) and putting down some stake and committing to doing certain actions. For example, in block-making if I fail in doing the block-making correctly, I will be slashed, meaning I will lose my funds. This is the root of trust of PoS systems, particularly Ethereum. Now in building EigenLayer we want to expand the suite of services. When you restake it should not be used as a civil resistance mechanism for identifying who participates in the system, but it should also have powerful positive and negative incentives to participate. Slashing is a mechanism for negative incentives and without negative incentives, it is very difficult to build reliable systems in a pseudonymous world. If everybody participating is synonymous, then what is the assurance that they are doing their service correctly? How do we bring slashing to EigenLayer? If all your stake is on Ethereum core contracts, EigenLayer has no access to that. To circumvent this, we restake, and what is restaking? Well, there are various ways of restaking. One way is trustless native restaking, where you stake on Ethereum and specify different degrees of freedom to participate in EigenLayer without intermediation. For example, you can specify: your validator, who will run your operations, who is the fee recipient who will receive the fees from your staking, and who is able to withdraw funds when you want to exit your staking position. By setting the withdrawal address to the EigenLayer smart contracts, you’re giving the power solely to the EigenLayer smart contracts, not to us or anybody in particular. Normally the contracts will let you pass on the withdrawal power from the smart contract to yourself, because you are the rightful owner of that particular state. To note, if you opt-in EigenLayer services and behave maliciously, EigenLayer can withdraw your stake, slash a portion of it, and return the remaining portion. Slashing is all written in ossified smart contracts in equilibrium to prevent our involvement in how slashing occurs. To explain other modes of staking, you could take your Lido stake tokens or your rocket pool staked ETH to our ETH and then deposit them on EigenLayer and specify either self-validation or alternate validator.
vVv: Could you explain what middleware is?
Sreeram: Middleware is any service that requires distributed validation. To give contrast, I’ll give an example of something which does not require distributed validation. Uniswapis is self-sufficient on the blockchain because it has a set of smart contracts and does not require any specific distributed validation. However, other services like oracles and data storage services where a set of nodes store the acquired data, run a consensus protocol, run a node off chain, all of these services that require distributed validation, that is not natively provided on a blockchain, we call them middleware. Think of them as services that require active distributed validation.
vVv: Will EigenLayer do any due diligence on middleware developing on top of it? What measures do you have in place to prevent bad actors from deploying your protocol?
Sreeram: One thing at the heart of blockchains, which is the new value proposition, is objectivity and credible neutrality. That is what a blockchain like Ethereum stands for. But to actually get the systems to work, there is a layer of subjectivity and social involvement which is needed. Otherwise, you cannot. How do you handle the DAO hack if you say code is law? How do we handle bridge hacks and other things we’re seeing today? How do we handle making EigenLayer a core part of Ethereum? One issue is what if people build a slashing condition on top, which either has a bug or which slashes everybody’s funds? Or if it was maliciously written to steal people’s funds? I think the answer to this is not a binary one. If we just focus on objectivity and credible neutrality or make the systems subjective, the value of decentralized trust is gone. We have to find a path that strikes a balance. Now, how do we do that in EigenLayer? As I mentioned, our objective at EigenLayer is open innovation, so how do we straddle the trade-off between open innovation and security. Without security, stakers leave and then we can not build innovation. The biggest problem is slashing maliciously-based on wrong contracts written either maliciously or with bugs. To protect against this, we introduce a layer of subjectivity in the middle on slashing. By assembling reputed Ethereum community members, both building the core protocol and people building on top, to be involved in a committee that can veto slashing. The only rule is we’re slashing on things that have occurred improperly. Initially, we’ll elect this committee subjectively, but eventually, we’ll have a governance process. One example of a governance process is a DAO that is interested in the Eigen ecosystem. Now, in terms of onboarding new solutions built on top of EigenLayers, we are open to feedback. At the moment, new services built on EigenLayer who want the protection of this veto will be onboarded by the DAO. This is to provide the committee with an understanding of the dApp and the conditions under which they can veto the slashing. However, this is a bit against open innovation. We can create two categories of services, one which is subject to the veto and another which is not. Stackers who are not subject to the slashing veto must opt-in at a much higher trust level. We envision services built on EigenLayer to go through a period where they are with the slashing veto, just like current L2 solutions have governance mechanisms. But eventually they will migrate to an ossified state in which they do not need the protection of the slashing veto, and stakers will still feel comfortable opting in. That’s the two levels you have, a level where anybody can build anything but there is a trust barrier to opt for the stackers. And another level which is controlled by an onboarding process, which includes things like due diligence, smart contract audits, and off-chain code audits that the teams will conduct to trust and onboard new services on EigenLayer.
vVv: You mentioned that you could potentially become a part of Ethereum itself?
Sreeram: We are very much aligned with the core value proposition of Ethereum. And we wouldn’t mind if the protocol becomes completely internalized as a default thing in Ethereum. The one thing I mentioned on becoming a part of Ethereum is then Ethereum has to take some kind of subjective position on slashing vetoes. But some of the best services and ideas could be internalized into the Ethereum protocol. There are two levels of this question, whether our service is built or internalized. For example, with respect to our data availability layer, I’ve talked to Ethereum members to see which aspects of this make sense to be a part of the Ethereum protocol. Another way of thinking about EigenLayer is a market for free market protocol upgrades to Ethereum. Anybody can build new services, and as these services mature and become robust, they can then be incorporated into the default Ethereum protocol. But there’s always an edge of innovation. If you have a data availability protocol running at 15 megabytes per second, for three years, and that is then internalized into Ethereum, this could improve the protocol. And how we think about blockchains and crypto is we want to minimize rent-seeking and maximize innovation.
”Another way of thinking about EigenLayer is a market for free market protocol upgrades to Ethereum.
And the way we think about blockchains and crypto is we want to minimize rent-seeking and maximize innovation.
vVv: Could you give us an example of the cost differential of being on Ethereum versus deploying on EigenLayer?
Sreeram: The cost is dependent on the service type. On Ethereum there is a capacity and as you approach this capacity, the cost increases exponentially. If all of the Ethereum’s block space is allocated for data writing – no computation, no execution, no uniswap on L1, no compound – the bandwidth is at 83 kilobytes per second. With the upcoming provision EIP-4844, with the addition to services the data bandwidth will be 80 kilobytes per second. And while 80 kilobytes per second is much higher than the data bandwidth used by roll-ups, we are envisioning a future where there is massive amounts of Ethreum activity. The total bandwidth provisioned on EigenDA is 15 megabytes per second. If you apply similar pricing mechanics, on top of Eigen DA, you will see that data availability or data bandwidth is much cheaper on EigenDA than it is on Ethereum. Of course, this should comes as a surprise since we are incorporating the next Ethereum upgrade intout solution. The key aspect of the solution is horizontal scaling when we want data to be available or stored. We need a set of nodes to download and store this data. If we ask every node to download and store this data, there is a massive cost incurred. Through advances in cryptography, like KCG polynomial commitments, we can have a system where each node downloads very little. And in even if 90% of the nodes go offline, you can reconstruct all the data. That’s what you can do on EigenDA because of the efficiency of the core protocol design.
vVv: And how would that be notable for the end user? Is there any other advantages than just having lower fees? Or is there anything with which you might even notice that you’re not using something on EigenLayer rather much rather than on Ethereum?
Sreeram: The way we expect this to translate is if you look at the three basic use cases of blockchain today: Defi, NFTs, and DAOs. These three are where two things have aligned; one is the willingness to pay a high fee for trust when transacting with financial applications or digital assets and when you want to create a DAO which is self-regulating. These are all places where people are willing to pay a premium for trust. But there are a number of other use cases of digital platforms in which blockchain is the right paradigm to build on which people will not be able to pay that kind of a premium. Think of things like social networks, ride sharing, and all these other things, which have massive rent seeking at its center and can be displaced through crypto. But the problem is that blockchains are not scalable enough to accommodate them nor are they sufficiently cost-efficient to handle the massive data throughput. The right architectures to build these new services are not ready. We see the role of EigenLayer in letting other infrastructure builders create some of these pieces so that it enables the next 100 use cases of blockchain.
vVv: As a middleware layer index or Oracle Developer, what are the trade-offs I am making developing on top of EigenLayer compared to what the services are doing today, apart from the obvious coupling trust with the base layer?
Sreeram: Let me address one thing that is on top of middleware developer’s minds. If all value goes back to ETH, what is my value capture? One example is Uniswap who can begin taking a cut of the fees paid inside their system, without staking the Uni token. That the first value proposition, taking a cut of the fees paid to the ETH stakers. Number two, some of these middlewares already have tokens or can tokenize these fees. Number three, the middleware token itself can be used for co-staking along with the ETH restaking. Hence, you can have two quorums, one ETH stackers and the other middleware stackers. For example, if you wanted a data storage guarantee, you need commitment from both the ETH and middleware stackers for a given data to be considered stored. That’s an example of dual token staking. Dual token staking enables these middlewares to create their own staking economies, in addition to Ethereum staking. This helps protect some of the negative spirals that may occur in tokens, which are endogenously staked. For example, if the middleware token depreciates in value, then the economic security follows and when the economic security declines, users and dApps pull out. And as dApps pull out value, the token value declines further. You can hedge or insure against security risks by borrowing economic security from Ethereum. You can have a fraction of the fees go to your own token stackers and then a fraction to ETH stackers dynamically. This is how we see the middleware ecosystem evolving on top of EigenLayer. There will be some without a token and only desire a fraction of the fees to build a classical infrastructure type business on top of EigenLayer. Then there will be others who use their own tokens for payment or their own tokens for co-staking. We’ll see massive amounts of innovation not only in algorithmic design, but also in the economic structures attendant to it.
vVv: Is there a reason for using staked ETH other than capital efficiency? From your descriptions of restaking, it sounds like it can work with any ERC20 or plain native ETH.
Sreeram: That is correct. You can restake anything. The capital efficiency accrues if you’re doing restaking. What are the uses of restaking? Number one, if you want to access to block proposals. There’s a lot of maximal extractable value (MEV) management that is accruing interest, and to build an MEV management solution, you want stackers. In Ethereum block proposals stakers can make credible commitments of what they are going to do. If they restake, they can make a statement that if I sign off on an encrypted transaction, and if I don’t include the decrypted version of this transaction in my block, I’m willing to get slashed. This is an example of a credible commitment by restaking on EigenLayer. The really interesting thing is, as you start receiving natively restaked block proposals, you start accruing massive superpower. You can begin proposing very interesting protocol upgrades without actually having to upgrade the protocol. For example, if you want single-slot finality, Ethereum requires many blocks before finalization. But if all the staked block proposals are opted into EigenLayer, they can make a commitment that they won’t fork a certain block, and that blocks becomes final immediately. Thus, you can start doing very interesting things. There are a lot of protocol details to be worked out on things like this. But number one, you can start doing extremely interesting things because it’s ETH stakers that are restaked. Number two, you can get alignment with the Ethereum ecosystem by utilizing ETH stakers for your other services as a natural mechanism. Instead of performing an airdrop to the Ethereum community, you could also in addition to, or in lieu of, do ETH staking for securing your services. There is a very clear exchange of value for what you’re doing, as well as an alignment to the ecosystem. I would say those are the additional powers you have in restaking as opposed to staking any other token. And like I mentioned, we are not opposed to staking other tokens on EigenLayer and we do expect when a middleware does a dual staking, both tokens may be on EigenLayer and we may even see subsidiary effects. Things like people using a Starkware token to restake for a Starkware Oracle. These are all emergent effects of intermodular security that EigenLayer can offer.
vVv: What are some of the middleware you would personally like to see developed on top of EigenLayer?
Sreeram: Yes, great question. Number one, I mentioned data availability, which is restricted data availability and bandwidth. Number two, I briefly mentioned about MEV management, but I can dive into a few examples because I believe management is one of those things where there is consensus for what is the correct way to proceed. One proposed mechanism is auctions selling off block space and that could be a part of the solution. There could be other aspects to the solution, for instance, you want to protect yourself from getting sandwiched and you’re you’re willing to pay a premium. Other solutions could include threshold encryption that Shutter Network is building and Chainlink’s first come, first serve ordering service. These interesting services can be built on EigenLayer for MEV management. There are more sophisticated MEV management solutions such as event driven actions. When actions are in the category called keeper – currently Chainlink and Gelato have keepers – which are mechanisms for event triggering actions based on the occurrence of some event. And if you want to do some of these things more natively, why? When a third party is triggering actions and then sent test transactions, you do not know whether the third party didn’t trigger the action or the transaction was not included because of censorship or other things going on at the block proposal level. However, suppose a block proposer is staked and that they have opted in to perform event triggered actions. In that case, very interesting phenomenons occur such as: liquidations, refilling of collateral, atomic arbitrage for example between Uniswap and Sushi swap, and that then refuels back to the Uniswap and Sushi ecosystems. There are other solutions like Root, who are trying to redirect some of the MEV value back to the users. And these can be built on top of EigenLayer. We see a plethora of auctions like Flash bots and Manifold blocks.
Ethereum will be the center of this multi-chain world, and the reason I say this is because, you need three things to accomplish this. One, you need very good bridges to all other chains. Two, you need high liquidity. And three, you need the hub to be more secure than all these other chains. And Ethereum has all three. Every new chain has a bridge to Ethereum; it’s the most liquid; it is also the most secure. There are, however, two problems. The first is how good are the Ethereum bridges? The gold standard would be to build light client bridges. You can run light clients of these other chains to verify consensus agreement on these other chains before bridging state in. You can build light client bridges on top of EigenLayer and certify all of those things on Ethereum. So that’s another category that we are quite excited about, light client bridges. Another category that can be built on top of EigenLayer is a layer of consensus in between rollups and Ethereum, like a layer 1.5 on which super fast consensus happens. Ethereum finalization is 12 minutes today, and if you want to get a finality time competing with Aptos, Solana, or others, you may want to achieve one second or sub-second confirmation times with high economic security. If you have a $10 billion restrict on top of a consensus protocol which settles state really fast and then certifies that state back on Ethereum, this could be something super interesting.
To add a bit more on things like settlement layers, one of the throughput limitations of Ethereum comes from state growth: the rate at which the amount of RAM or state grows on Ethereum. There are two other limitations, which is non-paralyzed execution and non-native execution. For example, if you’re running zero-knowledge proofs, instead of running them natively on C++ or bare metal, you’re actually running it on top of an abstraction hierarchy. And when you start building these other settlement layers, which are particularly tuned to settling rollups, you can observe parallel zero-knowledge proof executions. These can handle things like native execution on C++, or any default container, as well as zero state growth because you’re just checking a zero-knowledge proof which is mostly memoryless. Some of these advantages can be containerized and this is why we think modular. The modular architecture pioneered by Ethereum is actually the right way of building these solutions upon a layer of innovation, all of which can compose. When people talk about composability, they’re just talking about dApps. What we’re envisioning at EigenLayer is enabling distributed systems to compose.
vVv: I just want to say thank you Sreeram for joining us today. It’s been an absolute pleasure and has been really eye-opening. Not only do you have incredible knowledge about what you’re building, but also of the space as a whole, which I think is necessary if you want to be a good founder and if you want to lead your project in the right direction. So I’m very much looking forward to staying in touch.
Sreeram: Thank you so much. I really appreciate this conversation and all the interesting questions that have asked here. Thank you all for taking the time to come and listen.