AMA Recap: New Vision Reveal

Syntropy's AMA Recap: New Vision Reveal

By Cryptoray | Synternet | 6 Jun 2023


#Community Updates

Syntropy’s latest AMA was debatably the most critical in the project’s entire history. After unveiling the bold new trajectory of Syntropy as the main Data Availability Layer for blockchain and Web3, CEO Domas Povilauskas and CTO Jonas Simanavicius took to Telegram to address the community, answering any and all questions about the Syntropy’s upcoming role in being the main provider and platform for reliable, streaming, real-time blockchain data.

Set against a backdrop of well-informed, insightful questions and robust discussion with the Syntropy community, the AMA provided a forum for meaningful dialogue and exploration of the project’s new direction. Domas and Jonas dedicated their time, energy, and expertise to demystifying the nuances and intricacies of Syntropy’s new role in enhancing data deliverability in the blockchain space, how DARP will play a part, and potential new utility and value for the $NOIA token.

The session not only highlighted how the shift is indeed the best use case for Syntropy’s innovative tech, but its broader impact on the entire Web3 landscape. In case you missed it, here’s all the key topics, discussions, and happenings from the latest AMA on Syntropy’s leap into becoming the Data Availability layer for blockchain and beyond.

Summary of the new focus reveal AMA

Will this affect OBX? or any plans to continue with validator staking?

The blockchain roadmap for a while included our migration to Cosmos SDK so we’re already running a devnet chain and the data layer solution through the chain. We’re still planning to launch testnet and then mainnets of the chain which includes validators and staking. There are also more actors in the solution which are planned to have staking. As for OBX, the partners we onboarded already to our network to provide nodes for latency optimization are fully operational and optimize the network used by Stack users. Main technology parts of OBX to develop were the chain with the validator set, token and then the marketplace smart contracts and application running on top. As mentioned we have migrated the chain but the focus of development of the fully permissionless version of OBX has now shifted to the data layer solution which together with our chain acts as the core of Syntropy protocol. Our data layer solution is blockchain native with accounting and payments going through the chain using $NOIA.

How far away is the final product?

Because we have developed networking technologies, Stack and DARP before we were extremely quick to deliver data layer streaming solutions. We already have a devnet running the solution serving data from several chains and giving developer access to Aptos Hackathon participants in a few days. They will be able to already stream Aptos blockchain data live to their applications with faster latency than available competition. Next phases are testnet for the chain with faucet tokens, more chains added, different ways to access and stream data, mainnet and data availability features. I’m expecting larger numbers of developers already testing streams from several chains in the testnet phase.

How simple and easy is the new product for blockchain projects to use?

Extremely simple as it is a blockchain native experience. Developer goes to our developer portal, logs in with a wallet, selects / creates streams, pays NOIA tokens, gets API key, and deploys to his applications / dApps.

Will the Web2 product be still available, i.e., can you still use NOIA to speed up and reduce latency and packet loss of your internet connection?

Syntropy Stack with DARP network running optimizations is fully operational and we welcome further growth and user adoption. It can serve all Web2 industries who face internet latency optimization problems. When adoption grows further we’re ready to implement a token payment option to bring value to the token. The fastest way to start sending expensive data through our protocol is however the blockchain data streamed using Syntropy which is a fully token native experience. Low-latency and easy to consume blockchain data is in high demand both at the moment and projected to grow significantly during the next few years while decentralized communities and developers struggle to get any access to such data.

What phases of the previous roadmap will be abandoned?

The Data Layer solution took the priority place of a fully permissionless OBX application on our chain. As I mentioned, we migrated the chain to Cosmos SDK and ran a devnet with our Data Layer solution built on top.

Syntropy Stack with the network and onboarded partners is fully operational for Web2 clients for latency optimization, we urge developers to use it for their network management needs too.

Can blockchain projects go ahead and use the new product themselves or do they need to contact you first?

Fully permissionless blockchain native experience using token and login in with the wallet as developer. One of the core principles of our product is to ensure seamless access and autonomy. Blockchain projects can indeed use our new product without any prerequisite to contact us. Our protocol has been designed to be completely permissionless and decentralized, thereby bypassing the traditional Web2 sales cycle.

Will OBX remain the main target for future use?

The move to provide this low latency data layer for blockchain data is because we believe we can bet faster adoption to send a lot of data through the protocol and token economy. Hundreds of chains added to the layer and millions of applications streaming and interacting with blockchain data will be the main factor of data usage growth for the data layer and the token in the near future. Latency optimization traffic usage is growing in a more organic way as Web3 is now facing unprecedented scalability challenges and growth.

Are there conversations with any Web3 company that is already interested in your new product to be an early adopter?

Yes, in the process of developing our new product, we have engaged in numerous strategic discussions with key players within the Web3 ecosystem, including founders, investors, and builders.

Over the last eight months, we've built a solid network in the Web3 industry, and a number of these connections are actively using elements of our early stage protocol, validating our work and providing valuable feedback. Our technical team has done a commendable job in ensuring the protocol's readiness and ability to handle the high demand for real-time on-chain data.

Looking ahead, we are now going after Layer1's for integrating their data into the network and working closely on doing ecosystem partnerships so our technology becomes  easily available to all developers building on these chains. Aptos hackathon is just a first step towards this extensive business development plan.

We're planning to build a strong ecosystem of all the Layer1's and Layer2's integrated into the network. It's the only protocol in the space today that allows any full node to start publishing data openly so the possibilities of building the biggest ecosystem for blockchain data is very real.

How is it going with PCCW and Entain? Are they still on board?

Yes, 100%. Syntropy Stack with the relay network is fully operational and will continue to be used by our users and supported by the partners. Nothing changes here.

Why should the client use your data and not from Oracle, which is already in space?

Our data streams are using decentralized broker system so has higher availability as decentralized systems when they mature, they are lower latency than most centralized providers, anyone can add another source of data or publisher because its an open protocol so don't need to wait for centralized provider to add a new chain or stream. We also provide unique tech functionality to have the data filtered and streamed historical data as well which is better technology than most indexers offering through APIs. It also provides for communication and spreading messages so you don't need to make hundreds of requests to get full meta data and many other features. Finally our protocol with many chains on will have incentives for nodes to publish data thus creating their own consensus layers. Data Availability products and proofs can be built on syntropy as a protocol to provide scalability and data for millions of apps instead of going to centralized providers or using native blockchain networks.

Messaging protocols like Layer 0 would be a subset of what Syntropy Data Layer is capable of?

In general many solutions can be built on such raw data streams:

  • Data availability solutions
  • Historical publishers
  • Different bridging solutions
  • ETL data publishers
  • Oracles
  • High availability publishers

Could it be said that Aptos will adopt the technology?

We are going to the Aptos Hackathon to have a workshop with their developers - fingers crossed.

How is Syntropy different from Celestia DA or is it the same thing?

Celestia is building a modular blockchain using DA technology to be a new blockchain, other DA specific technologies focus on specific blockchain scalability solutions. Our data layer is a full one stop shop for on-chain data. Someone mentioned the Bloomberg analogy, I like it. First its low latency streams of data. Most indexers first get data to their servers then serve data, that introduces latency. Our real time data streams are real real time data streams straight from blockchain node publishers to applications through our DARP optimized broker network. Second - ability to stream and get historical data first , filtered to only your needs saving a lot in infrastructure. Third, the DA layer you can build on syntropy is unique because we'll have many publishers of blockchain data from several sources. Features like encoding, sampling and other DA related changes of the data can be done and stored on top of our layer and readily delivered to applications so apps can trust data from us like light clients of their respective blockchains.

Can we expect $NOIA in Cosmos Network?

Yes, our chain is devnet running on Cosmos SDK now, we're planning testnet and mainnet later, and having our token native on the cosmos ecosystem too.

Can you explain the new vision like I’m five?

You know how you can watch cartoons on TV anytime you want, right? Syntropy is like that, but for people who build stuff on the internet using something called blockchain. It lets them see what's happening on the blockchain right now and also what happened before, kind of like being able to watch a new cartoon or an old one whenever they want. And just like you can trust that your favorite cartoon will always be there in new televisions, Syntropy makes sure that the important stuff on the blockchain is always available for millions of people who need it.

Where are the tests against competitors that have the lowest latency?

Yes. We did research and are going to be releasing it showing it during Aptos Hackathon and later releasing it publicly. What is very significant with this shift to Web3, is that our routing protocol allows creation of such a service which can be the most competitive in the market. We were always searching where optimizations can actually make a huge difference and we now see from all the results, that it makes syntropy the fastest data provider including mempool data which is very expensive and inaccessible in most of the blockchains. Whereas Syntropy makes it very fast and accessible to any developer.

Is it like an rpc layer? I'm guessing all of this data from other blockchains isn't gonna be stored on-chain. Also how does DARP play into this? With this pivot it sounds like there won't be a focus on node relayers, or will there? Will you still be able to run a relay node and earn rewards that way?

DARP was our initial technology which Stack spanned from. DARP finds fastest paths in the network and our decentralized version of DARP exclusively focused on grouping nodes using latency information for lowest latency data delivery. THe protocol we use for broker networks and to deliver data streams also has grouping inside. Also publishers need to select brokers through which to deliver data to the application. So all of our network is optimized for lowest latency, thus even now when publishers are connected data to the app is streamed in a low latency way from the blockchain.

RPCs vs APIs

Someone asked about RPCs vs APIs. Millions of apps use RPC and thus it's not scalable, everyone needs to run their own or use public or pay providers or decentralized RPC providers if there will be any. In our case applications can subscribe to streams they need and still get them through a decentralized system. So you can only monitor data or smart contracts you need or build event driven architecture apps. RPCs are request responses, that's a fundamentally different way of technology communication. Streams are push and on-to-many. We have showcase applications which can be fully decentralized and get data from several blockchains and show prices or NFTs or whatever with 0 backend infrastructure. This can actually enable true decentralized applications which have heavy event driven blockchain infrastructures that includes from DeFi, NFTs, dexes to standard applications who need to interact with blockchains and pay millions for centralized data providers or need to run their own RPC or blockchain node infrastructure.

Will Syntropy Stack be deprecated in the future?

I mentioned its fully operational and growing adoption more organically. We just expect on-chain data now to be a much hotter commodity through our protocol than we expected stack users to deliver traffic to the protocol in the immediate term.

Nearest vision

This is the biggest opportunity to grow our ecosystem, we're planning to onboard many chains as well as open for the public to onboard any publisher with their own chain or several sources to later provide cross-validation of data which is a huge topic. People can build things like validation proofs etc. just by implementing several chains which have sub ecosystems like parachains or app chains you can onboard hundreds of project data to the layer and thus thousands of developers developing for those projects get data access to their chains, which are unavailable at centralized providers or need to run their own expensive infrastructure. I'd like to see hundreds of chains and parachains and thousands of streams connected and any developer can use single click syntropy to get data to start developing instead of running an RPC or full node, while dApps can just stream data and be live and dynamic without separate off-chain backend infrastructures.

So, we’ve got DARP and the stack now with this new layer. Do we not think that the best form of advertising would be to get listed on Binance and Coinbase? Everyone who works in tech has heard of those but not everyone who works in tech has heard of Syntropy.

This is literally why we need this protocol which is blockchain native, using the token, and thousands of developers using the protocol in native way through the wallet etc. This gives true token utility and opens easier access to listing on exchanges and true adoption by the Web3 ecosystem. The surest path to listings as well as DEX volume is actually making a lot of data available and Web3 community using the protocol, this gives wallet count and adoption required to get approval of biggest exchanges.

Are we going to push to be listed in the big exchanges then?

Protocols are being listed when they are used and needed for the community to be available, that's the root of it so our plan has always had one goal is to deliver value to the token and this is by far the best action for the project. As we are super lucky to be in the networking space, we have developed the technologies before so we could deliver this in several months, which is now the hottest area in the blockchain ecosystem in nominal usage (blockchain data).

So the nodes of the DARP Relay Network will function as the blockchain node runners?

Technical details will come out, blockchain validators will be separate nodes and there will be other nodes like brokers, publishers and more to run. This AMA is about the announcement but the details will emerge on new pages in the website, blog posts.

This new vision sounds like The Graph and Chainlink under one token.

Include substreams.io and you're about there - that's the Pillar II.

Will it also be possible to integrate data streaming related to DARP, including node lists, relay paths, usage paths, rewards, potential relays, etc. ?

Interesting idea, though we now focus on adding first layer 1s to the layer because there is already huge demand for that data.

When will the test network be launched? And when is the mainnet?

Devnet running now, devs will already be using and testing the protocol in a few days at Aptos Hackathon. Testnet with faucet tokens for more public use coming later this year more details and information will come out during next month as we produce content and go live with information, documentation. I'm sure things will get much more clear as the website and information grows, still wanted to give as much context during the initial website launch to introduce the protocol as we have the Hackathon just in a few days.

Blockchain data is always available anyways isn’t it? Like I can go on ether explorer and see all transactions on ether. What am I missing here, what is the value add?

We'll need to do much more education on use cases etc which we're doing, so keep checking around, but i'll try quickly - where do you think explorer gets data from, any explorer or dashboard or any application which shows prices or even uniswap when you enter a token needs to give price etc? These Apps/dApps use either centralized providers and indexers to buy that data, which is very expensive actually, so that they can use quick APIs or RPC requests to get that data. Then they don't need to run that infrastructure themselves. It's also not always real time or has lower performance. Other companies like Infura, ANKR offer hosting RPC nodes for you so you don't need to run your infrastructure. Still you either run your own or use a third party with some features (indexed, filtered).

Public RPC nodes and public APIs are just not suitable for high scale use and can't guarantee performance or working at all. Like when you go to metamask to make a transaction now you connect to one of Infuras gateways, if it goes down ethereum still works but if you enter a not working gateway to your metamask you dont get data or cant send transactions. With syntropy if you're a dApp and lets say need only to display specific NFT data from Solana or something. Normally you need a centralized provider endpoint to stream data there or run your own backend when you run your own backend, running Solana's node can be 3,000eur/month, plus you need DevOps. To monitor 3-4 chains it can cost 1m+ a year. While if you subscribe only to specific topic you need from Syntropy you get those events and data delivered to your app, only what you need without running all this blockchain infra or using centralized indexers and because we want to make full data availability layer you can get both real time events you need it streamed to your dApp/app as well as then query historical data to be delivered in the stream. So benefits for Apps are - don't need backend, cheaper, more scalable, decentralized, open so many more streams available than centralized chosen solutions, more convenient and FAST. That's the protocol and in the future scaling blockchains.

What services cost that much?

Running servers if you do it manually, like some blockchain nodes are huge, Solana is famous for needing higher, more powerful machines for example. If you run full blockchain infrastructure with redundancy that's many machines and a devops team which for a company can be millions a year.

Devnet Update

Devnet running, giving access to devs at Aptos Hackathon already, have several chains and data streams onboarded. timeline is testnet of the chain running this and more devs using faucet tokens later this year. The protocol has been developed over the last 6 months at least with pieces of our technology we already had for this launch. At the time we're going for the testnet phase to already give access to developers, hence the timing of the launch of the website. Much more details will come out in the upcoming weeks.

Would be nice to also see the revenue of the “old” Web2 services go towards the token as promised either by FIAT gateway or market buys.

We keep the commitment to bring value generated by our technologies to the token. This priority shift is a must move in market conditions to grow our ecosystem within Web3 and kickstart token utility which we saw would take too long for Web2 companies to provide with our one product to the protocol and ecosystem.

Should we expect an update on tokenomics sometime in the next 6 months or so?

Yes.

To which projects would you compare NOIA as of now to have a somewhat similar use case?

Graph, substreams.io, centralized ones Alchemy, Moralis, etc, everyone in the blockchain data space can be seen as providing the same use cases. We are unique in our ways of low latency real time data stream which indexers don't do and we're not an indexing API, even our historical data will be in a different format, so we have a unique implementation to deliver fast and in a convenient way. But use cases are scaling blockchain data for all apps and dApps which need to get price feeds, history, analysis, real time events, security etc of blockchains.

Is the primary selling point the speed at which data is delivered and that Syntropy is likely to be the fastest one?

From the testing we've done so far for the chains we added were mostly fastest compared to other providers with one centralized provider being somewhat on par for some data, our head of engineering will do a presentation at the hackathon and will present performance metrics.

A comparison to Chainlink has been made, what do you think of that one more specifically?

Chainlink deliver off-chain data to on-chain and also does data consensus. We deliver on-chain data to off-chain. But we also will be able to have many publishers of same data and people can build cross-validation of blockchain data on our layer. Also Chainlink does guarantee consensus over data, it's slower but data is guaranteed to be correct, but lower performance. We first deliver raw on-chain data as soon as it happens from your chosen trusted publisher.

Will there be any educational data for the community? It would be easier to understand the concept.

Education info is coming in general and wasn't meant for today, however we had limited time to prepare and we're going to already give our tech for testing at the Hackathon, thus going public with what we created. Now it's time for the tech team to deliver that and start onboarding new chains and that's happening soon. As I said we developed for quite some time and the protocol is actually working on the website if you click to stream Aptos data. The rest of the team now will focus on partnerships with the ecosystems, reaching their devs and education of what we do and why. I'm looking forward to all the educational material going forward and also myself trying to explain the new protocol as best as we can during the next weeks on all mediums possible.

But is 100ms a big factor in a data stream? Are there use-cases where the absolute fastest speeds are needed?

I'll just give an example, Aptos data from our stream in our testing was 1.7 seconds or 5 blocks ahead of a competitor. 5 blocks is 5 blocks. For some it doesn't matter, for some it's absolutely crucial.

How does the Web2 use case TAM compare to this new Web3 use case TAM in your perspective?

TAM for blockchain data is growing with the blockchain space, the question which is bigger is really market share specific not TAM. Lets just say blockchain data is now on demand, expensive and will increase exponentially with any blockchain adoption because it used to be enough to query an explorer to just check balances, now any app if it interacts with blockchain in any significant way needs to basically monitor the chain and activity inside. TAM for latency optimizations really depends on access mechanisms. The first product we did (Stack),  uses wireguard tunnels to optimize connections so you’re looking at use cases where people can manage their infra that way. So if TAM for DARP or latency optimizations alone is all internet, your market share you get depends on how protocol can be used. We see providing low latency on chain data we can now get significantly bigger market share and nominal traffic numbers to our protocol.

Could you maybe elaborate on the use case for DEXes to sparkle our empty imagination? Simple example how it could be noticed by users.

DEXes need asset prices and movements of balances from all chains as soon as changes happen, and if you can deliver that decentralization that makes it actually more of a DEX. For example, the biggest DEXes you use and know now actually use centralized. Providers or if you console inspect data coming from their own backends or s3.

Summary / End of the AMA

The magnitude of this is much bigger than it sounds, I'm now just looking forward to adding as many chains to the protocol and teams working on adoption meaning going through the dev communities of respective added chains as well as business consuming blockchain data but the protocol and access to data is permissionless. It is also just another beautiful piece of technology we built using our expertise and network software we already had as well as DARP protocol which was a major piece of work and has patented technology. Stack was delivered and works, but we want more usage on the network. The Web2 world is too slow for us to survive so we made sure we add technology for the Web3 community because it's decentralized and fast and they need data now. Ever since we found how to cut latency, search who needs it most started and we always wanted to be a decentralized token economy. This is by far the best opportunity now to get even more traffic to Syntropy economy which we couldn't still have if we only had Stack still and growth isn't as fast as we'd like like on so many project now which missed the narrative or delivered technology which isn't relevant but also can no longer be used to provide value to the ecosystem. While we can, we quickly delivered another protocol. That's the rationale i stand by and as i said will focus our energy now on onboarding chains and providing data as well as other ecosystem growth strategies chosen. Kudos to the team who worked so hard through the hard times not giving up and delivering another piece of fabulous technology in such a short period of time and such market conditions.

Final word

The takeaway from the AMA is clear: Syntropy is not just responding to the evolving Web3 landscape, it's actively shaping it.

By introducing the Data Availability Layer, Syntropy is aiming to make blockchain data not only more accessible but also a critical part of the entire Internet’s infrastructure. Through partnerships, educational initiatives, and commitment to its mission, Dom and Jonas made clear that the project is facilitating the transition from the Internet of today to a more decentralized future. The Syntropy story continues to be one of ongoing innovation, dedicated efforts, and proactive progression.

This “a-ha” moment of recognizing how Syntropy’s tech stack can make the broadest possible impact offers a preview of how the Data Availability Layer will positively impact dApps, both Web2 and Web3 organizations, and most importantly – the Syntropy community.

How do you rate this article?

7


Cryptoray
Cryptoray Verified Member

I am an enthusiast of cryptocurrencies and new technologies. Proud to be Ambassador at Synternet.


Synternet
Synternet

At Synternet, we're driven by the conviction that permissionless, interoperable data will serve as the foundational building blocks not just for Web3, but for the future of the internet itself. It's our mission—our commitment—to transform this vision into reality. By providing modular, interoperable data infrastructure sol It is based on a combination of technologies that includes blockchain, encryption, optimized routing, and an economic model that enables and fosters the deployment of this architecture.

Send a $0.01 microtip in crypto to the author, and earn yourself as you read!

20% to author / 80% to me.
We pay the tips from our rewards pool.