Categories
Blockchain

Chainbinders Development Insights

Chainbinders Development Insights

Preamble

It’s the end of March as I write this, and I am finally able to put together a coherent post on everything that’s been happening in my professional life. For those who might be living under a boulder ten thousand leagues under the sea, I wrote some time ago about Doki Doki, how it would be the next big thing and how the NFT world would change due to their unique value proposition and inspired design ideas. The asset back then was trading at a 1.5 million dollar market cap, a value which has since ballooned to over 20 million as of writing this post.

Of course, making a correct call in a bull market isn’t something that’s particularly difficult – I don’t reallly care for the accolades, but I do note it here because most readers do. No, in this post I wanted to get into the specifics of my personal involvement in the project in the last two months; the reason for no post in March, and what I think will be the next big thing to hit NFTs; Chainbinders.

What is it anyway?

NFTs piss me off. I mean, they really piss me off. When I see “generative” art that is made by a computer and had no effort whatsoever put in its creation (aside from setting up some initial values) I get upset. When I see said generative “art” raise tens of millions of dollars I get very upset. When I see clones do the same thing and achieve the same results with an even less original idea, I get beyond fucking upset, I get VERY fucking upset.

I think the NFT space is due for a shake up. We’re not really legitimizing the space by shilling endless computer generated art for hundreds of thousands of dollars (in some cases millions), so I decided to do it myself. Naturally, the Doki Doki team was in reach around the time of my first article on how bullish I was on their token, so we decided to get together for a little collab. A one time project we would work under the Doki brand that would change things forever. And so Chainbinders was born.

Right, but what is it?

Chainbinders is part game theory, part art collecting, part gambling, and part anime. It’s got elements of things ranging from Final Fantasy to Nier:Automata. It’s a combination of everything I love about the various mediums I’ve enjoyed in my life, putting it all in a pot and shipping it on blockchain. It’s hard to explain what exactly it is, because it encompasses so many things. I suppose I’ll put the theories to bed on this being a game however – seriously guys, would you play a game developed in 6 weeks? It would be awful!

At its core though, Chainbinders is an NFT game experience the likes of which has never been experienced before, and likely never will be experienced again. We’re creating original IP and characters (a massive cast of 15 of them!), giving them each their own backgrounds, lore, stories, motivations, and then gamifying them on the marketplace with some clever token mechanics to make sure these NFT’s have instant liquidity and actual value.

That’s the long and short of it – crypto degens will be able to fully realize their NFT gains, collectors will be able to amass a huge set (over 100) of original cards, and gacha lovers will be able to enjoy one of the most memorable entries in the genre to date.

I can’t go over the exact details until launch of course (just 1 week away!), but these were the general aims of what Chainbinders sought to accomplish in its development cycle. This post is mostly to discuss more about that dev cycle and the sorts of challenges we faced during development.

Art, Music, & Lore

It comes to me as somewhat of a shock that the veritable mass of NFT projects all feature absolutely horrible art. I don’t mean that as an overstatement; I personally find noveau art like Hashmasks to be utterly apalling, the blockchain equivalent of an Andy Warhol painting. Sure, it might sell for a lot of money, but that’s not really compelling to me.

Getting the art right for Chainbinders was the first step we took in delivering a real experience for folks, unlike anything seen before. It was a measured approach, something that we had to personally guide from beginning to end, whether it was our NFT designers, animators, or artists. When you get creative direction that is this closely involved with the general proces, you end up with some amazing cohesion, a level of quality that you simply can’t achieve by simply hiring talent and setting them off.

On the musical side, this was an important element too. We had several composers coming from different backgrounds come together to create pieces on this project which was a challenge in and of itself. Artistic variation is generally more accepted than musical variation, and for this reason it can sometimes be risky to get independent music talent together.

The way we dealt with this issue was quite simple really – each of the Chainbinders have a very specific thing that they do very well thematically, and rather than keeping the musical talent chained (ha) to one particular style, I had them explore many subthemes even within their own work. You’ll hear tracks from the same producer that are wildly different, and this is by design – when the variation is thematic in nature, it becomes seamless as a whole, a sort of contraposto of musical notes and elements.

As for the lore, this was where I could strut my stuff and create a believable world. One of the departments practically every creative project likes to skimp out on is their writing and lore. And in a sense this was the same for Chainbinders – I was the only writer in a staff of over 40 – but in this case, as I was also leading up creative decisions from start to finish I could apply my skillset to every part of the experience.

Which, by the way was a horrible idea, at least for my free time – writing what practically amounted to a novel in about 30 days start to end was not the easiest task, I will admit. On top of all of the art prompts, direction, and guidance from the lore that was necessary in order to make the product whole. Still, if you plan on doing something, you have to do it right. In the case of Chainbinders, we wanted to deliver a triple A experience, and that involved a deep lore that explored many aspects of these characters and their lives – who they are and what makes them tick. It was useful that I had already written a fantasy novel previously, as I had to use all the tricks up my sleeve to get this story out properly. The final synthesis of that process is the world of Chainbinders, a near-universe of story and lore to explore, each character with twisting paths and arcs that are hopefully compelling enough to get an audience to really buy into these NFTs, not just with their hard earned money, but emotionally.

Innovating in Blockchain

As for the crypto elements, gacha is an already-amazing innovation that we’ve seen absolutely blow up on the Doki side. They released two machines as of the writing of this post, the first being Cryptochibis, which sold out in under 3 hours, and Miguel Garest’s Sushi machine, which sold out in under 2.

The future is DeGacha.

But no, that wasn’t enough. We couldn’t simply create the world’s most detailed NFTs and sell them through the world’s only Ethereum gacha machine (now with sub-penny transaction costs on Matic). We had to do something more. Much more.

Details on exactly the mechanics we’ve implemented for Chainbinders are on a strict need-to-know basis until we actually release the product, but needless to say, you’ll be awed and fomoing your life savings into these things with the fervor of a retail normie learning about GME.

Blitzscaling

Running projects is hard. Putting together a team of over 40 people with their own schedules and actually delivering on something is even harder. Doing it in 6 weeks? Well, now you’re on Dante Must Die mode. One of the annoying things I find about crypto is just how long things take to get out – in reality, if you have a decent project manager, you can cut down a lot of that time and turn it into productivity. This sort of development sprint is not something for the faint of heart, but it does put other projects on notice.

In total, Chainbinders had over 30 artists, 3 front end developers, 3 independent auditors, 3 music producers, 2 animators, 2 translators, and a writer. Not to mention the additional resource we required, voice, talent scouts, and so on. It was a gargantuan task keeping everything on schedule, and growing from zero to the final product was certainly no easy task!

Conclusion

Chainbinders takes everything we love about crypto, anime, and games and turns it on its head. It’s a wonderful world of strange characters, superpowered villains, and dangerous weather all wrapped into a blockchain format that is sure to both awe and inspire.

Look forward to Chainbinders on April 8th!

Categories
Blockchain

NFTs Enter The Golden Age

NFTs Enter The Golden Age

Preamble

Non Fungible Tokens have been around the crypto ecosystem as far back as 2012. The very first papers discussing their potential even had some familiar names attached to them, Vitalik Buterin being one of them (though this was before Ethereum was truly concepted). At its core, an NFT (Non-Fungible Token) is a cryptocoin representing some sort of non-crypto specific thing on chain. The avid reader of this site should have a general understanding of what NFTs are, so we will skip going over the simple things, but the most common application of these NFTs is artwork. Variant NFTs have been made that contain things like videos and even puzzles.

It’s an interesting use of blockchain to be sure – and though you might think the value proposition is not so apparently obvious, one needs only spend a bit of time on various crypto specific forums and Twitter to understand just how compelling these ideas are – how quickly new NFT platforms can grow and spread, how they can give artists exponential revenue potential, and how transformative it can be. Of course, dear reader, you don’t come here to learn about the metaphysical elements of universal art tokens. No, you’re here to make money; and that’s the thread we will be following today. How these NFTs are tokenizing talent, and how to profit from them massively.

Growing Interest

It’s no small statement to say that NFTs are currently the latest craze hitting the markets, with new projects and platforms hitting Ethereum nodes near you at a fervent pace that would overwhelm even the most studious of market researchers. Tokens are minted with just the cost of gas (ha!) that then go on to sell for six figures or more, all in the space of months, weeks, and in some rare cases, days. The reason for these sorts of outsized returns of course has lots to do with the relatively small sliver of people who can actually tell the wheat from the chaff, and take that informational edge to later sell to someone else at a healthy premium. NFTs are a niche within a niche, even smaller than the DeFi sector, which is why it becomes trivial to find and generate alpha for those intrepid enough to look.

If anyone here remembers Cryptokitties, you’ll recall how this was the first true NFT platform – and how it completely bottlenecked the Ethereum network for weeks on end as people were crafting their digital cats on chain. That hype quickly fizzled out, partially because Ethereum couldn’t support the app, but also partially because it was too early to the market. Three years layer and what has changed? Plenty has – and this is the core of our investment thesis and why we think NFTs are quickly entering their Golden Age.

These things clogged Ethereum for weeks on end.

Look at your typical crypto investor/trader. What things are they putting their money into? Where are they diverting resources, and what sorts of products are they after? In 2017, the sort of thing that was popular was platforms – Ethereum was the hype, and if you slapped a shoddy github together and got Ian Balina to talk about your project, you too could be an “ETH competitor” and rake in tens of millions. It was an easy cash grab, and that was where all the focus was. Fat protocols were the only game in town. Fast forward to 2021, and the environment is a lot more varied. DeFi in particular has led the charge, and the myriad of bootstrapped projects in the ecosystem all vie in fierce competition for the liquidity of each actor in the system. Though many of these are simple APY draws (earn X$ by staking your tokens), many others are becoming increasingly more interesting, and this sort of attention to utility and fundamentals like never seen before sets the stage for the rise of NFTs.

The Golden Age

In many ways, NFTs are really the final synthesis of crypto economics, game theory, and killer features that only blockchain can provide. Taking a look at the field, and it seems the market agrees:

cryptopunks-nfts

With headlines like these, it is no wonder that people are flocking by the hundreds to try and find that next rare NFT token or speculative investment. Of course, we’re in a bull market – so let us temper your silly ideas of buying just any trash NFT and making it big (remind anyone of 2017?). We are of the opinion that very few of these platforms are actually worth their salt, and even fewer will stand the test of time. What’s appealing as an investor is more than just quick cash grabs, but longer term fundamentals. That’s where you really make the big bucks. A few of these are part of our portfolio, of course, but none are as interesting as our investment in the DOKI project.

Pivots

Image

When DOKI first started, it was just a typical DeFi finance shitcoin, the type you would expect from the deluge of YFI clones after Andre Cronje famously made it a mainstream topic. There was absolutely nothing special about the token, other than the people involved – they might have been anonymous, but they were brilliant. Those brilliant founders ultimately decided to make the switch from a proto-typical DeFi project into one focusing exclusively on NFTs – and what a change it has been. While other NFT projects focus on stupid things like unsustainable yields and hype by getting twitter influencers on board, this team has been quietly working on delivering their decentralized gacha experiment – one that has been returning huge numbers. On the first day of their product release, Doki outperformed Rarible’s (the leading NFT platform) volume by over 10x. You heard that right.

Of course, crypto traders are as generally unintelligent as they are laden with capital to hand over, and the fact that DOKI was no longer doing DeFi made them collectively shit-their-pants and sell en masse. Damn the fundamentals, right? Smarter investors will pick up their bags at a wonderful discount, smarter investors such as the ones reading this very article.

But what makes gacha special anyway?

For those who don’t know, gacha games were first introduced to mainstream users a decade ago through various mobile games. The concept involves taking people’s hard earned cash and trading it in for 2D waifus, power up items, or other nerdy things. It’s an absolutely massive market, particularly because it is just about as close to gambling without actually calling it that (same with lootbox mania). The global gacha market has generated billions in returns, and it’s a tried and true formula that people continue to enjoy.

The interesting thing here is that Doki’s product is the only one of its type on the market. They have a beautiful eye for design, the user experience is fun and engaging, and the product truly speaks for itself. Needless to say, the artwork is also a tier above anything else on the market right now. Go try and roll their machine if you want to be convinced, that experience will do more than this article ever could. This sort of unique value proposition combined with good fundamentals ultimately makes it an attractive purchase for the long term.

So what do we have here? We have a few elements in our pot that are starting to come together. We have pseudo gambling (degenerate crypto anons are more than happy to tell you about their own streaks), NFTs entering their Golden Era, and topped off with the world’s most insane crypto bull run. What do you get when you combine all of these? We won’t do all your homework here, but you do the math.

Conclusion

NFTs are transformative in the way they tokenize talent, lucrative in the way their mechanics reward early investors, and interesting beyond typical speculation. It’s a concept that crypto believers keep coming back to over and over again through the years – and it may prove in the long run as one of crypto’s killer features. We’re heavy proponents of the NFT Golden Age, and we are investing heavily in select projects – which we’ll be selling at a hefty premium when the herd catches up.

Halo out.

Categories
Blockchain

DFINITY – A New Beginning

DFINITY - A New Beginning

DFINITY - CoinList

If you have been in crypto for almost a decade as most of us have, you will have noticed that the unfortunate trend of projects operating in the space claiming paradigm-shifting breakthroughs and novel technical solutions to heretofore unsolved problems spanning the realms of game theory, cryptography and distributed systems continues unabated.

Yes, crypto-fatigue is real and the constant stream of upstart competitors offering dubious claims regarding their unique approaches to scaling, security, UX, and crypto-economics can quickly turn even the most starry-eyed newcomer into a cynical and seasoned investor ready to discount the latest entry in the space as just yet another self-styled Ethereum-Killer. Ethereum Killers whose lofty claims are only made more improbable by a conspicuous lack of clearly stated tradeoffs or by implausible and reckless security assumptions.

And yet.

While the constant barrage of questionable projects who tout their tech as the latest and greatest keep on appending increasingly meaningless version numbers (blockchain 2.0! 3rd Gen network!) shows no sign of slowing down, this shouldn’t preclude the discerning investor from spotting the proverbial diamond in the rough.

Dfinity is, in our opinion, certainly one such diamond.

The origins of Dfinity can be traced all the way back to 2016 when Dominic Williams and Timo Hanke set out to radically change how the internet operates after having witnessed the potential offered by unstoppable, distributed, and autonomous code in the form of smart contracts on the then-nascent Ethereum platform.

So, what is Dfinity?

We think the best way to start answering this question is to plainly tell you upfront what Dfinity ISN’T:

  • It’s not permissionless, meaning not just anybody can run a node and provide computing power to the Internet Computer.
  • It’s not aiming to supplant existing blockchain networks(well not entirely anyway) but rather to be a competitor to legacy web 2.0 computing platforms.

The crucial difference between Dfinity and all other blockchain projects lies in the fact that the protocol was engineered specifically to store massive amounts of data on-chain and to serve as a fast global storage and execution environment for code that can in some cases rival existing cloud computing providers, with no external services needed to access it.

The desperate need for decentralization

It’s no secret that the current situation of the technology industry is untenable.

The status quo is a bunch of monolithic for-profit companies that shape and control every facet of what was once supposed to be a pluralistic, distributed, and egalitarian network model.

The tech arena is utterly dominated by the incumbents: Google, Amazon, Facebook, Apple, Twitter -their tendrils extend well beyond their actual web properties and infrastructure and into the realm of technical committees drafting interoperability standards, public opinion, and policy.

This is not what Sir Tim Berners Lee envisioned when he worked tirelessly to lay the foundations of what would become the Internet as we now know it.

Platform risk is not only real but ubiquitous and oftentimes impossible to avoid, with increasingly costly and deleterious consequences for businesses.

Walled gardens such as proprietary and exclusive software distribution channels in the form of monopolistic App Stores are the norm rather than the exception.

All of this has attracted a great deal of scrutiny from regulators in various countries and some have taken drastic measures to try and limit the power these behemoths are able to wield over the public discourse or to deter anti-competitive practices with very little success if any.

The appeal of a truly decentralized Internet computer is immense and immediately apparent and resonates with people the world over.

The Cambridge Analytica fiasco, the recent Twitter data breach that exposed powerful internal administrative tools and led to the takeover of prominent accounts on an unprecedented scale is just a reminder of how centralization of data can have serious social repercussions.

People in the know are also acutely aware of another often underestimated but crucial pain point that is plaguing our current version of the internet and that is the staggering amount of technical debt and the immense burden posed by the complexity of the current tech stack that underpins the internet.

Having to manage, secure, extend and improve an aging foundational layer that was never designed to properly support many of the use cases that have now come to fruition is an endless and monstrous task that requires an exorbitant amount of skilled and knowledgeable caretakers.

A difficult task, the right people to tackle it

The good news is that Dfinity is seeking to change all of this and we believe they have a real shot at making it a reality and are poised to succeed where previous efforts have failed. Here’s why.

It all starts with an exceedingly ambitious vision, creating a universal protocol to coordinate, deploy, replicate, validate, secure, store and execute code and data between a multitude of participants who supply computational resources and storage space to a global resource pool.

They have been toiling away in relative secrecy for almost half a decade now and have amassed an impressive amount of incredibly talented people at the forefront of their respective fields, some widely regarded as luminaries.

These are the kind of distinguished system engineers, cryptographers, security researchers, and programmers that you can’t simply entice to work on a mediocre project with a large paycheck.

These people are drawn to technical excellence and a cohesive and daring vision like moths to a flame and their willingness to be associated with a particular project is the best indicator of its legitimacy. There is simply no way that many illustrious academics would risk tarnishing their carefully cultivated reputation and standing in academia by being involved in a sham.

We will just mention a couple of key people in technical and operational roles but we highly encourage you to peruse the team page on the Dfinity website as it’s a veritable who’s who of technologists.

Jan Camenisch is a remarkable and prolific cryptographer who has authored over the last 3 decades countless and widely cited research papers in the cryptography field. Dfinity poached him from his previous employer, IBM, where he spent 20 years as Principal Research Staff Member.

Ben Lynn is another superstar name in the cryptography world and one of very few people who can claim the indisputable honor of having the initial of his last name immortalized as part of a novel cryptography scheme that he co-authored and that is seeing broad adoption in the crypto space, BLS.

You want to create a blazing-fast, scalable, and interoperable execution environment, who would you want to design it? Andreas Rossberg certainly fits the bill and would be pretty high on the (very short) list of people up to the task as one of the creators of the WebAssembly specification during his tenure at Google.

Honestly, with research centers located in Zurich, San Francisco, and Palo Alto, we could have dedicated a few more pages just to extol the virtues of the many incredibly talented people Dfinity has managed to band together; now, don’t worry we won’t bore you to death but if there’s one thing you should take away from this it’s the bonafide technical pedigree of their R&D and engineering teams.

We keep stressing this point because had they not managed to amass such a critical amount of talent and showed impressive progress we too would be skeptical of their ability to bring to fruition the moonshot that is the Internet Computer as they envision it.

The Internet Computer

Ethereum initially billed itself as the “World Computer”, and while the debate is still raging on whether it fully succeeded in living up to its name, no one can say that it wasn’t a great branding strategy. And since as we all know good artists copy and great artists steal, it was the next logical step for Dfinity to name their labor of love as the Internet Computer. It may be derivative but it’s an apt description of what they are trying to achieve.

The first pillar on which they intend to rebuild the internet is called “Canisters”. What is a Canister you say? Well, it’s an isolated, single-threaded, deterministic, cryptographically secure execution environment that is deployed, orchestrated and that can be interacted with over the Internet Computer Protocol. Sounds just like word salad? Then think of it as a Smart Contract on steroids. And yes we know it’s a trite comparison that has been used to death in a misleading way by all kinds of Ethereum wannabes to bolster interest but it really is the best approximation we can come up with while referencing existing crypto projects. If you are familiar with more traditional IT lingo you can think of a Canister as being similar to a Process.

It too executes code, but this is where the similarity ends as a Canister differs from a traditional process in a few key areas. First of all, a Canister can’t terminate due to invalid inputs, or errors caused by faulty logic because there is no way for it to abort/panic or otherwise stop. If a Canister crashes it will automatically revert to its previous state before it received the input that broke it. This is a nifty failsafe that would prevent a container from entering an inoperable state in many cases. Canisters are also replicated across all nodes spanning a subnet of the Internet Computer. They can be deployed, controlled, and decommissioned only by a user or by another container that has administrative privileges.

The technology that makes all of this possible is heavily dependent on WebAssembly and as a result container can benefit from one of WebAssembly greatest strengths, the ability to run code written in any one of the multitudes of programming languages that can be compiled into WebAssembly. Canisters can interoperate with each other even if they were written in different languages. WebAssembly also supports formal semantics, opening the way for a formally verified ecosystem to develop in the long term further increasing robustness, predictability, and security of code running on the Internet Computer.

This is where things take an interesting turn as the next revolutionary property of the Internet Computer is how it stores and retrieves data. Canisters have a memory limit of a couple of gigabytes and no disk storage of their own. So where do you store data? Well the ICP is tasked to keep state on behalf of Canisters and to abstract away the data storage part of the equation from the developers.

How exactly it achieves that is still a matter of speculation as few concrete details have been offered so far. We do know at a high-level view that it will work much like traditional Object Storage offered by legacy cloud computing platforms and canisters will be able to perform standard operations such as GET, PUT, APPEND, DELETE, LIST, STATUS, and so on.

This will be a massive boon to ease of use and will significantly improve the experience for developers. These standard functions all come together to form one of the first services to be previewed on the Internet Computer, BigMap, and will be accessible to developers through APIs. Keep in mind this data storage is highly resilient and accessible by all of the nodes forming a single subnet. This is absolutely unheard of in the blockchain space and is much more akin to an Amazon S3 instance. And yet it is distributed and cryptographically signed. The blockchain is slowly but surely becoming the Cloud and Dfinity is at the forefront of this shift that is blurring the line between traditional cloud architectures designed to be operated by a monolithic entity and distributed systems composed of independent data centers across the globe. But that’s not all as Dapps built on Dfinity will need powerful search capabilities to sift through all the data that they can now store. And this precise capability is offered by another fundamental building block called BigSearch, an indexing and search framework that operates much like Elasticsearch and is able to perform advanced searches using Stemming and other techniques to detect similar keywords and return appropriate matches.

The Network Nervous System

The last piece of the puzzle that sits at the core of the Internet Computer Protocol is its on-chain governance system dubbed the Network Nervous System.

The Network Nervous System or NNS responsible for many critical tasks that are crucial to the health, security, and performance of the network such as dynamically allocating resources from participating data centers, performing routine operations needed to guarantee data availability, and ensuring the security and authenticity of data traveling across the network.

The NNS, much in the same way as its human namesake is formed by tens of thousands of Neurons. Neurons are single governance cells that come together to collectively perform certain governance actions on the ICP.

To create a Neuron you need to lock ICPs inside a timelock contract. But wait what are ICPs?

Yes as you may have correctly surmised when you started reading this article a few minutes ago the Internet Computer has its own native token used to reward network participants and ensure the security and honesty of network participants through economic incentives and penalties. In fact, it has not one but two tokens, ICPs, and Cycles. But let’s focus on the former for now.

ICP is primarily a governance token but unlike the vast majority of ERC20 based governance tokens used by DeFi projects living on Ethereum, this one has actual utility.

Their primary function as we said is to provide collateral value by being locked inside a Neuron to participate in the governance process. Neurons are not all born equal and many variables affect the voting power of each, such as the amount of value locked, the age of the Neuron defined as the amount of time it has been participating in the governance process, and a third parameter known as Dissolve Delay, a user-defined amount of time before the locked value in a neuron is returned to the user after he or she has invoked the Dissolve function to effectively stop the Neuron. The Dissolve Delay can be thought of as the crypto equivalent of a time deposit bank account when you are free to request a withdrawal at any time but your capital is subject to a previously agreed upon period of time before it is disbursed.

All things equal, a neuron with the same balance of locked ICPs as another but with a higher Dissolve Delay will have a proportionally higher voting power to reward the long-term commitment of the user that created it and ensure the stakeholders’ interests align with those of the network as a whole. Of course with higher responsibility come higher rewards and our neurons with a high Dissolve Delay will earn more participation rewards compared to a less invested neuron.

A brilliant way to incentivize stakeholders to create neurons is offered by the fact that the Dfinity team has announced that it will distribute all ICPs tokens to financial contributors pre locked into neurons who will have an already set age, encouraging backers to avoid immediately dissolving neurons to cash out their ICPS and instead of taking part in the governance process long term. We think this is a great way to pre-seed neurons and allows a strong community centered around the governance process to grow organically.

This brings us to the second use case for ICP tokens and that is their ability to be converted into Cycles.

Cycles are the other type of tokens powering the Internet Computer and are Dfinity’s counterpart to ETH in the sense that they are used to pay for computing resources such as network fees, CPU cycles, ram and storage used by Canister and ultimately applications as well as a way to prevent DDoS attacks by attaching a small monetary cost to TXs acting as a rate limiter and ensuring an attacker can’t effortlessly and freely spam the network.

An important aspect of the conversion of ICPs into Cycles is that the exchange rate is not fixed but dynamically adjusted by the Network Nervous System itself in response to external stimuli.

This peculiar token economics model allows for approximately 1 CHF(that’s a swiss franc) worth of ICPs to always be exchanged for a trillion cycles, called a T.

Canisters continually burn cycles in order to operate and since cycles can only be obtained by acquiring ICPs tokens this means that as long as the network sees sustained utilization the number of available cycles will be constantly decreasing. This is a nifty deflationary model!

Not only that but since the price of cycles is fixed this means that developers building applications on the Internet Computer can benefit from having stable and predictable costs to access computing resources and cycles act as a sort of stablecoin that can be used as a store of value. By tying the recurring computing costs and the operation of the internet computer as a whole to cycles as its sole payment method the Dfinity Foundation is able to ensure that if there is fluctuation in market prices for cycles these will quickly self-correct as market participants will rush in to scoop up cheap gas that is always in demand, therefore, stabilizing the price relative to the available supply.

Participating data centers who chose to make computing resource available for the Internet Computer to use will be paid in ICPs but since ICPs value is designed to be volatile in nature and DCs need to have a predictable revenue stream the amount of ICPs paid for a determined set of computing capacity made available will also be dynamically determined by the NNS to reach a pre-agreed value denominated in USD.

This remuneration scheme is very ingenious and should allow a robust token ecosystem and markets to develop and flourish while providing substantial economic rewards for early network participants.

The Dfinity network has undergone several iterations, the most recent being Sodium, released only 2 months ago. Sodium is still only a preview of the network but now with token economics fully baked in for developers who are looking to experiment and build the next great thing on the Internet Computer.

The public launch of the network in its nearly final incarnation is expected with the Mercury release in Q4 2020. We are eagerly awaiting this momentous occasion and will endeavor to snag up a hefty allocation of ICP tokens as soon as trading opens.

 

Halo out.

Categories
Oracles

How Ethereum 2.0 is Redefining Blockchain Security

How Ethereum 2.0 is Redefining Blockchain Security

So you may have heard some news recently about ETH 2.0’s long awaited and much bandied about official multi client testnet named Medalla crashing and burning only to re-emerge from the flames after a few days of downtime. The event didn’t leave them unscathed-far from it. Don’t listen to the noise, however, we’ll tell you why it’s the best thing that could have happened and why Ethereum 2.0 is on track to be the most decentralized and resilient blockchain yet thanks to a razor sharp focus on security.

What Happened With Medalla Anyway?

A majority of the Medalla beacon nodes were running the Prysm client developed by Prysmatic Labs. This set the stage for the actual issue having such an outsized impact and cascading effects and is a powerful reminder of the perils posed by a lack of client diversity and over representation of a single client in a blockchain network.

The ETH 2.0 beacon chain relies heavily on the assumption that all network participants have the same clock time to properly propose/validate blocks and perform other duties. Clock skew is a real issue and the Prysm developers had baked in a system that used Cloudflare’s Roughtime protocol to get digitally signed and reliable clock time to the benefit of users.

Unfortunately it turns out that Roughtime uses a pool of servers to determinate the time and if one of these servers misbehaves and reports an incorrect time that is heavily behind or in the future it gets averaged in and the resulting time is way off from the real time. A clear case where using a median instead of an average value would have almost completely mitigated the issue. This large clock skew trickled down to ETH 2.0 nodes running Prysm because the developers had designed the Roughtime sync so that it would automatically adjust the local system clock if it detected a deviation from the time reported by roughtime. This is obviously a bit haphazard and should only be performed if the discrepancy is minimal eg.: in the order of a minute or less. Since the roughtime servers were reporting a time more than 4 hours in the future, this got propagated to anyone running a prysm instance and effectively made all the prysm beacon nodes unable to work with the other clients. In the ensuing chaos and the mad scramble that followed to get everything back on track additional mistakes were made that led to validators being slashed and network participation dropping even further; in the end it took several days for the network to reach finality again. This catastrophic scenario was an invaluable opportunity for client teams to gather data as it was unfolding, to hone their incident response playbook and to uncover a lot of edge cases that just would not have emerged had the network not been in such a degraded and fragmented state. All clients suffered from massive resource usage spikes due to the amounts of different forks and the strenuous loads that they imposed. These conditions would have been almost impossible to create in a synthetic controlled environment. The Medalla incident led to a multitude of improvements in all the clients(and Prysm now only uses Roughtime as a time source to warn the users of possible clock skew but does NOT alter the system clock).If you want to learn more about this specific incident we highly recommend these very through and informative posts by Benjamin Edgington and the excellent Postmortem authored by the Prysmatic Labs team.

Client Diversity Is Paramount

It all starts with a simple yet unavoidable question: how do you actually design a blockchain to be as resilient and hardened as possible against a multitude of highly skilled adversaries and black swan events?

An immediate, binary choice with far reaching implications is how many client implementations of the protocol will be available. A single one or many.

Having a blockchain network comprised of more than 1 protocol implementation(a client) means that if a something goes horribly wrong with one client, be it an actively exploited bug in the wild or a legitimate code update that has unforeseen consequences the network does not necessarily come to a standstill thanks to the other unaffected clients.

This begs the question – why don’t blockchains have multiple clients? Well developing(and maintaining!) a blockchain client is a complex and time consuming undertaking that requires a diverse skillset. It demands a profound grasp of multidisciplinary subjects such as networking, security, cryptography, distributed systems, economics and more!

Now, for those of you looking for the fly in the ointment there is a notable downside: different client implementations which are often coded in different programming languages can have subtle divergences in how they interpret and implement the protocol spec. These differences can lead to consensus bugs. A consensus bug can be described as two (or more) clients not being able to agree on a certain thing.  In a blockchain context it can be a block, a transaction, or another object. This can have severe consequences on the health of the network and lead to forks, downtime and eventually when the dust settles even reverted transactions.

As everything else in the blockchain space, security and decentralization are a tough balancing act. There are various schools of thought on this but we believe monocultures to be dangerous and are firmly in the camp of people believing multiple clients to offer better overall security and resiliency in a network. Another benefit is that the ecosystem and ultimately users are not beholden to a single entity (the client developers) that is able to hold captive the network, exert disproportionate influence and dictate the overall direction of the project.

With that out of the way, let’s get back to Ethereum 2.0 and how it’s reshaping the blockchain security landscape. To start with it has FIVE (5!) clients under active development as of now. While we fully expect this number to decrease over time as users converge to the more mature and robust implementations, this is nonetheless an amazing evolutionary process. The clients that are better able to thrive in the real world will emerge as the winners, it’s comparable to natural selection and will ultimately greatly benefit the network. We think the inevitable consolidation phase will leave us with battle-hardened clients that will have withstood a barrage of adversarial conditions and more importantly dev teams who have an intimate understanding of the security landscape.

Audit All The Things

So we have client diversity covered, now what? The next logical step is ensuring that these clients are as secure, performant, stable and robust as possible. As is standard practice for complex, mission critical systems that are expected to hold a significant amount of value, audits are involved. All of the ETH 2.0 clients have or are in the process of, engaging 3rd party security auditors to perform a thorough review of their codebase, simulate adversarial scenarios, uncover potential bugs/security vulnerabilities and suggest appropriate fixes and mitigations.

Selecting and working with a company to audit your codebase is a deeply involved endeavor that requires tight collaboration and a strong commitment from the start of the process (usually a RfP, Request for Proposals) where you delineate the scope of the audit and solicit bids from various qualified security vendors to the end of the relationship, which usually concludes with an Audit Report and a request for comments.

It’s important to stress that an audit is not a fire and forget tool, but rather an ongoing undertaking and only one layer in a Defense in Depth approach to blockchain security.

It’s not only the ETH 2.0 clients that have undergone comprehensive security audits. The reputable Trail Of Bits firm has been engaged to assess the security of the CLI tool prospective stakers will use to generate the cryptographic keys used to control their validators. The audit uncovered 2 high severity issues, and suggested several less critical improvements but also noted the general high quality and maturity of the code reviewed.

Another encouraging sign showing that the EF is deeply conscious of all the critical moving parts that will have to interact together to bootstrap the beacon chain and allow validators to deposit ETH to perform staking duties is signaled by the fact that they went a step further than a simple audit for a piece of code that is arguably one of the most vital puzzle pieces In ETH 2.0: the Deposit Smart Contract.

This is a smart contract deployed on the Ethereum mainnet that will act as a one way bridge for people to transfer their ETH from the current eth 1.0 chain to the beacon chain to be staked. Runtime Verification has conducted a Formal Verification audit of this smart contract at the behest of the EF. What is a formal verification audit? The answer to that would necessitate a separate post to thoroughly explain but the gist of it is that it’s a process that allows to prove(or refute) the correctness of a specific algorithm in a mathematical way. This is one of the most rigorous and challenging vetting procedures available in the software realm and speaks volumes about the extraordinary dedication by the EF to shipping secure code.

We believe there are no silver bullets in the security field but gaining actionable insights form a high quality audit report compiled by a reputable firm is a great and necessary first step.

Bug Bounties and Attacknets

As befitting a project as ambitious and complex as Ethereum is, especially when securing billions in value, the EF is building a world class in-house dedicated security team whose sole focus will be ETH 2.0.

They have received numerous applications by highly qualified people and we have no doubt they will succeed in amassing a sizeable amount of InfoSec talent.

But as part of a multi pronged approach to security the EF has also spearheaded a Bug Bounty program covering the Phase 0 part of the ETH 2.0 spec. The pieces of the project that are considered in scope are well detailed and the bounties very generous ranging from a minimum of $1000 for bugs rated as low severity/no impact and going up to $20000 for critical bugs that have the potential to severely impact the network.

This is in addition to the rewards offered for successfully breaking the ad hoc testnet networks bootstrapped and maintained by the EF and cheekily and aptly named attacknets.

The Ethereum Foundation initially deployed multiple attacknets dubbed beta-0,  each formed solely of nodes running a single client in order to purposefully lower the overall security of the network and the barrier to entry for whitehat hackers and security professionals looking to probe and exploit specific client vulnerabilities.

These attacknets have since been deprecated and decommissioned but not before they fulfilled their role and yielded some successful exploits that mainly targeted the networking layer and were able to successfully prevent finality thus being eligible for bounties as summarized in the Trophies section.

The single-client beta-0 attacknets were retired in favor of a multi client attacknet dubbed beta-1. This attacknet is still operational and so far no one has successfully claimed a bounty on it so if you have a penchant for breaking things this could be a nice way to help ETH 2.0, gain lasting fame and net some cash.

Look for more parts of the spec and client implementations being covered under the program as the EF transition to a dedicated ETH2 bug bounty portal with a public leaderboard, following in the footprints of the pioneering initiative that has led to countless ETH1 security issues reported and fixed.

The bug bounties offered are reflective of the openness of the Ethereum ecosystem and the deep rooted commitment by the EF to fostering an open and collaborative environment that extends to all facets of the security footprint for the project. We think this approach is likely to pay off big time in the long term.

The Secret Weapon: Community Fuzzing

All of this brings us to the final and most disruptive initiative that has emerged from the security related efforts in the Ethereum 2.0 ecosystem.

The EF generously funded a grant to develop a comprehensive fuzzing framework targeting most of the Ethereum 2.0 clients. We consider this to be the magic arrow in the EF’s quiver.

But what exactly is fuzzing?

We’re glad you asked. Fuzzing is a process in which an automated program overwhelms a target piece of software with a deluge of random(and not so random) inputs in the hope of inducing a crash or unexpected behavior that could manifest itself in the real world when certain conditions arise. Still not sure you understand? Well this very vivid analogy courtesy of Afri Schoedon comes to the rescue:

“Imagine you have an unlimited number of kids of all ages asking you seemingly random questions non-stop. The moment you have a mental breakdown, the psychologist will write down the question that caused it and try to repair you, so you will withstand it next time. “

Makes it a lot clearer right?

Having a fuzzing framework to uncover bugs that would be basically impossible for a human to detect complements nicely the already well rounded approach to a robust security posture in ETH 2.0 but what is absolutely unprecedented is the revolutionary way in which the EF decided to engage stakeholders as part of the process.

Traditionally, fuzzing is a process generally performed by security firms when contracted by a client or done in house by large entities that have a dedicated security team and that are well versed in shipping secure code as it is fairly onerous and time consuming. In most cases it is closely guarded and while the tools and techniques are sometimes open sourced they are very rarely disseminated with custom tailored code to target a specific software/ property because of the high potential for abuse by attackers to use them for nefarious purposes eg.: finding a security vulnerability and exploiting it instead of reporting it.

The Sigma Prime devs took a very different approach in publishing the Beacon-Fuzz tool that is opposite to the security-through-obscurity credo that is often ingrained and embraced by a lot of fortune 500 companies and even by some less open minded blockchain projects.

Not only did they release the tool in the public domain under a very permissive license but they also pre populated it with corpora(a set of pre defined inputs upon which to perform mutations) designed to kickstart the fuzzing of existing ETH 2.0 clients.

They are actively encouraging members of the Ethereum community to run the tools, going as far as providing assistance in setting up the local fuzzing environment and troubleshoot issues(you can reach them on the #fuzzing channel in their discord using this link should you want to join the fuzzing ranks)

This effort is notable because their reasoning is that the benefits gained by having a diverse set of stakeholders run the fuzzing software far outweighs the risk of an attacker using the tool to uncover and exploit bugs in the clients, especially now that the network has not yet reached Mainnet status and is not securing real value.

So far their intuition has been proven largely correct as there was an enthusiastic response from the community at large and some dedicated community members who recognized the value provided started using the tool and eventually managed to find bugs affecting quite a few clients as detailed in this blog post.

The amount of stakeholders who have a significant vested interest in ensuring a successful launch of the beacon chain, its ongoing security and a pristine uptime thanks to the economic incentives that underpin the network(namely the staking rewards for proposing and validating blocks and the desire to avoid the stiff penalties that could result from security incidents) means there is a big and ever growing pool of users who are highly motivated to run the tool.

The Sigma Prime crew is not resting on their laurels though, and have recently added new capabilities to the fuzzing framework, extending its already powerful features to not only find bugs affecting a single client but also comparing how each client performs the various State Transitions in order to find discrepancies from the canonical reference spec and potential consensus issues between the different implementations!

They are also in the process of adding more fuzzing targets covering the portion of the codebase that is handling networking in the various clients. A more in depth look at how structural differential fuzzing operates and a sneak peak of their exciting roadmap can be gleaned by visiting their blog.

Even once the whole attack surface of the various endpoints will have been exhaustively covered the tool will still prove useful to detect potential bugs that could be introduced as part of successive modifications to the protocol and client updates.

Based on our interactions with the Sigma Prime folks and other clients devs we have gained a deep respect for their steadfast commitment in strictly adhering to security best practices and their resolve to continually assess, challenge and improve the security posture of the ecosystem as a whole.

The teams have some amazingly talented sec leads working in an incredibly collaborative environment and projects such as Beacon-Fuzz are testament to this. They will have a lasting beneficial impact as they are refined and maintained well past phase 0.

To conclude, we believe that Ethereum 2.0 has a great shot at shipping a highly resilient and hardened network with a best-in-class overall security posture thanks to the strenuous efforts by multiple teams and the amount of expertise they bring to the table. Other projects in the space will be hard pressed to replicate these enviable achievements, pervasive security culture and virtuous cycle brought to bear by its massive, energetic community and the grassroot movement involved in the fuzzing efforts. It’s a tall ask for any project but especially difficult to mimic for the multitude of self-styled VC-funded “Ethereum killers” that have been kept in carefully controlled and monitored incubators since their inception and don’t have a set of well established and documented protocols and guidelines to respond to security SNAFUs. Ethereum has weathered countless attacks and the expertise and insight gained by the battle hardened client developers is invaluable here and can’t be understated.

Does this mean bugs won’t happen come mainnet time or that all of this is enough to ward off attackers and guarantee there will be no security incidents? Certainly not- but it highlights how uniquely positioned the ETH 2.0 project is to timely and effectively respond to such issues if and when they will arise. You’d be a fool to bet against ETH 2.0 and you won’t find us on the sell-side of the order book or among the ranks of traders opening short positions anytime soon.

Halo out.

Back home

 

Categories
Oracles

API3 and The Future of Oracles

API3 and The Future of Oracles

The last few years in crypto certainly have been interesting, haven’t they? We’ve seen the rise and fall of the ICO bubble, years of bearish sentiment, the near-permanent hiatus of the very site you’re reading now, and all the while we’ve had steady progress throughout. We’re not talking about the platitudes that CZ and company love to tout regarding BUIDL – yes, this has been happening, but on the surface level it would seem stagnant. After all, when prices are down over 90% across the board for months and then years, it really can seem like doomsday. In many cases it was – weak hands got washed out, lots of companies got sued, and many more went defunct. Yet despite all of this, innovations and development has truly continued on. And so it is that we come to API3 – what it’s building, why it’s doing it, and if we really do have the next Chainlink on our hands.

The Oracle Problem

The infamous oracle problem has been one of the largest and most well known issues facing smart contracts and blockchain development, and it’s been that way for years now. You have a smart contract on-chain with enforceable rules and functions, but they are really only useful with data that is available inside the Ethereum network itself.

You can’t make a contract on the price of gold if such an input has to come in from meatspace – and therein lies the oracle problem. Just how do you get this kind of data on-chain and in a decentralized manner? Moreover, how do you ensure that this data is verifiably true, and how do you secure against an attack against such a data source? Certainly this increases the attack surface of a product dependent on a) the smart contract and b) the oracle provider itself.

Since the heady days of crypto, we’ve tried to resolve this oracle problem in several different ways. The most circuitous of which include prediction markets, such as Augur or Gnosis, but the real money has always been with an oracle provider that can deliver this data in an anonymous way, without third party intervention, and doing so in a cost-effective manner.

Enter Chainlink.

Prosegue la scalata di Chainlink: dove può arrivare? – Valute Virtuali

It would be poor form to discuss oracles and current solutions without mentioning Chainlink and all of the progress they’ve brought to the crypto ecosystem. In fact, this very site was a strong proponent of Chainlink back when they had their ICO in 2017. Of course, investing in an ICO is one thing – money was easy to make back then. What shows your strength as an investor more, however, is the ability to hold an investment through a bear market and reap the rewards once Chainlink was truly appreciated for what it was:

 
 

Chainlink is great. It’s one of the only true oracle projects that delivers on its promises, has a vibrant community of holders (known as the LINKmarines) and is poised to be one of the eminent blue-chip crypto tokens to hold in the future.

…So why am I writing this article?

Because Chainlink has problems. Problems that API3 solves.

The API Problem

So, we’ve discussed what the oracle problem is. In reality, it’s really more of a problem that’s been created by thinking too small about how we actually want smart contracts to function on Ethereum. The goal was never really to solve the decentralization of nodes that deliver oracle data, or to overcomplicate things such that “anyone” can deliver this kind of data. Even talking about it here it is a tad complex, is it not?

Actually, we have a much simpler problem. Really what we want is the ability to hook onto off-chain data and use it in our contracts. Oracles as far as blockchain middleware is concerned have been compared to API’s of the web in the sense that they deliver this data to the consumer. Rather than thinking of oracles as an abstraction of APIs, why don’t we just apply the design philosophy of an API itself onto blockchain?

Wouldn’t it be cool if instead of making an oracle call that costs you three dollars (quite expensive in the long run), you could make an API call that delivers the same data?

Wouldn’t it be cool to know who is actually delivering that data, rather than having to trust an anonymous node?

Wouldn’t it be cool to avoid all of the attack surface that multiple node providers enable and simply deliver that sort of data in a seamless integration?

Exit Chainlink. Enter API3.

The API Solution

So how does API3 work and why are we so bullish on it? In short, it takes all of the value that Chainlink nodes are currently aggregating (you know, the ones who are only incentivized by that value) and delivers it to the providers of the data themselves. I mean this in a direct sense. You don’t need to have some intermediary set of Chainlink nodes to hook onto an API provider and transmit that data on-chain. Actually, you can just have the API provider themselves provide that data and reap the rewards. This solves several key problems that Chainlink will have to contend with over time, and why we think API3 is a very bullish product.

Firstly, you now have a reputation to uphold when you are directly providing API data to consumers. Report bad or wrong data on Chainlink? There’s a monetary repercussion, but that’s about it. No one really knows who controls the node that did the damage due to the anonymous nature. As the API provider yourself you have a direct investment in the verity of your data, which means you remove a lot of the “oracle bribing” situation you can get with Chainlink.

Which is a huge problem, which Chainlink has solved by the way – only, they’ve made that solution prohibitively expensive to do so. Chainlink accepts that oracles can be bribed, and part of its design to safeguard against that involves using multiple nodes to deliver the ground truth of whatever data it is that you’re calling up. Multiple nodes cost money. Lots of money.

 
 
API3 has a neat solution they call Airnode, which is deployable on chain and requires very little onboarding (which the team will help with themselves) from the part of the API provider. Once you set it up you can forget about it. There’s your data, live on chain and anyone can make a call and request it. No nodes required. No upkeep. No attack surface.

It’s elegant. Extremely elegant.

The Money

That’s what it’s all ultimately about, right? In the end, we need to ask ourselves what the actual advantages are for the data provider here, as well as the consumer. Aside from the aforementioned ease of onboarding (try getting any legacy company to set up a blockchain node) API3 is just… cheaper. It’s cheaper to set up, cheaper to manage, and cheaper to make oracle calls on-chain. Nearly every aspect of API3 has been built with the data consumer in mind, and we think this is one area where API3 just beats out the competition.

API3 is mostly focused on creating decentralized data feeds like feeds.chain.link that are composed of multiple Airnodes and decentrally and transparently governed. While you can call Airnodes directly and it’s as robust as calling the API directly, most DeFi runs on decentralized feeds and the ethos of the space is such that single-source oracles are seen as suboptimal (however these will be useful for things like prediction markets).

Of course, we try to be unbiased here – in our personal opinion, we still feel that Chainlink will provide a solid level of security with their node architecture. However, we feel that with the inclusion of the reputational element each API provider will now have, we think API3 offers an alternative that can plug in to existing systems and reduce gas costs – to the tune of 50% or more per call for very little downside.

Solid savings, solid returns.

Governance Hype

Every project these days has a governance mechanic built into their token and API3 is no different. There is value to be acquired here in dollars and cents when it comes to owning supply of API3. You get to vote in governance changes, fee structure updates, and channel many of those fees to you, the holder. For a burgeoning data marketplace we think this is extremely bullish and once again are willing to make the play that API3 is less of a bet on their specific project succeeding (it will) and more of a bet on blockchain itself becoming even larger and more mainstream.

There’s also a good staking mechanic involved with the token which provides rewards to those willing to put their tokens up and act as insurance against malfunctions/errors. We expect these to happen, but will be few and far between. They happen on other existing systems too, and we are glad that API3 is taking the initiative here and being realistic about things rather than pretending these problems don’t exist.

Plus, staked tokens is reduced supply, less sellers, you know the deal.

There’s lots more to API3 that you can check out from their whitepaper and website, so we urge any intelligent investor to go ahead and check it out. Make your own decisions on whether you see any value here (we do) and if you’re interested in their upcoming sale in October.

Token Metrics? They exist

API3 has some really great things going for it. They have a lot of pre-existing customers who are looking to use API3 right out of the gate. The team has worked very closely with Chainlink themselves for some time now, so they know exactly where the pain points are.

There’s no reason for us to include things like token metrics, sale numbers, and team sections as you are free to do your own research when it comes to this kind of thing. Nor do we really exist in that kind of market anymore – attractive terms and numbers are never enough these days to get real appreciation. This time around you need some actual value proposition and real long term view to make a splash in this market.

Is API3 the end of Chainlink? Are we bearish on Chainlink? Is this a hit piece on Chainlink? Certainly not. If you think so, remember that you’re accusing individuals who invested in Chainlink big time and have held it for years of…being bearish on Chainlink.

No, we think these two solutions are great for different things – it will ultimately be up to the data consumer if they want a cheaper hands-off solution or a more expensive, but more robust offering with Chainlink. There’s definitely room in the data market for more oracle options – and existing services can only be improved by quality projects in this sector.

We’re convinced API3 is one of them. And we’re investing. Heavily.

Watch this space.

Halo out.