What are stable coins? Cryptocurrency Q&A with Rich Lyons

Cryptocurrency - stable coin illustration

 

Cryptocurrencies are not investments for the faint of heart. As anyone who has followed the Bitcoin saga knows, the rollercoaster price movements of these digital assets are only for those with strong stomachs (or who want to conceal their transactions). In recent years, however, a new form of cryptocurrency has emerged with the promise of much less volatility. So-called stable coins, such as Tether, the stable coin market leader, are pegged one-to-one to the U.S. dollar or other asset, in theory making them safer.

Berkeley Haas News spoke to Rich Lyons, professor of finance and economics who served as Haas dean from 2008 to 2018 and is now UC Berkeley’s first chief innovation and entrepreneurship officer, about this new wrinkle in cryptocurrencies. Lyons, an expert in currency exchange rates who holds the William & Janet Cronk Chair in Innovative Leadership, recently co-authored a paper with Ganesh Viswanath-Natraj of England’s Warwick Business School examining what keeps stable coins stable.

Among their conclusions: Stable coins could open the door to the wider crypto world without the wild price swings of free-floating cryptocurrencies like Bitcoin. Even so, as Lyons stresses, stable coins are not necessarily the safe havens they are advertised to be.

If you look at a price chart of Bitcoin over the past few years, it looks like a trek through the Himalayas, with enormous peaks and valleys. Why are cryptocurrencies so much more volatile than traditional currencies?

We can answer that question by thinking about the dollar-euro exchange rate, which is more volatile than people originally thought it would be. The issue is that the euro’s fundamental value is a difficult thing to pin down, leaving a lot of room for speculation. Instability like that gets magnified in the world of cryptocurrency. At the end of the day, the Bitcoin-dollar exchange rate is just another exchange rate, and a lot of those same speculative dynamics are there.

But why are Bitcoin’s price movements so much greater than those of traditional currencies?

The big issue is that the fundamental value of Bitcoin is even more nebulous than that of the euro. We can at least start to think about the fundamentals of the dollar-euro exchange rate, like the growth rate in Europe versus the U.S. With a cryptocurrency like Bitcoin, the fundamental picture is much harder to pin down. You have the same speculative dynamics as in a regular currency market, but with much fuzzier fundamentals.

Cryptocurrency illustration

What exactly are cryptocurrencies?

Over the past five-to-ten years, what some people are calling the digital asset economy has emerged. The digital asset economy lies outside the traditional banking system and is generally housed on a blockchain, which is a secure, decentralized electronic ledger used to record transactions. The digital asset economy includes cryptocurrencies like Bitcoin and so-called initial coin offerings. These assets serve multiple purposes. For example, I could issue 100 tokens, and by buying one, you could own one one-hundredth of a work of art. We can break up lumpy assets and give people ownership of small slices. In addition, this digital asset economy gives people in countries that might not be able to hold assets because of capital controls or other restrictions access to more of the world’s assets.

What’s the purpose of stable coins?

Because this digital asset economy is largely outside the traditional banking system, the issuers and traders of these assets aren’t like regulated financial institutions. They don’t have “know-your-customer” rules or anti-money-laundering regulations. At first, this digital asset economy lacked a store of value, that is, assets with relatively low volatility that people could hold knowing the value wouldn’t change drastically. Because Tether and other stable coins are pegged to traditional currencies, they have become stores of value in that alternative financial world that otherwise lacks a store of value.

Prof. Rich Lyons
Prof. Rich Lyons (Photo Copyright Noah Berger)

Haven’t stable coins been controversial?

Yes. For example, there was a question of whether the issuers of Tether were manipulating the price of Bitcoin. Part of the reason that scenario is possible is that Tether is used as the medium of exchange in over 50% of Bitcoin transactions. When people are buying and selling bitcoins, more often than not they are trading tether for bitcoins. One reason is that when you go from dollars to bitcoins, you are also going from inside to outside the banking system. That has high transaction costs. Tether is already outside the banking system, which makes it a much cheaper and more frictionless way to go in and out of Bitcoin.

Most people see the cryptocurrency world as pretty wild and woolly. Are stable coins as safe as claimed?

Tether is pegged to the dollar at one-to-one, and its price has generally traded within 1% of one-to-one. But about a year-and-a-half ago, there was some concern in the market that Tether was not backed one-to-one with assets; i.e., if there was a mass redemption of Tether, the collateral would not be sufficient to cover the full amount. This concern led the price to fall as low as 95 cents to the dollar. There was an audit, which was not 100% transparent, but it did restore confidence in the marketplace.

What kinds of questions should we be asking about stable coins?

Stable coins come in a number of different flavors. Some purport to be 100% backed by redeemable collateral that’s in escrow, collateral that can’t be captured and run away with. But part of the question, even with Tether, is whether it really is 100% collateralized. And is all that collateral really liquid? If you have to sell in fire-sale conditions, even a “100% collateralized” asset may not turn out genuinely to be 100% collateralized.

What are the long-term prospects for stable coins and cryptocurrencies generally?

There will be a lot of shakeout. The stable coins that have the greatest market confidence concerning the legitimacy and liquidity of their collateral will win out. Meanwhile, if you think about the literally thousands of initial coin offerings, all the tokens, all the cryptocurrencies—90% of them will be valueless in 10 years, in my judgment.

In a shakeout scenario, do stable coins have an advantage?

Most stable coins have collateral. So, if a stable coin fails, it won’t be a complete cataclysm. Whatever collateral is left after liquidation costs will go to the holders. But, when you talk about cryptocurrencies that don’t have any collateral—the Bitcoins and ICOs that don’t have any fundamental value backing them—when those go away, their value goes to zero. I’m not predicting that Bitcoin will necessarily go to zero, but certainly there are a lot of assets in the digital economy that will go to zero over the next 10 years. At the same time, you’re seeing assets in the digital economy that are getting 10 times the valuation they had two years ago. You’ve just got to be in the right place. And it’s anybody’s guess what the right place looks like.

How are cryptocurrencies in general and stable coins in particular evolving?

This idea of inside the banking system versus outside the banking system—that’s a pretty bright line right now. But when central banks move into the digital asset world, the line won’t be as clear. A well-functioning stable coin adds a lot of value, and all of the big central banks are doing a lot of research on cryptocurrencies. Many of them are saying they will launch a digital currency in the next five years. My prediction is in 10 years we will have three or four important stable-coin digital currencies, based in blockchain, and issued by central banks. They will live more in the traditional regulated banking system. That will fill in the continuum.

You and Ganesh Viswanath-Natraj just released a paper titled “What Keeps Stable Coins Stable?” What questions were you looking at?

We wanted to look at how tightly the price of Tether was pegged to the dollar. What we found was somewhat surprising. Tether trades at both a discount and a premium to the dollar. You might think a stable coin would trade like the Argentine peso in the early 2000s, when the peso was pegged to the dollar. But people didn’t have full confidence that the Argentine central bank would support the peso, so the peso consistently traded at a discount, sometimes substantially so.

What might explain Tether trading at a premium to the dollar?

There is this vehicle currency demand that can cause Tether to trade at a premium. If I as an investor can get into Bitcoin by either using dollars or Tether, but it is expensive to get into Bitcoin using dollars because transaction costs are higher, than I’d much rather buy bitcoin using Tether because it gives me a near costless option for getting into Bitcoin whenever I want. That “vehicle-currency demand” for Tether is what pushes its price above one US dollar.

 

In ambitious new book, Henry Chesbrough shows how to get results from open innovation

Adj. Prof. Henry ChesbroughWhen Adj. Prof. Henry Chesbrough, PhD 97, was researching open innovation in the pharmaceutical industry, he found one pharma that had 7,000 scientists working on tens of thousands of compounds. But the company only licensed out less than one a year, shelving the others.

Although some of those shelved compounds may have succeeded in the marketplace, companies may fear they’ll look bad if a product they passed on thrives externally—a phenomenon he calls “Fear of Looking Foolish” or FOLF. “Our interview subjects admitted to us that FOLF was a major constraint to overcoming this,” Chesbrough writes.

It’s been 16 years since the publication of Chesbrough’s Open Innovation launched a new paradigm for bringing new technologies to market, spurring companies to embrace the power of collaborative business models.

Open Innovation Results book coverChesbrough is back to close the loop with his most ambitious work to date. Open Innovation Results: Going Beyond the Hype and Getting Down to Business (Nov. 2019, Oxford University Press) offers a clear-eyed view of the challenges that limit organizations’ ability to create and profit from innovation and practical tools for overcoming those challenges.

The book also provides a roadmap to restore productivity and economic growth for society as a whole—in the U.S. and globally.

David Teece, the Thomas W. Tusher Professor in Global Business, says Open Innovation Results breaks new ground. “It links open innovation not only to enterprise performance but to national economic growth as well,” he says. “There are important insights into the difference between ‘open’ and ‘free’ innovation, along with insightful characterizations of China’s use of open innovation practices and policies.”

Open innovation centers on the idea that companies stand more to gain from making use of external ideas and sharing their own innovations through licensing, sales, partnerships, and spinoffs than from trying to do it all themselves. A famous example: IBM’s development of the PC.

“We wanted to do something small and fast…so it was critical to IBM’s success that we partnered with Intel and Microsoft and created the PC industry together,” said Jim Spohrer, Director of Cognitive OpenTech at IBM and a member of the Berkeley Innovation Forum, a group created by Chesbrough to help corporate managers involved in innovation.

Has the promise of innovation been overhyped?

Chesbrough opens the book with an “exponential paradox” that’s at the heart of our current global economic situation: While new technologies are emerging faster and faster—some say exponentially—economic productivity is slowing. Has the promise of innovation been overhyped?

The real problem, Chesbrough argues, is that promoters of innovation too often chase after “bright and shiny objects,” focusing on the initial stage of development and neglecting the rest of the process. Innovation results depend on what you finish, not on what you start, he says.

“In order to advance prosperity, we must not only create new technologies, but we must also disseminate them broadly and absorb them, which means having the knowledge and skills to put them to work in our business,” Chesbrough says. “Only then do we really see the social benefit of these new technologies, and only then will these measures of economic productivity catch up again.”

Chesbrough shapes these three facets of innovation—generation, dissemination, and absorption—into a new paradigm for managing R&D and bringing new technologies to market. Rooted in two decades of extensive field research, the book is packed with real examples of successes and failures from companies such as Procter & Gamble, IBM, Intel, General Electric, Bayer, and Huawei.

Carlos Moedas, the European Union’s Commissioner for Research, Science, and Innovation, says the book’s complex concepts are easily relatable. “[It’s] a must-read for politicians, policy-makers, and business leaders who want to make a difference by designing the right policies that drive not only the generation of new ideas, but…their broad dissemination and adoption by society,” he says.

About Henry Chesbrough

Henry Chesbrough is widely known as “the father of open innovation”. He has built an international reputation for his insights into the innovation process. The author of six books (translated into 12 languages) and numerous articles, he has received 70,000 citations to his work on Google Scholar. He has appointments at both UC Berkeley’s Haas School of Business and at Esade Business School in Barcelona.

Prof. Chesbrough founded and organizes two external groups of companies that each meet twice a year to discuss challenges in managing innovation: the Berkeley Innovation Forum (32 member companies) and the European Innovation Forum (20 member companies). He has taught at the Haas School of Business for the past 14 years, at Esade Business School for the past 7 years, and taught previously at Harvard Business School for 6 years. He also serves as the Faculty Director of the Garwood Center for Corporate Innovation at Berkeley Haas.

Open Innovation Results is available for pre-order on Amazon.

Should the U.S. “decouple” from China? International experts debate at Haas

Nearly 40 years after President Richard Nixon’s historic visit to Beijing, should the U.S. disengage from its deeply entwined economic relationship with China?  

That was the topic for a panel of international experts and journalists at “The Great Decoupling and Sino-US Race for Technological Supremacy” held at Berkeley Haas on Oct. 17. Co-hosted by the Financial Times, the Asia Society, and Haas’ Institute for Business Innovation, the event included a panel discussion followed by a spirited Oxford-style debate. 

“This is the most important topic of our day. It flows into not just economics, not just technology, but into national security, and into the long-run performance and survival of liberal democracies,” said Prof. David Teece, faculty director Tusher Initiative for Management of Intellectual Capital, in introducing the event. “While the debate tonight is framed as ‘should there be a decoupling?’ it’s already starting to happen. Whether it’s the appropriate course or not is something we’ll need to think about.”

UC Berkeley Chancellor Carol Christ pointed out the important role universities play in this complex landscape.

The Chinese government has made investment in higher education an essential element of its competitive edge, even as the objective is complicated by state policies aimed at controlling inquiry and expression. Our university, meanwhile, long a proponent of international engagement and with deep, numerous, and varied academic collaborations in China, is now grappling with how we show support for Chinese students, scholars, and institutional partners suddenly cast as objects of suspicion.” 

The panelists included:

  • Robert Atkinson, President, Information Technology and Innovation Foundation
  • Professor Li Chenjian, Neuroscientist, Peking University
  • Dan Wang, Tech Analyst, Gavekal
  • Orville Schell, Director of the Center on U.S.-China Relations at the Asia Society and former dean of the Graduate School of Journalism

“It’s fair to say the world is at an important moment,” Schell said in his introduction. “When the two largest economies are in a state of changed grace, everything that goes on between us—whether it’s business and trade, but also things like scientific research, academic life, and cultural exchange—is under reconsideration, and that’s what’s known as decoupling.”

Li, who lived in the U.S. for 25 years as a neuroscientist at Cornell and Mt. Sinai before taking a post at Peking University, brought a different perspective, pointing out that China is 20% of humanity. “When we talk about decoupling, we are too much into the zero sum game between the U.S. and China…Instead of decoupling, I am more on the side of principled engagement,” he said. “I question the notion that after 40 years engagement has failed. My experience…is I that liberal democracy is winning in Chinese society, quietly, irreversibly, at the grassroots level.

Were the results of that experiment really so predictable? These researchers aim to find out.

Research prediction platform
Researchers have launched a beta version a new platform to predict research study results.

They say that hindsight is 20-20, and perhaps nowhere is that more true than in academic research.

“We’ve all had the experience of standing up to present a novel set of findings, often building on years of work, and having someone in the audience blurt out ‘But we knew this already!,’” says Prof. Stefano DellaVigna, a behavioral economist with joint appointments in the Department of Economics and Berkeley Haas. “But in most of these cases, someone would have said the same thing had we found the opposite result. We’re all 20-20, after the fact.”

Prof. Stefano DellaVigna
Prof. Stefano DellaVigna

DellaVigna has a cure for this type of academic Monday morning quarterbacking: a prediction platform to capture the conventional wisdom before studies are run.

Along with colleagues Devin Pope of the University of Chicago’s Booth School of Business and Eva Vivalt of the Research School of Economics at Australian National University, he’s launched a beta website that will allow researchers, PhD students, and even members of the general public to review proposed research projects and make predictions on the outcome. 

Making research more transparent

Their proposal, laid out in a new article in Science’s Policy Forum, is part of a wave of efforts to improve the rigor and credibility of social science research. These reforms were sparked by the  replication crisis—the failure of reproduce the results of many published studies—and include mass efforts to replicate studies as well as platforms for pre-registering research designs and hypotheses. 

“We thought there was something important to be gained by having a record of what people believed before the results were known, and social scientists have never done that in a systematic way,” says DellaVigna, who co-directs the Berkeley Initiative for Behavioral Economics and Finance. “This will not only help us better identify results that are truly surprising, but will also help improve experimental design and the accuracy of forecasts.”

Identifying truly surprising results

Because science builds on itself, people interpret new results based on what they already know. An advantage of the prediction platform is that it would help better identify truly surprising results, even in cases where there’s a null finding—which rarely get published because they typically aren’t seen as significant, the researchers argue. 

“The collection of advance forecasts of research results could combat this bias by making null results more interesting, as they may indicate a departure from accepted wisdom,” Vivalt wrote in an article on the proposal in The Conversation.

A research prediction platform will also help gauge how accurate experts actually are in certain areas. For example, DellaVigna and Pope gathered predictions from academic experts on 18 different experiments to determine the effectiveness of “nudges” versus monetary incentives in motivating workers to do an online task. They found the experts were fairly accurate, but there was no difference between highly cited faculty and other faculty, and that PhD students did the best.

Understanding where there is a general consensus can also help researchers design better research questions, to get at less-well-understood phenomena, the authors point out. Collecting a critical mass of predictions will also open up a new potential research area on whether people update their beliefs after new results are known. 

Making a prediction on the platform would require a simple 5-to-15-minute survey, DellaVigna says. The forecasts would be distributed to the researcher after data are gathered, and the study results would be sent to the forecasters at the end of the study.

Berkeley Haas Prof. Don Moore, who has been a leader in advocating for more transparent, rigorous research methods and training the next generation of researchers, says the prediction platform “could bring powerful and constructive change to the way we think about research results. One of its great strengths is that it capitalizes on the wisdom of the crowd, potentially tapping the collective knowledge of a field to help establish a scientific consensus on which new research results can build.”  

Haas energy experts weigh in on PG&E power shutoffs

Are mass power shutdowns the new normal for California, and what does this mean for the state’s businesses and the economy? What is the path forward for the state’s aging grid, and the thousands of miles of power lines strung across fire-prone wildlands?

Professors Severin Borenstein and Catherine Wolfram of the Energy Institute at Haas (See Q&A below) have been fielding a stream of questions from journalists all week after Pacific Gas & Electric determined it could not guarantee the safety of its lines and shut down power to hundreds of thousands of people, including the entire UC Berkeley campus. SoCal Edison also cut power Thursday.

Photo of Severin Borenstein
Severin Borenstein

Borenstein on Tuesday banged out a blog post with pro tips for getting through a power shutoff and then spent the day getting ready himself, packing the freezer with ice to add thermal mass. “This is why we live in a modern economy, so we don’t have to spend most of our lives doing these things,” Borenstein, the E.T. Grether Chair in Business Administration and faculty director for the Energy Institute at Haas, told the Los Angeles Times.

Wolfram, whose research centers on energy in the developing world, pointed out that in places with unreliable power such as sub-Saharan Africa, businesses have no choice but to incorporate the cost of backup power into their budgets. Commercial generators can run upwards of $10,000, she told NPR’s Marketplace.

“That becomes essentially like a tax on the economy,” said Wolfram, Cora Jane Flood Professor of Business Administration who also serves as Associate Dean for Academic Affairs and Chair of the Faculty.

catherine Wolfram
Catherine Wolfram

For California’s energy economy, the effects of climate change are throwing things off balance to the point where some things will have to change. Concrete poles may be necessary in windy areas; another possibility is that the state require batteries in new developments that are in high-risk fire areas so power shutoffs to them aren’t so disruptive, Borenstein suggested in another Los Angeles Times story.

Utilities have long sought to balance the cost of preventing wildfires with the need to sell cheap power, said Borenstein, who serves on the board of governors for the California Independent System Operator, which oversees the state grid. “It used to be that balance was viewed as pretty reasonable. With climate change, I think it’s not any more.”

We asked a few more questions of both professors.

Is it possible to gauge the economic impact of this week’s power shutoffs?

Catherine Wolfram: Yes and no. I’ve seen estimates that run all the way from $65M to $2.6B. That’s a pretty wide range, but even the high side is less than 1/1000th of the state’s annual GDP. So, while the outages are troubling, there shouldn’t be any concern that this will set us into a recession. (Update 10/14: Read Wolfram’s blog post estimating economic loss to be about $1B).

I’ve been pointing out, though, that these outages are very unusual because they’re a really long duration—not your typical two-minute to two-hour outage—but they’re not associated with a natural disaster. So, if we try to extrapolate from other multi-day outages, we’re likely conflating the effects of the thing that led to that outage, like Superstorm Sandy, say. With a hurricane, it’s likely that the economic losses reflect both the costs of the natural disaster and the costs of the lost power.

At the same time, there are likely going to be losses that are hard to measure. For instance, I got an email from a colleague who was worried that months of work on an experiment could be lost. Electricity is used for many, many different purposes in modern economies, so it’s really hard to put a precise number on the losses.

People were already angry with PG&E before this shutdown, and now they’re even angrier. Do you think the shutoffs were justified? 

CW: It’s really hard to know. The fires last year were devastating, so no one wants to repeat that experience. PG&E has pretty strong incentives right now to be extremely cautious.

Severin Borenstein: Effectively PG&E has a huge financial liability if their equipment starts a fire. On the other hand, the large financial and other costs of shutting off power are not borne by PG&E, but by customers. They do take a reputation hit, but don’t face the same sort of financial downside that would fall on them if their equipment started another fire.

In balance, were they effective?

CW: This involves proving a negative, which is practically impossible. We’ll never really know whether the outages prevented a fire from starting.

As we begin to feel the effects of climate change, what is the path forward for the state’s aging grid?

CW: I’d say that other utilities have been better than PG&E at investing in modern technologies that help prevent fires and give them more visibility into their electricity systems. For example, I’ve heard that San Diego Gas & Electric has technologies that de-energize lines when they sense that they’re falling. Hopefully, PG&E will start investing in things like this.

Berkeley Haas a key player in Blockchain Week

Top blockchain technology researchers and industry leaders from around the world will gather in Berkeley for the Crypto Economics Security Conference (CESC) this month, part of the massive San Francisco Blockchain Week taking place Oct. 28 to Nov. 3 on both sides of the bay.

About 5,000 people are expected to attend the full week of events, with Berkeley Haas co-hosting the conference and a Blockchain Career Fair during the first half of the week, followed by the San Francisco Blockchain Week Epicenter event at the San Francisco Marriott Marquis on Thursday and Friday and the DeFi Hackathon over the weekend.

Haas professors to share research

Haas Profs. Christine Parlour and Steven Tadelis are among the researchers speaking at the Crypto Economics Security Conference on Oct. 28 and 29 in UC Berkeley’s Pauley Ballroom. The conference will explore the economic security aspects of blockchain technology, including game theory, incentive design, mechanism design, and market design.

Parlour, the Sylvan C. Coleman Chair in Finance and Accounting, will present her research on the investment characteristics of a sample of 64 initial coin offerings, and discuss asset pricing properties of cryptocurrencies. (Read a Q&A with Parlour here). Tadelis, the Sarin Chair in Leadership and Strategy, will outline how feedback and reputation systems work for online marketplaces like eBay and Uber, highlighting some of the bias in feedback and reputation systems.

Speakers at the conference, now in its third year, will hold sessions simultaneously on two ballroom stages. “It will be much more intimate than many tech conferences,” said UC Berkeley undergraduate student Liam DiGregorio, BS 21, who is a SF Blockchain Week co-organizer and head of external partnerships and business development for Blockchain at Berkeley, which is also a host of the week’s events.

Kate Tomlinson, MBA 20 and a business consultant at Blockchain at Berkeley, said she’s looking forward to a few days of immersing herself in blockchain. “One of the things I liked about the conference last year was the opportunity to meet people working on projects that we’ve been working on at Blockchain at Berkeley,” she said. “The conference also connects you to people who are really building this stuff and are at the forefront of what’s going on. The conference helps me to understand what’s really happening.”

Along with the security conference, Haas is co-hosting the Blockchain Career Fair at International House on Oct. 30 from 3:00 to 7:30pm, where more than 40 companies will be looking for blockchain talent. Last year, about 1,500 people submitted resumes online and 170 people landed jobs, DiGregorio said.

Epicenter Event and Hackathon

Following the conference and and career fair at Berkeley, the week’s events shift to San Francisco for the Epicenter Event, bringing together companies, developers, investors, and researchers. Among the speakers throughout the week are Ethereum founder and industry leader Vitalik Buterin and the founders of NuCypher, Forte, and Kabam. UC Berkeley Prof. Shafrira “Shafi” Goldwasser, director of the Simons Institute at UC Berkeley and the winner of the ACM Turing Award in 2012, will speak at the Epicenter Conference at San Francisco’s Marriott Marquis.

Researchers from Vanderbilt, Carnegie Mellon, Cornell, UC Santa Barbara, the University of Edinburgh, Northeastern University, New York University, and McGill will also attend.

The week wraps up with the DeFI Hackathon, open to developers of all ages, on Nov. 1-3 at San Francisco’s Terra Gallery. “We’ll have mentors onsite to jump-start the learning experience,” DiGregorio said.

Haas’ participation in blockchain week is part of the school’s ongoing commitment to blockchain research. Haas is one of  34 universities that participate in the University Blockchain Research Initiative,  created by Ripple to support academic research; technical development; and innovation in blockchain, cryptocurrency, and digital payments. Through the initiative, Haas received a five-year, multi-million dollar grant to support faculty and student research, along with events like the conference and job fair, said Karin Bauer, program manager for the Berkeley Haas Blockchain Initiative. “We are able to support independent research in areas of innovation in blockchain, cryptocurrency, and finance that might not otherwise be funded,” she said.

Use the promo code Haas20  to receive 20% off tickets for the CESC or Epicenter conference. The Career Fair and DeFi Hackathon are both free and open to the public upon completion of the applications on SFBlockchainWeek.io.