Please enjoy ARK's weekly newsletter curated by our thematic research team and designed to keep you engaged with disruptive innovation.
Not rendering correctly? View this email as a web page here.
Spot Bitcoin ETFs Continue To Gain Momentum, And More...
ARK • Disrupt
It's Your weekly innovation newsletter
It's Tuesday, February 20, 2024. Please enjoy ARK's weekly newsletter curated by our thematic research team and designed to keep you engaged with disruptive innovation.
Spot Bitcoin ETFs Continue To Gain Momentum
By Yassine Elmandjra | @yassineARK Director of Digital Assets
One month into their approval, spot bitcoin ETFs are proving transformative for institutional investors and asset allocators. Bitcoin’s market cap has eclipsed $1 trillioni for the first time since November 2021 and, according to Bloomberg, spot bitcoin ETF volumes and flows in aggregate are higher at this point in their lifecycle than those of any ETF asset launch in history.ii As measured by volume and flows since inception on January 11, spot bitcoin ETFs already are the second largest commodity ETFiii behind gold, and ahead of silver. In volume, they have generated $43.9 billion,iv or $18.6 billion, excluding the Grayscale Bitcoin Trust ETF (GBTC), which has outflowed. Net inflows excluding GBTC have totaled $11.5 billion, or $4.7 billion including GBTC.
Bound by mandates that restrict their exposure to bitcoin without extensive due diligence, some of the largest full-service brokerage firms and financial advisors are waiting for the green light. During the next three to six months, as due diligence teams understand the opportunity in well diversified portfolios to potentially increase risk-adjusted returns with a new asset class, flows into bitcoin ETFs should continue to increase.
With our partners at 21Shares and Resolute Investment Managers, ARK is excited about the opportunity to continue bridging the gap between Bitcoin and the traditional financial world order.
Open-AI’s “Sora” Text-To-Video AI Model Is A Landmark Achievement
Last week, OpenAI released its text-to-video AI model, Sora, a landmark achievement in the field of AI video production. With its Pixar-quality animation and photorealistic videos of landscapes, Sora is outperforming models offered by Google, Runway, and Pika. In addition to text-to-video generation, Sora canv animate image inputs, extend video inputs forward or backward in time, transform video inputs, connect two video inputs seamlessly, generate images, and simulate physical and virtual worlds.
According to our research, large language models and text-to-image diffusion models have pushed the cost of text and image production to nil. Sora is lowering the cost of video production even further, increasing access to studio-grade, AI-enabled video content production. In our view, generative AI tools are a boon to the creator community, including platforms that aggregate user-generated content.
Big Ideas 2024
Disrupting The Norm, Defining The Future | Annual Research Report
Sora’s Content-Generation Capabilities Could Have Important Applications In Robotics
Frank Downing & Tasha Keeney | @ARKInvest Director of Research, Next Gen Internet & Director of Investment Analysis
OpenAI ‘s new generative AI video creation model, Sora,vi generates content with quality and detail that users need to see to believe. According to its technical report,vii OpenAI combined diffusion model technology—DALL-E style models that use text prompts to generate images and video—with the transformer architecture that powers ChatGPT. Notably, Sora trained on videos of varying durations, resolutions, and sizes, unlike prior text-to-video models that trained on a standardized resolution and aspect ratio. OpenAI’s success suggests that the diversity of Sora’s training data has enabled it to frame and compose scenes more effectively and to accommodate a more diverse array of input and output modalities than other models. By leveraging its expertise in both diffusion and transformer models, and training on vast amounts of raw video and image data, OpenAI appears to have raised the state-of-the-art to a new level.
Given a video, Sora can extend the scene ahead or backward in time, potentially predicting what did or will happen before or after any scene—a capability that could help predict the movements of pedestrians and vehicles in autonomous driving applications. In short, Sora appears to demonstrate simulation capabilities that could have broader use cases, specifically in robotics.
Although never fed with physics explicitly, Sora generates videos that can visualize the movement of people and objects accurately, even when they are occluded or out of frame, which potentially could be useful in simulation-based training for robots. While its understanding of the physical world has yet to be perfected, Sora seems to be a leap forward for multimodal models, which already have proven usefulviii in autonomous driving.
[i] Dale, B. 2024. “Bitcoin's market cap breaks $1T, taking overall crypto market to $2T.” Axios.
[ii] Balchunas, E. 2024. “Here’s the updated version…” X.
This Newsletter is for informational purposes only and does not constitute, either explicitly or implicitly, any provision of services or products by ARK Investment Management LLC (“ARK”). Investors should determine for themselves whether a particular service or product is suitable for their investment needs or should seek such professional advice for their particular situation. All content is original and has been researched and produced by ARK unless otherwise stated therein. No part of the content may be reproduced in any form, or referred to in any other publication, without the express written permission of ARK. All statements made regarding companies, securities or other financial information contained in the content or articles relating to ARK are strictly beliefs and points of view held by ARK and are not endorsements of any company or security or recommendations to buy or sell any security. By visiting and/or otherwise using the ARK website in any way, you indicate that you understand and accept the terms of use as set forth on the website and agree to be bound by them. If you do not agree to the terms of use of the website, please do no access the website or any pages thereof. Any descriptions of, references to, or links to other products, publications or services does not constitute an endorsement, authorization, sponsorship by or affiliation with ARK with respect to any linked site or its sponsor, unless expressly stated by ARK. Any such information, products or sites have not necessarily been reviewed by ARK and are provided or maintained by third parties over whom ARK exercises no control. ARK expressly disclaims any responsibility for the content, the accuracy of the information, and/or quality of products or services provided by or advertised on these third-party sites. For full disclosures, click here.