Snowflake Summit Takeaways


Missed this year’s Snowflake Summit? Michael Caplan, Client Solutions Architect Manager, and Ross Stuart, Senior Solutions Architect for AHEAD, are here to break down the latest innovations and happenings in the world of data. 

Key Takeaways   

There’s plenty of excitement around all of the opportunities that Large Learning Models (LLMs) and generative AI present, but AI adoption is still at its infancy in most industries. Many organizations are at the beginning stages of identifying and prioritizing business use cases and experimenting with AI. So it’s no surprise that major themes for this year’s Snowflake Summit conference sessions revolved around bringing or building AI application to data in Snowflake. Whether or not you’re a Snowflake user, if you’re looking to get into the AI space, there is a need to start centralizing your data assets in one location and start building applications off of that repository.  Eliminating data silos, expensive environments, and operational complexity is the only path for organizations to use AI effectively, if at all. 

Along those lines, most of the Snowflake enhancements announced at the conference are building blocks for larger AI capabilities. New Snowflake native apps available in their marketplace, product releases for Snowpark ML, ML-powered Functions, and a Document AI that uses large language models (LLMs) to extract context from unstructured data are all cases for keeping your data in Snowflake and utilizing some of the most advanced tools currently in the market to help you on your path to generative AI capabilities. Many Generative AI capabilities are on the roadmap, but we believe the GenAI capabilities will start with pretrained models being tuned on customer data.  

Big Announcements 

Although Snowflake announced plenty of smaller enhancements, these six larger announcements caught our eye as especially impactful and the focus of innovation in the Snowflake space. 

  • Partnership with Nvidia – As announced in the dedicated first day keynote between Snowflake CEO Frank Slootman and Nvidia CEO Jensen Huang, there is now the ability to use GPU processing with Snowflake workloads. This new area of focus reflects the rising interest for organizations to take advantage of large learning models (LLMs). But even beyond AI, by centralizing your data footprint, you get the benefit of a single place to generate your data insights or build apps on top of the platform.  With the Nvidia Partnership, data scientists can build, train and run AI/ML workloads on top of Snowflake without the burden of additional effort copying data out of a central data lake to run those jobs or the complexity of creating that pipeline.  Instead, you can just run those workloads on top of Snowflake with their GPU-as-a-Service model, providing the simplicity that businesses and data science teams demand. 
  • Document AI – Solving the problem of unstructured data is one of the key focus areas for Snowflake, and this new offering will become one of the simplified options for non-data scientists to leverage and improve their overall ability for process automation. Using LLMs, you can point to your unstructured data (such as PDFs, Docs, etc.), and Document AI will scan the data so you can conversationally ask it questions about context. From there, you can train the data to improve accuracy. This tool comes from Snowflake’s Applica acquisition, and there were some particularly cool demos to come out of this space.  
  • Snowpark Container Services – Sometimes in app development, there is a need for certain programming languages or other uses that require developers to move data out of Snowflake and into some other platform for that data processing or ML model building, etc. With Snowpark Container Services, users can bring their containers to Snowflake to build apps on the platform in their preferred language of choice beyond the native Python, Java, and Scala, meaning massive time savings and easier data management. 
  • Iceberg Tables – You’ll now be able to leverage the industry-standard Apache Iceberg table format. It doesn’t matter if it’s internally managed by Snowflake or externally managed — data can still benefit from the ease of use of the Snowflake platform. 
  • Performance and Cost Management Improvements – It seems like Snowflake is finally dipping their toes into the FinOps space by providing better visibility into their performance enhancements. With Performance Index, new performance utilization views help you better understand your usage from an observability perspective. 
  • Marketplace and Apps Powered by Snowflake – With more and more applications being brought to Snowflake and using Snowflake data, Snowflake is building out and enhancing their marketplace model, similar to public cloud providers and their marketplace offerings. Monetizing data is a compelling trend across industries, and the Snowflake marketplace would allow organizations to publish their applications on top of Snowflake data that consumers can bring; consumers can then purchase marketplace apps and data using their Remaining Performance Obligation (RPO).  

AHEAD engineers are incredibly well-versed in the Snowflake platform, and all their new innovations are an exciting opportunity to keep building toward fully-realized generative AI for our clients. To learn more about how AHEAD can help your organization leverage the advanced data capabilities of Snowflake, get in touch with us today. 


Subscribe to the AHEAD I/O Newsletter for a periodic digest of all things apps, opps, and infrastructure.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.