Snowflake is one of the most popular cloud data platforms. It allows data storage, data processing, and analytics at massive scale, with extremely less administrative effort, one of the key areas for any data platform is to build data pipelines. In this course, Moving Data with Snowflake, you’ll learn how to quickly & reliably, ingest the data into the tables of Snowflake database - by building Batch and Streaming pipelines in Snowflake. First, you’ll delve into each different data loading options in Snowflake. Then, you’ll see how the batch loading process works in Snowflake - how to connect to external data stores, like Azure Storage; and how to directly query the files in Snowflake. Next, you’ll explore how to work with structured, and semi-structured file formats, like CSV, JSON, and Parquet. finally, you’ll see how to process streaming data in Snowflake – how to continuously load the data using Snowpipe; and how to automate this loading process. By the end of this course, you’ll have the knowledge and skills, to Move Data with Snowflake, by building batch and streaming pipelines.