4 Matching Annotations
- Mar 2023
-
docs.databricks.com docs.databricks.com
-
PythonCopyconfigs = {"fs.azure.account.auth.type": "OAuth", "fs.azure.account.oauth.provider.type": "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider", "fs.azure.account.oauth2.client.id": "<application-id>", "fs.azure.account.oauth2.client.secret": dbutils.secrets.get(scope="<scope-name>",key="<service-credential-key-name>"), "fs.azure.account.oauth2.client.endpoint": "https://login.microsoftonline.com/<directory-id>/oauth2/token"} # Optionally, you can add <directory-name> to the source URI of your mount point. dbutils.fs.mount( source = "abfss://<container-name>@<storage-account-name>.dfs.core.windows.net/", mount_point = "/mnt/<mount-name>", extra_configs = configs)
-
- Feb 2023
-
stackoverflow.com stackoverflow.com
-
import org.apache.spark.sql._ import org.apache.spark.sql.types._ val rows1 = Seq( Row("1", Row("a", "b"), "8.00", Seq(Row("1","2"), Row("12","22"))), Row("2", Row("c", "d"), "9.00", Seq(Row("3","4"), Row("33","44"))) ) val rows1Rdd = spark.sparkContext.parallelize(rows1, 4) val schema1 = StructType( Seq( StructField("id", StringType, true), StructField("s1", StructType( Seq( StructField("x", StringType, true), StructField("y", StringType, true) ) ), true), StructField("d", StringType, true), StructField("s2", ArrayType(StructType( Seq( StructField("u", StringType, true), StructField("v", StringType, true) ) )), true) ) ) val df1 = spark.createDataFrame(rows1Rdd, schema1)
create a dataframe schema
-
-
stackoverflow.com stackoverflow.com
-
df = df.withColumn( "person", struct( $"person.*", struct( lit("value_1").as("person_field_1"), lit("value_2").as("person_field_2"), ).as("nested_column_within_person") ) )
Example code for adding a complex structure into a nested column
-
- Sep 2021
-
www.datagrom.com www.datagrom.com
-
From data warehouses, to data lakes, to snowflake/databricks. Plus strengths/weaknesses of both services
-