...
- Release 0.9.0 [Michael R.]
- We added:
- Spark: Column-level lineage introduced for Spark integration (#698, #645) @pawel-big-lebowski
- Java: Spark to use Java client directly (#774) @mobuchowski
- Clients: Add OPENLINEAGE_DISABLED environment variable which overrides config to NoopTransport (#780) @mobuchowski
- For the bug fixes and more information, see the Github repo.
- Shout out to new contributor Jakub Dardziński, who contributed a bug fix to this release!
- We added:
- Snowflake Blog Post [Ross]
- topic: a new integration between OL and Snowflake
- integration is the first OL extractor to process query logs
- design:
- an Airflow pipeline processes queries against Snowflake
- separate job: pulls access history and assembles lineage metadata
- two angles: Airflow sees it, Snowflake records it
- the meat of the integration: a view that does untold SQL madness to emit JSON to send to OL
- result: you can study the transformation by asking Snowflake AND Airflow about it
- required: having access history enabled in your Snowflake account (which requires special access level)
- Q & A
- Howard: is the access history task part of the DAG?
- Ross: yes, there's a separate DAG that pulls the view and emits the events
- Howard: what's the scope of the metadata?
- Ross: the account level
- Michael C: in Airflow integration, there's a parent/child relationship; is this captured?
- Ross: there are 2 jobs/runs, and there's work ongoing to emit metadata from Airflow (task name)
- Great Expectations integration [Michael C.]
- validation actions in GE execute after validation code does
- metadata extracted from these and transformed into facets
- recent update: the integration now supports version 3 of the GE API
- some configuration ongoing: currently you need to set up validation actions in GE
- Q & A
- Willy: is the metadata emitted as facets?
- Michael C.: yes, two
- dbt integration [Willy]
- a demo on getting started with the OL-dbt library
- pip install the integration library and dbt
- configure the dbt profile
- run seed command and run command in dbt
- the integration extracts metadata from the different views
- in Marquez, the UI displays the input/output datasets, job history, and the SQL
- a demo on getting started with the OL-dbt library
- Open discussion
- Howard: what is the process for becoming a committer?
- Maciej: nomination by a committer then a vote
- Sheeri: is coding beforehand recommended?
- Maciej: contribution to the project is expected
- Willy: no timeline on the process, but we are going to try to hold a regular vote
- Ross: project documentation covers this but is incomplete
- Michael C.: is this process defined by the LFAI
- Ross: contributions to the website, workshops are welcome!
- Michael R.: we're in the process of moving the meeting recordings to our YouTube channel
- Howard: what is the process for becoming a committer?
May 19th, 2022 (10am PT)
Agenda:
...