Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

  1. Announcements [Julien]:
    1. Upcoming meetups
      1. Boston Data Lineage Meetup (tentatively scheduled for June)
      2. San Francisco OpenLineage Meetup at Astronomer (tentatively scheduled for June 27)
    2. Upcoming talks
      1. Paweł Leszczyński and Maciej Obuchowski, “Column Lineage is Coming to the Rescue,” Berlin Buzzwords, June 18-20, 2023
      2. Julien Le Dem and Willy Lulciuc, “Cross-platform Data Lineage with OpenLineage,” Data+AI Summit, June 28-29, 2023
      3. Maciej Obuchowski, “OpenLineage in Airflow: A Comprehensive Guide,” Airflow Summit, September 19-21, 2023
  2. Recent releases [Michael R.]
    1. OpenLineage 0.24.0
      1. Additions
        1. Support custom transport types #1795@nataliezeller1
        2. Airflow: dbt Cloud integration #1418@howardyoo@JDarDagran
        3. Spark: support dataset name modification using regex #1796@pawel-big-lebowski
      2. https://github.com/OpenLineage/OpenLineage/releases/tag/0.24.0
      3. https://github.com/OpenLineage/OpenLineage/compare/0.23.0...0.24.0
  3. Custom transport types support [Natalie]
    1. OpenLineage supports a set of predefined transport types (HTTP, Kafka, others)
    2. Previously, adding a new or custom type required changing the transport config and transport factory to recognize the new type
    3. This change allows for extending functionality without having to change anything in the OpenLineage codebase
    4. Example: my company, where we work with an OpenMetadata backend
      1. This required a custom transport type
      2. With this change I can do this without changing anything
    5. Implementation
      1. New interface: TransportBuilder
      2. Implementable via methods:
        1. getType() // set in transport.type config param
        2. getConfig() // extension of TransportConfig, containing the required configuration
        3. Transport build(TransportConfig config) // builds a custom Transport instance based on the custom configuration
      3. Additionally you need to have a file (META-INF/services/io.openlineage.client.transports.TransportBuilder) that must be included in a jar in the class path, containing the fully qualified name of the implementing class
      4. Using the service loader pattern, implementations of TransportBuilder will be discovered and loaded at runtime.
    6. Q&A
      1. What are some use cases for other cool transport mechanisms?
        1. Native cloud, your queue system to send events
        2. Preferred way: the provider, data catalog, or something to implement over the lineage
        3. Maybe someone wants to do MSMQ or MQSeries
        4. You can also apply some transformation logic as part of your transport provider, so you can have your own ways of transporting the data
      2. Should we have some sort of repository where people can put their custom transport types that their building in a single place?
        1. They can put them in the repo; I don't think we need a separate place, at least right now
  4.  dbt Cloud integration [Jakub]
    1. Previously:
      1. The dbt-ol script invoked dbt metadata processing and sent OpenLineage events
      2. Worked only with a local dbt project
      3. How events were created:
        1. each run was a separate supported dbt node
        2. parent run reflected dbt-ol command call
    2. New dbt Cloud integration:
      1. each run in dbt Cloud might have multiple steps, each producing separate JSON files
      2. Each step is considered a parent run
      3. DbtArtifactProcessor was separated as a parent for DbtCloudArtifactProcessor and DbtLocalArtifactProcessor classes; the naming convention stays the same
      4. Used with DbtCloudRunJobOperator & DbtCloudJobRunSensor operators in Airflow integration, also makes use of DbtCloudHook to retrieve metadata from the dbt Cloud API
    3. Artifact retrieval and processing
      1. Due to a 10-sec thread timeout in the OpenLineage-Airflow integration, there is the following process for fetching dbt metadata:
        1. each run is a separate supported dbt node (models, tests, sources, snapshots)
        2. parent run reflects dbt-ol command call
      2. The issue will be resolved with the Airflow OpenLineage provider release (learn more about AIP-53 here)
  5. Discussion items
    1. Can we help ensure efficiency by narrowing the scope in some pragmatic ways? For example: is validation necessary in the case that an OpenLineage client is being used to send events? Are there other similar cases where validation might not be necessary?
      1. Work on adding validation to the project is ongoing, e.g., in the proxy where there is some schema validation happening
      2. It would be useful to have some testing facility, e.g., for people consuming OpenLineage and potential implementers
      3. From a producer's point of view, we could check if the consumer consumes them; this would have to be specific to each consumer
      4. We could have a dataset of events that contain all the assets, which would be useful for anyone who wants to do their own testing – like examples of all the facets that exist (instead of having to create them by hand for internal teams)
      5. Maybe just pump demo payloads out to disk and keep them somewhere
    2. Improving column lineage: there are lots of other elements that would be useful
      1. People want to add selected rules and filters
        1. Is there an anticipated traffic level, typical volume in a plan for design lineage
      2. Column metadata is well covered by other standards in the industry, but there are some lineage ones related to expected performance, flags that people want such as for PII data that's being managed on that edge, etc.
      3. One question: are those properties of a transformation itself, or just a property of a resulting column?
        1. In some cases, transformation; in others the actual edge, which is interesting. Option: have the ability to define the kinds of edges
        2. for PII, there is a tagging facet we were discussing that is still in progress
        3. Action item: get feedback on this and complete it
    3. Spark integration: merge into and aggregate functions don't provide column lineage
      1. A fix has recently been made, but when will this be released?
      2. Anyone can request a release in the #general Slack channel. You're encouraged to do this if you'd like a fix before the next regularly scheduled release (on the first work day of the month).

April 20, 2023 (10am PT)

Attendees:

  • TSC:
    • Julien Le Dem, OpenLineage project lead
    • Paweł Leszczyński, Software Engineer, GetInData
    • Maciej Obuchowski, Software Engineer, GetInData, OpenLineage committer
    • Michael Robinson, Community team, Astronomer
  • And:
    • Sheeri Cabral, Technical Product Manager, Lineage, Collibra
    • Julian LaNeve, Senior Product Manager, Astronomer
    • John Montroy, Big data/backend engineer
    • Anirudh Shrinivason, Data Engineer, Grab

...