Versions Compared


  • This line was added.
  • This line was removed.
  • Formatting was changed.


  • Open Business and Artificial Intelligence Connectivity (OBAIC) borrows the concept from Open Database Connectivity (ODBC), which is an interface that makes it possible for applications to access data from a variety of database management systems (DBMSs). The aim of OBAIC is to make it as the interface that makes it possible for BI tools to access machine learning model from a variety of AI platform - “AI ODBC for BI”BI” 
  • Through OBAICOnce this is defined and developed, BI vendors can connect to any AI framework (TensorFlow, PyTorch… etc) platform freely without concerning about the underlying implementation, just like what we used to have for database with ODBC
  • The committee has decided this standard will only define the protocol of how AI and BI communicate. The actual implementation, such as whether this should be a Server VS Server-less, will leave it up to the vendor to provide.
  • There are 3 key aspects when designing this standard 
    • BI -
    As a BI platform,
    • what specific call do I need this standard to provide so that I can better leverage any underlying AI/ML framework?
    As an
    • AI
    • - what should be the common denominator an AI framework should provide to support this standard?

Decision to be made


    • Data - Shall data be moved around in the communication between AI and BI (passed by value) or keep the data in the same location (passed by reference)?


Decision to be made

  • Data file type: What type of data we are supporting: e.g. for Delta needs to be parquet, RDBMS? Can modify the Jeffrey init cut below to support multiple data types, depending on the use case.
    • Inference: Pass by value should be good enough if it's only for predicting 
    • Train: not immediate, maybe later in Phase 2
    Do we upload the data to AI (passed by value) and keep the data in the same location (pass by reference)?
  • Metadata structure, what kind of JSON schema do we need
  • Do we support training or just inference?
  • Do we only support a specific model type (ONNX) or arbitrary number of framework
  • Decouple model (asking the model to predict and train) and data (listing, upload, download)
  • Tableau version of OBAIC
  • Qlik version of OBAIC:
  • Finalize Logo

Question to clarify


Why should I share our model to you?


  1. BI vendor has some data on which predictive analytics would be valuable. 
  2. BI vendor requests AI vendor (through OBAIC) to train/prepare a model that accepts features of a certain type (numeric, categorical, text, etc.)
  3. BI vendor gives AI vendor a token to allow access to the training data with the above features. A SQL statement is a natural way to specify how to retrieve data from the datastore.
  4. When model is trained, BI vendor can see the results of training (e.g., accuracy).
  5. AI vendor provides predictions on data shared by BI vendor, again using an access token.


  1. .


Train a New Model

function TrainModel(inputs, outputs, modelOptions, dataConfig) -> UUID


  customerAge WITH ENCODING (
  activeInLastMonth WITH ENCODING (
  canceledMembership WITH DECODING (
FROM myData (


List Models

function ListModels() -> List[UUID, Status]


    { "modelUUID": "abcdef0123", "status": "deployed" },
    { "modelUUID": "1234567890", "status": "training" }

Show Model Config

function GetModelConfig(UUID) -> Config


The response here is essentially a pared-down version of the original training configuration.

Get Model Status

function GetModelStatus(UUID) -> Status


  "status": "errored",
  "message": "Failed to train"

Get Model Metrics

Get core evaluation metrics for a trained model.



Predict Using Trained Model

function PredictWithModel(UUID, dataConfig) -> Predictions