Visualising AI Embeddings in APEX

Image
" To deal with hyper-planes in a 14-dimensional space, visualize a 3-D space and say 'fourteen' to yourself very loudly. Everyone does it."     - Geoffrey Hinton , 2018 Turing Award winner. Within the wonderful world of  Generative AI , one concept that is all the rage is RAG, or  Retrieval Augmented Generation , which is an  AI framework that combines the strengths of traditional information retrieval systems (such as databases) with the capabilities of generative large language models (LLM s) . RAG's goal is to improve the accuracy, relevance, and timeliness of information  generation  - such as  documents, text and images -  by optimizing LLM output.  When creating a RAG system, it’s essential to store information in a format that a LLM can retrieve. This is where  data is converted into embeddings through a process that uses pre-trained m

Oracle Select AI and APEX

Yes, yes, I know, it's my first blog post in 10 years.  As Oracle is doing some really interesting stuff with AI I thought I'd break cover and do a post showing a little something I've been working on.

It's probably not escaped your attention that Oracle recently announced the release of Oracle Database 23ai (on OCI/Azure now but not yet on-prem), with a host of new features to allow developers to incorporate generative AI (genAI) into their work which will ultimately help end users be more productive.

One AI feature that was released in 2023 prior to this announcement is Select AI, or Autonomous Database Select AI to give it the proper title.   This is a feature of the Autonomous Database that enables data to be queried using natural language, generating (ANSI) SQL from genAI large language models (LLMs) such as OpenAI, Cohere and Azure OpenAI.  

There's a great demo put together by the Oracle APEX team demonstrating Select AI working with APEX.  In the demo app the user can create conversations that mix generating and running SQL queries against the users database with general genAI 'chat'.  It also shows how the generated SQL can then be used in Interactive Grids to create the reports dynamically as a separate page accessed via a link under each response section within a conversation.  

I'd really recommend installing that demo app although you'll need to have a working OCI Autonomous Database instance.  You can have an Always Free Autonomous Database up and running in no time if you haven't already got one.

As I started playing with the demo, which is pretty cool on it's own, I thought it would be interesting to see if I could extend the capabilities to use Select AI to ingest external data sets and then instantly create Interactive Grids dynamically on that data in a region sitting beside the conversation region.  Select AI could then be used to create queries on that data in conjunction with data that already exists in the database.

I thought it would be useful to create a video showing this extended demo app working.  The link to the video on Youtube is below, but here's a summary of what is happening:

  • In my database I have some pre-existing product sales data for a fictitious company mainly supplying healthcare equipment and I'd like to supplement it with some external data to generate reports.
  • I create a new conversation and the first question I fire at Select AI (utilising the OpenAI API) is:
Show US healthcare spending in whole billions by year from 1992 to 2018 and also show the amounts adjusted for inflation and return in JSON format 

 

    • Next, I'd also like a list of sitting US Presidents, so I ask another question:
    Show all US Presidents between 1990 and 2017 and return in JSON format

     

    • Both these data sets are now available in my database, so I direct my next question towards it:
    Show US Presidents with total healthcare spending and total adjusted for inflation healthcare spending during their presidency and show the years it covers and the total number of years it covers

     

    • Lastly, I want to use the pre-existing product sales data along with the 2 new data sets, so I ask:
    Show the total price of products sold to ‘Healthcare’ industry customers for each year of a presidential term and show the total healthcare spending for each President

     

    • As you can see I can go back and forth between the 4 generated Interactive Grid reports, view the raw JSON that was returned, and get an explanation from Select AI about how the underlying SQL was generated.



    How was this done?

    • Firstly, an Interactive Grid was added to the same AI conversation page so we can instantly see the results of any queries.  A new PL/SQL package was also created to deal with the following.
    • When making the request for data in the genAI conversation, it's important that the prompt includes a request that the data be returned as JSON.
    • The JSON is extracted from the response and parsed to check its valid.
    • Select AI is called in 'chat' mode to generate suitable table and view names for our JSON response.
    • A new table is created with a single JSON column to store the data and the JSON is then inserted into that table.
    • A JSON search index is created on the table with the 'dataguide on' parameter.
    • JSON_DATAGUIDE is used to rename the virtual columns on the table.
    • DBMS_JSON.CREATE_VIEW_ON_PATH is then used to create a view on the table.
    • A simple Select statement is then created on that view which we can then store in the database for use in generating the Interactive Grid.
    • The Interactive Grid is then refreshed based on the generated SQL.


    There are a few limitations and some improvements that could help.  Some notes on those:

    • When creating an AI profile during the demo setup (using DBMS_CLOUD_AI.CREATE_PROFILE) I found the default OpenAI model has too small a context window due to the amount of data passed in the prompt and the JSON data that could be returned.   Instead, I created the profile using 'gpt-4-turbo' model which allows for a 128k token context window whereas the default 'gpt-4' model allows for only 8k tokens.
    • The results from the LLM's are not always perfect and can be inconsistent.  Also, I notice that sometimes it will not return JSON at all saying it's not setup to do that! 
    • The Interactive Grids that are generated are in fact a single IG report region where the source is a PL/SQL function that returns a SQL query.  It defaults to be based on an APEX collection so a 'set' of columns can be created in advance at design time.  Each time a report is run (from the Run Report link under each response in a conversation) the relevant SQL query is passed as the IG source.  What this means is that any formatting, aggregations or sorting remains in place when running any of the individual reports.  It would be ideal if it was possible to create actual Interactive Grids (or Interactive Reports) dynamically, meaning created at runtime and not related to any particular page or region. I'm sure this isn't currently possible but I can see a place for this in the future of APEX and genAI
    • I'd love to publish a link for the demo app I've created for you to use but it would cost me a fortune (slight exaggeration) in OpenAI API costs.  I'll look to create another blog post with the code and app, with instructions on how to install.  Hopefully, that won't take another 10 years!


    Comments

    Popular posts from this blog

    Visualising AI Embeddings in APEX

    Fun with Bubbles - Part 2