Consider that you have list of support articles that you want to suggest it to users when users search for it. You want to suggest as best fit as possible. With the availability of tools like Large Language Model (LLMs) and Vector Databases, the approach towards suggestions & recommendation systems has significantly evolved.
package.json
. Since we want to store the list of articles to database, we have to read them for a file. Create articles.txt
and copy the following:
index.js
and you are ready. Let’s start writing code.
fs
to help us read the list of articles from a articles.txt
file and USER_QUERY
is the query we will use to do similarity search.
support_articles
table. It will store the title of the article along with it’s embeddings. Please feel free to add more fields of your choice, such as description or tags.
For simplicity, create a table with columns for ID
, content
, and embedding
.
fs
library to read the articles.txt
and convert every title on the list into embeddings. With Portkey, generating embeddings is straightforward and same as working with OpenAI SDK and no additional code changes required.
await storeSupportArticles();
You should now see the rows created from the Table Editor.
id
, content
and the similarity
score against the best row and user query in the database.
support_articles
is now powered to return vector similarity search operations.
No more waiting! Let’s run an search query.