1. Input: PDF of study material
⬇️ 2. Process documents
⬇️ 3. Generate questions based on study material and exam guidelines
⬇️ 4. Answer the questions based on the study material
You could combine step 3 & 4, but separating them = better results👇
2/9 Let's dive into the code:
First, load and process data for question generation.
I'm using the new gpt-3.5-turbo-16k model for a larger context window.
Result: Less calls to the model and higher context-awareness
3/9 Process the data for question answering.
We'll be storing the data in a vector database for easy retrieval and search possibilities later.
Standard gpt-3.5-turbo model for lower cost.
Lower chunk size = lower token usage when retrieving.
4/9 The part where you can get creative. Prompting.
Prompting is key to getting great results. Play around with this and get as creative as you want.
You could let it generate multiple-choice questions, in any language you'd like, include your exam criteria and much more.
5/9 Generating the actual questions.
Initialize the large language model. You can use the one you like most.
Use the refine summarization chain to move through all the documents and generate questions based on the content provided.
6/9 Create the vector database and initialize the retrieval Q&A chain.
Work with the embeddings of your choice. Set up your preferred vector database and initialize the Retrieval chain with the database as the retriever.
We're now all set for answering the questions.
7/9 Run the answering chain and print the results.
Tada... You've just created personalized practice questions for your exam!
As I said, prompting determines the quality of your results. Play around with this.
But did you know you can create AI-tools like ChatGPT yourself?
Unleash the potential by merging @LangChainAI and @streamlit.
Today's mission: craft a 4-hour workday blueprint based on @thedankoe's video.
Let me show you how 🧵
Before we dive in, this is day 1 of my LangChain Unchained series.
Each day, I'll implement a small LangChain project in Streamlit.
Follow @JorisTechTalk to stay up-to-date.
If there's anything you'd like to see, let me know!
Let's dive in:
A high-level overview:
1️⃣ Load YouTube transcript
2️⃣ Split transcript into chunks
3️⃣ Use summarization chain to create a strategy based on the content of the video.
4️⃣ Use a simple LLM Chain to create a detailed plan based on the strategy.
ChatGPT can give you a kick-start when learning new skills.
But I like to learn through YouTube videos.
With the power of @LangChainAI, you can generate a personalized YouTube study schedule based on a skill you'd like to learn.
Let me show you how: 🧵
#AI
Before we dive in, this is day 7 of my '7 days of LangChain'.
Every day, I've introduced you to a simple project that will guide you through the basics of LangChain.
Today's a longer one.
Follow @JorisTechTalk to stay up-to-date on my next series.
Let's dive in:
High level overview of what's happening:
1️⃣ Generate list of video id's from favorite YT channels
2️⃣ Load all transcripts
3️⃣ Split the transcript
4️⃣ Extract skills
5️⃣ Vectorize
6️⃣ Generate skillset
7️⃣ Find relevant videos.