Connor Shorten Profile picture
Research Scientist @weaviate_io! Interested in Database Agents and Approximate Nearest Neighbor Search! Host of the Weaviate podcast, link below!
Feb 4 8 tweets 5 min read
Hey everyone! I am super excited to share our new research report is live on ArXiv! 🎉

Querying Databases with Function Calling!

Thread with more details! 🧵(1/8) Image OpenAI's Deep Research has been amazing and is the latest anecdote on the effectiveness of Compound AI Systems. As impressive as this has already been, imagine how much further it will go when its connected to your personal data, tailoring these reports and research to your particular life!

Connecting AI systems to custom data sources has been wildly successful with RAG. However, RAG is a bit of a naive agentic architecture relative to new advances such as Function Calling and the dawn of Agentic RAG.

Whereas RAG describes a hard-coded flow of retrieve-then-generate, Agentic RAG iteratively searches, reflects on results, and controls whether it is ready to respond to the user or needs to make another search request. 🧵(2/8)Image
Jan 8, 2023 16 tweets 6 min read
Sunday morning reading thread ☕️🧵

1- InPars v2
2- What are you Token About?
3- Demonstrate-Search-Predict
4- Attributed Question Answering
5- Instructor
6- In Defense of Cross-Encoders for Zero-Shot Retrieval
7- MonoQA
8- HF x Neural Magic

Quote and links for each below: 👇 1- InPars-v2: Large Language Models as Efficient Dataset Generators for Information Retrieval

"For each dataset in the BEIR benchmark, we sample 100k documents from its corpus and generate one synthetic query per document using GPT-J prompted with 3 examples from MS MARCO"
Mar 7, 2021 7 tweets 2 min read
Sunday Morning Reading Thread ☕️

- Self-Supervised Learning: The Dark Matter of Intelligence 🧠
- SEER ⚙️
- Multimodal Neurons 👁️📚
- Do Transformer Modifications Transfer? ⚔️
- Ultra Data-Efficient GAN 🤯

Quote from each below: 👇 Self-Supervised Learning: The Dark Matter of Intelligence 🧠

"As babies, we learn how the world works largely by observation. We form generalized predictive models about objects in the world by learning concepts such as object permanence and gravity"

ai.facebook.com/blog/self-supe…
Mar 5, 2021 21 tweets 5 min read
Bringing back AI Weekly Update! 🎉
Here is a preview/curation for March 8th (#27):

- Multimodal Neurons
- SSL: The Dark Matter of Intelligence
- SEER (x2)
- Generative Adversarial Transformers
- Ultra Data-Efficient GAN Training



Links Below: 👇 Multimodal Neurons

openai.com/blog/multimoda…