Paul Couvert Profile picture
Nov 9, 2023 6 tweets 3 min read Read on X
It's great that ChatGPT Plus can now use GPT-4, web search, and Dall-E 3 in the same conversation.

But you can do the same thing without paying $20/month.

I'll show you how to do it for free and without limitation: Image
1. Go to Bing Chat

Start by navigating to bing .com/chat.

I advise you to use the Edge browser to avoid limitations.

Select the "Creative" mode which uses the full power of GPT-4.
2. Search on the web

Enter your prompt by giving the AI the URL of a web page or something to search for on the internet.

Prompt (e.g.):

"Research the latest announcements from OpenAI and make a summary in 5 short bullet points with the most important ones."

Bing crawls the web and responds to you with GPT-4.

You can also check the sources used.

But that's just the beginning!
3. Dall-E 3 in the same conversation

You can now generate an image related to a previous answer.

Prompt (e.g.):

"Create an image to illustrate the last bullet point that I could put in the introduction of my blog post."

Bing will understand the context and your purpose and create a tailor-made image with Dall-E 3.

All you have to do is download it!
Bonus: "GPT-4 Vision"

You can also upload an image and analyze it with GPT-4 for free.

Just click on the icon at the bottom left and upload your image.

Type your prompt (e.g.): "Who currently occupies this building and where is it located?"

Tip:

It also works perfectly for diagrams or drawings in addition to photos.
It's a good alternative for those who don't yet have access to the ChatGPT update (like me).

Feel free to share the first post and follow me if this guide was useful to you:

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Paul Couvert

Paul Couvert Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @itsPaulAi

Jul 29
Qwen has just released a model on par with GPT-4o...

And you can run it locally easily 🤯

Yep. GPT-4o level AI running offline on a laptop.

- Fully open source
- Only 3B active parameters
- 262k context length natively

Quick steps to run it on your machine and details below
1. Download LM Studio and the model

- Install LM Studio for your OS (macOS, Windows, Linux)
- In the search tab type "Qwen3 30B A3B 2507"

I highly recommend the versions quantized by Unsloth, especially those marked "UD".

They're REALLY efficient. E.g.: Q4_K_XL Qwen3 30B A3B Instruct 2507 UD

Then just click on the download button.
2. Load the model

- Open the chat tab
- Select the model you've downloaded in the list
- Adjust the context length if you want
- Change the default parameters in the sidebar to:

* Temperature=0.7
* TopP=0.8
* TopK=20
* MinP=0

And you're good to go!
Read 4 tweets
Jul 22
Wait so Alibaba Qwen has just released ANOTHER model??

Qwen3-Coder is simply one of the best coding model we've ever seen.

→ Still 100% open source
→ Up to 1M context window 🔥
→ 35B active parameters
→ Same performance as Sonnet 4

They're releasing a CLI tool as well ↓ Image
You can use the model for free on Qwen Chat:

1. Create a free account
2. Select Qwen3-Coder in the list
3. You can also upload files (codebase)
Qwen-code is a CLI tool for agentic coding:

→ Forked from Gemini Code
→ Includes custom prompts and proper function call protocols
→ GitHub link: github.com/QwenLM/qwen-co…Image
Read 7 tweets
Jul 19
Wait NVIDIA has just released new SOTA open source models?!

Available in 4 sizes 1.5B, 7B, 14B and 32B that you can run 100% locally.

- OpenReasoning-Nemotron
- SOTA scores across many benchmarks
- Tailored for math, science, code

How to run it on your laptop and details below Image
You can run them using LM Studio for free:

1. Download LM Studio for macOS, Windows or Linux
2. In the search tab, type "openreasoning"
3. Install the version you want

I suggest the 7B (very good for its size) from Bartowski in the Q4_0 quantization if you're using an ARM processor like me.
Then you can start using it right away!

1. Load it from the top bar
2. Just type your prompt!
Read 4 tweets
Jul 13
Grok 4 is excellent for vibe-coding

You can give it your entire codebase in 5s to add features or debug.

→ No need for Cursor
→ Works directly on the Grok site

Quick steps below: Image
1. Generate command

To export your codebase Grok 4 can generate the command that you can reuse over and over again.

Prompt template:

"Give me a shell command I can use in my [IDE Name], with my [Project Type, eg. Node js] to download the entire codebase in a single text file but without the node_modules, package-lock.json, or hidden files/folders, just the really useful ones."
2. Export codebase

- Open your terminal in your IDE
- Paste the command generated above

A new .txt file with your entire codebase is automatically created.

All you have to do is download/save it.
Read 5 tweets
Jul 3
Gemini CLI can automate your computer using MCP 🔥

Add Windows MCP (or macOS MCP) to Gemini CLI and you can tell it what to do autonomously.

Gemini then takes control of your entire system to achieve the goal you've set.

Links below
As a reminder, you have 1'000 free requests PER DAY using Gemini CLI!

Windows MCP used → github.com/CursorTouch/Wi…

MacOS MCP (never tried) → github.com/baryhuang/mcp-…
Tutorial to install Gemini CLI ↓

Note: I advise you to use WSL if you are using Windows

Read 4 tweets
Jul 1
You can add a local "=AI" formula in Excel

Excel can then process data using a free and open-source AI model like Gemma.

This means that it understands what's in the cells and returns a tailored response based on your prompt even offline.

Steps to set it up and examples below Image
Keep in mind that everything runs locally, even the AI model (LLM): no remote servers or APIs!

EXAMPLE 1 - Categorize data

Formula:
=AI("Is this a basketball or baseball team? Just write one word: Basketball or Baseball. Team name: ",A1)
EXAMPLE 2 - Sentiment analysis

Formula:
=AI("Is this sentence negative or positive? Just write one word: Positive or Negative. The sentence is: '",A1,"'")

And you can do much more like data extraction, summarizing, etc.!

Tutorial ↓
Read 6 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Don't want to be a Premium member but still want to support us?

Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal

Or Donate anonymously using crypto!

Ethereum

0xfe58350B80634f60Fa6Dc149a72b4DFbc17D341E copy

Bitcoin

3ATGMxNzCUFzxpMCHL5sWSt4DVtS8UqXpi copy

Thank you for your support!

Follow Us!

:(