The @browsercompany 's Arc browser has some amazing UX features. These are some of my favorites (thread):
1. Hybrid Bookmark + Tabs ("Close and Keep Pinned")
This is a brilliant feature for tab/bookmark junkies.
You can pin bookmarks on the sidebar. When you click them, they behave like tabs. When you're done, you can select "Close and keep pinned", it returns to the original pinned page.
This suits my workflow so well, because I have a few aggregator pages bookmarked, and I like to browse for a bit and reset each to their home views when I'm done. I have tons of bookmarks and tabs, and blurring the lines between them is a great idea.
@browsercompany Peek Links
I like the way it handles external links. It shows them as a popup taking up most of the screen. You can click outside the popup to return to go back, or you can hit the expand button to open it in a new tab.
Eliminates lots of tab closing / back button usage
@browsercompany Drag Links to Split View
Tab split view is well done, you can drag links to do a horizontal or vertical split
A UI popup that lets you use "prompt templates" or write freeform text as the prompt.
You can ask the AI to perform any task on the current block, use a built-in command, or define your own templates
You can write any command in the popup, and it will run that command using text in the current block.
For example, to create flash cards for studying, you could write "create flash cards based on the following text:"
There are also a bunch of built-in prompts for common tasks like summarization, creating outlines, asking follow-up questions, and identifying common objections to an idea.
Hard-Boiled Wonderland and the End of the World by Haruki Murakami
A human data processor trained to encrypt data with his mind to protect it from criminal groups.
Two intertwined stories between dystopian Tokyo and a strange isolated town
Picture of Dorian Gray by Oscar Wilde
The beautiful Dorian Gray makes an impulsive wish to stay young while his portrait remains the same, aging over time to absorb the damage of his wild life.
Genius writing and dialog that explores our obsession with youth and beauty.
These terms refer to how many examples a language model needs to perform a task
"Fine-Tuning" is the most common and traditional approach.
100s to 100k+ labeled examples are used. It has strong performance on many benchmarks but needs a new large dataset for every task.
It's better for specific use cases than general tasks.
"Zero-shot" is where no examples are allowed. Only instruction in natural language is given. This method provides maximum convenience but is most challenging for AI. Performing a task without examples is hard for humans too