That website is generated automatically from the notebooks in this repo.
Take a look around -- all the things you'd hope to see in a high-quality project are there. It's all done for you by nbdev. For instance, see the nice README? Built from a notebook! github.com/fastai/nbdev/
nbdev v1 is already recommended by experts, and v2 is a big step up again.
"From my point of view it is close to a Pareto improvement over traditional Python library development." Thankyou @erikgaas 😀
Here's an example of the beautiful and useful docs that are auto-generated by nbdev+Quarto. nbdev.fast.ai/merge.html
Here's an example of an exported function in a notebook cell. This is automatically added to the python module, and the documentation you see on the right is auto-generated. nbdev.fast.ai/merge.html#nbd…
Every time we update a notebook to change the docs, library, or tests, everything is checked by @github Actions automatically
Here's the @pypi pip installer that's auto-generated. See the description? That's created for you from the notebook you use for your documentation homepage (just like the README, and the description for your conda package) pypi.org/project/nbdev/
I've barely scratched the surface in this brief tweet thread! For much more information, take a look at the blog post authored with @HamelHusain fast.ai/2022/07/28/nbd…
This launch wouldn't have been possible without some amazing people. I'd especially like to highlight Hamel & @wasimlorgat, who made nbdev2 a far better product than it would have been without them, JJ Allaire @fly_upside_down & the @quarto_pub team, & the @fastdotai community
• • •
Missing some Tweet in this thread? You can try to
force a refresh
Today @answerdotai is proposing `/llms.txt`. This is a file you can use to tell models where to find LLM-friendly content for your website.
It provides background information, along with links to markdown files providing more detailed information. answer.ai/posts/2024-09-…
We're providing a website with details of the proposal, & javascript and python parsers. There's also an example of how to incorporate llms.txt into an editor--rather than weight into the emacs vs vim vs vscode wars, we picked ed, the standard text editor llmstxt.org
Today websites are not just used to provide information to people, but they are also used to provide information to large language models.
For instance, language models are often used to enhance development environments used by coders.
Announcing FastHTML. A new way to create modern interactive web apps.
Scales down to a 6-line python file; scales up to complex production apps.
Auth, DBs, caching, styling, etc built-in & replaceable and extensible. 1-click deploy to @Railway, @vercel, @huggingface, & more.
To get started, head over to the home page: .
The whole site, designed by the @tinloof gang, is itself a running FastHTML app, and includes live code examples running inside that page.fastht.ml
I started FastHTML because during 25+ years of web development, I realized that web programming could be easier & more powerful. I feel that recent trends move away from the power of the web’s foundations, resulting in a fractured ecosystem of over-complex frameworks and tools.
For those that hope (or worry) that LLMs will do breakthrough scientific research, I've got good (or bad) news:
LLMs are particularly, exceedingly, marvellously ill-suited to this task. (if you're a researcher, you'll have noticed this already)
Here's why🧵
Breakthrough research requires either:
1. Going in a totally new and unexpected direction that everyone decided long ago was stupid, or 2. Finding some extraordinary new experimental data that means we have to change our theories
LLMs can't run experiments, so we'll focus on 1
It's not helpful to just say "LLMs can't reason", since clearly they do some things which humans would use reasoning for.
But LLMs are, first and foremost, fuzzy subgraph matching machines (at *massive* scale*).