-
Notifications
You must be signed in to change notification settings - Fork 26.2k
[AI, RAG, Docs] generate and expose llms.txt and llms-full.txt files on the docs website #60434
New issue
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
I'm very interested in this feature and would love to contribute if you need any help! 😺 |
Keeping my fingers crossed for this one - great idea! |
This is an interesting idea. Which usecases would be addressed by that providing that file ? |
@JeanMeche LLMs files are becoming more and more the standard for RAG. I've tried some online crawlers and the results is, to say the least, not satisfactory IMHO. Resources: |
Hi @JeanMeche Any conclusions on this matter? 😺 |
yet another reference :) geromegrignon/angular-caniuse#16 |
Thank you @geromegrignon - that's a great article! |
Another possible way: https://repomix.com/ |
Nice @geromegrignon thanks. That is kinda similar to https://gitingest.com/. But Repomix seems to be AI dedicated |
But both of them are still not the llm files for Angular docs website |
fyi: https://github.com/google/A2A/blob/main/llms.txt used with https://gitmcp.io/ it allows models to grok the codebase much more quickly and allow answers about API usage, best practices, source code to come from a tool, rather than the model's base knowledge or limited understanding of only how you might have used the framework in your codebase |
@markgoho thanks! https://gitmcp.io/ seems really cool! |
Hi there, I'm back on the topic. Tbh, I'm having a hard time to find credible sources on the value of llms.txt files. Cursor for example, suggest to directly crawl the doc pages : https://docs.cursor.com/context/@-symbols/@-docs |
@JeanMeche Cursor as file llm.txt: https://docs.cursor.com/llms-full.txt You can find the list of llm.txt files on this website: https://llmstxt.site/ The advantage of having this, especially the full version, which contains all the documentation, is that we could give it to a llm, as a source, so that it gives us answers based on the Angular documentation. Many llm chat applications now offer the possibility of providing a list of files to answer based only on these documents. To be sure to have reliable documentation. This is the case with mistral AI, or Google notebook LM. I would like to be able to include all the Angular documentation in these tools, but I don't have a complete file (text or PDF) of the Angular documentation. |
Hello everyone! Quick update: we've added a couple of files this week that should help with providing context to LLMs (thanks @JeanMeche!):
We'll be adding links to those files to the necessary angular.dev page(s) shorty. As the next step, we plan to look into having more focused files (e.g. a file that collects the content about the template expression syntax, a file with signal APIs, etc), but we don't have any ETAs at this moment. I will close this ticket as complete, please create new tickets if you have feedback or have proposals on how we can improve the content of those files. Thank you. |
@AndrewKushnir thank you! I was waiting for that! <3 |
Which @angular/* package(s) are relevant/related to the feature request?
No response
Description
AI models are typically not up-to-date with the latest information, as they are periodically trained on snapshots of data.
As an Angular developer who uses AI tools for programming, I want the AI to be aware of the latest Angular releases so that I can generate modern code that leverages the newest features available in Angular.
Concept: https://llmstxt.org/
Exapmle: https://docs.stripe.com/llms.txt
This feature would greatly improve DX for Angular developers.
Proposed solution
Generate and expose the llms.txt and llms-full.txt files on the documentation website so that developers can use them as context for AI tool prompts.
Alternatives considered
Provide simple crawler-readable markdown (.md) files for each topic within the documentation.
The text was updated successfully, but these errors were encountered: