Imagine selecting a paragraph and having ChatGPT automatically correct grammar and spelling errors, then seamlessly replacing it in your buffer. Well, imagine no more - it’s now a reality!
Here’s what it does:
- Selects the current paragraph in Emacs.
- Sends it to ChatGPT for a grammar and spelling check.
- Replaces the original text with the corrected version, all within Emacs.
Inception:
The other night I read a post on X that said LLMs would be used to enhance word prediction for texting on phones. I thought another interesting application would be to easily bring spelling and grammar fixes to whatever I’m editing in emacs.
It’s not flawless, but in my experience, it’s all I need.
Here’s a video example: https://youtu.be/hrhoNE2M9Qw
Here’s the gist: https://gist.github.com/ckopsa/c55bf8cc25df8a4a87c6993bdce3573e
Leverages chatgpt-shell found here: https://github.com/xenodium/chatgpt-shell
Nice, I’ve been using gptel for interacting with LLMs via. emacs, didn’t know about chatgpt-shell.
This and other things also possible with ellama. It also works with local models.
Yeah, I’d be eager to try and see if it makes the response faster without sacrificing quality. Are there models right now that have decent output running on something like a Chromebook?
Mini orca 3b maybe?
On most Chromebooks are very weak hardware. I don’t think it will work fast enough to be useful.
But you can use it with open ai or Google api.
See also [this reply](https://github.com/s-kostyaev/ellama/issues/13#issuecomment-1807954046) about using ellama on weak hardware:
You can try:- lower quantization of zephyr (like
- 7b-beta-q2_K
- ,
- 7b-beta-q3_K_S
- ,
- 7b-beta-q3_K_M
- ,
- 7b-beta-q3_K_L
- lighter models, like
- orca-mini:3b-q4_0
- or
- starcoder:1b
- ,
- starcoder:3b
- for coding
- deploy ollama on more powerful machine and connect ellama to it (on cloud provider, for example)
- use OpenAI of Google API and connect ellama to it
Think of all the thoughts you can think while not thinking!
And now you have to double check everything twice!