There is an updated blog post on Continue
With all the hype surrounding Generative AI/LLM, and all the hallucinations mentioned in the news, what are these actually good for?
As it turns out LLMs trained for code generation are helpful. But what if you don’t want your code going to some cloud provider? This is a great solution for that.
Here is the plan:
- Install Ollama and load the model
- Install Continue
- Try it out
- Conclusion
Install ollama and load the model
Ollama allows you to run models locally:
- Install ollama from https://ollama.com/
- Open a terminal window (these commands were run on an M1 MacBook Pro)
- Run ollama list This should start ollama and show that no models are available
- Run ollama pull g1ibby/deepseek:6.7b (that is a one after the g)
- When this completes run ollama list again and you should see
NAME ID SIZE MODIFIED
g1ibby/deepseek:6.7b f7f889d53789 3.8 GB 15 seconds ago
Install Continue
Continue allows an IDE to access a model running in ollama
- In VSCode, install the Continue plugin from Extensions sidebar (https://continue.dev/)
- Open the Continue sidebar
- Select the Config gear
- And replace the existing “models” section with
"allowAnonymousTelemetry": false,
"models": [
{
"title": "DeepSeek 6.7B",
"provider": "ollama",
"model": "g1ibby/deepseek:6.7b"
}
],
- Note: Setting allowAnonymousTelemetry keeps the model completely offline
- Note: The deepseek model is a reasonably accurate code generator on the tasks I tried
- Select DeepSeek 6.7B in the dropdown to the left of the Config gear
- You may also want to move Continue to the right sidebar
Try it out
Let's say you want to store some data in PostgreSQL.
- Press Command+L (Control+L on Linux). This will open a new Continue session (The extension also has an IDE-specific tutorial for IntelliJ showing keyboard shortcuts.)
- Type Add SQL for the postgreSQL database to create a table with the following 4 columns: a uuid primary key, a required first name, a required last name, and an optional note
This will generate something like:
Here is a simple example of how you could create such a table in PostgreSQL using SQL commands:
CREATE TABLE users (
id UUID PRIMARY KEY DEFAULT uuid_generate_v4(), -- Generate a new unique UUID for each row
first_name VARCHAR(50) NOT NULL, -- Required first name
last_name VARCHAR(50) NOT NULL, -- Required last name
note TEXT -- Optional note
);
In this script:
* UUID PRIMARY KEY DEFAULT uuid_generate_v4() creates a column of UUID type as the primary key. The function uuid_generate_v4() is used to generate a new unique UUID for each row.
* VARCHAR(50) NOT NULL defines a string field that can't be null and has a maximum length of 50 characters.
* TEXT is just the data type for text fields, which can store large amounts of text data (up to ~1GB).
Please replace the column names and types with your preferred ones if needed. The above script assumes you have the UUID extension installed in PostgreSQL. If not, you would need to install it first using CREATE EXTENSION IF NOT EXISTS "uuid-ossp";.
Usually there is something wrong with the answer (such as gen_random_uuid now being a standard part of PostgreSQL 16 so no extension is needed), but it usually seems to be close enough to be helpful to experienced developers and save time. The time it takes on an M1 Mac seems to vary between almost instant to a few seconds.
A function can also be put directly into the code by prefixing the query with /edit for example inside a typescript file
- Select a blank line and press Command+Shift+L
- Type /edit create a function to remove all spaces from a string
And about a second later you have (again not quite correct - bad variable name because there is no context and missing the type)
function removeSpaces(str) {
return str.replace(/\s/g, '');
}
Directly in your ts file
Conclusion
Quick code generation using LLMs seems like a useful helper for any developer, although everything that it generates needs to be verified and tested. I need to spend a lot more time with this to see how it performs with different coding tasks, and I'm planning on sharing these in future posts.