Compare commits

...

22 Commits

Author SHA1 Message Date
Michael Yang
a6d03dd510 Merge pull request #110 from jmorganca/fix-pull-0-bytes
fix pull 0 bytes on completed layer
2023-07-18 19:38:59 -07:00
Michael Yang
68df36ae50 fix pull 0 bytes on completed layer 2023-07-18 19:38:11 -07:00
Michael Yang
5540305293 Merge pull request #112 from jmorganca/fix-relative-modelfile
resolve modelfile before passing to server
2023-07-18 19:36:24 -07:00
Michael Yang
d4cfee79d5 resolve modelfile before passing to server 2023-07-18 19:34:05 -07:00
Michael Yang
6e36f948df Merge pull request #109 from jmorganca/fix-create-memory
fix memory leak in create
2023-07-18 17:25:19 -07:00
Michael Yang
553fa39fe8 fix memory leak in create 2023-07-18 17:14:17 -07:00
Jeffrey Morgan
820e581ad8 web: fix typos and add link to discord 2023-07-18 17:03:40 -07:00
Isaac McFadyen
d14785738e README typo fix (#106)
* Fixed typo in README
2023-07-18 16:24:57 -07:00
Patrick Devine
9e15635c2d attempt two for skipping files in the file walk (#105) 2023-07-18 15:37:01 -07:00
Jeffrey Morgan
3e10f902f5 add mario example 2023-07-18 14:27:36 -07:00
Jeffrey Morgan
aa6714f25c fix typo in README.md 2023-07-18 14:03:11 -07:00
Jeffrey Morgan
7f3a37aed4 fix typo 2023-07-18 13:32:06 -07:00
Jeffrey Morgan
7b08280355 move download to the top of README.md 2023-07-18 13:31:25 -07:00
Jeffrey Morgan
e3cc4d5eac update README.md with new syntax 2023-07-18 13:22:46 -07:00
Jeffrey Morgan
8c85dfb735 Add README.md for examples 2023-07-18 13:22:46 -07:00
hoyyeva
ac62a413e5 Merge pull request #103 from jmorganca/web-update
website content and design update
2023-07-18 16:18:04 -04:00
Eva Ho
d1f89778e9 fix css on smaller screen 2023-07-18 16:17:42 -04:00
Eva Ho
df67a90e64 fix css 2023-07-18 16:02:45 -04:00
Eva Ho
576ae644de enable downloader 2023-07-18 15:57:39 -04:00
Eva Ho
7e52e51db1 update website text and design 2023-07-18 15:56:43 -04:00
Michael Chiang
f12df8d79a Merge pull request #101 from jmorganca/adding-logo
add logo
2023-07-18 12:47:20 -07:00
Michael Chiang
65de730bdb Update README.md
add logo
2023-07-18 12:45:38 -07:00
12 changed files with 241 additions and 206 deletions

108
README.md
View File

@@ -1,75 +1,65 @@
![ollama](https://github.com/jmorganca/ollama/assets/251292/961f99bb-251a-4eec-897d-1ba99997ad0f)
<div align="center">
<picture>
<source media="(prefers-color-scheme: dark)" height="200px" srcset="https://github.com/jmorganca/ollama/assets/3325447/318048d2-b2dd-459c-925a-ac8449d5f02c">
<img alt="logo" height="200px" src="https://github.com/jmorganca/ollama/assets/3325447/c7d6e15f-7f4d-4776-b568-c084afa297c2">
</picture>
</div>
# Ollama
Run large language models with `llama.cpp`.
Create, run, and share self-contained large language models (LLMs). Ollama bundles a models weights, configuration, prompts, and more into self-contained packages that run anywhere.
> Note: certain models that can be run with Ollama are intended for research and/or non-commercial use only.
> Note: Ollama is in early preview. Please report any issues you find.
### Features
## Download
- Download and run popular large language models
- Switch between multiple models on the fly
- Hardware acceleration where available (Metal, CUDA)
- Fast inference server written in Go, powered by [llama.cpp](https://github.com/ggerganov/llama.cpp)
- REST API to use with your application (python, typescript SDKs coming soon)
- [Download](https://ollama.ai/download) for macOS on Apple Silicon (Intel coming soon)
- Download for Windows and Linux (coming soon)
- Build [from source](#building)
## Install
## Examples
- [Download](https://ollama.ai/download) for macOS with Apple Silicon (Intel coming soon)
- Download for Windows (coming soon)
You can also build the [binary from source](#building).
## Quickstart
Run a fast and simple model.
### Quickstart
```
ollama run orca
ollama run llama2
>>> hi
Hello! How can I help you today?
```
## Example models
### Creating a custom model
### 💬 Chat
Have a conversation.
Create a `Modelfile`:
```
ollama run vicuna "Why is the sky blue?"
FROM llama2
PROMPT """
You are Mario from Super Mario Bros. Answer as Mario, the assistant, only.
User: {{ .Prompt }}
Mario:
"""
```
### 🗺️ Instructions
Get a helping hand.
Next, create and run the model:
```
ollama run orca "Write an email to my boss."
ollama create mario -f ./Modelfile
ollama run mario
>>> hi
Hello! It's your friend Mario.
```
### 🔎 Ask questions about documents
## Model library
Send the contents of a document and ask questions about it.
Ollama includes a library of open-source, pre-trained models. More models are coming soon.
```
ollama run nous-hermes "$(cat input.txt)", please summarize this story
```
### 📖 Storytelling
Venture into the unknown.
```
ollama run nous-hermes "Once upon a time"
```
## Advanced usage
### Run a local model
```
ollama run ~/Downloads/vicuna-7b-v1.3.ggmlv3.q4_1.bin
```
| Model | Parameters | Size | Download |
| ----------- | ---------- | ----- | ------------------------- |
| Llama2 | 7B | 3.8GB | `ollama pull llama2` |
| Orca Mini | 3B | 1.9GB | `ollama pull orca` |
| Vicuna | 7B | 3.8GB | `ollama pull vicuna` |
| Nous-Hermes | 13B | 7.3GB | `ollama pull nous-hermes` |
## Building
@@ -86,23 +76,5 @@ To run it start the server:
Finally, run a model!
```
./ollama run ~/Downloads/vicuna-7b-v1.3.ggmlv3.q4_1.bin
```
## API Reference
### `POST /api/pull`
Download a model
```
curl -X POST http://localhost:11343/api/pull -d '{"model": "orca"}'
```
### `POST /api/generate`
Complete a prompt
```
curl -X POST http://localhost:11434/api/generate -d '{"model": "orca", "prompt": "hello!"}'
./ollama run llama2
```

View File

@@ -160,11 +160,11 @@ func (c *Client) Generate(ctx context.Context, req *GenerateRequest, fn Generate
})
}
type PullProgressFunc func(PullProgress) error
type PullProgressFunc func(ProgressResponse) error
func (c *Client) Pull(ctx context.Context, req *PullRequest, fn PullProgressFunc) error {
return c.stream(ctx, http.MethodPost, "/api/pull", req, func(bts []byte) error {
var resp PullProgress
var resp ProgressResponse
if err := json.Unmarshal(bts, &resp); err != nil {
return err
}
@@ -173,11 +173,11 @@ func (c *Client) Pull(ctx context.Context, req *PullRequest, fn PullProgressFunc
})
}
type PushProgressFunc func(PushProgress) error
type PushProgressFunc func(ProgressResponse) error
func (c *Client) Push(ctx context.Context, req *PushRequest, fn PushProgressFunc) error {
return c.stream(ctx, http.MethodPost, "/api/push", req, func(bts []byte) error {
var resp PushProgress
var resp ProgressResponse
if err := json.Unmarshal(bts, &resp); err != nil {
return err
}

View File

@@ -43,12 +43,11 @@ type PullRequest struct {
Password string `json:"password"`
}
type PullProgress struct {
type ProgressResponse struct {
Status string `json:"status"`
Digest string `json:"digest,omitempty"`
Total int `json:"total,omitempty"`
Completed int `json:"completed,omitempty"`
Percent float64 `json:"percent,omitempty"`
}
type PushRequest struct {
@@ -57,14 +56,6 @@ type PushRequest struct {
Password string `json:"password"`
}
type PushProgress struct {
Status string `json:"status"`
Digest string `json:"digest,omitempty"`
Total int `json:"total,omitempty"`
Completed int `json:"completed,omitempty"`
Percent float64 `json:"percent,omitempty"`
}
type ListResponse struct {
Models []ListResponseModel `json:"models"`
}

View File

@@ -9,6 +9,7 @@ import (
"net"
"net/http"
"os"
"path/filepath"
"strings"
"time"
@@ -25,6 +26,11 @@ import (
func create(cmd *cobra.Command, args []string) error {
filename, _ := cmd.Flags().GetString("file")
filename, err := filepath.Abs(filename)
if err != nil {
return err
}
client := api.NewClient()
var spinner *Spinner
@@ -83,7 +89,7 @@ func push(cmd *cobra.Command, args []string) error {
client := api.NewClient()
request := api.PushRequest{Name: args[0]}
fn := func(resp api.PushProgress) error {
fn := func(resp api.ProgressResponse) error {
fmt.Println(resp.Status)
return nil
}
@@ -129,25 +135,23 @@ func RunPull(cmd *cobra.Command, args []string) error {
func pull(model string) error {
client := api.NewClient()
var currentDigest string
var bar *progressbar.ProgressBar
currentLayer := ""
request := api.PullRequest{Name: model}
fn := func(resp api.PullProgress) error {
if resp.Digest != currentLayer && resp.Digest != "" {
if currentLayer != "" {
fmt.Println()
}
currentLayer = resp.Digest
layerStr := resp.Digest[7:23] + "..."
fn := func(resp api.ProgressResponse) error {
if resp.Digest != currentDigest && resp.Digest != "" {
currentDigest = resp.Digest
bar = progressbar.DefaultBytes(
int64(resp.Total),
"pulling "+layerStr,
fmt.Sprintf("pulling %s...", resp.Digest[7:19]),
)
} else if resp.Digest == currentLayer && resp.Digest != "" {
bar.Set(resp.Completed)
} else if resp.Digest == currentDigest && resp.Digest != "" {
bar.Set(resp.Completed)
} else {
currentLayer = ""
currentDigest = ""
fmt.Println(resp.Status)
}
return nil

15
examples/README.md Normal file
View File

@@ -0,0 +1,15 @@
# Examples
This directory contains examples that can be created and run with `ollama`.
To create a model:
```
ollama create example -f <example file>
```
To run a model:
```
ollama run example
```

7
examples/mario Normal file
View File

@@ -0,0 +1,7 @@
FROM llama2
PARAMETER temperature 1
PROMPT """
System: You are Mario from super mario bros, acting as an assistant.
User: {{ .Prompt }}
Assistant:
"""

View File

@@ -3,7 +3,6 @@ package server
import (
"bytes"
"crypto/sha256"
"encoding/hex"
"encoding/json"
"errors"
"fmt"
@@ -42,10 +41,9 @@ type Layer struct {
Size int `json:"size"`
}
type LayerWithBuffer struct {
type LayerReader struct {
Layer
Buffer *bytes.Buffer
io.Reader
}
type ConfigV2 struct {
@@ -161,7 +159,7 @@ func CreateModel(name string, mf io.Reader, fn func(status string)) error {
return err
}
var layers []*LayerWithBuffer
var layers []*LayerReader
params := make(map[string]string)
for _, c := range commands {
@@ -274,7 +272,7 @@ func CreateModel(name string, mf io.Reader, fn func(status string)) error {
return nil
}
func removeLayerFromLayers(layers []*LayerWithBuffer, mediaType string) []*LayerWithBuffer {
func removeLayerFromLayers(layers []*LayerReader, mediaType string) []*LayerReader {
j := 0
for _, l := range layers {
if l.MediaType != mediaType {
@@ -285,7 +283,7 @@ func removeLayerFromLayers(layers []*LayerWithBuffer, mediaType string) []*Layer
return layers[:j]
}
func SaveLayers(layers []*LayerWithBuffer, fn func(status string), force bool) error {
func SaveLayers(layers []*LayerReader, fn func(status string), force bool) error {
// Write each of the layers to disk
for _, layer := range layers {
fp, err := GetBlobsPath(layer.Digest)
@@ -303,10 +301,10 @@ func SaveLayers(layers []*LayerWithBuffer, fn func(status string), force bool) e
}
defer out.Close()
_, err = io.Copy(out, layer.Buffer)
if err != nil {
if _, err = io.Copy(out, layer.Reader); err != nil {
return err
}
} else {
fn(fmt.Sprintf("using already created layer %s", layer.Digest))
}
@@ -315,7 +313,7 @@ func SaveLayers(layers []*LayerWithBuffer, fn func(status string), force bool) e
return nil
}
func CreateManifest(name string, cfg *LayerWithBuffer, layers []*Layer) error {
func CreateManifest(name string, cfg *LayerReader, layers []*Layer) error {
mp := ParseModelPath(name)
manifest := ManifestV2{
@@ -341,7 +339,7 @@ func CreateManifest(name string, cfg *LayerWithBuffer, layers []*Layer) error {
return os.WriteFile(fp, manifestJSON, 0o644)
}
func GetLayerWithBufferFromLayer(layer *Layer) (*LayerWithBuffer, error) {
func GetLayerWithBufferFromLayer(layer *Layer) (*LayerReader, error) {
fp, err := GetBlobsPath(layer.Digest)
if err != nil {
return nil, err
@@ -361,7 +359,7 @@ func GetLayerWithBufferFromLayer(layer *Layer) (*LayerWithBuffer, error) {
return newLayer, nil
}
func paramsToReader(params map[string]string) (io.Reader, error) {
func paramsToReader(params map[string]string) (io.ReadSeeker, error) {
opts := api.DefaultOptions()
typeOpts := reflect.TypeOf(opts)
@@ -419,7 +417,7 @@ func paramsToReader(params map[string]string) (io.Reader, error) {
return bytes.NewReader(bts), nil
}
func getLayerDigests(layers []*LayerWithBuffer) ([]string, error) {
func getLayerDigests(layers []*LayerReader) ([]string, error) {
var digests []string
for _, l := range layers {
if l.Digest == "" {
@@ -431,34 +429,30 @@ func getLayerDigests(layers []*LayerWithBuffer) ([]string, error) {
}
// CreateLayer creates a Layer object from a given file
func CreateLayer(f io.Reader) (*LayerWithBuffer, error) {
buf := new(bytes.Buffer)
_, err := io.Copy(buf, f)
if err != nil {
return nil, err
}
func CreateLayer(f io.ReadSeeker) (*LayerReader, error) {
digest, size := GetSHA256Digest(f)
f.Seek(0, 0)
digest, size := GetSHA256Digest(buf)
layer := &LayerWithBuffer{
layer := &LayerReader{
Layer: Layer{
MediaType: "application/vnd.docker.image.rootfs.diff.tar",
Digest: digest,
Size: size,
},
Buffer: buf,
Reader: f,
}
return layer, nil
}
func PushModel(name, username, password string, fn func(status, digest string, Total, Completed int, Percent float64)) error {
func PushModel(name, username, password string, fn func(api.ProgressResponse)) error {
mp := ParseModelPath(name)
fn("retrieving manifest", "", 0, 0, 0)
fn(api.ProgressResponse{Status: "retrieving manifest"})
manifest, err := GetManifest(mp)
if err != nil {
fn("couldn't retrieve manifest", "", 0, 0, 0)
fn(api.ProgressResponse{Status: "couldn't retrieve manifest"})
return err
}
@@ -480,11 +474,21 @@ func PushModel(name, username, password string, fn func(status, digest string, T
if exists {
completed += layer.Size
fn("using existing layer", layer.Digest, total, completed, float64(completed)/float64(total))
fn(api.ProgressResponse{
Status: "using existing layer",
Digest: layer.Digest,
Total: total,
Completed: completed,
})
continue
}
fn("starting upload", layer.Digest, total, completed, float64(completed)/float64(total))
fn(api.ProgressResponse{
Status: "starting upload",
Digest: layer.Digest,
Total: total,
Completed: completed,
})
location, err := startUpload(mp, username, password)
if err != nil {
@@ -498,10 +502,19 @@ func PushModel(name, username, password string, fn func(status, digest string, T
return err
}
completed += layer.Size
fn("upload complete", layer.Digest, total, completed, float64(completed)/float64(total))
fn(api.ProgressResponse{
Status: "upload complete",
Digest: layer.Digest,
Total: total,
Completed: completed,
})
}
fn("pushing manifest", "", total, completed, float64(completed/total))
fn(api.ProgressResponse{
Status: "pushing manifest",
Total: total,
Completed: completed,
})
url := fmt.Sprintf("%s://%s/v2/%s/manifests/%s", mp.ProtocolScheme, mp.Registry, mp.GetNamespaceRepository(), mp.Tag)
headers := map[string]string{
"Content-Type": "application/vnd.docker.distribution.manifest.v2+json",
@@ -524,15 +537,19 @@ func PushModel(name, username, password string, fn func(status, digest string, T
return fmt.Errorf("registry responded with code %d: %v", resp.StatusCode, string(body))
}
fn("success", "", total, completed, 1.0)
fn(api.ProgressResponse{
Status: "success",
Total: total,
Completed: completed,
})
return nil
}
func PullModel(name, username, password string, fn func(status, digest string, Total, Completed int, Percent float64)) error {
func PullModel(name, username, password string, fn func(api.ProgressResponse)) error {
mp := ParseModelPath(name)
fn("pulling manifest", "", 0, 0, 0)
fn(api.ProgressResponse{Status: "pulling manifest"})
manifest, err := pullModelManifest(mp, username, password)
if err != nil {
@@ -550,16 +567,15 @@ func PullModel(name, username, password string, fn func(status, digest string, T
total += manifest.Config.Size
for _, layer := range layers {
fn("starting download", layer.Digest, total, completed, float64(completed)/float64(total))
if err := downloadBlob(mp, layer.Digest, username, password, fn); err != nil {
fn(fmt.Sprintf("error downloading: %v", err), layer.Digest, 0, 0, 0)
fn(api.ProgressResponse{Status: fmt.Sprintf("error downloading: %v", err), Digest: layer.Digest})
return err
}
completed += layer.Size
fn("download complete", layer.Digest, total, completed, float64(completed)/float64(total))
}
fn("writing manifest", "", total, completed, 1.0)
fn(api.ProgressResponse{Status: "writing manifest"})
manifestJSON, err := json.Marshal(manifest)
if err != nil {
@@ -577,7 +593,7 @@ func PullModel(name, username, password string, fn func(status, digest string, T
return err
}
fn("success", "", total, completed, 1.0)
fn(api.ProgressResponse{Status: "success"})
return nil
}
@@ -609,7 +625,7 @@ func pullModelManifest(mp ModelPath, username, password string) (*ManifestV2, er
return m, err
}
func createConfigLayer(layers []string) (*LayerWithBuffer, error) {
func createConfigLayer(layers []string) (*LayerReader, error) {
// TODO change architecture and OS
config := ConfigV2{
Architecture: "arm64",
@@ -628,22 +644,26 @@ func createConfigLayer(layers []string) (*LayerWithBuffer, error) {
buf := bytes.NewBuffer(configJSON)
digest, size := GetSHA256Digest(buf)
layer := &LayerWithBuffer{
layer := &LayerReader{
Layer: Layer{
MediaType: "application/vnd.docker.container.image.v1+json",
Digest: digest,
Size: size,
},
Buffer: buf,
Reader: buf,
}
return layer, nil
}
// GetSHA256Digest returns the SHA256 hash of a given buffer and returns it, and the size of buffer
func GetSHA256Digest(data *bytes.Buffer) (string, int) {
layerBytes := data.Bytes()
hash := sha256.Sum256(layerBytes)
return "sha256:" + hex.EncodeToString(hash[:]), len(layerBytes)
func GetSHA256Digest(r io.Reader) (string, int) {
h := sha256.New()
n, err := io.Copy(h, r)
if err != nil {
log.Fatal(err)
}
return fmt.Sprintf("sha256:%x", h.Sum(nil)), int(n)
}
func startUpload(mp ModelPath, username string, password string) (string, error) {
@@ -725,16 +745,20 @@ func uploadBlob(location string, layer *Layer, username string, password string)
return nil
}
func downloadBlob(mp ModelPath, digest string, username, password string, fn func(status, digest string, Total, Completed int, Percent float64)) error {
func downloadBlob(mp ModelPath, digest string, username, password string, fn func(api.ProgressResponse)) error {
fp, err := GetBlobsPath(digest)
if err != nil {
return err
}
_, err = os.Stat(fp)
if !os.IsNotExist(err) {
if fi, _ := os.Stat(fp); fi != nil {
// we already have the file, so return
log.Printf("already have %s\n", digest)
fn(api.ProgressResponse{
Digest: digest,
Total: int(fi.Size()),
Completed: int(fi.Size()),
})
return nil
}
@@ -783,10 +807,21 @@ func downloadBlob(mp ModelPath, digest string, username, password string, fn fun
total := remaining + completed
for {
fn(fmt.Sprintf("Downloading %s", digest), digest, int(total), int(completed), float64(completed)/float64(total))
fn(api.ProgressResponse{
Status: fmt.Sprintf("downloading %s", digest),
Digest: digest,
Total: int(total),
Completed: int(completed),
})
if completed >= total {
if err := os.Rename(fp+"-partial", fp); err != nil {
fn(fmt.Sprintf("error renaming file: %v", err), digest, int(total), int(completed), 1)
fn(api.ProgressResponse{
Status: fmt.Sprintf("error renaming file: %v", err),
Digest: digest,
Total: int(total),
Completed: int(completed),
})
return err
}

View File

@@ -101,15 +101,10 @@ func pull(c *gin.Context) {
ch := make(chan any)
go func() {
defer close(ch)
fn := func(status, digest string, total, completed int, percent float64) {
ch <- api.PullProgress{
Status: status,
Digest: digest,
Total: total,
Completed: completed,
Percent: percent,
}
fn := func(r api.ProgressResponse) {
ch <- r
}
if err := PullModel(req.Name, req.Username, req.Password, fn); err != nil {
c.JSON(http.StatusInternalServerError, gin.H{"error": err.Error()})
return
@@ -129,15 +124,10 @@ func push(c *gin.Context) {
ch := make(chan any)
go func() {
defer close(ch)
fn := func(status, digest string, total, completed int, percent float64) {
ch <- api.PushProgress{
Status: status,
Digest: digest,
Total: total,
Completed: completed,
Percent: percent,
}
fn := func(r api.ProgressResponse) {
ch <- r
}
if err := PushModel(req.Name, req.Username, req.Password, fn); err != nil {
c.JSON(http.StatusInternalServerError, gin.H{"error": err.Error()})
return
@@ -195,7 +185,8 @@ func list(c *gin.Context) {
if !info.IsDir() {
fi, err := os.Stat(path)
if err != nil {
return err
log.Printf("skipping file: %s", fp)
return nil
}
path := path[len(fp)+1:]
slashIndex := strings.LastIndex(path, "/")

View File

@@ -1,3 +1,4 @@
import Header from '../header'
import Downloader from './downloader'
import Signup from './signup'
@@ -26,22 +27,19 @@ export default async function Download() {
}
return (
<main className='flex min-h-screen max-w-2xl flex-col p-4 lg:p-24 items-center mx-auto'>
<img src='/ollama.png' className='w-16 h-auto' />
<section className='my-12 text-center'>
<h2 className='my-2 max-w-md text-3xl tracking-tight'>Downloading Ollama</h2>
<h3 className='text-sm text-neutral-500'>
Problems downloading?{' '}
<a href={asset.browser_download_url} className='underline'>
Try again
</a>
</h3>
<Downloader url={asset.browser_download_url} />
</section>
<section className='max-w-sm flex flex-col w-full items-center border border-neutral-200 rounded-xl px-8 pt-8 pb-2'>
<p className='text-lg leading-tight text-center mb-6 max-w-[260px]'>Sign up for updates</p>
<>
<Header />
<main className='flex min-h-screen max-w-6xl flex-col py-20 px-16 lg:p-32 items-center mx-auto'>
<img src='/ollama.png' className='w-16 h-auto' />
<section className='mt-12 mb-8 text-center'>
<h2 className='my-2 max-w-md text-3xl tracking-tight'>Downloading...</h2>
<h3 className='text-base text-neutral-500 mt-12 max-w-[16rem]'>
While Ollama downloads, sign up to get notified of new updates.
</h3>
<Downloader url={asset.browser_download_url} />
</section>
<Signup />
</section>
</main>
</main>
</>
)
}

View File

@@ -28,7 +28,7 @@ export default function Signup() {
return false
}}
className='flex self-stretch flex-col gap-3 h-32'
className='flex self-stretch flex-col gap-3 h-32 md:mx-40 lg:mx-72'
>
<input
required
@@ -37,13 +37,13 @@ export default function Signup() {
onChange={e => setEmail(e.target.value)}
type='email'
placeholder='your@email.com'
className='bg-neutral-100 rounded-lg px-4 py-2 focus:outline-none placeholder-neutral-500'
className='border border-neutral-200 rounded-lg px-4 py-2 focus:outline-none placeholder-neutral-300'
/>
<input
type='submit'
value='Get updates'
disabled={submitting}
className='bg-black text-white disabled:text-neutral-200 disabled:bg-neutral-700 rounded-lg px-4 py-2 focus:outline-none cursor-pointer'
className='bg-black text-white disabled:text-neutral-200 disabled:bg-neutral-700 rounded-full px-4 py-2 focus:outline-none cursor-pointer'
/>
{success && <p className='text-center text-sm'>You&apos;re signed up for updates</p>}
</form>

24
web/app/header.tsx Normal file
View File

@@ -0,0 +1,24 @@
const navigation = [
{ name: 'Discord', href: 'https://discord.gg/MrfB5FbNWN' },
{ name: 'GitHub', href: 'https://github.com/jmorganca/ollama' },
{ name: 'Download', href: '/download' },
]
export default function Header() {
return (
<header className='absolute inset-x-0 top-0 z-50'>
<nav className='mx-auto flex items-center justify-between px-10 py-4'>
<a className='flex-1 font-bold' href='/'>
Ollama
</a>
<div className='flex space-x-8'>
{navigation.map(item => (
<a key={item.name} href={item.href} className='text-sm leading-6 text-gray-900'>
{item.name}
</a>
))}
</div>
</nav>
</header>
)
}

View File

@@ -1,34 +1,32 @@
import { AiFillApple } from 'react-icons/ai'
import models from '../../models.json'
import Header from './header'
export default async function Home() {
return (
<main className='flex min-h-screen max-w-2xl flex-col p-4 lg:p-24'>
<img src='/ollama.png' className='w-16 h-auto' />
<section className='my-4'>
<p className='my-3 max-w-md'>
<a className='underline' href='https://github.com/jmorganca/ollama'>
Ollama
</a>{' '}
is a tool for running large language models, currently for macOS with Windows and Linux coming soon.
<br />
<br />
<a href='/download'>
<button className='bg-black text-white text-sm py-2 px-3 rounded-lg flex items-center gap-2'>
<AiFillApple className='h-auto w-5 relative -top-px' /> Download for macOS
</button>
</a>
</p>
</section>
<section className='my-4'>
<h2 className='mb-4 text-lg'>Example models you can try running:</h2>
{models.map(m => (
<div className='my-2 grid font-mono' key={m.name}>
<code className='py-0.5'>ollama run {m.name}</code>
<>
<Header />
<main className='flex min-h-screen max-w-6xl flex-col py-20 px-16 md:p-32 items-center mx-auto'>
<img src='/ollama.png' className='w-16 h-auto' />
<section className='my-12 text-center'>
<div className='flex flex-col space-y-2'>
<h2 className='md:max-w-[18rem] mx-auto my-2 text-3xl tracking-tight'>Portable large language models</h2>
<h3 className='md:max-w-xs mx-auto text-base text-neutral-500'>
Bundle a models weights, configuration, prompts, data and more into self-contained packages that run anywhere.
</h3>
</div>
))}
</section>
</main>
<div className='mx-auto flex flex-col space-y-4 mt-12'>
<a href='/download' className='md:mx-10 lg:mx-14 bg-black text-white rounded-full px-4 py-2 focus:outline-none cursor-pointer'>
Download
</a>
<p className='text-neutral-500 text-sm '>
Available for macOS with Apple Silicon <br />
Windows & Linux support coming soon.
</p>
</div>
</section>
</main>
</>
)
}