Skip to content

Commit cf49ddc

Browse files
committed
Merge branch 'main' into feat/db-sharing
2 parents c47a434 + e4016cd commit cf49ddc

31 files changed

+3142
-3148
lines changed

.npmrc

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
@jsr:registry=https://npm.jsr.io

README.md

Lines changed: 9 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,10 +1,10 @@
1-
# postgres.new
1+
# database.build ([formerly postgres.new](#why-rename-postgresnew))
22

33
In-browser Postgres sandbox with AI assistance.
44

5-
![github-repo-hero](https://github.com/user-attachments/assets/e55f7c0d-a817-4aeb-838e-728aabda3a5d)
5+
![github-repo-hero](https://github.com/user-attachments/assets/1ace0688-dfa7-4ddb-86bc-c976fa5b2f42)
66

7-
With [postgres.new](https://postgres.new), you can instantly spin up an unlimited number of Postgres databases that run directly in your browser (and soon, deploy them to S3).
7+
With [database.build](https://database.build), you can instantly spin up an unlimited number of Postgres databases that run directly in your browser (and soon, deploy them to S3).
88

99
Each database is paired with a large language model (LLM) which opens the door to some interesting use cases:
1010

@@ -14,7 +14,8 @@ Each database is paired with a large language model (LLM) which opens the door t
1414
- Build database diagrams
1515

1616
## How it works
17-
All queries in postgres.new run directly in your browser. There’s no remote Postgres container or WebSocket proxy.
17+
18+
All queries in database.build run directly in your browser. There’s no remote Postgres container or WebSocket proxy.
1819

1920
How is this possible? [PGlite](https://pglite.dev/), a WASM version of Postgres that can run directly in your browser. Every database that you create spins up a new instance of PGlite that exposes a fully-functional Postgres database. Data is stored in IndexedDB so that changes persist after refresh.
2021

@@ -25,6 +26,10 @@ This is a monorepo split into the following projects:
2526
- [Frontend (Next.js)](./apps/postgres-new/): This contains the primary web app built with Next.js
2627
- [Backend (pg-gateway)](./apps/db-service/): This serves S3-backed PGlite databases over the PG wire protocol using [pg-gateway](https://github.com/supabase-community/pg-gateway)
2728

29+
## Why rename postgres.new?
30+
31+
This project is not an official Postgres project and we don’t want to mislead anyone! We’re renaming to database.build because, well, that’s what this does. This will still be 100% Postgres-focused, just with a different URL.
32+
2833
## Video
2934

3035
[![image](https://github.com/user-attachments/assets/9da04785-d813-4e9c-a400-4e00c63381a1)](https://youtu.be/ooWaPVvljlU)

apps/postgres-new/.env.example

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -11,3 +11,7 @@ OPENAI_API_KEY="<openai-api-key>"
1111
# Vercel KV (local Docker available)
1212
KV_REST_API_URL="http://localhost:8080"
1313
KV_REST_API_TOKEN="local_token"
14+
15+
NEXT_PUBLIC_LEGACY_DOMAIN=https://postgres.new
16+
NEXT_PUBLIC_CURRENT_DOMAIN=https://database.build
17+
REDIRECT_LEGACY_DOMAIN=false
File renamed without changes.
Lines changed: 9 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,9 @@
1+
import Layout from '~/components/layout'
2+
3+
export default function MainLayout({
4+
children,
5+
}: Readonly<{
6+
children: React.ReactNode
7+
}>) {
8+
return <Layout>{children}</Layout>
9+
}
Lines changed: 222 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,222 @@
1+
'use client'
2+
3+
import { TarStream, TarStreamInput } from '@std/tar/tar-stream'
4+
import { chunk } from 'lodash'
5+
import Link from 'next/link'
6+
import { useState } from 'react'
7+
import { useApp } from '~/components/app-provider'
8+
import {
9+
Accordion,
10+
AccordionContent,
11+
AccordionItem,
12+
AccordionTrigger,
13+
} from '~/components/ui/accordion'
14+
import { Button } from '~/components/ui/button'
15+
import { Dialog, DialogContent, DialogHeader, DialogTitle } from '~/components/ui/dialog'
16+
import { Progress } from '~/components/ui/progress'
17+
import { DbManager } from '~/lib/db'
18+
import { countFiles, listFiles } from '~/lib/files'
19+
import {
20+
fileFromStream,
21+
fileToTarStreamFile,
22+
mergeIterables,
23+
readableStreamFromIterable,
24+
transformStreamFromFn,
25+
} from '~/lib/streams'
26+
import {
27+
currentDomainHostname,
28+
currentDomainUrl,
29+
downloadFile,
30+
legacyDomainHostname,
31+
} from '~/lib/util'
32+
33+
export default function Page() {
34+
const { dbManager } = useApp()
35+
const [progress, setProgress] = useState<number>()
36+
37+
return (
38+
<>
39+
<Dialog open>
40+
<DialogContent className="max-w-2xl" showCloseButton={false}>
41+
<DialogHeader>
42+
<DialogTitle>Export your databases</DialogTitle>
43+
<div className="py-2 border-b" />
44+
</DialogHeader>
45+
<p>
46+
{legacyDomainHostname} is renaming to {currentDomainHostname}, which means you need to
47+
transfer your databases if you wish to continue using them.
48+
</p>
49+
50+
<Accordion type="single" collapsible>
51+
<AccordionItem value="item-1" className="border rounded-md">
52+
<AccordionTrigger className="p-0 gap-2 px-3 py-2">
53+
<div className="flex gap-2 items-center font-normal text-lighter text-sm">
54+
<span>
55+
Why is {legacyDomainHostname} renaming to {currentDomainHostname}?
56+
</span>
57+
</div>
58+
</AccordionTrigger>
59+
<AccordionContent className="p-3 prose prose-sm">
60+
This project is not an official Postgres project and we don&apos;t want to mislead
61+
anyone! We&apos;re renaming to{' '}
62+
<Link href={currentDomainUrl} className="underline">
63+
{currentDomainHostname}
64+
</Link>{' '}
65+
because, well, that&apos;s what this does. This will still be 100% Postgres-focused,
66+
just with a different URL.
67+
</AccordionContent>
68+
</AccordionItem>
69+
</Accordion>
70+
<Accordion type="single" collapsible>
71+
<AccordionItem value="item-1" className="border rounded-md">
72+
<AccordionTrigger className="p-0 gap-2 px-3 py-2">
73+
<div className="flex gap-2 items-center font-normal text-lighter text-sm">
74+
<span>Why do I need to export my databases?</span>
75+
</div>
76+
</AccordionTrigger>
77+
<AccordionContent className="p-3 prose prose-sm">
78+
<p>
79+
Since PGlite databases are stored in your browser&apos;s IndexedDB storage, other
80+
domains like{' '}
81+
<Link href={currentDomainUrl} className="underline">
82+
{currentDomainHostname}
83+
</Link>{' '}
84+
cannot access them directly (this is a security restriction built into every
85+
browser).
86+
</p>
87+
<p>
88+
If you&apos;d like to continue using your previous databases and conversations:
89+
<ol>
90+
<li>Export them from {legacyDomainHostname}</li>
91+
<li>Import them to {currentDomainHostname}</li>
92+
</ol>
93+
</p>
94+
</AccordionContent>
95+
</AccordionItem>
96+
</Accordion>
97+
<div className="my-2 border-b" />
98+
<div className="prose">
99+
<h4 className="mb-4">How to transfer your databases to {currentDomainHostname}</h4>
100+
<ol>
101+
<li>
102+
Click <strong>Export</strong> to download all of your databases into a single
103+
tarball.
104+
<br />
105+
{progress === undefined ? (
106+
<Button
107+
className="my-2"
108+
onClick={async () => {
109+
if (!dbManager) {
110+
throw new Error('dbManager is not available')
111+
}
112+
113+
setProgress(0)
114+
115+
const dbCount = await dbManager.countDatabases()
116+
const fileCount = await countFiles()
117+
118+
// Plus 1 for the meta DB
119+
const totalFiles = 1 + dbCount + fileCount
120+
121+
// Passthrough stream to increment progress bar
122+
const progressPassthrough = transformStreamFromFn<
123+
TarStreamInput,
124+
TarStreamInput
125+
>((chunk) => {
126+
if (chunk.type === 'file') {
127+
setProgress((progress) => (progress ?? 0) + 100 / totalFiles)
128+
}
129+
return chunk
130+
})
131+
132+
const fileStream = mergeIterables([
133+
createDumpStream(dbManager),
134+
createStorageStream(),
135+
])
136+
137+
const tarGzStream = readableStreamFromIterable(fileStream)
138+
.pipeThrough(progressPassthrough)
139+
.pipeThrough(new TarStream())
140+
.pipeThrough<Uint8Array>(new CompressionStream('gzip'))
141+
142+
const file = await fileFromStream(
143+
tarGzStream,
144+
`${location.hostname}.tar.gz`,
145+
{ type: 'application/x-gzip' }
146+
)
147+
148+
downloadFile(file)
149+
}}
150+
>
151+
Export
152+
</Button>
153+
) : (
154+
<div className="flex gap-2 text-xs items-center">
155+
<Progress className="my-2 w-[60%]" value={Math.round(progress)} />
156+
{Math.round(progress)}%
157+
</div>
158+
)}
159+
<br />
160+
This tarball will contain every PGlite database&apos;s <code>pgdata</code> dump
161+
along with any files you imported or exported from {legacyDomainHostname}.
162+
</li>
163+
<li>
164+
Navigate to{' '}
165+
<Link href={`${currentDomainUrl}/import`}>{currentDomainHostname}/import</Link> and
166+
click <strong>Import</strong>.
167+
</li>
168+
</ol>
169+
</div>
170+
</DialogContent>
171+
</Dialog>
172+
</>
173+
)
174+
}
175+
176+
/**
177+
* Generates a stream of PGlite dumps for all the databases as tar file/directory entries.
178+
*/
179+
async function* createDumpStream(
180+
dbManager: DbManager,
181+
batchSize = 5
182+
): AsyncIterable<TarStreamInput> {
183+
const databases = await dbManager.exportDatabases()
184+
const batches = chunk(databases, batchSize)
185+
186+
// Meta DB has to be dumped separately
187+
// We intentionally yield this first so that it is
188+
// first in the archive
189+
const metaDb = await dbManager.getMetaDb()
190+
const metaDump = await metaDb.dumpDataDir('gzip')
191+
yield fileToTarStreamFile(new File([metaDump], 'meta.tar.gz', { type: metaDump.type }))
192+
193+
yield { type: 'directory', path: '/dbs' }
194+
195+
// Dump in batches to avoid excessive RAM use
196+
for (const batch of batches) {
197+
// All PGlite instances within a batch are loaded in parallel
198+
yield* await Promise.all(
199+
batch.map(async ({ id }) => {
200+
const db = await dbManager.getDbInstance(id)
201+
const dump = await db.dumpDataDir('gzip')
202+
const file = new File([dump], `${id}.tar.gz`, { type: dump.type })
203+
await dbManager.closeDbInstance(id)
204+
return fileToTarStreamFile(file, '/dbs')
205+
})
206+
)
207+
}
208+
}
209+
210+
/**
211+
* Creates a stream of storage files (eg. CSVs) as tar file/directory entries.
212+
*/
213+
async function* createStorageStream(): AsyncIterable<TarStreamInput> {
214+
yield { type: 'directory', path: '/files' }
215+
216+
for await (const { id, file } of listFiles()) {
217+
// Capture the ID by storing each file in a sub-dir
218+
// named after the ID
219+
yield { type: 'directory', path: `/files/${id}` }
220+
yield fileToTarStreamFile(file, `/files/${id}`)
221+
}
222+
}

0 commit comments

Comments
 (0)