...
This commit is contained in:
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
@@ -1,225 +0,0 @@
|
||||
# tus Resumable Upload Protocol (Condensed for Coding Agents)
|
||||
|
||||
## Core Protocol
|
||||
|
||||
All Clients and Servers MUST implement the core protocol for resumable uploads.
|
||||
|
||||
### Resuming an Upload
|
||||
|
||||
1. **Determine Offset (HEAD Request):**
|
||||
* **Request:**
|
||||
```
|
||||
HEAD /files/{upload_id} HTTP/1.1
|
||||
Host: tus.example.org
|
||||
Tus-Resumable: 1.0.0
|
||||
```
|
||||
* **Response:**
|
||||
```
|
||||
HTTP/1.1 200 OK
|
||||
Upload-Offset: {current_offset}
|
||||
Tus-Resumable: 1.0.0
|
||||
```
|
||||
* Server MUST include `Upload-Offset`.
|
||||
* Server MUST include `Upload-Length` if known.
|
||||
* Server SHOULD return `200 OK` or `204 No Content`.
|
||||
* Server MUST prevent caching: `Cache-Control: no-store`.
|
||||
|
||||
2. **Resume Upload (PATCH Request):**
|
||||
* **Request:**
|
||||
```
|
||||
PATCH /files/{upload_id} HTTP/1.1
|
||||
Host: tus.example.org
|
||||
Content-Type: application/offset+octet-stream
|
||||
Content-Length: {chunk_size}
|
||||
Upload-Offset: {current_offset}
|
||||
Tus-Resumable: 1.0.0
|
||||
|
||||
[binary data chunk]
|
||||
```
|
||||
* **Response:**
|
||||
```
|
||||
HTTP/1.1 204 No Content
|
||||
Tus-Resumable: 1.0.0
|
||||
Upload-Offset: {new_offset}
|
||||
```
|
||||
* `Content-Type` MUST be `application/offset+octet-stream`.
|
||||
* `Upload-Offset` in request MUST match server's current offset (else `409 Conflict`).
|
||||
* Server MUST acknowledge with `204 No Content` and `Upload-Offset` (new offset).
|
||||
* Server SHOULD return `404 Not Found` for non-existent resources.
|
||||
|
||||
### Common Headers
|
||||
|
||||
* **`Upload-Offset`**: Non-negative integer. Byte offset within resource.
|
||||
* **`Upload-Length`**: Non-negative integer. Total size of upload in bytes.
|
||||
* **`Tus-Version`**: Comma-separated list of supported protocol versions (Server response).
|
||||
* **`Tus-Resumable`**: Protocol version used (e.g., `1.0.0`). MUST be in every request/response (except `OPTIONS`). If client version unsupported, server responds `412 Precondition Failed` with `Tus-Version`.
|
||||
* **`Tus-Extension`**: Comma-separated list of supported extensions (Server response). Omitted if none.
|
||||
* **`Tus-Max-Size`**: Non-negative integer. Max allowed upload size in bytes (Server response).
|
||||
* **`X-HTTP-Method-Override`**: String. Client MAY use to override HTTP method (e.g., for `PATCH`/`DELETE` limitations).
|
||||
|
||||
### Server Configuration (OPTIONS Request)
|
||||
|
||||
* **Request:**
|
||||
```
|
||||
OPTIONS /files HTTP/1.1
|
||||
Host: tus.example.org
|
||||
```
|
||||
* **Response:**
|
||||
```
|
||||
HTTP/1.1 204 No Content
|
||||
Tus-Resumable: 1.0.0
|
||||
Tus-Version: 1.0.0,0.2.2,0.2.1
|
||||
Tus-Max-Size: 1073741824
|
||||
Tus-Extension: creation,expiration
|
||||
```
|
||||
* Response MUST contain `Tus-Version`. MAY include `Tus-Extension` and `Tus-Max-Size`.
|
||||
* Client SHOULD NOT include `Tus-Resumable` in request.
|
||||
|
||||
## Protocol Extensions
|
||||
|
||||
Clients SHOULD use `OPTIONS` request and `Tus-Extension` header for feature detection.
|
||||
|
||||
### Creation (`creation` extension)
|
||||
|
||||
Create a new upload resource. Server MUST add `creation` to `Tus-Extension`.
|
||||
|
||||
* **Request (POST):**
|
||||
```
|
||||
POST /files HTTP/1.1
|
||||
Host: tus.example.org
|
||||
Content-Length: 0
|
||||
Upload-Length: {total_size} OR Upload-Defer-Length: 1
|
||||
Tus-Resumable: 1.0.0
|
||||
Upload-Metadata: filename {base64_filename},is_confidential
|
||||
```
|
||||
* MUST include `Upload-Length` or `Upload-Defer-Length: 1`.
|
||||
* If `Upload-Defer-Length: 1`, client MUST set `Upload-Length` in subsequent `PATCH`.
|
||||
* `Upload-Length: 0` creates an immediately complete empty file.
|
||||
* Client MAY supply `Upload-Metadata` (key-value pairs, value Base64 encoded).
|
||||
* If `Upload-Length` exceeds `Tus-Max-Size`, server responds `413 Request Entity Too Large`.
|
||||
* **Response:**
|
||||
```
|
||||
HTTP/1.1 201 Created
|
||||
Location: {upload_url}
|
||||
Tus-Resumable: 1.0.0
|
||||
```
|
||||
* Server MUST respond `201 Created` and set `Location` header to new resource URL.
|
||||
* New resource has implicit offset `0`.
|
||||
|
||||
#### Headers
|
||||
|
||||
* **`Upload-Defer-Length`**: `1`. Indicates upload size is unknown. Server adds `creation-defer-length` to `Tus-Extension` if supported.
|
||||
* **`Upload-Metadata`**: Comma-separated `key value` pairs. Key: no spaces/commas, ASCII. Value: Base64 encoded.
|
||||
|
||||
### Creation With Upload (`creation-with-upload` extension)
|
||||
|
||||
Include initial upload data in the `POST` request. Server MUST add `creation-with-upload` to `Tus-Extension`. Depends on `creation` extension.
|
||||
|
||||
* **Request (POST):**
|
||||
```
|
||||
POST /files HTTP/1.1
|
||||
Host: tus.example.org
|
||||
Content-Length: {initial_chunk_size}
|
||||
Upload-Length: {total_size}
|
||||
Tus-Resumable: 1.0.0
|
||||
Content-Type: application/offset+octet-stream
|
||||
Expect: 100-continue
|
||||
|
||||
[initial binary data chunk]
|
||||
```
|
||||
* Similar rules as `PATCH` apply for content.
|
||||
* Client SHOULD include `Expect: 100-continue`.
|
||||
* **Response:**
|
||||
```
|
||||
HTTP/1.1 201 Created
|
||||
Location: {upload_url}
|
||||
Tus-Resumable: 1.0.0
|
||||
Upload-Offset: {accepted_offset}
|
||||
```
|
||||
* Server MUST include `Upload-Offset` with accepted bytes.
|
||||
|
||||
### Expiration (`expiration` extension)
|
||||
|
||||
Server MAY remove unfinished uploads. Server MUST add `expiration` to `Tus-Extension`.
|
||||
|
||||
* **Response (PATCH/POST):**
|
||||
```
|
||||
HTTP/1.1 204 No Content
|
||||
Upload-Expires: Wed, 25 Jun 2014 16:00:00 GMT
|
||||
Tus-Resumable: 1.0.0
|
||||
Upload-Offset: {new_offset}
|
||||
```
|
||||
* **`Upload-Expires`**: Datetime in RFC 9110 format. Indicates when upload expires. Client SHOULD use to check validity. Server SHOULD respond `404 Not Found` or `410 Gone` for expired uploads.
|
||||
|
||||
### Checksum (`checksum` extension)
|
||||
|
||||
Verify data integrity of `PATCH` requests. Server MUST add `checksum` to `Tus-Extension`. Server MUST support `sha1`.
|
||||
|
||||
* **Request (PATCH):**
|
||||
```
|
||||
PATCH /files/{upload_id} HTTP/1.1
|
||||
Content-Length: {chunk_size}
|
||||
Upload-Offset: {current_offset}
|
||||
Tus-Resumable: 1.0.0
|
||||
Upload-Checksum: {algorithm} {base64_checksum}
|
||||
|
||||
[binary data chunk]
|
||||
```
|
||||
* **Response:**
|
||||
* `204 No Content`: Checksums match.
|
||||
* `400 Bad Request`: Algorithm not supported.
|
||||
* `460 Checksum Mismatch`: Checksums mismatch.
|
||||
* In `400`/`460` cases, chunk MUST be discarded, upload/offset NOT updated.
|
||||
* **`Tus-Checksum-Algorithm`**: Comma-separated list of supported algorithms (Server response to `OPTIONS`).
|
||||
* **`Upload-Checksum`**: `{algorithm} {Base64_encoded_checksum}`.
|
||||
|
||||
### Termination (`termination` extension)
|
||||
|
||||
Client can terminate uploads. Server MUST add `termination` to `Tus-Extension`.
|
||||
|
||||
* **Request (DELETE):**
|
||||
```
|
||||
DELETE /files/{upload_id} HTTP/1.1
|
||||
Host: tus.example.org
|
||||
Content-Length: 0
|
||||
Tus-Resumable: 1.0.0
|
||||
```
|
||||
* **Response:**
|
||||
```
|
||||
HTTP/1.1 204 No Content
|
||||
Tus-Resumable: 1.0.0
|
||||
```
|
||||
* Server SHOULD free resources, MUST respond `204 No Content`.
|
||||
* Future requests to URL SHOULD return `404 Not Found` or `410 Gone`.
|
||||
|
||||
### Concatenation (`concatenation` extension)
|
||||
|
||||
Concatenate multiple partial uploads into a single final upload. Server MUST add `concatenation` to `Tus-Extension`.
|
||||
|
||||
* **Partial Upload Creation (POST):**
|
||||
```
|
||||
POST /files HTTP/1.1
|
||||
Upload-Concat: partial
|
||||
Upload-Length: {partial_size}
|
||||
Tus-Resumable: 1.0.0
|
||||
```
|
||||
* `Upload-Concat: partial` header.
|
||||
* Server SHOULD NOT process partial uploads until concatenated.
|
||||
* **Final Upload Creation (POST):**
|
||||
```
|
||||
POST /files HTTP/1.1
|
||||
Upload-Concat: final;{url_partial1} {url_partial2} ...
|
||||
Tus-Resumable: 1.0.0
|
||||
```
|
||||
* `Upload-Concat: final;{space-separated_partial_urls}`.
|
||||
* Client MUST NOT include `Upload-Length`.
|
||||
* Final upload length is sum of partials.
|
||||
* Server MAY delete partials after concatenation.
|
||||
* Server MUST respond `403 Forbidden` to `PATCH` requests against final upload.
|
||||
* **`concatenation-unfinished`**: Server adds to `Tus-Extension` if it supports concatenation while partial uploads are in progress.
|
||||
* **HEAD Request for Final Upload:**
|
||||
* Response SHOULD NOT contain `Upload-Offset` unless concatenation finished.
|
||||
* After success, `Upload-Offset` and `Upload-Length` MUST be equal.
|
||||
* Response MUST include `Upload-Concat` header.
|
||||
* **HEAD Request for Partial Upload:**
|
||||
* Response MUST contain `Upload-Offset`.
|
||||
@@ -1,667 +0,0 @@
|
||||
|
||||
# TUS (1.0.0) — Server-Side Specs (Concise)
|
||||
|
||||
## Always
|
||||
|
||||
* All requests/responses **except** `OPTIONS` MUST include: `Tus-Resumable: 1.0.0`.
|
||||
If unsupported → `412 Precondition Failed` + `Tus-Version`.
|
||||
* Canonical server features via `OPTIONS /files`:
|
||||
|
||||
* `Tus-Version: 1.0.0`
|
||||
* `Tus-Extension: creation,creation-with-upload,termination,checksum,concatenation,concatenation-unfinished` (as supported)
|
||||
* `Tus-Max-Size: <int>` (if hard limit)
|
||||
* `Tus-Checksum-Algorithm: sha1[,md5,crc32...]` (if checksum ext.)
|
||||
|
||||
## Core
|
||||
|
||||
* **Create:** `POST /files` with `Upload-Length: <int>` OR `Upload-Defer-Length: 1`. Optional `Upload-Metadata`.
|
||||
|
||||
* `201 Created` + `Location: /files/{id}`, echo `Tus-Resumable`.
|
||||
* *Creation-With-Upload:* If body present → `Content-Type: application/offset+octet-stream`, accept bytes, respond with `Upload-Offset`.
|
||||
* **Status:** `HEAD /files/{id}`
|
||||
|
||||
* Always return `Upload-Offset` for partial uploads, include `Upload-Length` if known; if deferred, return `Upload-Defer-Length: 1`. `Cache-Control: no-store`.
|
||||
* **Upload:** `PATCH /files/{id}`
|
||||
|
||||
* `Content-Type: application/offset+octet-stream` and `Upload-Offset` (must match server).
|
||||
* On success → `204 No Content` + new `Upload-Offset`.
|
||||
* Mismatch → `409 Conflict`. Bad type → `415 Unsupported Media Type`.
|
||||
* **Terminate:** `DELETE /files/{id}` (if supported) → `204 No Content`. Subsequent requests → `404/410`.
|
||||
|
||||
## Checksum (optional but implemented here)
|
||||
|
||||
* Client MAY send: `Upload-Checksum: <algo> <base64digest>` per `PATCH`.
|
||||
|
||||
* Server MUST verify request body’s checksum of the exact received bytes.
|
||||
* If algo unsupported → `400 Bad Request`.
|
||||
* If mismatch → **discard the chunk** (no offset change) and respond `460 Checksum Mismatch`.
|
||||
* If OK → `204 No Content` + new `Upload-Offset`.
|
||||
* `OPTIONS` MUST include `Tus-Checksum-Algorithm` (comma-separated algos).
|
||||
|
||||
## Concatenation (optional but implemented here)
|
||||
|
||||
* **Partial uploads:** `POST /files` with `Upload-Concat: partial` and `Upload-Length`. (MUST have length; may use creation-with-upload/patch thereafter.)
|
||||
* **Final upload:** `POST /files` with
|
||||
`Upload-Concat: final; /files/{a} /files/{b} ...`
|
||||
|
||||
* MUST NOT include `Upload-Length`.
|
||||
* Final uploads **cannot** be `PATCH`ed (`403`).
|
||||
* Server SHOULD assemble final (in order).
|
||||
* If `concatenation-unfinished` supported, final may be created before partials completed; server completes once all partials are done.
|
||||
* **HEAD semantics:**
|
||||
|
||||
* For *partial*: MUST include `Upload-Offset`.
|
||||
* For *final* before concatenation: SHOULD NOT include `Upload-Offset`. `Upload-Length` MAY be present if computable (= sum of partials’ lengths when known).
|
||||
* After finalization: `Upload-Offset == Upload-Length`.
|
||||
|
||||
---
|
||||
|
||||
# TUS FastAPI Server (disk-only, crash-safe, checksum + concatenation)
|
||||
|
||||
**Features**
|
||||
|
||||
* All persistent state on disk:
|
||||
|
||||
```
|
||||
TUS_ROOT/
|
||||
{upload_id}/
|
||||
info.json # canonical metadata & status
|
||||
data.part # exists while uploading or while building final
|
||||
data # final file after atomic rename
|
||||
```
|
||||
* Crash recovery: `HEAD` offset = size of `data.part` or `data`.
|
||||
* `.part` during upload; `os.replace()` (atomic) to `data` on completion.
|
||||
* Streaming I/O; `fsync` on file + parent directory.
|
||||
* Checksum: supports `sha1` (can easily add md5/crc32).
|
||||
* Concatenation: server builds final when partials complete; supports `concatenation-unfinished`.
|
||||
|
||||
> Run with: `uv pip install fastapi uvicorn` then `uvicorn tus_server:app --host 0.0.0.0 --port 8080` (or `python tus_server.py`).
|
||||
> Set `TUS_ROOT` env to choose storage root.
|
||||
|
||||
```python
|
||||
# tus_server.py
|
||||
from fastapi import FastAPI, Request, Response, HTTPException
|
||||
from typing import Optional, Dict, Any, List
|
||||
import os, json, uuid, base64, asyncio, errno, hashlib
|
||||
|
||||
# -----------------------------
|
||||
# Config
|
||||
# -----------------------------
|
||||
TUS_VERSION = "1.0.0"
|
||||
# Advertise extensions implemented below:
|
||||
TUS_EXTENSIONS = ",".join([
|
||||
"creation",
|
||||
"creation-with-upload",
|
||||
"termination",
|
||||
"checksum",
|
||||
"concatenation",
|
||||
"concatenation-unfinished",
|
||||
])
|
||||
# Supported checksum algorithms (keys = header token)
|
||||
CHECKSUM_ALGOS = ["sha1"] # add "md5" if desired
|
||||
|
||||
TUS_ROOT = os.environ.get("TUS_ROOT", "/tmp/tus")
|
||||
MAX_SIZE = 1 << 40 # 1 TiB default
|
||||
|
||||
os.makedirs(TUS_ROOT, exist_ok=True)
|
||||
app = FastAPI()
|
||||
|
||||
# Per-process locks to prevent concurrent mutations on same upload_id
|
||||
_locks: Dict[str, asyncio.Lock] = {}
|
||||
def _lock_for(upload_id: str) -> asyncio.Lock:
|
||||
if upload_id not in _locks:
|
||||
_locks[upload_id] = asyncio.Lock()
|
||||
return _locks[upload_id]
|
||||
|
||||
# -----------------------------
|
||||
# Path helpers
|
||||
# -----------------------------
|
||||
def upload_dir(upload_id: str) -> str:
|
||||
return os.path.join(TUS_ROOT, upload_id)
|
||||
|
||||
def info_path(upload_id: str) -> str:
|
||||
return os.path.join(upload_dir(upload_id), "info.json")
|
||||
|
||||
def part_path(upload_id: str) -> str:
|
||||
return os.path.join(upload_dir(upload_id), "data.part")
|
||||
|
||||
def final_path(upload_id: str) -> str:
|
||||
return os.path.join(upload_dir(upload_id), "data")
|
||||
|
||||
# -----------------------------
|
||||
# FS utils (crash-safe)
|
||||
# -----------------------------
|
||||
def _fsync_dir(path: str) -> None:
|
||||
fd = os.open(path, os.O_DIRECTORY)
|
||||
try:
|
||||
os.fsync(fd)
|
||||
finally:
|
||||
os.close(fd)
|
||||
|
||||
def _write_json_atomic(path: str, obj: Dict[str, Any]) -> None:
|
||||
tmp = f"{path}.tmp"
|
||||
data = json.dumps(obj, separators=(",", ":"), ensure_ascii=False)
|
||||
with open(tmp, "w", encoding="utf-8") as f:
|
||||
f.write(data)
|
||||
f.flush()
|
||||
os.fsync(f.fileno())
|
||||
os.replace(tmp, path)
|
||||
_fsync_dir(os.path.dirname(path))
|
||||
|
||||
def _read_json(path: str) -> Dict[str, Any]:
|
||||
with open(path, "r", encoding="utf-8") as f:
|
||||
return json.load(f)
|
||||
|
||||
def _size(path: str) -> int:
|
||||
try:
|
||||
return os.path.getsize(path)
|
||||
except FileNotFoundError:
|
||||
return 0
|
||||
|
||||
def _exists(path: str) -> bool:
|
||||
return os.path.exists(path)
|
||||
|
||||
# -----------------------------
|
||||
# TUS helpers
|
||||
# -----------------------------
|
||||
def _ensure_tus_version(req: Request):
|
||||
if req.method == "OPTIONS":
|
||||
return
|
||||
v = req.headers.get("Tus-Resumable")
|
||||
if v is None:
|
||||
raise HTTPException(status_code=412, detail="Missing Tus-Resumable")
|
||||
if v != TUS_VERSION:
|
||||
raise HTTPException(status_code=412, detail="Unsupported Tus-Resumable",
|
||||
headers={"Tus-Version": TUS_VERSION})
|
||||
|
||||
def _parse_metadata(raw: Optional[str]) -> str:
|
||||
# Raw passthrough; validate/consume in your app if needed.
|
||||
return raw or ""
|
||||
|
||||
def _new_upload_info(upload_id: str,
|
||||
kind: str, # "single" | "partial" | "final"
|
||||
length: Optional[int],
|
||||
defer_length: bool,
|
||||
metadata: str,
|
||||
parts: Optional[List[str]] = None) -> Dict[str, Any]:
|
||||
return {
|
||||
"upload_id": upload_id,
|
||||
"kind": kind, # "single" (default), "partial", or "final"
|
||||
"length": length, # int or None if deferred/unknown
|
||||
"defer_length": bool(defer_length),
|
||||
"metadata": metadata, # raw Upload-Metadata header
|
||||
"completed": False,
|
||||
"parts": parts or [], # for final: list of upload_ids (not URLs)
|
||||
}
|
||||
|
||||
def _load_info_or_404(upload_id: str) -> Dict[str, Any]:
|
||||
p = info_path(upload_id)
|
||||
if not _exists(p):
|
||||
raise HTTPException(404, "Upload not found")
|
||||
try:
|
||||
return _read_json(p)
|
||||
except Exception as e:
|
||||
raise HTTPException(500, f"Corrupt metadata: {e}")
|
||||
|
||||
def _set_info(upload_id: str, info: Dict[str, Any]) -> None:
|
||||
_write_json_atomic(info_path(upload_id), info)
|
||||
|
||||
def _ensure_dir(path: str):
|
||||
os.makedirs(path, exist_ok=False)
|
||||
|
||||
def _atomic_finalize_file(upload_id: str):
|
||||
"""Rename data.part → data and mark completed."""
|
||||
upath = upload_dir(upload_id)
|
||||
p = part_path(upload_id)
|
||||
f = final_path(upload_id)
|
||||
if _exists(p):
|
||||
with open(p, "rb+") as fp:
|
||||
fp.flush()
|
||||
os.fsync(fp.fileno())
|
||||
os.replace(p, f)
|
||||
_fsync_dir(upath)
|
||||
info = _load_info_or_404(upload_id)
|
||||
info["completed"] = True
|
||||
_set_info(upload_id, info)
|
||||
|
||||
def _current_offsets(upload_id: str):
|
||||
f, p = final_path(upload_id), part_path(upload_id)
|
||||
if _exists(f):
|
||||
return True, False, _size(f)
|
||||
if _exists(p):
|
||||
return False, True, _size(p)
|
||||
return False, False, 0
|
||||
|
||||
def _parse_concat_header(h: Optional[str]) -> Optional[Dict[str, Any]]:
|
||||
if not h:
|
||||
return None
|
||||
h = h.strip()
|
||||
if h == "partial":
|
||||
return {"type": "partial", "parts": []}
|
||||
if h.startswith("final;"):
|
||||
# format: final;/files/a /files/b
|
||||
rest = h[len("final;"):].strip()
|
||||
urls = [s for s in rest.split(" ") if s]
|
||||
return {"type": "final", "parts": urls}
|
||||
return None
|
||||
|
||||
def _extract_upload_id_from_url(url: str) -> str:
|
||||
# Accept relative /files/{id} (common) — robust split:
|
||||
segs = [s for s in url.split("/") if s]
|
||||
return segs[-1] if segs else url
|
||||
|
||||
def _sum_lengths_or_none(ids: List[str]) -> Optional[int]:
|
||||
total = 0
|
||||
for pid in ids:
|
||||
info = _load_info_or_404(pid)
|
||||
if info.get("length") is None:
|
||||
return None
|
||||
total += int(info["length"])
|
||||
return total
|
||||
|
||||
async def _stream_with_checksum_and_append(file_obj, request: Request, algo: Optional[str]) -> int:
|
||||
"""Stream request body to file, verifying checksum if header present.
|
||||
Returns bytes written. On checksum mismatch, truncate to original size and raise HTTPException(460)."""
|
||||
start_pos = file_obj.tell()
|
||||
# Choose hash
|
||||
hasher = None
|
||||
provided_digest = None
|
||||
if algo:
|
||||
if algo not in CHECKSUM_ALGOS:
|
||||
raise HTTPException(400, "Unsupported checksum algorithm")
|
||||
if algo == "sha1":
|
||||
hasher = hashlib.sha1()
|
||||
# elif algo == "md5": hasher = hashlib.md5()
|
||||
# elif algo == "crc32": ... (custom)
|
||||
# Read expected checksum
|
||||
if hasher:
|
||||
uh = request.headers.get("Upload-Checksum")
|
||||
if not uh:
|
||||
# spec: checksum header optional; if algo passed to this fn we must have parsed it already
|
||||
pass
|
||||
else:
|
||||
try:
|
||||
name, b64 = uh.split(" ", 1)
|
||||
if name != algo:
|
||||
raise ValueError()
|
||||
provided_digest = base64.b64decode(b64.encode("ascii"))
|
||||
except Exception:
|
||||
raise HTTPException(400, "Invalid Upload-Checksum")
|
||||
written = 0
|
||||
async for chunk in request.stream():
|
||||
if not chunk:
|
||||
continue
|
||||
file_obj.write(chunk)
|
||||
if hasher:
|
||||
hasher.update(chunk)
|
||||
written += len(chunk)
|
||||
# Verify checksum if present
|
||||
if hasher and provided_digest is not None:
|
||||
digest = hasher.digest()
|
||||
if digest != provided_digest:
|
||||
# rollback appended bytes
|
||||
file_obj.truncate(start_pos)
|
||||
file_obj.flush()
|
||||
os.fsync(file_obj.fileno())
|
||||
raise HTTPException(status_code=460, detail="Checksum Mismatch")
|
||||
file_obj.flush()
|
||||
os.fsync(file_obj.fileno())
|
||||
return written
|
||||
|
||||
def _try_finalize_final(upload_id: str):
|
||||
"""If this is a final upload and all partials are completed, build final data and finalize atomically."""
|
||||
info = _load_info_or_404(upload_id)
|
||||
if info.get("kind") != "final" or info.get("completed"):
|
||||
return
|
||||
part_ids = info.get("parts", [])
|
||||
# Check all partials completed and have data
|
||||
for pid in part_ids:
|
||||
pinf = _load_info_or_404(pid)
|
||||
if not pinf.get("completed"):
|
||||
return # still not ready
|
||||
if not _exists(final_path(pid)):
|
||||
# tolerate leftover .part (e.g., if completed used .part->data). If data missing, can't finalize.
|
||||
return
|
||||
# Build final .part by concatenating parts' data in order, then atomically rename
|
||||
up = upload_dir(upload_id)
|
||||
os.makedirs(up, exist_ok=True)
|
||||
ppath = part_path(upload_id)
|
||||
# Reset/overwrite .part
|
||||
with open(ppath, "wb") as out:
|
||||
for pid in part_ids:
|
||||
with open(final_path(pid), "rb") as src:
|
||||
for chunk in iter(lambda: src.read(1024 * 1024), b""):
|
||||
out.write(chunk)
|
||||
out.flush()
|
||||
os.fsync(out.fileno())
|
||||
# If server can compute length now, set it
|
||||
length = _sum_lengths_or_none(part_ids)
|
||||
info["length"] = length if length is not None else info.get("length")
|
||||
_set_info(upload_id, info)
|
||||
_atomic_finalize_file(upload_id)
|
||||
|
||||
# -----------------------------
|
||||
# Routes
|
||||
# -----------------------------
|
||||
@app.options("/files")
|
||||
async def tus_options():
|
||||
headers = {
|
||||
"Tus-Version": TUS_VERSION,
|
||||
"Tus-Extension": TUS_EXTENSIONS,
|
||||
"Tus-Max-Size": str(MAX_SIZE),
|
||||
"Tus-Checksum-Algorithm": ",".join(CHECKSUM_ALGOS),
|
||||
}
|
||||
return Response(status_code=204, headers=headers)
|
||||
|
||||
@app.post("/files")
|
||||
async def tus_create(request: Request):
|
||||
_ensure_tus_version(request)
|
||||
|
||||
metadata = _parse_metadata(request.headers.get("Upload-Metadata"))
|
||||
concat = _parse_concat_header(request.headers.get("Upload-Concat"))
|
||||
|
||||
# Validate creation modes
|
||||
hdr_len = request.headers.get("Upload-Length")
|
||||
hdr_defer = request.headers.get("Upload-Defer-Length")
|
||||
|
||||
if concat and concat["type"] == "partial":
|
||||
# Partial MUST have Upload-Length (spec)
|
||||
if hdr_len is None:
|
||||
raise HTTPException(400, "Partial uploads require Upload-Length")
|
||||
if hdr_defer is not None:
|
||||
raise HTTPException(400, "Partial uploads cannot defer length")
|
||||
elif concat and concat["type"] == "final":
|
||||
# Final MUST NOT include Upload-Length
|
||||
if hdr_len is not None or hdr_defer is not None:
|
||||
raise HTTPException(400, "Final uploads must not include Upload-Length or Upload-Defer-Length")
|
||||
else:
|
||||
# Normal single upload: require length or defer
|
||||
if hdr_len is None and hdr_defer != "1":
|
||||
raise HTTPException(400, "Must provide Upload-Length or Upload-Defer-Length: 1")
|
||||
|
||||
# Parse length
|
||||
length: Optional[int] = None
|
||||
defer = False
|
||||
if hdr_len is not None:
|
||||
try:
|
||||
length = int(hdr_len)
|
||||
if length < 0: raise ValueError()
|
||||
except ValueError:
|
||||
raise HTTPException(400, "Invalid Upload-Length")
|
||||
if length > MAX_SIZE:
|
||||
raise HTTPException(413, "Upload too large")
|
||||
elif not concat or concat["type"] != "final":
|
||||
# final has no length at creation
|
||||
defer = (hdr_defer == "1")
|
||||
|
||||
upload_id = str(uuid.uuid4())
|
||||
udir = upload_dir(upload_id)
|
||||
_ensure_dir(udir)
|
||||
|
||||
if concat and concat["type"] == "final":
|
||||
# Resolve part ids from URLs
|
||||
part_ids = [_extract_upload_id_from_url(u) for u in concat["parts"]]
|
||||
# Compute length if possible
|
||||
sum_len = _sum_lengths_or_none(part_ids)
|
||||
info = _new_upload_info(upload_id, "final", sum_len, False, metadata, part_ids)
|
||||
_set_info(upload_id, info)
|
||||
|
||||
# Prepare empty .part (will be filled when partials complete)
|
||||
with open(part_path(upload_id), "wb") as f:
|
||||
f.flush(); os.fsync(f.fileno())
|
||||
_fsync_dir(udir)
|
||||
|
||||
# If all partials already complete, finalize immediately
|
||||
_try_finalize_final(upload_id)
|
||||
|
||||
return Response(status_code=201,
|
||||
headers={"Location": f"/files/{upload_id}",
|
||||
"Tus-Resumable": TUS_VERSION})
|
||||
|
||||
# Create partial or single
|
||||
kind = "partial" if (concat and concat["type"] == "partial") else "single"
|
||||
info = _new_upload_info(upload_id, kind, length, defer, metadata)
|
||||
_set_info(upload_id, info)
|
||||
|
||||
# Create empty .part
|
||||
with open(part_path(upload_id), "wb") as f:
|
||||
f.flush(); os.fsync(f.fileno())
|
||||
_fsync_dir(udir)
|
||||
|
||||
# Creation-With-Upload (optional body)
|
||||
upload_offset = 0
|
||||
has_body = request.headers.get("Content-Length") or request.headers.get("Transfer-Encoding")
|
||||
if has_body:
|
||||
ctype = request.headers.get("Content-Type", "")
|
||||
if ctype != "application/offset+octet-stream":
|
||||
raise HTTPException(415, "Content-Type must be application/offset+octet-stream for creation-with-upload")
|
||||
# Checksum header optional; if present, parse algo token
|
||||
uh = request.headers.get("Upload-Checksum")
|
||||
algo = None
|
||||
if uh:
|
||||
try:
|
||||
algo = uh.split(" ", 1)[0]
|
||||
except Exception:
|
||||
raise HTTPException(400, "Invalid Upload-Checksum")
|
||||
|
||||
async with _lock_for(upload_id):
|
||||
with open(part_path(upload_id), "ab+") as f:
|
||||
f.seek(0, os.SEEK_END)
|
||||
upload_offset = await _stream_with_checksum_and_append(f, request, algo)
|
||||
|
||||
# If length known and we hit it, finalize
|
||||
inf = _load_info_or_404(upload_id)
|
||||
if inf["length"] is not None and upload_offset == int(inf["length"]):
|
||||
_atomic_finalize_file(upload_id)
|
||||
# If this is a partial that belongs to some final, a watcher could finalize final; here we rely on
|
||||
# client to create final explicitly (spec). Finalization of final is handled by _try_finalize_final
|
||||
# when final resource is created (or rechecked on subsequent HEAD/PATCH).
|
||||
headers = {"Location": f"/files/{upload_id}", "Tus-Resumable": TUS_VERSION}
|
||||
if upload_offset:
|
||||
headers["Upload-Offset"] = str(upload_offset)
|
||||
return Response(status_code=201, headers=headers)
|
||||
|
||||
@app.head("/files/{upload_id}")
|
||||
async def tus_head(upload_id: str, request: Request):
|
||||
_ensure_tus_version(request)
|
||||
info = _load_info_or_404(upload_id)
|
||||
is_final = info.get("kind") == "final"
|
||||
|
||||
headers = {
|
||||
"Tus-Resumable": TUS_VERSION,
|
||||
"Cache-Control": "no-store",
|
||||
}
|
||||
if info.get("metadata"):
|
||||
headers["Upload-Metadata"] = info["metadata"]
|
||||
|
||||
if info.get("length") is not None:
|
||||
headers["Upload-Length"] = str(int(info["length"]))
|
||||
elif info.get("defer_length"):
|
||||
headers["Upload-Defer-Length"] = "1"
|
||||
|
||||
exists_final, exists_part, offset = False, False, 0
|
||||
if is_final and not info.get("completed"):
|
||||
# BEFORE concatenation completes: SHOULD NOT include Upload-Offset
|
||||
# Try to see if we can finalize now (e.g., partials completed after crash)
|
||||
_try_finalize_final(upload_id)
|
||||
info = _load_info_or_404(upload_id)
|
||||
if info.get("completed"):
|
||||
# fallthrough to completed case
|
||||
pass
|
||||
else:
|
||||
# For in-progress final, no Upload-Offset; include Upload-Length if computable (already handled above)
|
||||
return Response(status_code=200, headers=headers)
|
||||
|
||||
# For partials or completed finals
|
||||
f = final_path(upload_id)
|
||||
p = part_path(upload_id)
|
||||
if _exists(f):
|
||||
exists_final, offset = True, _size(f)
|
||||
elif _exists(p):
|
||||
exists_part, offset = True, _size(p)
|
||||
else:
|
||||
# if info exists but no data, consider gone
|
||||
raise HTTPException(410, "Upload gone")
|
||||
|
||||
headers["Upload-Offset"] = str(offset)
|
||||
return Response(status_code=200, headers=headers)
|
||||
|
||||
@app.patch("/files/{upload_id}")
|
||||
async def tus_patch(upload_id: str, request: Request):
|
||||
_ensure_tus_version(request)
|
||||
info = _load_info_or_404(upload_id)
|
||||
|
||||
if info.get("kind") == "final":
|
||||
raise HTTPException(403, "Final uploads cannot be patched")
|
||||
|
||||
ctype = request.headers.get("Content-Type", "")
|
||||
if ctype != "application/offset+octet-stream":
|
||||
raise HTTPException(415, "Content-Type must be application/offset+octet-stream")
|
||||
|
||||
# Client offset must match server
|
||||
try:
|
||||
client_offset = int(request.headers.get("Upload-Offset", "-1"))
|
||||
if client_offset < 0: raise ValueError()
|
||||
except ValueError:
|
||||
raise HTTPException(400, "Invalid or missing Upload-Offset")
|
||||
|
||||
# If length deferred, client may now set Upload-Length (once)
|
||||
if info.get("length") is None and info.get("defer_length"):
|
||||
if "Upload-Length" in request.headers:
|
||||
try:
|
||||
new_len = int(request.headers["Upload-Length"])
|
||||
if new_len < 0:
|
||||
raise ValueError()
|
||||
except ValueError:
|
||||
raise HTTPException(400, "Invalid Upload-Length")
|
||||
if new_len > MAX_SIZE:
|
||||
raise HTTPException(413, "Upload too large")
|
||||
info["length"] = new_len
|
||||
info["defer_length"] = False
|
||||
_set_info(upload_id, info)
|
||||
|
||||
# Determine current server offset
|
||||
f = final_path(upload_id)
|
||||
p = part_path(upload_id)
|
||||
if _exists(f):
|
||||
raise HTTPException(403, "Upload already finalized")
|
||||
if not _exists(p):
|
||||
raise HTTPException(404, "Upload not found")
|
||||
|
||||
server_offset = _size(p)
|
||||
if client_offset != server_offset:
|
||||
return Response(status_code=409)
|
||||
|
||||
# Optional checksum
|
||||
uh = request.headers.get("Upload-Checksum")
|
||||
algo = None
|
||||
if uh:
|
||||
try:
|
||||
algo = uh.split(" ", 1)[0]
|
||||
except Exception:
|
||||
raise HTTPException(400, "Invalid Upload-Checksum")
|
||||
|
||||
# Append data (with rollback on checksum mismatch)
|
||||
async with _lock_for(upload_id):
|
||||
with open(p, "ab+") as fobj:
|
||||
fobj.seek(0, os.SEEK_END)
|
||||
written = await _stream_with_checksum_and_append(fobj, request, algo)
|
||||
|
||||
new_offset = server_offset + written
|
||||
|
||||
# If length known and reached exactly, finalize
|
||||
info = _load_info_or_404(upload_id) # reload
|
||||
if info.get("length") is not None and new_offset == int(info["length"]):
|
||||
_atomic_finalize_file(upload_id)
|
||||
|
||||
# If this is a partial, a corresponding final may exist and be now completable
|
||||
# We don't maintain reverse index; finalization is triggered when HEAD on final is called.
|
||||
# (Optional: scan for finals to proactively finalize.)
|
||||
|
||||
return Response(status_code=204, headers={"Tus-Resumable": TUS_VERSION, "Upload-Offset": str(new_offset)})
|
||||
|
||||
@app.delete("/files/{upload_id}")
|
||||
async def tus_delete(upload_id: str, request: Request):
|
||||
_ensure_tus_version(request)
|
||||
async with _lock_for(upload_id):
|
||||
udir = upload_dir(upload_id)
|
||||
for p in (part_path(upload_id), final_path(upload_id), info_path(upload_id)):
|
||||
try:
|
||||
os.remove(p)
|
||||
except FileNotFoundError:
|
||||
pass
|
||||
try:
|
||||
os.rmdir(udir)
|
||||
except OSError:
|
||||
pass
|
||||
return Response(status_code=204, headers={"Tus-Resumable": TUS_VERSION})
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Quick Client Examples (manual)
|
||||
|
||||
```bash
|
||||
# OPTIONS
|
||||
curl -i -X OPTIONS http://localhost:8080/files
|
||||
|
||||
# 1) Single upload (known length)
|
||||
curl -i -X POST http://localhost:8080/files \
|
||||
-H "Tus-Resumable: 1.0.0" \
|
||||
-H "Upload-Length: 11" \
|
||||
-H "Upload-Metadata: filename Zm9vLnR4dA=="
|
||||
# → Location: /files/<ID>
|
||||
|
||||
# Upload with checksum (sha1 of "hello ")
|
||||
printf "hello " | curl -i -X PATCH http://localhost:8080/files/<ID> \
|
||||
-H "Tus-Resumable: 1.0.0" \
|
||||
-H "Content-Type: application/offset+octet-stream" \
|
||||
-H "Upload-Offset: 0" \
|
||||
-H "Upload-Checksum: sha1 L6v8xR3Lw4N2n9kQox3wL7G0m/I=" \
|
||||
--data-binary @-
|
||||
# (Replace digest with correct base64 for your chunk)
|
||||
|
||||
# 2) Concatenation
|
||||
# Create partial A (5 bytes)
|
||||
curl -i -X POST http://localhost:8080/files \
|
||||
-H "Tus-Resumable: 1.0.0" \
|
||||
-H "Upload-Length: 5" \
|
||||
-H "Upload-Concat: partial"
|
||||
# → Location: /files/<A>
|
||||
printf "hello" | curl -i -X PATCH http://localhost:8080/files/<A> \
|
||||
-H "Tus-Resumable: 1.0.0" \
|
||||
-H "Content-Type: application/offset+octet-stream" \
|
||||
-H "Upload-Offset: 0" \
|
||||
--data-binary @-
|
||||
|
||||
# Create partial B (6 bytes)
|
||||
curl -i -X POST http://localhost:8080/files \
|
||||
-H "Tus-Resumable: 1.0.0" \
|
||||
-H "Upload-Length: 6" \
|
||||
-H "Upload-Concat: partial"
|
||||
# → Location: /files/<B>
|
||||
printf " world" | curl -i -X PATCH http://localhost:8080/files/<B> \
|
||||
-H "Tus-Resumable: 1.0.0" \
|
||||
-H "Content-Type: application/offset+octet-stream" \
|
||||
-H "Upload-Offset: 0" \
|
||||
--data-binary @-
|
||||
|
||||
# Create final (may be before or after partials complete)
|
||||
curl -i -X POST http://localhost:8080/files \
|
||||
-H "Tus-Resumable: 1.0.0" \
|
||||
-H "Upload-Concat: final; /files/<A> /files/<B>"
|
||||
# HEAD on final will eventually show Upload-Offset once finalized
|
||||
curl -i -X HEAD http://localhost:8080/files/<FINAL> -H "Tus-Resumable: 1.0.0"
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Implementation Notes (agent hints)
|
||||
|
||||
* **Durability:** every data write `fsync(file)`; after `os.replace` of `*.part → data` or `info.json.tmp → info.json`, also `fsync(parent)`.
|
||||
* **Checksum:** verify against **this request’s** body only; on mismatch, **truncate back** to previous size and return `460`.
|
||||
* **Concatenation:** final upload is never `PATCH`ed. Server builds `final.data.part` by concatenating each partial’s **final file** in order, then atomically renames and marks completed. It’s triggered lazily in `HEAD` of final (and right after creation).
|
||||
* **Crash Recovery:** offset = `size(data.part)` or `size(data)`; `info.json` is canonical for `kind`, `length`, `defer_length`, `completed`, `parts`.
|
||||
* **Multi-process deployments:** replace `asyncio.Lock` with file locks (`fcntl.flock`) per `upload_id` to synchronize across workers.
|
||||
|
||||
|
||||
@@ -1,229 +0,0 @@
|
||||
```bash
|
||||
unpm install @uppy/react
|
||||
```
|
||||
|
||||
## Components
|
||||
|
||||
Pre-composed, plug-and-play components:
|
||||
|
||||
<Dashboard /> renders @uppy/dashboard
|
||||
<DashboardModal /> renders @uppy/dashboard as a modal
|
||||
<DragDrop /> renders @uppy/drag-drop
|
||||
<ProgressBar /> renders @uppy/progress-bar
|
||||
<StatusBar /> renders @uppy/status-bar
|
||||
|
||||
more info see https://uppy.io/docs/react
|
||||
|
||||
|
||||
we use tus server for the upload support
|
||||
|
||||
npm install @uppy/tus
|
||||
|
||||
e.g.
|
||||
|
||||
import Uppy from '@uppy/core';
|
||||
import Dashboard from '@uppy/dashboard';
|
||||
import Tus from '@uppy/tus';
|
||||
|
||||
import '@uppy/core/dist/style.min.css';
|
||||
import '@uppy/dashboard/dist/style.min.css';
|
||||
|
||||
new Uppy()
|
||||
.use(Dashboard, { inline: true, target: 'body' })
|
||||
|
||||
|
||||
|
||||
========================
|
||||
CODE SNIPPETS
|
||||
========================
|
||||
|
||||
TITLE: React Dashboard Modal Example with TUS
|
||||
DESCRIPTION: Demonstrates how to use the DashboardModal component from @uppy/react with the Tus plugin for resumable uploads.
|
||||
LANGUAGE: jsx
|
||||
CODE:
|
||||
```
|
||||
/** @jsx React */
|
||||
import React from 'react'
|
||||
import Uppy from '@uppy/core'
|
||||
import { DashboardModal } from '@uppy/react'
|
||||
import Tus from '@uppy/tus'
|
||||
|
||||
const uppy = new Uppy({ debug: true, autoProceed: false })
|
||||
.use(Tus, { endpoint: 'https://tusd.tusdemo.net/files/' })
|
||||
|
||||
class Example extends React.Component {
|
||||
state = { open: false }
|
||||
|
||||
render() {
|
||||
const { open } = this.state
|
||||
return (
|
||||
<DashboardModal
|
||||
uppy={uppy}
|
||||
open={open}
|
||||
onRequestClose={this.handleClose}
|
||||
/>
|
||||
)
|
||||
}
|
||||
// ..snip..
|
||||
}
|
||||
```
|
||||
|
||||
----------------------------------------
|
||||
|
||||
TITLE: Installation using npm for @uppy/react
|
||||
DESCRIPTION: Provides the command to install the @uppy/react package using npm.
|
||||
LANGUAGE: bash
|
||||
CODE:
|
||||
```
|
||||
$ npm install @uppy/react @uppy/core @uppy/dashboard @uppy/tus
|
||||
```
|
||||
|
||||
----------------------------------------
|
||||
|
||||
TITLE: Uppy Dashboard and Tus Integration Example (HTML & JavaScript)
|
||||
DESCRIPTION: This snippet demonstrates how to initialize Uppy with the Dashboard and Tus plugins, configure them, and handle upload success events.
|
||||
LANGUAGE: html
|
||||
CODE:
|
||||
```
|
||||
<html>
|
||||
<head>
|
||||
<link rel="stylesheet" href="https://releases.transloadit.com/uppy/v4.18.0/uppy.min.css" />
|
||||
</head>
|
||||
|
||||
<body>
|
||||
<div class="DashboardContainer"></div>
|
||||
<button class="UppyModalOpenerBtn">Upload</button>
|
||||
<div class="uploaded-files">
|
||||
<h5>Uploaded files:</h5>
|
||||
<ol></ol>
|
||||
</div>
|
||||
</body>
|
||||
|
||||
<script type="module">
|
||||
import { Uppy, Dashboard, Tus } from 'https://releases.transloadit.com/uppy/v4.18.0/uppy.min.mjs'
|
||||
var uppy = new Uppy({
|
||||
debug: true,
|
||||
autoProceed: false,
|
||||
})
|
||||
.use(Dashboard, {
|
||||
browserBackButtonClose: false,
|
||||
height: 470,
|
||||
inline: false,
|
||||
replaceTargetContent: true,
|
||||
showProgressDetails: true,
|
||||
target: '.DashboardContainer',
|
||||
trigger: '.UppyModalOpenerBtn',
|
||||
})
|
||||
.use(Tus, { endpoint: 'https://tusd.tusdemo.net/files/' })
|
||||
.on('upload-success', function (file, response) {
|
||||
var url = response.uploadURL
|
||||
var fileName = file.name
|
||||
|
||||
document.querySelector('.uploaded-files ol').innerHTML +=
|
||||
'<li><a href="' + url + '" target="_blank">' + fileName + '</a></li>'
|
||||
})
|
||||
</script>
|
||||
</html>
|
||||
```
|
||||
|
||||
----------------------------------------
|
||||
|
||||
TITLE: Initialize Uppy with Tus Plugin (JavaScript)
|
||||
DESCRIPTION: Demonstrates how to initialize Uppy and configure the Tus plugin for resumable uploads.
|
||||
LANGUAGE: js
|
||||
CODE:
|
||||
```
|
||||
import Uppy from '@uppy/core'
|
||||
import Tus from '@uppy/tus'
|
||||
|
||||
const uppy = new Uppy()
|
||||
uppy.use(Tus, {
|
||||
endpoint: 'https://tusd.tusdemo.net/files/', // use your tus endpoint here
|
||||
resume: true,
|
||||
retryDelays: [0, 1000, 3000, 5000],
|
||||
})
|
||||
```
|
||||
|
||||
----------------------------------------
|
||||
|
||||
TITLE: Uppy Core Initialization and Plugin Usage (JavaScript)
|
||||
DESCRIPTION: This example demonstrates how to initialize Uppy with core functionality and integrate the Tus plugin. It also shows how to listen for upload completion events.
|
||||
LANGUAGE: javascript
|
||||
CODE:
|
||||
```
|
||||
import Uppy from '@uppy/core'
|
||||
import Dashboard from '@uppy/dashboard'
|
||||
import Tus from '@uppy/tus'
|
||||
|
||||
const uppy = new Uppy()
|
||||
.use(Dashboard, { trigger: '#select-files' })
|
||||
.use(Tus, { endpoint: 'https://tusd.tusdemo.net/files/' })
|
||||
.on('complete', (result) => {
|
||||
console.log('Upload result:', result)
|
||||
})
|
||||
```
|
||||
|
||||
----------------------------------------
|
||||
|
||||
TITLE: Uppy XHRUpload Configuration (JavaScript)
|
||||
DESCRIPTION: This snippet shows the basic JavaScript configuration for Uppy, initializing it with the XHRUpload plugin to send files to a specified endpoint.
|
||||
LANGUAGE: javascript
|
||||
CODE:
|
||||
```
|
||||
import Uppy from '@uppy/core';
|
||||
import XHRUpload from '@uppy/xhr-upload';
|
||||
|
||||
const uppy = new Uppy({
|
||||
debug: true,
|
||||
autoProceed: false,
|
||||
restrictions: {
|
||||
maxFileSize: 100000000,
|
||||
maxNumberOfFiles: 10,
|
||||
allowedFileTypes: ['image/*', 'video/*']
|
||||
}
|
||||
});
|
||||
|
||||
uppy.use(XHRUpload, {
|
||||
endpoint: 'YOUR_UPLOAD_ENDPOINT_URL',
|
||||
fieldName: 'files[]',
|
||||
method: 'post'
|
||||
});
|
||||
|
||||
uppy.on('complete', (result) => {
|
||||
console.log('Upload complete:', result);
|
||||
});
|
||||
|
||||
uppy.on('error', (error) => {
|
||||
console.error('Upload error:', error);
|
||||
});
|
||||
```
|
||||
|
||||
----------------------------------------
|
||||
|
||||
TITLE: Install Uppy Core Packages for TUS
|
||||
DESCRIPTION: Installs the core Uppy package along with the Dashboard and Tus plugins using npm.
|
||||
LANGUAGE: bash
|
||||
CODE:
|
||||
```
|
||||
npm install @uppy/core @uppy/dashboard @uppy/tus @uppy/xhr-upload
|
||||
```
|
||||
|
||||
========================
|
||||
QUESTIONS AND ANSWERS
|
||||
========================
|
||||
|
||||
TOPIC: Uppy React Components
|
||||
Q: What is the purpose of the @uppy/react package?
|
||||
A: The @uppy/react package provides React component wrappers for Uppy's officially maintained UI plugins. It allows developers to easily integrate Uppy's file uploading capabilities into their React applications.
|
||||
|
||||
----------------------------------------
|
||||
|
||||
TOPIC: Uppy React Components
|
||||
Q: How can @uppy/react be installed in a project?
|
||||
A: The @uppy/react package can be installed using npm with the command '$ npm install @uppy/react'.
|
||||
|
||||
----------------------------------------
|
||||
|
||||
TOPIC: Uppy React Components
|
||||
Q: Where can I find more detailed documentation for the @uppy/react plugin?
|
||||
A: More detailed documentation for the @uppy/react plugin is available on the Uppy website at https://uppy.io/docs/react.
|
||||
42
aiprompts/herolib_core/basic_instructions.md
Normal file
42
aiprompts/herolib_core/basic_instructions.md
Normal file
@@ -0,0 +1,42 @@
|
||||
|
||||
## instructions for code generation
|
||||
|
||||
> when I generate code, the following instructions can never be overruled they are the basics
|
||||
|
||||
- do not try to fix files which end with _.v because these are generated files
|
||||
|
||||
|
||||
## instruction for vlang scripts
|
||||
|
||||
when I generate vlang scripts I will always use .vsh extension and use following as first line:
|
||||
|
||||
```
|
||||
#!/usr/bin/env -S v -n -w -cg -gc none -cc tcc -d use_openssl -enable-globals run
|
||||
```
|
||||
|
||||
- a .vsh is a v shell script and can be executed as is, no need to use v ...
|
||||
- in .vsh file there is no need for a main() function
|
||||
- these scripts can be used for examples or instruction scripts e.g. an installs script
|
||||
|
||||
## executing vlang scripts
|
||||
|
||||
As AI agent I should also execute .v or .vsh scripts with vrun
|
||||
|
||||
```bash
|
||||
vrun ~/code/github/freeflowuniverse/herolib/examples/biztools/bizmodel.vsh
|
||||
```
|
||||
|
||||
## executing test scripts
|
||||
|
||||
instruct user to test as follows (vtest is an alias which gets installed when herolib gets installed), can be done for a dir and for a file
|
||||
|
||||
```bash
|
||||
vtest ~/code/github/freeflowuniverse/herolib/lib/osal/package_test.v
|
||||
```
|
||||
|
||||
- use ~ so it works over all machines
|
||||
- don't use 'v test', we have vtest as alternative
|
||||
|
||||
## module imports
|
||||
|
||||
- in v all files in a folder are part of the same module, no need to import then, this is important difference in v
|
||||
@@ -19,6 +19,9 @@ import freeflowuniverse.herolib.core.pathlib
|
||||
```
|
||||
|
||||
### Creating Path Objects
|
||||
|
||||
This will figure out if the path is a dir, file and if it exists.
|
||||
|
||||
```v
|
||||
// Create a Path object for a file
|
||||
mut file_path := pathlib.get("path/to/file.txt")
|
||||
@@ -27,6 +30,8 @@ mut file_path := pathlib.get("path/to/file.txt")
|
||||
mut dir_path := pathlib.get("path/to/directory")
|
||||
```
|
||||
|
||||
if you know in advance if you expect a dir or file its better to use `pathlib.get_dir(path:...,create:true)` or `pathlib.get_file(path:...,create:true)`.
|
||||
|
||||
### Basic Path Operations
|
||||
```v
|
||||
// Get absolute path
|
||||
|
||||
Reference in New Issue
Block a user