diff --git a/.gitignore b/.gitignore
index 74c2474..ac1261b 100644
--- a/.gitignore
+++ b/.gitignore
@@ -31,7 +31,6 @@ pytorch_connectomics/
.coverage
coverage.xml
dist/
-.env
-.env.*
+
*.log
.github/workflows/docker-test.yml
diff --git a/COPILOT.md b/COPILOT.md
new file mode 100644
index 0000000..f8cdfc1
--- /dev/null
+++ b/COPILOT.md
@@ -0,0 +1,86 @@
+**Project Summary**
+- **Name:** pytc-client — Desktop client and services for Pytorch Connectomics
+- **Purpose:** Electron + React desktop client that interacts with FastAPI backend services and the `pytorch_connectomics` library for connectomics workflows.
+
+**Top-level Structure**
+- **`client/`**: Electron + React frontend (CRA). Important files:
+ - `client/package.json` — frontend scripts (`start`, `build`, `electron`).
+ - `client/main.js` — Electron entrypoint.
+ - `client/src/` — React source; components in `client/src/components/`.
+- **`server_api/`**: FastAPI application (API server)
+ - `server_api/main.py` — API entrypoint.
+ - `server_api/requirements.txt` — Python deps for API.
+- **`server_pytc/`**: PyTC worker service used for model inference or background tasks
+ - `server_pytc/main.py` — worker entrypoint.
+- **`pytorch_connectomics/`**: Library (local package) with models, configs, and utilities used by the worker.
+ - `pytorch_connectomics/setup.py` — library install entry.
+ - `pytorch_connectomics/configs/` — example YAML config files.
+- **`docker/`**: Docker resources for containerized backend; root `docker-compose.yaml` / `Dockerfile` exist.
+- **`scripts/`**: Helper scripts
+ - `scripts/bootstrap.sh` / `scripts/bootstrap.ps1` — one-time install/bootstrap
+ - `start.sh` / `start.bat` — start the full stack (servers + Electron client)
+- **`docs/`, `notebooks/`, `tests/`**: documentation, Jupyter notebooks, and unit tests respectively.
+
+**Important Root Files**
+- `package.json` (root): convenience npm scripts added to operate on the `client/` folder from repository root.
+- `README.md`: project setup and running instructions.
+
+**Common Developer Commands**
+- Bootstrap (one-time):
+```bash
+./scripts/bootstrap.sh # macOS / Linux
+scripts\bootstrap.ps1 # Windows PowerShell
+```
+- Start full stack (desktop + services):
+```bash
+./start.sh # macOS / Linux
+start.bat # Windows CMD
+```
+- Build frontend from repo root (convenience scripts added):
+```bash
+npm run client:install # install dependencies inside `client/`
+npm run client:build # build production frontend in `client/build`
+npm run client:dev # run frontend dev server (react-scripts start)
+npm run client:electron # run Electron (`electron .`) from `client/`
+```
+- Manual backend run (dev):
+```bash
+python -m venv .venv
+# activate venv
+source .venv/bin/activate # macOS / Linux
+.\.venv\Scripts\activate # Windows PowerShell
+pip install -r server_api/requirements.txt
+python server_api/main.py
+python server_pytc/main.py
+```
+- Tests:
+```bash
+pytest -q
+```
+- Docker backend (build & run):
+```bash
+docker compose build backend
+docker compose up backend
+docker compose down
+```
+
+**Where to Make Changes**
+- Frontend UI / components: edit `client/src/` and update `client/package.json` scripts as needed.
+- Electron behavior: edit `client/main.js` and electron-specific code.
+- API endpoints & backend logic: edit `server_api/` (FastAPI) and `server_pytc/` for worker behavior.
+- Models/configs: `pytorch_connectomics/` contains models, configs, and experiment YAMLs.
+
+**CI / Automation Tips**
+- CI jobs can call `npm --prefix client run build` (same as `npm run client:build`) to build frontend from repo root.
+- For parallel dev workflows (backend + frontend concurrently), consider adding `concurrently` or `npm-run-all` as dev dependencies and a `dev` script in root.
+
+**Notes & Recommendations**
+- Root `package.json` now includes `client:*` convenience scripts that use `npm --prefix client` to be cross-platform.
+- `client/package.json` uses `react-scripts` and defines `build` and `electron` scripts; keep those in sync when modifying frontend tooling.
+- Use `start.sh` / `start.bat` for the supported, repository-provided startup flow when possible.
+
+**Contact / Reference**
+- See `README.md` for more detailed setup and video demo link.
+- Use `tests/` for examples of expected behavior and for regression checks.
+
+(Generated by Copilot assistant for developer reference.)
\ No newline at end of file
diff --git a/DEVLOG.md b/DEVLOG.md
new file mode 100644
index 0000000..02ae7d0
--- /dev/null
+++ b/DEVLOG.md
@@ -0,0 +1,95 @@
+# DEVLOG.md — pytc-client Development Log
+
+This file records the major development steps, design decisions, and UI/UX changes made to the pytc-client project. It is formatted for easy reading by humans and AI agents.
+
+---
+
+## Project Context
+- **Repo:** PytorchConnectomics/pytc-client
+- **Frontend:** React + Electron (Ant Design)
+- **Backend:** FastAPI (Python), local user management for prototype
+
+---
+
+## Major Features & Changes
+
+### [2025-11-21] Backend User Management
+* **Backend Auth**: Integrated FastAPI with `python-jose` (JWT) and `passlib` (bcrypt) for secure authentication.
+* **Database**: Added SQLite database (`sql_app.db`) with SQLAlchemy models for Users.
+* **Frontend Integration**: Updated `UserContext` to communicate with backend endpoints (`/register`, `/token`, `/users/me`) instead of local storage.
+* **Dependencies**: Added `python-jose`, `passlib`, `sqlalchemy` to requirements.
+
+### [2025-11-21] Advanced File Management & UI Polish
+* **Manual Sign Out**: Added a sign-out button to the header, allowing users to return to the Welcome screen.
+* **File Previews**: Implemented a preview modal for files (images and text) triggered by double-click or context menu.
+* **Multi-Select & Drag Selection**: Added drag selection box and keyboard shortcuts (Ctrl/Shift) for selecting multiple files.
+* **Enhanced Drag & Drop**: Enabled moving multiple selected files/folders at once, including within the same parent directory.
+* **Context Menu Enhancements**: Updated context menu to handle multiple selections (bulk Copy/Delete) and hide "Preview" for multi-select.
+* **Bug Fixes**: Resolved issues with drag selection (single item) and Electron path handling on Windows.
+
+### 1. Welcome Page
+- Added a full-screen Welcome page as the app's entry point.
+- Includes project name, intro, and warm message.
+- Two buttons: "Sign in" and "Sign up".
+- Styled to resemble cursor.com (modern, clean, gradient background).
+- Welcome page is always shown on app start (automatic sign out).
+
+### 2. Backend User Management (New)
+- Replaced local storage with production-ready backend auth.
+- Users are stored in `server_api/sql_app.db` (SQLite).
+- Passwords are hashed with bcrypt.
+- JWT tokens used for session management (stored in localStorage).
+
+### 3. Main App Navigation
+- After login, user sees main app view (tabs: Visualization, Model Training, Model Inference, Tensorboard, Files).
+- "Welcome" tab removed from main menu after login.
+- Navigation to Welcome page is blocked after login.
+
+### 4. Files Tab (Google Drive-like)
+- Files tab shows three file slots per user.
+- Each slot displays file info (name, size, type) or "Empty".
+- Upload, rename, and delete actions for each slot (Ant Design components).
+- Upload is local only; rename uses modal; delete clears slot.
+
+### 5. Debugging & Build Process
+- Debug banners/messages added and removed for troubleshooting.
+- Reminder: Electron app uses static build (`client/build/`), so `npm run build` is required after code changes.
+- Hot reload only works in browser dev mode (`npm start`).
+
+---
+
+## Known Issues & Fixes
+- [x] Welcome page not showing: fixed by auto sign out on app start.
+- [x] "Welcome" tab visible after login: removed from menu.
+- [x] Debug messages visible: removed.
+- [x] Modals not working in Electron: fixed after proper build/restart.
+- [x] Drag selection not selecting single items: fixed.
+- [x] Electron "ERR_FILE_NOT_FOUND": fixed path separator in main.js.
+
+---
+
+## Next Steps / TODOs
+- [x] Add multi-file support or previews in Files tab.
+- [x] Add manual sign-out button in main app view.
+- [x] Integrate backend user management (FastAPI, JWT, etc.) for production.
+- [ ] Add user profile editing and avatar upload.
+- [ ] Improve file upload to support actual file storage (not just metadata).
+
+---
+
+## How to Develop & Test
+- Make code changes in `client/src/`.
+- Run `npm --prefix client run build` to update the Electron app.
+- Start the app with `./start.bat`.
+- For live development, use `npm start` (browser only).
+
+---
+
+## AI Agent Notes
+- All major UI/UX changes, context, and user flows are documented here.
+- Use this file to bootstrap further development, onboarding, or automation.
+- For backend integration, see FastAPI endpoints and user model notes above.
+
+---
+
+_Last updated: 2025-11-21_
diff --git a/bash.exe.stackdump b/bash.exe.stackdump
new file mode 100644
index 0000000..e7b5590
--- /dev/null
+++ b/bash.exe.stackdump
@@ -0,0 +1,29 @@
+Stack trace:
+Frame Function Args
+0007FFFFBBB0 00021006118E (00021028DEE8, 000210272B3E, 0007FFFFBBB0, 0007FFFFAAB0) msys-2.0.dll+0x2118E
+0007FFFFBBB0 0002100469BA (000000000000, 000000000000, 000000000000, 0007FFFFBE88) msys-2.0.dll+0x69BA
+0007FFFFBBB0 0002100469F2 (00021028DF99, 0007FFFFBA68, 0007FFFFBBB0, 000000000000) msys-2.0.dll+0x69F2
+0007FFFFBBB0 00021006A41E (000000000000, 000000000000, 000000000000, 000000000000) msys-2.0.dll+0x2A41E
+0007FFFFBBB0 00021006A545 (0007FFFFBBC0, 000000000000, 000000000000, 000000000000) msys-2.0.dll+0x2A545
+0007FFFFBE90 00021006B9A5 (0007FFFFBBC0, 000000000000, 000000000000, 000000000000) msys-2.0.dll+0x2B9A5
+End of stack trace
+Loaded modules:
+000100400000 bash.exe
+7FFA3F430000 ntdll.dll
+7FFA3E6F0000 KERNEL32.DLL
+7FFA3CF90000 KERNELBASE.dll
+7FFA3EDD0000 USER32.dll
+7FFA3D290000 win32u.dll
+000210040000 msys-2.0.dll
+7FFA3D510000 GDI32.dll
+7FFA3CE70000 gdi32full.dll
+7FFA3CAC0000 msvcp_win.dll
+7FFA3CB60000 ucrtbase.dll
+7FFA3D450000 advapi32.dll
+7FFA3D8B0000 msvcrt.dll
+7FFA3D730000 sechost.dll
+7FFA3E4E0000 RPCRT4.dll
+7FFA3CE40000 bcrypt.dll
+7FFA3C2A0000 CRYPTBASE.DLL
+7FFA3D370000 bcryptPrimitives.dll
+7FFA3EC80000 IMM32.DLL
diff --git a/client/.env b/client/.env
index e92841b..9db1133 100644
--- a/client/.env
+++ b/client/.env
@@ -1,2 +1,4 @@
-REACT_APP_API_PROTOCOL=http
-REACT_APP_API_URL=localhost:4242
+SKIP_PREFLIGHT_CHECK=true
+REACT_APP_SERVER_PROTOCOL=http
+REACT_APP_SERVER_URL=localhost:4242
+PORT=3001
diff --git a/client/main.js b/client/main.js
index e3bfae7..a42a4c1 100644
--- a/client/main.js
+++ b/client/main.js
@@ -7,18 +7,20 @@ require('electron-reload')(__dirname, {
let mainWindow
-function createWindow () {
+function createWindow() {
mainWindow = new BrowserWindow({
width: 800,
height: 600,
webPreferences: {
nodeIntegration: true,
- contextIsolation: false
+ contextIsolation: false,
+ webSecurity: false, // Allow loading iframes from localhost
+ allowRunningInsecureContent: true
}
})
mainWindow.loadURL(url.format({
- pathname: path.join(__dirname, './build/index.html'),
+ pathname: path.join(__dirname, 'build', 'index.html'),
protocol: 'file:',
slashes: true
}))
diff --git a/client/package-lock.json b/client/package-lock.json
index dd9df65..cfd80fa 100644
--- a/client/package-lock.json
+++ b/client/package-lock.json
@@ -30,6 +30,7 @@
},
"devDependencies": {
"@babel/plugin-proposal-private-property-in-object": "^7.16.7",
+ "babel-loader": "8.2.2",
"cross-env": "^7.0.3",
"electron-reload": "^2.0.0-alpha.1",
"prettier": "^3.3.2"
@@ -5704,12 +5705,14 @@
}
},
"node_modules/babel-loader": {
- "version": "8.3.0",
- "resolved": "https://registry.npmjs.org/babel-loader/-/babel-loader-8.3.0.tgz",
- "integrity": "sha512-H8SvsMF+m9t15HNLMipppzkC+Y2Yq+v3SonZyU70RBL/h1gxPkH08Ot8pEE9Z4Kd+czyWJClmFS8qzIP9OZ04Q==",
+ "version": "8.2.2",
+ "resolved": "https://registry.npmjs.org/babel-loader/-/babel-loader-8.2.2.tgz",
+ "integrity": "sha512-JvTd0/D889PQBtUXJ2PXaKU/pjZDMtHA9V2ecm+eNRmmBCMR09a+fmpGTNwnJtFmFl5Ei7Vy47LjBb+L0wQ99g==",
+ "dev": true,
+ "license": "MIT",
"dependencies": {
"find-cache-dir": "^3.3.1",
- "loader-utils": "^2.0.0",
+ "loader-utils": "^1.4.0",
"make-dir": "^3.1.0",
"schema-utils": "^2.6.5"
},
@@ -5721,10 +5724,39 @@
"webpack": ">=2"
}
},
+ "node_modules/babel-loader/node_modules/json5": {
+ "version": "1.0.2",
+ "resolved": "https://registry.npmjs.org/json5/-/json5-1.0.2.tgz",
+ "integrity": "sha512-g1MWMLBiz8FKi1e4w0UyVL3w+iJceWAFBAaBnnGKOpNa5f8TLktkbre1+s6oICydWAm+HRUGTmI+//xv2hvXYA==",
+ "dev": true,
+ "license": "MIT",
+ "dependencies": {
+ "minimist": "^1.2.0"
+ },
+ "bin": {
+ "json5": "lib/cli.js"
+ }
+ },
+ "node_modules/babel-loader/node_modules/loader-utils": {
+ "version": "1.4.2",
+ "resolved": "https://registry.npmjs.org/loader-utils/-/loader-utils-1.4.2.tgz",
+ "integrity": "sha512-I5d00Pd/jwMD2QCduo657+YM/6L3KZu++pmX9VFncxaxvHcru9jx1lBaFft+r4Mt2jK0Yhp41XlRAihzPxHNCg==",
+ "dev": true,
+ "license": "MIT",
+ "dependencies": {
+ "big.js": "^5.2.2",
+ "emojis-list": "^3.0.0",
+ "json5": "^1.0.1"
+ },
+ "engines": {
+ "node": ">=4.0.0"
+ }
+ },
"node_modules/babel-loader/node_modules/schema-utils": {
"version": "2.7.1",
"resolved": "https://registry.npmjs.org/schema-utils/-/schema-utils-2.7.1.tgz",
"integrity": "sha512-SHiNtMOUGWBQJwzISiVYKu82GiV4QYGePp3odlY1tuKO7gPtphAT5R/py0fA6xtbgLL/RvtJZnU9b8s0F1q0Xg==",
+ "dev": true,
"dependencies": {
"@types/json-schema": "^7.0.5",
"ajv": "^6.12.4",
@@ -6286,9 +6318,9 @@
}
},
"node_modules/caniuse-lite": {
- "version": "1.0.30001643",
- "resolved": "https://registry.npmjs.org/caniuse-lite/-/caniuse-lite-1.0.30001643.tgz",
- "integrity": "sha512-ERgWGNleEilSrHM6iUz/zJNSQTP8Mr21wDWpdgvRwcTXGAq6jMtOUPP4dqFPTdKqZ2wKTdtB+uucZ3MRpAUSmg==",
+ "version": "1.0.30001756",
+ "resolved": "https://registry.npmjs.org/caniuse-lite/-/caniuse-lite-1.0.30001756.tgz",
+ "integrity": "sha512-4HnCNKbMLkLdhJz3TToeVWHSnfJvPaq6vu/eRP0Ahub/07n484XHhBF5AJoSGHdVrS8tKFauUQz8Bp9P7LVx7A==",
"funding": [
{
"type": "opencollective",
@@ -6302,7 +6334,8 @@
"type": "github",
"url": "https://github.com/sponsors/ai"
}
- ]
+ ],
+ "license": "CC-BY-4.0"
},
"node_modules/case-sensitive-paths-webpack-plugin": {
"version": "2.4.0",
@@ -17360,6 +17393,25 @@
}
}
},
+ "node_modules/react-scripts/node_modules/babel-loader": {
+ "version": "8.4.1",
+ "resolved": "https://registry.npmjs.org/babel-loader/-/babel-loader-8.4.1.tgz",
+ "integrity": "sha512-nXzRChX+Z1GoE6yWavBQg6jDslyFF3SDjl2paADuoQtQW10JqShJt62R6eJQ5m/pjJFDT8xgKIWSP85OY8eXeA==",
+ "license": "MIT",
+ "dependencies": {
+ "find-cache-dir": "^3.3.1",
+ "loader-utils": "^2.0.4",
+ "make-dir": "^3.1.0",
+ "schema-utils": "^2.6.5"
+ },
+ "engines": {
+ "node": ">= 8.9"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0",
+ "webpack": ">=2"
+ }
+ },
"node_modules/react-scripts/node_modules/fs-extra": {
"version": "10.1.0",
"resolved": "https://registry.npmjs.org/fs-extra/-/fs-extra-10.1.0.tgz",
@@ -17384,6 +17436,24 @@
"graceful-fs": "^4.1.6"
}
},
+ "node_modules/react-scripts/node_modules/schema-utils": {
+ "version": "2.7.1",
+ "resolved": "https://registry.npmjs.org/schema-utils/-/schema-utils-2.7.1.tgz",
+ "integrity": "sha512-SHiNtMOUGWBQJwzISiVYKu82GiV4QYGePp3odlY1tuKO7gPtphAT5R/py0fA6xtbgLL/RvtJZnU9b8s0F1q0Xg==",
+ "license": "MIT",
+ "dependencies": {
+ "@types/json-schema": "^7.0.5",
+ "ajv": "^6.12.4",
+ "ajv-keywords": "^3.5.2"
+ },
+ "engines": {
+ "node": ">= 8.9.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/webpack"
+ }
+ },
"node_modules/react-scripts/node_modules/semver": {
"version": "7.6.3",
"resolved": "https://registry.npmjs.org/semver/-/semver-7.6.3.tgz",
diff --git a/client/package.json b/client/package.json
index dbe998a..71742ff 100644
--- a/client/package.json
+++ b/client/package.json
@@ -52,6 +52,7 @@
},
"devDependencies": {
"@babel/plugin-proposal-private-property-in-object": "^7.16.7",
+ "babel-loader": "8.2.2",
"cross-env": "^7.0.3",
"electron-reload": "^2.0.0-alpha.1",
"prettier": "^3.3.2"
@@ -60,6 +61,6 @@
"nth-check": "$nth-check",
"resolve-url-loader": "^5.0.0",
"svgo": "^3.3.2",
- "webpack-dev-server": "^5.2.1"
+ "webpack-dev-server": "^4.15.0"
}
-}
+}
\ No newline at end of file
diff --git a/client/src/App.js b/client/src/App.js
index 9e605b8..26e308e 100644
--- a/client/src/App.js
+++ b/client/src/App.js
@@ -1,7 +1,9 @@
import { useContext, useEffect, useState } from 'react'
import './App.css'
import Views from './views/Views'
+import Welcome from './views/Welcome'
import { AppContext, ContextWrapper } from './contexts/GlobalContext'
+import UserContextWrapper, { UserContext } from './contexts/UserContext'
import { YamlContextWrapper } from './contexts/YamlContext'
function CacheBootstrapper ({ children }) {
@@ -30,17 +32,24 @@ function CacheBootstrapper ({ children }) {
return children
}
+function MainContent () {
+ const { currentUser } = useContext(UserContext)
+ return currentUser ? :
+}
+
function App () {
return (
-
-
-
-
-
-
-
-
-
+
+
+
+
+
+
+
+
+
+
+
)
}
diff --git a/client/src/components/NeuroglancerViewer.js b/client/src/components/NeuroglancerViewer.js
new file mode 100644
index 0000000..44d6dd4
--- /dev/null
+++ b/client/src/components/NeuroglancerViewer.js
@@ -0,0 +1,150 @@
+import React, { useState, useEffect } from 'react';
+import { Button, Spin, Alert, Typography } from 'antd';
+import { ReloadOutlined } from '@ant-design/icons';
+import axios from 'axios';
+
+const { Text, Title } = Typography;
+const API_BASE = `${process.env.REACT_APP_SERVER_PROTOCOL || 'http'}://${process.env.REACT_APP_SERVER_URL || 'localhost:4243'}`;
+
+/**
+ * NeuroglancerViewer Component
+ *
+ * Loads and displays Neuroglancer viewer in an iframe using the project's image files.
+ * Uses the same approach as the Visualization tab.
+ *
+ * @param {number} projectId - Project ID to load viewer for
+ * @param {object} currentSynapse - Current synapse for position reference
+ */
+function NeuroglancerViewer({ projectId = 1, currentSynapse }) {
+ const [viewerUrl, setViewerUrl] = useState(null);
+ const [loading, setLoading] = useState(true);
+ const [error, setError] = useState(null);
+
+ // Load Neuroglancer viewer on mount
+ useEffect(() => {
+ loadViewer();
+ }, [projectId]);
+
+ const loadViewer = async () => {
+ setLoading(true);
+ setError(null);
+
+ try {
+ const response = await axios.get(
+ `${API_BASE}/api/synanno/ng-url/${projectId}`,
+ { withCredentials: true }
+ );
+
+ if (response.data.url) {
+ setViewerUrl(response.data.url);
+ } else {
+ // If backend returns a message instead of URL (transition state)
+ if (response.data.message) {
+ // We can handle the message here, but for now let's just show the "Setup in Progress" state
+ // by not setting the URL.
+ console.log("Neuroglancer message:", response.data.message);
+ }
+ setError(null); // Clear error if it's just a transition message
+ }
+ } catch (err) {
+ console.error('Failed to load Neuroglancer viewer', err);
+ setError(err.response?.data?.detail || 'Failed to load Neuroglancer viewer');
+ } finally {
+ setLoading(false);
+ }
+ };
+
+ const refreshViewer = () => {
+ loadViewer();
+ };
+
+ if (loading) {
+ return (
+
+
+
+ );
+ }
+
+ if (error) {
+ return (
+
+ );
+ }
+
+ // Display viewer in iframe
+ return (
+
+
+ }
+ onClick={refreshViewer}
+ title="Refresh viewer"
+ />
+ {currentSynapse && (
+
+ Synapse #{currentSynapse.id}
+
+ )}
+
+ {viewerUrl ? (
+
+ ) : (
+
+
Setup in Progress
+
Data server is running. Preparing viewer...
+
+
+ )}
+
+ );
+}
+
+export default NeuroglancerViewer;
diff --git a/client/src/components/ProofreadingControls.js b/client/src/components/ProofreadingControls.js
new file mode 100644
index 0000000..e22068a
--- /dev/null
+++ b/client/src/components/ProofreadingControls.js
@@ -0,0 +1,157 @@
+import React, { useState, useEffect } from 'react';
+import { Button, Input, Space, Typography, Divider } from 'antd';
+import { CheckOutlined, CloseOutlined, QuestionOutlined, ArrowRightOutlined } from '@ant-design/icons';
+
+const { Text } = Typography;
+
+/**
+ * ProofreadingControls Component
+ *
+ * Provides UI controls for classifying synapses and editing neuron IDs.
+ * Includes status buttons, input fields, and save/navigation buttons.
+ *
+ * @param {object} currentSynapse - Currently selected synapse
+ * @param {function} onSave - Callback to save changes
+ * @param {function} onNext - Callback to navigate to next synapse
+ */
+function ProofreadingControls({ currentSynapse, onSave, onNext }) {
+ const [status, setStatus] = useState('error');
+ const [preNeuronId, setPreNeuronId] = useState('');
+ const [postNeuronId, setPostNeuronId] = useState('');
+
+ // Update local state when current synapse changes
+ useEffect(() => {
+ if (currentSynapse) {
+ setStatus(currentSynapse.status);
+ setPreNeuronId(currentSynapse.pre_neuron_id || '');
+ setPostNeuronId(currentSynapse.post_neuron_id || '');
+ }
+ }, [currentSynapse]);
+
+ const handleSave = async () => {
+ const updates = {
+ status,
+ pre_neuron_id: preNeuronId ? parseInt(preNeuronId) : null,
+ post_neuron_id: postNeuronId ? parseInt(postNeuronId) : null
+ };
+
+ await onSave(updates);
+ };
+
+ const handleSaveAndNext = async () => {
+ await handleSave();
+ onNext();
+ };
+
+ if (!currentSynapse) {
+ return (
+
+ No synapse selected
+
+ );
+ }
+
+ return (
+
+ {/* Current Synapse Info */}
+
+
Synapse #{currentSynapse.id}
+
+
+ Position: ({currentSynapse.x.toFixed(1)}, {currentSynapse.y.toFixed(1)}, {currentSynapse.z.toFixed(1)})
+
+ {currentSynapse.confidence && (
+
+ Confidence: {(currentSynapse.confidence * 100).toFixed(0)}%
+
+ )}
+
+
+
+
+
+ {/* Status Classification */}
+
+ Status Classification
+
+ }
+ onClick={() => setStatus('correct')}
+ style={{
+ backgroundColor: status === 'correct' ? '#52c41a' : undefined,
+ borderColor: status === 'correct' ? '#52c41a' : undefined
+ }}
+ >
+ Correct (C)
+
+ }
+ onClick={() => setStatus('incorrect')}
+ >
+ Incorrect (X)
+
+ }
+ onClick={() => setStatus('unsure')}
+ style={{
+ backgroundColor: status === 'unsure' ? '#faad14' : undefined,
+ borderColor: status === 'unsure' ? '#faad14' : undefined,
+ color: status === 'unsure' ? '#fff' : undefined
+ }}
+ >
+ Unsure (U)
+
+
+
+
+
+
+ {/* Neuron ID Inputs */}
+
+ Pre-synaptic Neuron ID
+ setPreNeuronId(e.target.value)}
+ placeholder="Enter neuron ID"
+ type="number"
+ />
+
+
+
+ Post-synaptic Neuron ID
+ setPostNeuronId(e.target.value)}
+ placeholder="Enter neuron ID"
+ type="number"
+ />
+
+
+
+
+ {/* Action Buttons */}
+
+
+ Save (S)
+
+ }
+ onClick={handleSaveAndNext}
+ >
+ Save & Next (→)
+
+
+
+ );
+}
+
+export default ProofreadingControls;
diff --git a/client/src/components/SynapseList.js b/client/src/components/SynapseList.js
new file mode 100644
index 0000000..d80499f
--- /dev/null
+++ b/client/src/components/SynapseList.js
@@ -0,0 +1,105 @@
+import React from 'react';
+import { List, Typography, Progress } from 'antd';
+import { CheckCircleOutlined, CloseCircleOutlined, QuestionCircleOutlined } from '@ant-design/icons';
+
+const { Text } = Typography;
+
+/**
+ * SynapseList Component
+ *
+ * Displays a scrollable list of synapses with status indicators and progress tracking.
+ * Highlights the currently selected synapse and allows clicking to navigate.
+ *
+ * @param {array} synapses - Array of synapse objects
+ * @param {number} currentIndex - Index of currently selected synapse
+ * @param {function} onSelectSynapse - Callback when synapse is clicked
+ * @param {number} reviewedCount - Number of reviewed synapses
+ */
+function SynapseList({ synapses, currentIndex, onSelectSynapse, reviewedCount }) {
+
+ /**
+ * Get icon based on synapse status
+ */
+ const getStatusIcon = (status) => {
+ switch (status) {
+ case 'correct':
+ return ;
+ case 'incorrect':
+ return ;
+ case 'unsure':
+ return ;
+ default:
+ return null; // No icon for 'error' status
+ }
+ };
+
+ // Calculate progress
+ const totalErrors = synapses.filter(s => s.status === 'error').length;
+ const progress = totalErrors > 0 ? (reviewedCount / totalErrors) * 100 : 0;
+
+ return (
+
+ {/* Progress Section */}
+
+
Progress
+
+
+ {reviewedCount} / {totalErrors} reviewed
+
+
+
+ {/* Synapse List */}
+
(
+ onSelectSynapse(index)}
+ style={{
+ cursor: 'pointer',
+ backgroundColor: index === currentIndex ? '#e6f7ff' : 'transparent',
+ borderLeft: index === currentIndex ? '3px solid #1890ff' : '3px solid transparent',
+ padding: '8px 12px',
+ transition: 'all 0.2s'
+ }}
+ onMouseEnter={(e) => {
+ if (index !== currentIndex) {
+ e.currentTarget.style.backgroundColor = '#f5f5f5';
+ }
+ }}
+ onMouseLeave={(e) => {
+ if (index !== currentIndex) {
+ e.currentTarget.style.backgroundColor = 'transparent';
+ }
+ }}
+ >
+
+
+ Synapse #{synapse.id}
+ {getStatusIcon(synapse.status)}
+
+
+ ({synapse.x.toFixed(1)}, {synapse.y.toFixed(1)}, {synapse.z.toFixed(1)})
+
+ {synapse.confidence && (
+
+ Confidence: {(synapse.confidence * 100).toFixed(0)}%
+
+ )}
+
+
+ )}
+ />
+
+ );
+}
+
+export default SynapseList;
diff --git a/client/src/contexts/UserContext.js b/client/src/contexts/UserContext.js
new file mode 100644
index 0000000..f846fe2
--- /dev/null
+++ b/client/src/contexts/UserContext.js
@@ -0,0 +1,98 @@
+import React, { createContext, useState, useCallback, useEffect } from 'react';
+
+export const UserContext = createContext();
+
+const API_URL = 'http://localhost:4242';
+
+const UserContextWrapper = ({ children }) => {
+ const [currentUser, setCurrentUser] = useState(null);
+ const [token, setToken] = useState(localStorage.getItem('token'));
+
+ const autoSignOut = useCallback(() => {
+ setCurrentUser(null);
+ setToken(null);
+ localStorage.removeItem('token');
+ }, []);
+
+ // Check token on mount
+ useEffect(() => {
+ const checkUser = async () => {
+ if (token) {
+ try {
+ const response = await fetch(`${API_URL}/users/me`, {
+ headers: { Authorization: `Bearer ${token}` }
+ });
+ if (response.ok) {
+ const userData = await response.json();
+ setCurrentUser({ ...userData, name: userData.username }); // Map username to name for compatibility
+ } else {
+ autoSignOut();
+ }
+ } catch (error) {
+ console.error("Failed to fetch user", error);
+ autoSignOut();
+ }
+ }
+ };
+ checkUser();
+ }, [token, autoSignOut]);
+
+ const signIn = async (username, password) => {
+ try {
+ const formData = new FormData();
+ formData.append('username', username);
+ formData.append('password', password);
+
+ const response = await fetch(`${API_URL}/token`, {
+ method: 'POST',
+ body: formData,
+ });
+
+ if (response.ok) {
+ const data = await response.json();
+ setToken(data.access_token);
+ localStorage.setItem('token', data.access_token);
+ // Fetch user details immediately
+ const userRes = await fetch(`${API_URL}/users/me`, {
+ headers: { Authorization: `Bearer ${data.access_token}` }
+ });
+ if (userRes.ok) {
+ const userData = await userRes.json();
+ setCurrentUser({ ...userData, name: userData.username });
+ return true;
+ }
+ }
+ return false;
+ } catch (error) {
+ console.error("Sign in error", error);
+ return false;
+ }
+ };
+
+ const signUp = async (username, password) => {
+ try {
+ const response = await fetch(`${API_URL}/register`, {
+ method: 'POST',
+ headers: { 'Content-Type': 'application/json' },
+ body: JSON.stringify({ username, password }),
+ });
+
+ if (response.ok) {
+ // Auto sign in after registration
+ return await signIn(username, password);
+ }
+ return false;
+ } catch (error) {
+ console.error("Sign up error", error);
+ return false;
+ }
+ };
+
+ return (
+
+ {children}
+
+ );
+};
+
+export default UserContextWrapper;
diff --git a/client/src/views/Files.js b/client/src/views/Files.js
new file mode 100644
index 0000000..1c44a0c
--- /dev/null
+++ b/client/src/views/Files.js
@@ -0,0 +1,11 @@
+import React from 'react';
+import FilesManager from './FilesManager';
+
+export default function Files() {
+ return (
+
+
Files
+
+
+ );
+}
diff --git a/client/src/views/FilesManager.js b/client/src/views/FilesManager.js
new file mode 100644
index 0000000..b6bbf3a
--- /dev/null
+++ b/client/src/views/FilesManager.js
@@ -0,0 +1,834 @@
+import React, { useState, useEffect, useRef } from 'react';
+import { Button, Input, Modal, message, Menu, Breadcrumb, Empty, Image } from 'antd';
+import { FolderFilled, FileOutlined, FileTextOutlined, HomeOutlined, ArrowUpOutlined, AppstoreOutlined, BarsOutlined, UploadOutlined, EyeOutlined } from '@ant-design/icons';
+import axios from 'axios';
+
+// API base URL (adjust via env vars if needed)
+const API_BASE = `${process.env.REACT_APP_SERVER_PROTOCOL || 'http'}://${process.env.REACT_APP_SERVER_URL || 'localhost:4243'}`;
+
+// Configure axios to include JWT token
+axios.interceptors.request.use((config) => {
+ const token = localStorage.getItem('token');
+ if (token) {
+ config.headers.Authorization = `Bearer ${token}`;
+ }
+ return config;
+});
+
+// Transform backend file list into UI state
+const transformFiles = (fileList) => {
+ const folders = [];
+ const files = {};
+ fileList.forEach((f) => {
+ if (f.is_folder) {
+ folders.push({ key: String(f.id), title: f.name, parent: f.path === 'root' ? 'root' : String(f.path), is_folder: true });
+ } else {
+ const parentKey = f.path || 'root';
+ if (!files[parentKey]) files[parentKey] = [];
+ files[parentKey].push({ key: String(f.id), name: f.name, size: f.size, type: f.type, is_folder: false });
+ }
+ });
+ if (!folders.find((f) => f.key === 'root')) {
+ folders.unshift({ key: 'root', title: 'My Drive', parent: null });
+ }
+ return { folders, files };
+};
+
+function FilesManager() {
+ const [folders, setFolders] = useState([]);
+ const [files, setFiles] = useState({});
+ const [currentFolder, setCurrentFolder] = useState('root');
+ const [viewMode, setViewMode] = useState('grid'); // 'grid' or 'list'
+ const [selectedItems, setSelectedItems] = useState([]);
+ const [clipboard, setClipboard] = useState({ items: [], action: null }); // copy / move
+ const [editingItem, setEditingItem] = useState(null);
+ const [newItemType, setNewItemType] = useState(null);
+ const [tempName, setTempName] = useState('');
+ const inputRef = useRef(null);
+ const [contextMenu, setContextMenu] = useState(null);
+ const [previewFile, setPreviewFile] = useState(null);
+ const [propertiesData, setPropertiesData] = useState(null);
+ const [selectionBox, setSelectionBox] = useState(null);
+ const containerRef = useRef(null);
+ const itemRefs = useRef({});
+ const isDragSelecting = useRef(false);
+
+ // Focus input when editing starts
+ useEffect(() => {
+ if (editingItem && inputRef.current) {
+ inputRef.current.focus();
+ inputRef.current.select();
+ }
+ }, [editingItem]);
+
+ // Load initial data
+ // eslint-disable-next-line react-hooks/exhaustive-deps
+ useEffect(() => {
+ const fetchFiles = async () => {
+ try {
+ const res = await axios.get(`${API_BASE}/files`, { withCredentials: true });
+ const { folders: flds, files: fls } = transformFiles(res.data);
+ setFolders(flds);
+ setFiles(fls);
+ } catch (err) {
+ console.error('Failed to load files', err);
+ message.error('Could not load files');
+ }
+ };
+ fetchFiles();
+ }, []);
+
+ const getCurrentFolderObj = () => folders.find((f) => f.key === currentFolder);
+ // eslint-disable-next-line no-loop-func
+ const getBreadcrumbs = () => {
+ const path = [];
+ let curr = getCurrentFolderObj();
+ while (curr) {
+ path.unshift(curr);
+ curr = folders.find((f) => f.key === curr.parent);
+ }
+ return path;
+ };
+
+ const handleNavigate = (key) => {
+ setCurrentFolder(key);
+ setSelectedItems([]);
+ setEditingItem(null);
+ setNewItemType(null);
+ };
+
+ const handleUp = () => {
+ const curr = getCurrentFolderObj();
+ if (curr && curr.parent) handleNavigate(curr.parent);
+ };
+
+ // Folder creation
+ const startCreateFolder = () => {
+ const key = `new_folder_${Date.now()}`;
+ setNewItemType('folder');
+ setEditingItem(key);
+ setTempName('');
+ };
+
+ const finishCreateFolder = async () => {
+ if (!tempName.trim()) {
+ setEditingItem(null);
+ setNewItemType(null);
+ return;
+ }
+ try {
+ const payload = { name: tempName, path: currentFolder };
+ const res = await axios.post(`${API_BASE}/files/folder`, payload, { withCredentials: true });
+ const newFolder = res.data;
+ setFolders([...folders, { key: String(newFolder.id), title: newFolder.name, parent: newFolder.path }]);
+ setFiles({ ...files, [String(newFolder.id)]: [] });
+ message.success('Folder created');
+ } catch (err) {
+ console.error(err);
+ message.error('Failed to create folder');
+ }
+ setEditingItem(null);
+ setNewItemType(null);
+ };
+
+ // Rename
+ const startRename = (key, currentName) => {
+ setEditingItem(key);
+ setTempName(currentName);
+ };
+
+ const finishRename = async () => {
+ if (!tempName.trim()) {
+ setEditingItem(null);
+ return;
+ }
+ const key = editingItem;
+ const isFolder = folders.some((f) => f.key === key);
+ try {
+ await axios.put(`${API_BASE}/files/${key}`, { name: tempName, path: isFolder ? undefined : currentFolder }, { withCredentials: true });
+ if (isFolder) {
+ setFolders(folders.map((f) => (f.key === key ? { ...f, title: tempName } : f)));
+ } else {
+ setFiles((prev) => ({
+ ...prev,
+ [currentFolder]: prev[currentFolder].map((f) => (f.key === key ? { ...f, name: tempName } : f)),
+ }));
+ }
+ message.success('Renamed successfully');
+ } catch (err) {
+ console.error(err);
+ message.error('Rename failed');
+ }
+ setEditingItem(null);
+ };
+
+ // Delete
+ const handleDelete = async (keys = selectedItems) => {
+ if (keys.length === 0) return;
+ try {
+ await Promise.all(keys.map((id) => axios.delete(`${API_BASE}/files/${id}`, { withCredentials: true })));
+ const folderIds = keys.filter((k) => folders.some((f) => f.key === k));
+ const fileIds = keys.filter((k) => !folderIds.includes(k));
+ setFolders(folders.filter((f) => !folderIds.includes(f.key)));
+ setFiles((prev) => {
+ const newFiles = { ...prev };
+ Object.keys(newFiles).forEach((fk) => {
+ newFiles[fk] = newFiles[fk].filter((f) => !fileIds.includes(f.key));
+ });
+ return newFiles;
+ });
+ setSelectedItems([]);
+ message.success(`Deleted ${keys.length} items`);
+ } catch (err) {
+ console.error(err);
+ message.error('Delete failed');
+ }
+ };
+
+ // Copy / Paste (simple copy creates a new entry via folder endpoint for demo)
+ const handleCopy = (keys = selectedItems) => {
+ if (keys.length === 0) return;
+ setClipboard({ items: keys, action: 'copy' });
+ message.info('Copied to clipboard');
+ };
+
+ const handlePaste = async () => {
+ if (!clipboard.items.length) return;
+ if (clipboard.action === 'copy') {
+ const newEntries = [];
+ for (const id of clipboard.items) {
+ const orig = Object.values(files).flat().find((f) => f.key === id);
+ if (!orig) continue;
+ const payload = { name: `Copy of ${orig.name}`, path: currentFolder };
+ try {
+ const res = await axios.post(`${API_BASE}/files/folder`, payload, { withCredentials: true });
+ newEntries.push({ key: String(res.data.id), name: payload.name, size: orig.size, type: orig.type });
+ } catch (err) {
+ console.error('Paste error', err);
+ }
+ }
+ if (newEntries.length) {
+ setFiles((prev) => ({ ...prev, [currentFolder]: [...(prev[currentFolder] || []), ...newEntries] }));
+ message.success('Pasted items');
+ }
+ }
+ };
+
+ // Preview
+ const handlePreview = (key) => {
+ const file = (files[currentFolder] || []).find((f) => f.key === key);
+ if (file) setPreviewFile(file);
+ };
+
+ // Properties
+ const handleProperties = (keys = selectedItems) => {
+ if (keys.length === 0) return;
+
+ if (keys.length === 1) {
+ // Single item - show detailed info
+ const key = keys[0];
+ const folder = folders.find((f) => f.key === key);
+ const file = Object.values(files).flat().find((f) => f.key === key);
+ const item = folder || file;
+
+ if (item) {
+ setPropertiesData({
+ type: 'single',
+ name: item.title || item.name,
+ isFolder: !!folder,
+ size: item.size || 'N/A',
+ fileType: item.type || 'Folder',
+ created: new Date().toLocaleString(), // Backend should provide this
+ modified: new Date().toLocaleString(), // Backend should provide this
+ });
+ }
+ } else {
+ // Multiple items - show summary
+ const folderKeys = keys.filter((k) => folders.some((f) => f.key === k));
+ const fileKeys = keys.filter((k) => !folderKeys.includes(k));
+
+ // Calculate total size (simplified - would need backend support for accurate calculation)
+ let totalSize = 0;
+ fileKeys.forEach((key) => {
+ const file = Object.values(files).flat().find((f) => f.key === key);
+ if (file && file.size) {
+ // Parse size string (e.g., "1.5MB" or "500KB")
+ const sizeStr = String(file.size);
+ const match = sizeStr.match(/([0-9.]+)\s*(KB|MB|GB)/i);
+ if (match) {
+ const value = parseFloat(match[1]);
+ const unit = match[2].toUpperCase();
+ if (unit === 'KB') totalSize += value;
+ else if (unit === 'MB') totalSize += value * 1024;
+ else if (unit === 'GB') totalSize += value * 1024 * 1024;
+ }
+ }
+ });
+
+ const totalSizeStr = totalSize > 1024
+ ? `${(totalSize / 1024).toFixed(2)} MB`
+ : `${totalSize.toFixed(2)} KB`;
+
+ setPropertiesData({
+ type: 'multiple',
+ totalCount: keys.length,
+ folderCount: folderKeys.length,
+ fileCount: fileKeys.length,
+ totalSize: totalSizeStr,
+ });
+ }
+ };
+
+ // Drag & Drop
+ const handleDragStart = (e, key, type) => {
+ let itemsToDrag = selectedItems;
+ if (!selectedItems.includes(key)) {
+ itemsToDrag = [key];
+ setSelectedItems([key]);
+ }
+ e.dataTransfer.setData('text/plain', JSON.stringify({ keys: itemsToDrag }));
+ };
+
+ const handleDragOver = (e) => e.preventDefault();
+
+ const handleDrop = async (e, targetFolderKey) => {
+ e.preventDefault();
+
+ // Check if dropping files from OS (external files)
+ if (e.dataTransfer.files && e.dataTransfer.files.length > 0) {
+ await handleExternalFileDrop(e.dataTransfer.files, targetFolderKey);
+ return;
+ }
+
+ // Handle internal drag-and-drop
+ const dataStr = e.dataTransfer.getData('text/plain');
+ if (!dataStr) return;
+ const { keys } = JSON.parse(dataStr);
+ if (!keys || keys.length === 0) return;
+ if (keys.includes(targetFolderKey)) return;
+
+ let moved = 0;
+ for (const key of keys) {
+ const isFolder = folders.some((f) => f.key === key);
+ const item = isFolder
+ ? folders.find((f) => f.key === key)
+ : Object.values(files).flat().find((f) => f.key === key);
+
+ if (!item) continue;
+
+ try {
+ // Send name along with path to fix 422 error
+ await axios.put(`${API_BASE}/files/${key}`, {
+ name: item.title || item.name,
+ path: targetFolderKey
+ }, { withCredentials: true });
+
+ if (isFolder) {
+ setFolders((prev) => prev.map((f) => (f.key === key ? { ...f, parent: targetFolderKey } : f)));
+ } else {
+ setFiles((prev) => {
+ const sourceFolder = Object.keys(prev).find((fk) => prev[fk].some((f) => f.key === key));
+ if (!sourceFolder) return prev;
+ const fileObj = prev[sourceFolder].find((f) => f.key === key);
+ const newSource = prev[sourceFolder].filter((f) => f.key !== key);
+ const newTarget = [...(prev[targetFolderKey] || []), fileObj];
+ return { ...prev, [sourceFolder]: newSource, [targetFolderKey]: newTarget };
+ });
+ }
+ moved++;
+ } catch (err) {
+ console.error('Move error', err);
+ message.error(`Failed to move ${item.title || item.name}`);
+ }
+ }
+ if (moved) {
+ message.success(`Moved ${moved} items`);
+ setSelectedItems([]);
+ }
+ };
+
+ // Handle external file drops from OS
+ const handleExternalFileDrop = async (fileList, targetFolder) => {
+ const filesArray = Array.from(fileList);
+ let uploaded = 0;
+
+ for (const file of filesArray) {
+ const form = new FormData();
+ form.append('file', file);
+ form.append('path', targetFolder);
+
+ try {
+ const res = await axios.post(`${API_BASE}/files/upload`, form, {
+ headers: { 'Content-Type': 'multipart/form-data' },
+ withCredentials: true,
+ });
+ const newFile = res.data;
+ setFiles((prev) => ({
+ ...prev,
+ [targetFolder]: [...(prev[targetFolder] || []), {
+ key: String(newFile.id),
+ name: newFile.name,
+ size: newFile.size,
+ type: newFile.type
+ }],
+ }));
+ uploaded++;
+ } catch (err) {
+ console.error('Upload error', err);
+ message.error(`Failed to upload ${file.name}`);
+ }
+ }
+
+ if (uploaded > 0) {
+ message.success(`Uploaded ${uploaded} file${uploaded > 1 ? 's' : ''}`);
+ }
+ };
+
+
+ // Selection box handling
+ const handleMouseDown = (e) => {
+ if (e.target !== containerRef.current && e.target.className !== 'file-manager-content') return;
+ isDragSelecting.current = false;
+ if (!e.ctrlKey && !e.shiftKey) setSelectedItems([]);
+ const rect = containerRef.current.getBoundingClientRect();
+ setSelectionBox({
+ startX: e.clientX - rect.left,
+ startY: e.clientY - rect.top + containerRef.current.scrollTop,
+ currentX: e.clientX - rect.left,
+ currentY: e.clientY - rect.top + containerRef.current.scrollTop,
+ initialSelected: e.ctrlKey ? [...selectedItems] : [],
+ });
+ };
+
+ const handleMouseMove = (e) => {
+ if (!selectionBox) return;
+ const rect = containerRef.current.getBoundingClientRect();
+ const currentX = e.clientX - rect.left;
+ const currentY = e.clientY - rect.top + containerRef.current.scrollTop;
+ if (Math.abs(currentX - selectionBox.startX) > 5 || Math.abs(currentY - selectionBox.startY) > 5) {
+ isDragSelecting.current = true;
+ }
+ setSelectionBox((prev) => ({ ...prev, currentX, currentY }));
+ const left = Math.min(selectionBox.startX, currentX);
+ const top = Math.min(selectionBox.startY, currentY);
+ const width = Math.abs(currentX - selectionBox.startX);
+ const height = Math.abs(currentY - selectionBox.startY);
+ const newSelected = [];
+ Object.keys(itemRefs.current).forEach((key) => {
+ const el = itemRefs.current[key];
+ if (!el) return;
+ const itemRect = el.getBoundingClientRect();
+ const containerRect = containerRef.current.getBoundingClientRect();
+ const itemLeft = itemRect.left - containerRect.left;
+ const itemTop = itemRect.top - containerRect.top + containerRef.current.scrollTop;
+ if (
+ left < itemLeft + itemRect.width &&
+ left + width > itemLeft &&
+ top < itemTop + itemRect.height &&
+ top + height > itemTop
+ ) {
+ newSelected.push(key);
+ }
+ });
+ const merged = Array.from(new Set([...selectionBox.initialSelected, ...newSelected]));
+ setSelectedItems(merged);
+ };
+
+ const handleMouseUp = () => setSelectionBox(null);
+
+ // Keyboard shortcuts
+ // eslint-disable-next-line react-hooks/exhaustive-deps
+ useEffect(() => {
+ const handleKeyDown = (e) => {
+ if (editingItem) {
+ if (e.key === 'Enter') newItemType ? finishCreateFolder() : finishRename();
+ if (e.key === 'Escape') {
+ setEditingItem(null);
+ setNewItemType(null);
+ }
+ return;
+ }
+ if (e.target.tagName === 'INPUT') return;
+ if (e.key === 'Delete') handleDelete();
+ if (e.ctrlKey && e.key === 'c') handleCopy();
+ if (e.ctrlKey && e.key === 'v') handlePaste();
+ if (e.ctrlKey && e.key === 'a') {
+ e.preventDefault();
+ const allKeys = [
+ ...folders.filter((f) => f.parent === currentFolder).map((f) => f.key),
+ ...(files[currentFolder] || []).map((f) => f.key),
+ ];
+ setSelectedItems(allKeys);
+ }
+ };
+ window.addEventListener('keydown', handleKeyDown);
+ return () => window.removeEventListener('keydown', handleKeyDown);
+ }, [selectedItems, clipboard, currentFolder, folders, files, editingItem, newItemType, tempName]);
+
+ // Context menu handling
+ const handleContextMenu = (e, type, key) => {
+ e.preventDefault();
+ e.stopPropagation();
+ if (type === 'item') {
+ if (!selectedItems.includes(key)) setSelectedItems([key]);
+ } else if (type === 'container') {
+ if (e.target === containerRef.current || e.target.className === 'file-manager-content') setSelectedItems([]);
+ }
+ setContextMenu({ x: e.clientX, y: e.clientY, type, key });
+ };
+
+ useEffect(() => {
+ const handleClick = () => setContextMenu(null);
+ window.addEventListener('click', handleClick);
+ return () => window.removeEventListener('click', handleClick);
+ }, []);
+
+ const renderItem = (item, type) => {
+ const isSelected = selectedItems.includes(item.key);
+ const isEditing = editingItem === item.key;
+ const icon = type === 'folder' ? : ;
+ return (
+ (itemRefs.current[item.key] = el)}
+ draggable={!isEditing}
+ onDragStart={(e) => handleDragStart(e, item.key, type)}
+ onDragOver={type === 'folder' ? handleDragOver : undefined}
+ onDrop={type === 'folder' ? (e) => handleDrop(e, item.key) : undefined}
+ onContextMenu={(e) => handleContextMenu(e, 'item', item.key)}
+ onClick={(e) => {
+ e.stopPropagation();
+ if (e.ctrlKey) {
+ setSelectedItems((prev) => (isSelected ? prev.filter((k) => k !== item.key) : [...prev, item.key]));
+ } else if (e.shiftKey && selectedItems.length) {
+ setSelectedItems((prev) => [...prev, item.key]);
+ } else {
+ setSelectedItems([item.key]);
+ }
+ }}
+ onDoubleClick={() => {
+ if (isEditing) return;
+ if (type === 'folder') handleNavigate(item.key);
+ else handlePreview(item.key);
+ }}
+ style={{
+ width: viewMode === 'grid' ? 100 : '100%',
+ padding: 8,
+ margin: viewMode === 'grid' ? 8 : 0,
+ textAlign: viewMode === 'grid' ? 'center' : 'left',
+ cursor: 'pointer',
+ borderRadius: 4,
+ backgroundColor: isSelected ? '#e6f7ff' : 'transparent',
+ border: isSelected ? '1px solid #1890ff' : '1px solid transparent',
+ display: viewMode === 'list' ? 'flex' : 'block',
+ alignItems: 'center',
+ userSelect: 'none',
+ }}
+ >
+
+ {React.cloneElement(icon, { style: { fontSize: viewMode === 'grid' ? 48 : 24, color: type === 'folder' ? '#1890ff' : '#555' } })}
+
+ {isEditing ? (
+
setTempName(e.target.value)}
+ onBlur={() => (newItemType ? finishCreateFolder() : finishRename())}
+ onClick={(e) => e.stopPropagation()}
+ onKeyDown={(e) => e.stopPropagation()}
+ style={{ width: viewMode === 'grid' ? '100%' : 300 }}
+ />
+ ) : (
+
{item.title || item.name}
+ )}
+
+ );
+ };
+
+ const renderNewFolderPlaceholder = () => {
+ if (newItemType !== 'folder') return null;
+ return (
+
+
+
+
+
setTempName(e.target.value)}
+ onBlur={finishCreateFolder}
+ onClick={(e) => e.stopPropagation()}
+ onKeyDown={(e) => e.stopPropagation()}
+ style={{ width: viewMode === 'grid' ? '100%' : 300 }}
+ />
+
+ );
+ };
+
+ const currentFolders = folders.filter((f) => f.parent === currentFolder);
+ const currentFiles = files[currentFolder] || [];
+
+ const getContextMenuItems = () => {
+ if (contextMenu?.type === 'container') {
+ return [
+ { key: 'new_folder', label: 'Create Folder', icon: },
+ { key: 'upload', label: 'Upload File...', icon: },
+ ];
+ }
+ const items = [];
+ // Only show preview for files, not folders
+ if (selectedItems.length === 1) {
+ const selectedKey = selectedItems[0];
+ const isFolder = folders.some((f) => f.key === selectedKey);
+ if (!isFolder) {
+ items.push({ key: 'preview', label: 'Preview', icon: });
+ }
+ }
+ items.push(
+ { key: 'rename', label: 'Rename', icon: , disabled: selectedItems.length > 1 },
+ { key: 'copy', label: `Copy${selectedItems.length > 1 ? ` (${selectedItems.length})` : ''}`, icon: },
+ { key: 'delete', label: `Delete${selectedItems.length > 1 ? ` (${selectedItems.length})` : ''}`, danger: true, icon: },
+ { type: 'divider' },
+ { key: 'properties', label: 'Properties', icon: }
+ );
+ return items;
+ };
+
+ const handleUploadClick = () => {
+ const input = document.createElement('input');
+ input.type = 'file';
+ input.multiple = true;
+ input.onchange = async (e) => {
+ const filesSelected = Array.from(e.target.files);
+ for (const file of filesSelected) {
+ const form = new FormData();
+ form.append('file', file);
+ form.append('path', currentFolder);
+ try {
+ const res = await axios.post(`${API_BASE}/files/upload`, form, {
+ headers: { 'Content-Type': 'multipart/form-data' },
+ });
+ const newFile = res.data;
+ setFiles((prev) => ({
+ ...prev,
+ [currentFolder]: [...(prev[currentFolder] || []), { key: String(newFile.id), name: newFile.name, size: newFile.size, type: newFile.type }],
+ }));
+ message.success(`${file.name} uploaded`);
+ } catch (err) {
+ console.error('Upload error', err);
+ message.error(`Failed to upload ${file.name}`);
+ }
+ }
+ };
+ input.click();
+ };
+
+ return (
+ {
+ if (isDragSelecting.current) {
+ isDragSelecting.current = false;
+ return;
+ }
+ setSelectedItems([]);
+ setContextMenu(null);
+ }}
+ onContextMenu={(e) => handleContextMenu(e, 'container')}
+ >
+ {/* Toolbar & Breadcrumbs */}
+
+
} onClick={handleUp} disabled={currentFolder === 'root'} size="small" />
+
+ {getBreadcrumbs().map((f) => (
+ handleNavigate(f.key)} style={{ cursor: 'pointer' }}>
+ {f.key === 'root' ? : f.title}
+
+ ))}
+
+
:
}
+ onClick={() => setViewMode(viewMode === 'grid' ? 'list' : 'grid')}
+ title={viewMode === 'grid' ? 'Switch to List View' : 'Switch to Grid View'}
+ />
+
+
+ {/* Content Area */}
+
handleDrop(e, currentFolder)}
+ >
+ {currentFolders.length === 0 && currentFiles.length === 0 && !newItemType && (
+
+
+
+ )}
+ {currentFolders.map((f) => renderItem(f, 'folder'))}
+ {renderNewFolderPlaceholder()}
+ {currentFiles.map((f) => renderItem(f, 'file'))}
+ {selectionBox && (
+
+ )}
+
+
+ {/* Context Menu */}
+ {contextMenu && (
+
+
{
+ setContextMenu(null);
+ if (key === 'new_folder') startCreateFolder();
+ if (key === 'upload') handleUploadClick();
+ if (key === 'rename') {
+ const item = folders.find((f) => f.key === contextMenu.key) || (files[currentFolder] || []).find((f) => f.key === contextMenu.key);
+ startRename(contextMenu.key, item.title || item.name);
+ }
+ if (key === 'copy') handleCopy(selectedItems.length > 0 ? selectedItems : [contextMenu.key]);
+ if (key === 'delete') handleDelete(selectedItems.length > 0 ? selectedItems : [contextMenu.key]);
+ if (key === 'preview') handlePreview(contextMenu.key);
+ if (key === 'properties') handleProperties(selectedItems.length > 0 ? selectedItems : [contextMenu.key]);
+ }}
+ items={getContextMenuItems()}
+ />
+
+ )}
+
+ {/* Preview Modal */}
+
setPreviewFile(null)}
+ footer={null}
+ width={800}
+ >
+ {previewFile && (
+
+ {previewFile.type?.startsWith('image') ? (
+
+ ) : (
+
+
{previewFile.name === 'readme.txt' ? "This is a dummy text file content.\n\nIn a real app, this would fetch the file content from the server." : "Preview not available for this file type."}
+
+ )}
+
+ )}
+
+
+ {/* Properties Modal */}
+
setPropertiesData(null)}
+ footer={[
+ setPropertiesData(null)}>
+ Close
+ ,
+ ]}
+ width={500}
+ >
+ {propertiesData && propertiesData.type === 'single' && (
+
+
+ {propertiesData.isFolder ? (
+
+ ) : (
+
+ )}
+
+
{propertiesData.name}
+
{propertiesData.isFolder ? 'Folder' : 'File'}
+
+
+
+
+ Type:
+ {propertiesData.fileType}
+
+
+ Size:
+ {propertiesData.size}
+
+
+ Created:
+ {propertiesData.created}
+
+
+ Modified:
+ {propertiesData.modified}
+
+
+
+ )}
+ {propertiesData && propertiesData.type === 'multiple' && (
+
+
Selection Summary
+
+
+ Total Items:
+ {propertiesData.totalCount}
+
+
+ Folders:
+ {propertiesData.folderCount}
+
+
+ Files:
+ {propertiesData.fileCount}
+
+
+ Total Size:
+ {propertiesData.totalSize}
+
+
+
+ )}
+
+
+ );
+}
+
+export default FilesManager;
diff --git a/client/src/views/ProofReading.js b/client/src/views/ProofReading.js
new file mode 100644
index 0000000..208d5d1
--- /dev/null
+++ b/client/src/views/ProofReading.js
@@ -0,0 +1,272 @@
+import React, { useState, useEffect } from 'react';
+import { Layout, message, Spin, Empty } from 'antd';
+import axios from 'axios';
+import NeuroglancerViewer from '../components/NeuroglancerViewer';
+import SynapseList from '../components/SynapseList';
+import ProofreadingControls from '../components/ProofreadingControls';
+
+const { Sider, Content } = Layout;
+const API_BASE = `${process.env.REACT_APP_SERVER_PROTOCOL || 'http'}://${process.env.REACT_APP_SERVER_URL || 'localhost:4243'}`;
+
+/**
+ * ProofReading Component
+ *
+ * Main view for synapse proofreading. Integrates Neuroglancer viewer,
+ * synapse list, and proofreading controls. Supports keyboard shortcuts
+ * for efficient workflow.
+ */
+function ProofReading() {
+ const [synapses, setSynapses] = useState([]);
+ const [currentIndex, setCurrentIndex] = useState(0);
+ const [neuroglancerUrl, setNeuroglancerUrl] = useState('');
+ const [loading, setLoading] = useState(true);
+ const [projectId, setProjectId] = useState(1); // Default to first project
+ const [reviewedCount, setReviewedCount] = useState(0);
+
+ // Fetch synapses on mount
+ useEffect(() => {
+ fetchSynapses();
+ fetchNeuroglancerUrl();
+ }, [projectId]);
+
+ // Keyboard shortcuts
+ useEffect(() => {
+ const handleKeyPress = (e) => {
+ // Don't trigger shortcuts when typing in input fields
+ if (e.target.tagName === 'INPUT') return;
+
+ switch (e.key.toLowerCase()) {
+ case 'c':
+ updateStatus('correct');
+ break;
+ case 'x':
+ updateStatus('incorrect');
+ break;
+ case 'u':
+ updateStatus('unsure');
+ break;
+ case 'arrowright':
+ goToNext();
+ break;
+ case 'arrowleft':
+ goToPrevious();
+ break;
+ case 's':
+ saveCurrent();
+ break;
+ default:
+ break;
+ }
+ };
+
+ window.addEventListener('keydown', handleKeyPress);
+ return () => window.removeEventListener('keydown', handleKeyPress);
+ }, [currentIndex, synapses]);
+
+ /**
+ * Fetch synapses from backend
+ */
+ const fetchSynapses = async () => {
+ try {
+ setLoading(true);
+ const res = await axios.get(`${API_BASE}/api/projects/${projectId}/synapses`, {
+ withCredentials: true
+ });
+ setSynapses(res.data);
+
+ // Count reviewed synapses (not in error state)
+ const reviewed = res.data.filter(s => s.status !== 'error').length;
+ setReviewedCount(reviewed);
+
+ setLoading(false);
+ } catch (err) {
+ console.error('Failed to fetch synapses', err);
+ message.error('Failed to load synapses');
+ setLoading(false);
+ }
+ };
+
+ /**
+ * Fetch Neuroglancer URL for the project
+ */
+ const fetchNeuroglancerUrl = async () => {
+ try {
+ const res = await axios.get(`${API_BASE}/api/synanno/ng-url/${projectId}`, {
+ withCredentials: true
+ });
+ setNeuroglancerUrl(res.data.url);
+ } catch (err) {
+ console.error('Failed to fetch Neuroglancer URL', err);
+ message.error('Failed to load Neuroglancer viewer');
+ }
+ };
+
+ /**
+ * Update status of current synapse locally
+ */
+ const updateStatus = (status) => {
+ if (synapses[currentIndex]) {
+ const updated = [...synapses];
+ updated[currentIndex] = { ...updated[currentIndex], status };
+ setSynapses(updated);
+ }
+ };
+
+ /**
+ * Save current synapse to backend
+ */
+ const saveCurrent = async () => {
+ if (!synapses[currentIndex]) return;
+
+ try {
+ await axios.put(
+ `${API_BASE}/api/synapses/${synapses[currentIndex].id}`,
+ {
+ status: synapses[currentIndex].status,
+ pre_neuron_id: synapses[currentIndex].pre_neuron_id,
+ post_neuron_id: synapses[currentIndex].post_neuron_id
+ },
+ { withCredentials: true }
+ );
+ message.success('Synapse updated');
+
+ // Update reviewed count
+ const reviewed = synapses.filter(s => s.status !== 'error').length;
+ setReviewedCount(reviewed);
+ } catch (err) {
+ console.error('Failed to update synapse', err);
+ message.error('Failed to save changes');
+ }
+ };
+
+ /**
+ * Save synapse with provided updates
+ */
+ const handleSave = async (updates) => {
+ if (!synapses[currentIndex]) return;
+
+ try {
+ await axios.put(
+ `${API_BASE}/api/synapses/${synapses[currentIndex].id}`,
+ updates,
+ { withCredentials: true }
+ );
+
+ // Update local state
+ const updated = [...synapses];
+ updated[currentIndex] = { ...updated[currentIndex], ...updates };
+ setSynapses(updated);
+
+ message.success('Synapse updated');
+
+ // Update reviewed count
+ const reviewed = updated.filter(s => s.status !== 'error').length;
+ setReviewedCount(reviewed);
+ } catch (err) {
+ console.error('Failed to update synapse', err);
+ message.error('Failed to save changes');
+ }
+ };
+
+ /**
+ * Navigate to next synapse
+ */
+ const goToNext = () => {
+ if (currentIndex < synapses.length - 1) {
+ setCurrentIndex(currentIndex + 1);
+ } else {
+ message.info('You have reached the last synapse');
+ }
+ };
+
+ /**
+ * Navigate to previous synapse
+ */
+ const goToPrevious = () => {
+ if (currentIndex > 0) {
+ setCurrentIndex(currentIndex - 1);
+ } else {
+ message.info('You are at the first synapse');
+ }
+ };
+
+ // Loading state
+ if (loading) {
+ return (
+
+
+
+ );
+ }
+
+ // Empty state
+ if (synapses.length === 0) {
+ return (
+
+
+
+ );
+ }
+
+ return (
+
+ {/* Left Panel: Synapse List */}
+
+
+
+
+ {/* Center: Neuroglancer Viewer */}
+
+
+
+
+ {/* Right Panel: Proofreading Controls */}
+
+
+
+
+ );
+}
+
+export default ProofReading;
diff --git a/client/src/views/Views.js b/client/src/views/Views.js
index c77ae9c..1feeca3 100644
--- a/client/src/views/Views.js
+++ b/client/src/views/Views.js
@@ -1,150 +1,49 @@
-import React, { useState, useEffect } from 'react'
-import DataLoader from './DataLoader'
-import Visualization from '../views/Visualization'
-import ModelTraining from '../views/ModelTraining'
-import ModelInference from '../views/ModelInference'
-import Monitoring from '../views/Monitoring'
-import Chatbot from '../components/Chatbot'
-import { Layout, Menu, Button } from 'antd'
-import { MessageOutlined } from '@ant-design/icons'
-import { getNeuroglancerViewer } from '../utils/api'
+import React, { useState, useContext } from 'react'
+import { Layout, Menu, Avatar, Typography, Space, Button, Tooltip } from 'antd'
+import { FolderOpenOutlined, DesktopOutlined, UserOutlined, LogoutOutlined } from '@ant-design/icons'
+import FilesManager from './FilesManager'
+import Workspace from './Workspace'
+import { UserContext } from '../contexts/UserContext'
-const { Content, Sider } = Layout
+const { Content } = Layout
+const { Text } = Typography
-function Views () {
- const [current, setCurrent] = useState('visualization')
- const [viewers, setViewers] = useState([])
- const [isLoading, setIsLoading] = useState(false)
- const [isInferring, setIsInferring] = useState(false)
- const [isChatOpen, setIsChatOpen] = useState(false)
- console.log(viewers)
-
- const onClick = (e) => {
- setCurrent(e.key)
- }
+function Views() {
+ const [current, setCurrent] = useState('files')
+ const { currentUser, autoSignOut } = useContext(UserContext)
const items = [
- { label: 'Visualization', key: 'visualization' },
- { label: 'Model Training', key: 'training' },
- { label: 'Model Inference', key: 'inference' },
- { label: 'Tensorboard', key: 'monitoring' }
+ { label: 'File Management', key: 'files', icon: },
+ { label: 'Work Space', key: 'workspace', icon: }
]
- const renderMenu = () => {
- if (current === 'visualization') {
- return
- } else if (current === 'training') {
- return
- } else if (current === 'monitoring') {
- return
- } else if (current === 'inference') {
- return
- }
- }
-
- const [collapsed, setCollapsed] = useState(false)
-
- const fetchNeuroglancerViewer = async (
- currentImage,
- currentLabel,
- scales
- ) => {
- setIsLoading(true)
- try {
- const viewerId = currentImage.uid + currentLabel.uid + JSON.stringify(scales)
- let updatedViewers = viewers
- const exists = viewers.find(
- // (viewer) => viewer.key === currentImage.uid + currentLabel.uid
- (viewer) => viewer.key === viewerId
- )
- // console.log(exists, viewers)
- if (exists) {
- updatedViewers = viewers.filter((viewer) => viewer.key !== viewerId)
- }
- const res = await getNeuroglancerViewer(
- currentImage,
- currentLabel,
- scales
- )
- const newUrl = res.replace(/\/\/[^:/]+/, '//localhost')
- console.log('Current Viewer at ', newUrl)
-
- setViewers([
- ...updatedViewers,
- {
- key: viewerId,
- title: currentImage.name + ' & ' + currentLabel.name,
- viewer: newUrl
- }
- ])
- setIsLoading(false)
- } catch (e) {
- console.log(e)
- setIsLoading(false)
- }
+ const onClick = (e) => {
+ setCurrent(e.key)
}
- useEffect(() => { // This function makes sure that the inferring will continue when current tab changes
- if (current === 'inference' && isInferring) {
- console.log('Inference process is continuing...')
- }
- }, [current, isInferring])
-
return (
-
- {isLoading
- ? (Loading the viewer ...
)
- : (
- <>
- setCollapsed(value)}
- theme='light'
- collapsedWidth='0'
- >
-
-
-
-
-
- {renderMenu()}
-
-
- {isChatOpen ? (
-
- setIsChatOpen(false)} />
-
- ) : (
- }
- onClick={() => setIsChatOpen(true)}
- style={{
- margin: '8px 8px'
- }}
- />
- )}
- >
- )}
+
+
+
+
+
+ } style={{ backgroundColor: '#87d068' }} />
+ {currentUser ? currentUser.name : 'Guest'}
+
+
+ } onClick={autoSignOut} type="text" danger />
+
+
+
+
+ {current === 'files' ? : }
+
)
}
diff --git a/client/src/views/Welcome.js b/client/src/views/Welcome.js
new file mode 100644
index 0000000..5970858
--- /dev/null
+++ b/client/src/views/Welcome.js
@@ -0,0 +1,177 @@
+import React, { useState, useContext, useEffect } from 'react';
+import { Card, Button, Input, Modal, Typography, message, Space } from 'antd';
+import { UserOutlined, LockOutlined, RocketOutlined } from '@ant-design/icons';
+import { UserContext } from '../contexts/UserContext';
+
+const { Title, Text } = Typography;
+
+function Welcome() {
+ const { signIn, signUp, autoSignOut } = useContext(UserContext);
+ const [showSignIn, setShowSignIn] = useState(false);
+ const [showSignUp, setShowSignUp] = useState(false);
+ const [signInName, setSignInName] = useState('');
+ const [signInPassword, setSignInPassword] = useState('');
+ const [signUpName, setSignUpName] = useState('');
+ const [signUpPassword, setSignUpPassword] = useState('');
+
+ useEffect(() => {
+ autoSignOut();
+ }, [autoSignOut]);
+
+ const handleSignIn = async () => {
+ const ok = await signIn(signInName, signInPassword);
+ if (!ok) message.error('Invalid credentials');
+ setShowSignIn(false);
+ };
+
+ const handleSignUp = async () => {
+ const ok = await signUp(signUpName, signUpPassword);
+ if (!ok) message.error('Sign up failed');
+ setShowSignUp(false);
+ };
+
+ return (
+
+
+
+
+
+
+
PyTC Client
+
+ Advanced Connectomics Workflow Interface
+
+
+
+
+ setShowSignIn(true)}
+ >
+ Sign In
+
+ setShowSignUp(true)}
+ >
+ Create Account
+
+
+
+
+
+ © {new Date().getFullYear()} PyTC Client. All rights reserved.
+
+
+
+
+ {/* Sign In Modal */}
+
Sign In }
+ open={showSignIn}
+ onCancel={() => setShowSignIn(false)}
+ onOk={handleSignIn}
+ okText='Sign In'
+ centered
+ width={360}
+ okButtonProps={{ style: { background: '#764ba2', borderColor: '#764ba2' } }}
+ >
+
+ }
+ value={signInName}
+ onChange={e => setSignInName(e.target.value)}
+ />
+ }
+ value={signInPassword}
+ onChange={e => setSignInPassword(e.target.value)}
+ onPressEnter={handleSignIn}
+ />
+
+
+
+ {/* Sign Up Modal */}
+ Create Account}
+ open={showSignUp}
+ onCancel={() => setShowSignUp(false)}
+ onOk={handleSignUp}
+ okText='Sign Up'
+ centered
+ width={360}
+ okButtonProps={{ style: { background: '#764ba2', borderColor: '#764ba2' } }}
+ >
+
+ }
+ value={signUpName}
+ onChange={e => setSignUpName(e.target.value)}
+ />
+ }
+ value={signUpPassword}
+ onChange={e => setSignUpPassword(e.target.value)}
+ onPressEnter={handleSignUp}
+ />
+
+
+
+ );
+}
+
+export default Welcome;
diff --git a/client/src/views/Workspace.js b/client/src/views/Workspace.js
new file mode 100644
index 0000000..e6249d9
--- /dev/null
+++ b/client/src/views/Workspace.js
@@ -0,0 +1,137 @@
+import React, { useState, useEffect } from 'react'
+import DataLoader from './DataLoader'
+import Visualization from '../views/Visualization'
+import ModelTraining from '../views/ModelTraining'
+import ModelInference from '../views/ModelInference'
+import Monitoring from '../views/Monitoring'
+import ProofReading from '../views/ProofReading'
+import Chatbot from '../components/Chatbot'
+import { Layout, Menu, Button } from 'antd'
+import { MessageOutlined } from '@ant-design/icons'
+import { getNeuroglancerViewer } from '../utils/api'
+
+const { Content, Sider } = Layout
+
+function Workspace() {
+ const [current, setCurrent] = useState('visualization')
+ const [viewers, setViewers] = useState([])
+ const [isLoading, setIsLoading] = useState(false)
+ const [isInferring, setIsInferring] = useState(false)
+ const [isChatOpen, setIsChatOpen] = useState(false)
+ const [collapsed, setCollapsed] = useState(false)
+
+ const onClick = (e) => {
+ setCurrent(e.key)
+ }
+
+ const items = [
+ { label: 'Visualization', key: 'visualization' },
+ { label: 'Model Training', key: 'training' },
+ { label: 'Model Inference', key: 'inference' },
+ { label: 'Tensorboard', key: 'monitoring' },
+ { label: 'Proof Reading', key: 'proofreading' }
+ ]
+
+ const renderMenu = () => {
+ if (current === 'visualization') {
+ return
+ } else if (current === 'training') {
+ return
+ } else if (current === 'monitoring') {
+ return
+ } else if (current === 'inference') {
+ return
+ } else if (current === 'proofreading') {
+ return
+ }
+ }
+
+ const fetchNeuroglancerViewer = async (
+ currentImage,
+ currentLabel,
+ scales
+ ) => {
+ setIsLoading(true)
+ try {
+ const viewerId = currentImage.uid + currentLabel.uid + JSON.stringify(scales)
+ let updatedViewers = viewers
+ const exists = viewers.find(
+ (viewer) => viewer.key === viewerId
+ )
+ if (exists) {
+ updatedViewers = viewers.filter((viewer) => viewer.key !== viewerId)
+ }
+ const res = await getNeuroglancerViewer(
+ currentImage,
+ currentLabel,
+ scales
+ )
+ const newUrl = res.replace(/\/\/[^:/]+/, '//localhost')
+ console.log('Current Viewer at ', newUrl)
+
+ setViewers([
+ ...updatedViewers,
+ {
+ key: viewerId,
+ title: currentImage.name + ' & ' + currentLabel.name,
+ viewer: newUrl
+ }
+ ])
+ setIsLoading(false)
+ } catch (e) {
+ console.log(e)
+ setIsLoading(false)
+ }
+ }
+
+ useEffect(() => {
+ if (current === 'inference' && isInferring) {
+ console.log('Inference process is continuing...')
+ }
+ }, [current, isInferring])
+
+ return (
+
+ {isLoading
+ ? (Loading the viewer ...
)
+ : (
+ <>
+ setCollapsed(value)}
+ theme='light'
+ collapsedWidth='0'
+ >
+
+
+
+
+
+ {renderMenu()}
+
+
+ {isChatOpen ? (
+
+ setIsChatOpen(false)} />
+
+ ) : (
+ }
+ onClick={() => setIsChatOpen(true)}
+ style={{ margin: '8px 8px' }}
+ />
+ )}
+ >
+ )}
+
+ )
+}
+
+export default Workspace
diff --git a/neuroglancer_url.txt b/neuroglancer_url.txt
new file mode 100644
index 0000000..b17dc38
--- /dev/null
+++ b/neuroglancer_url.txt
@@ -0,0 +1 @@
+http://127.0.0.1:9000/v/4d27b74747a352dbc6a1ad52c3a9fc123fa56f42/
\ No newline at end of file
diff --git a/pyproject.toml b/pyproject.toml
index 8512f27..6d865ba 100644
--- a/pyproject.toml
+++ b/pyproject.toml
@@ -4,6 +4,8 @@ version = "0.1.0"
description = "PyTC Client backend services"
requires-python = ">=3.10,<3.12"
dependencies = [
+ "argon2-cffi>=25.1.0",
+ "bcrypt>=5.0.0",
"faiss-cpu==1.12.0",
"fastapi==0.119.0",
"h5py>=3.11",
@@ -13,8 +15,10 @@ dependencies = [
"langchain-ollama==1.0.0",
"neuroglancer==2.38",
"numpy>=1.24",
+ "passlib>=1.7.4",
"pillow>=10.0",
"psutil>=5.9.0",
+ "python-jose>=3.5.0",
"python-multipart==0.0.20",
"requests>=2.31",
"scipy>=1.11",
diff --git a/scripts/dev.sh b/scripts/dev.sh
index fdbfd84..417b69e 100755
--- a/scripts/dev.sh
+++ b/scripts/dev.sh
@@ -23,12 +23,19 @@ cleanup() {
if [[ -n "${PYTC_PID:-}" ]] && ps -p "${PYTC_PID}" >/dev/null 2>&1; then
kill "${PYTC_PID}" >/dev/null 2>&1 || true
fi
+ if [[ -n "${DATA_SERVER_PID:-}" ]] && ps -p "${DATA_SERVER_PID}" >/dev/null 2>&1; then
+ kill "${DATA_SERVER_PID}" >/dev/null 2>&1 || true
+ fi
wait || true
exit "${exit_code}"
}
trap cleanup EXIT INT TERM
+echo "Starting Data Server (port 8000)..."
+uv run --directory "${ROOT_DIR}" python server_api/scripts/serve_data.py &
+DATA_SERVER_PID=$!
+
echo "Starting API server..."
uv run --directory "${ROOT_DIR}" python server_api/main.py &
API_PID=$!
diff --git a/server_api/auth/database.py b/server_api/auth/database.py
new file mode 100644
index 0000000..0d6d70e
--- /dev/null
+++ b/server_api/auth/database.py
@@ -0,0 +1,20 @@
+from sqlalchemy import create_engine
+from sqlalchemy.ext.declarative import declarative_base
+from sqlalchemy.orm import sessionmaker
+
+SQLALCHEMY_DATABASE_URL = "sqlite:///./sql_app.db"
+# SQLALCHEMY_DATABASE_URL = "postgresql://user:password@postgresserver/db"
+
+engine = create_engine(
+ SQLALCHEMY_DATABASE_URL, connect_args={"check_same_thread": False}
+)
+SessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)
+
+Base = declarative_base()
+
+def get_db():
+ db = SessionLocal()
+ try:
+ yield db
+ finally:
+ db.close()
diff --git a/server_api/auth/models.py b/server_api/auth/models.py
new file mode 100644
index 0000000..3882bd9
--- /dev/null
+++ b/server_api/auth/models.py
@@ -0,0 +1,133 @@
+from sqlalchemy import Column, Integer, String, DateTime, ForeignKey, Boolean, Float
+from sqlalchemy.orm import relationship
+from sqlalchemy.sql import func
+from .database import Base
+from pydantic import BaseModel
+from typing import Optional, List
+from datetime import datetime
+
+# SQLAlchemy Model
+class User(Base):
+ __tablename__ = "users"
+
+ id = Column(Integer, primary_key=True, index=True)
+ username = Column(String, unique=True, index=True)
+ email = Column(String, unique=True, index=True, nullable=True)
+ hashed_password = Column(String)
+ created_at = Column(DateTime(timezone=True), server_default=func.now())
+
+ files = relationship("File", back_populates="owner")
+
+class File(Base):
+ __tablename__ = "files"
+
+ id = Column(Integer, primary_key=True, index=True)
+ user_id = Column(Integer, ForeignKey("users.id"))
+ name = Column(String, index=True)
+ path = Column(String, default="root") # Virtual parent folder key
+ is_folder = Column(Boolean, default=False)
+ size = Column(String, default="0KB")
+ type = Column(String, default="unknown")
+ physical_path = Column(String, nullable=True) # Path on disk
+ created_at = Column(DateTime(timezone=True), server_default=func.now())
+
+ owner = relationship("User", back_populates="files")
+
+class Project(Base):
+ __tablename__ = "projects"
+
+ id = Column(Integer, primary_key=True, index=True)
+ name = Column(String, nullable=False)
+ user_id = Column(Integer, ForeignKey("users.id"), nullable=True)
+ neuroglancer_url = Column(String, nullable=True)
+ image_path = Column(String, nullable=True)
+ label_path = Column(String, nullable=True)
+ created_at = Column(DateTime(timezone=True), server_default=func.now())
+
+class Synapse(Base):
+ __tablename__ = "synapses"
+
+ id = Column(Integer, primary_key=True, index=True)
+ project_id = Column(Integer, ForeignKey("projects.id"))
+ pre_neuron_id = Column(Integer, nullable=True)
+ post_neuron_id = Column(Integer, nullable=True)
+ x = Column(Float, nullable=False)
+ y = Column(Float, nullable=False)
+ z = Column(Float, nullable=False)
+ status = Column(String, default="error") # error, correct, incorrect, unsure
+ confidence = Column(Float, nullable=True)
+ reviewed_by = Column(Integer, ForeignKey("users.id"), nullable=True)
+ reviewed_at = Column(DateTime(timezone=True), nullable=True)
+ created_at = Column(DateTime(timezone=True), server_default=func.now())
+
+# Pydantic Schemas
+class UserBase(BaseModel):
+ username: str
+ email: Optional[str] = None
+
+class UserCreate(UserBase):
+ password: str
+
+class UserResponse(UserBase):
+ id: int
+ created_at: datetime
+
+ class Config:
+ from_attributes = True
+
+class FileBase(BaseModel):
+ name: str
+ path: str = "root"
+ is_folder: bool = False
+ size: str = "0KB"
+ type: str = "unknown"
+
+class FileCreate(FileBase):
+ pass
+
+class FileUpdate(BaseModel):
+ name: Optional[str] = None
+ path: Optional[str] = None
+
+class FileResponse(FileBase):
+ id: int
+ user_id: int
+ created_at: datetime
+
+ class Config:
+ from_attributes = True
+
+class Token(BaseModel):
+ access_token: str
+ token_type: str
+
+class TokenData(BaseModel):
+ username: Optional[str] = None
+
+# Synapse Proofreading Schemas
+class SynapseResponse(BaseModel):
+ id: int
+ project_id: int
+ pre_neuron_id: Optional[int]
+ post_neuron_id: Optional[int]
+ x: float
+ y: float
+ z: float
+ status: str
+ confidence: Optional[float]
+
+ class Config:
+ from_attributes = True
+
+class SynapseUpdate(BaseModel):
+ status: Optional[str] = None
+ pre_neuron_id: Optional[int] = None
+ post_neuron_id: Optional[int] = None
+
+class ProjectResponse(BaseModel):
+ id: int
+ name: str
+ neuroglancer_url: Optional[str]
+
+ class Config:
+ from_attributes = True
diff --git a/server_api/auth/router.py b/server_api/auth/router.py
new file mode 100644
index 0000000..dc1a18e
--- /dev/null
+++ b/server_api/auth/router.py
@@ -0,0 +1,185 @@
+from fastapi import APIRouter, Depends, HTTPException, status, UploadFile, File, Form
+from fastapi.security import OAuth2PasswordBearer, OAuth2PasswordRequestForm
+from sqlalchemy.orm import Session
+from . import models, utils, database
+from jose import JWTError, jwt
+from typing import List, Optional
+import shutil
+import os
+import uuid
+
+router = APIRouter()
+
+oauth2_scheme = OAuth2PasswordBearer(tokenUrl="token")
+
+def get_current_user(token: str = Depends(oauth2_scheme), db: Session = Depends(database.get_db)):
+ credentials_exception = HTTPException(
+ status_code=status.HTTP_401_UNAUTHORIZED,
+ detail="Could not validate credentials",
+ headers={"WWW-Authenticate": "Bearer"},
+ )
+ try:
+ payload = jwt.decode(token, utils.SECRET_KEY, algorithms=[utils.ALGORITHM])
+ username: str = payload.get("sub")
+ if username is None:
+ raise credentials_exception
+ token_data = models.TokenData(username=username)
+ except JWTError:
+ raise credentials_exception
+ user = db.query(models.User).filter(models.User.username == token_data.username).first()
+ if user is None:
+ raise credentials_exception
+ return user
+
+@router.post("/register", response_model=models.UserResponse)
+def register(user: models.UserCreate, db: Session = Depends(database.get_db)):
+ db_user = db.query(models.User).filter(models.User.username == user.username).first()
+ if db_user:
+ raise HTTPException(status_code=400, detail="Username already registered")
+
+ hashed_password = utils.get_password_hash(user.password)
+ new_user = models.User(username=user.username, email=user.email, hashed_password=hashed_password)
+ db.add(new_user)
+ db.commit()
+ db.refresh(new_user)
+ return new_user
+
+@router.post("/token", response_model=models.Token)
+def login_for_access_token(form_data: OAuth2PasswordRequestForm = Depends(), db: Session = Depends(database.get_db)):
+ user = db.query(models.User).filter(models.User.username == form_data.username).first()
+ if not user or not utils.verify_password(form_data.password, user.hashed_password):
+ raise HTTPException(
+ status_code=status.HTTP_401_UNAUTHORIZED,
+ detail="Incorrect username or password",
+ headers={"WWW-Authenticate": "Bearer"},
+ )
+ access_token_expires = utils.timedelta(minutes=utils.ACCESS_TOKEN_EXPIRE_MINUTES)
+ access_token = utils.create_access_token(
+ data={"sub": user.username}, expires_delta=access_token_expires
+ )
+ return {"access_token": access_token, "token_type": "bearer"}
+
+@router.get("/users/me", response_model=models.UserResponse)
+def read_users_me(current_user: models.User = Depends(get_current_user)):
+ return current_user
+
+# File Management Endpoints
+
+@router.get("/files", response_model=List[models.FileResponse])
+def get_files(current_user: models.User = Depends(get_current_user), db: Session = Depends(database.get_db)):
+ return current_user.files
+
+@router.post("/files/upload", response_model=models.FileResponse)
+def upload_file(
+ path: str = Form(...),
+ file: UploadFile = File(...),
+ current_user: models.User = Depends(get_current_user),
+ db: Session = Depends(database.get_db)
+):
+ # Create uploads directory if not exists
+ upload_dir = f"uploads/{current_user.id}"
+ os.makedirs(upload_dir, exist_ok=True)
+
+ # Generate unique filename
+ file_ext = os.path.splitext(file.filename)[1]
+ unique_filename = f"{uuid.uuid4()}{file_ext}"
+ file_path = f"{upload_dir}/{unique_filename}"
+
+ # Save file
+ with open(file_path, "wb") as buffer:
+ shutil.copyfileobj(file.file, buffer)
+
+ # Calculate size (approx)
+ size_bytes = os.path.getsize(file_path)
+ size_str = f"{size_bytes / 1024:.1f}KB" if size_bytes < 1024 * 1024 else f"{size_bytes / (1024 * 1024):.1f}MB"
+
+ new_file = models.File(
+ user_id=current_user.id,
+ name=file.filename,
+ path=path,
+ is_folder=False,
+ size=size_str,
+ type=file.content_type or "unknown",
+ physical_path=file_path
+ )
+ db.add(new_file)
+ db.commit()
+ db.refresh(new_file)
+ return new_file
+
+@router.post("/files/folder", response_model=models.FileResponse)
+def create_folder(
+ file: models.FileCreate,
+ current_user: models.User = Depends(get_current_user),
+ db: Session = Depends(database.get_db)
+):
+ new_folder = models.File(
+ user_id=current_user.id,
+ name=file.name,
+ path=file.path,
+ is_folder=True,
+ size="0KB",
+ type="folder"
+ )
+ db.add(new_folder)
+ db.commit()
+ db.refresh(new_folder)
+ return new_folder
+
+@router.put("/files/{file_id}", response_model=models.FileResponse)
+def update_file(
+ file_id: int,
+ file_update: models.FileUpdate,
+ current_user: models.User = Depends(get_current_user),
+ db: Session = Depends(database.get_db)
+):
+ file = db.query(models.File).filter(models.File.id == file_id, models.File.user_id == current_user.id).first()
+ if not file:
+ raise HTTPException(status_code=404, detail="File not found")
+
+ # Only update provided fields
+ if file_update.name is not None:
+ file.name = file_update.name
+ if file_update.path is not None:
+ file.path = file_update.path
+
+ db.commit()
+ db.refresh(file)
+ return file
+
+@router.delete("/files/{file_id}")
+def delete_file(
+ file_id: int,
+ current_user: models.User = Depends(get_current_user),
+ db: Session = Depends(database.get_db)
+):
+ file = db.query(models.File).filter(models.File.id == file_id, models.File.user_id == current_user.id).first()
+ if not file:
+ raise HTTPException(status_code=404, detail="File not found")
+
+ # If it's a file, delete from disk
+ if not file.is_folder and file.physical_path and os.path.exists(file.physical_path):
+ os.remove(file.physical_path)
+
+ # If it's a folder, we should ideally delete children, but for now let's just delete the folder entry
+ # Or better: delete all children recursively
+ # Simple recursive delete
+ def delete_recursive(parent_key):
+ children = db.query(models.File).filter(models.File.path == parent_key, models.File.user_id == current_user.id).all()
+ for child in children:
+ delete_recursive(str(child.id)) # Assuming key is ID-based
+ if not child.is_folder and child.physical_path and os.path.exists(child.physical_path):
+ os.remove(child.physical_path)
+ db.delete(child)
+
+ # Since we use 'folder_{timestamp}' as keys in frontend, but here we might use IDs or keep frontend keys?
+ # The frontend uses 'key' which is a string. The backend uses 'id' (int).
+ # We need to align this.
+ # Option 1: Frontend uses backend IDs as keys.
+ # Option 2: Backend stores frontend keys.
+ # Let's go with Option 1: Frontend adapts to use IDs.
+
+ # For now, just delete the item itself.
+ db.delete(file)
+ db.commit()
+ return {"message": "File deleted"}
diff --git a/server_api/auth/utils.py b/server_api/auth/utils.py
new file mode 100644
index 0000000..6d4cf54
--- /dev/null
+++ b/server_api/auth/utils.py
@@ -0,0 +1,27 @@
+from passlib.context import CryptContext
+from datetime import datetime, timedelta
+from typing import Optional
+from jose import JWTError, jwt
+
+# Constants
+SECRET_KEY = "your-secret-key-keep-it-secret" # TODO: Move to env var
+ALGORITHM = "HS256"
+ACCESS_TOKEN_EXPIRE_MINUTES = 30
+
+pwd_context = CryptContext(schemes=["argon2"], deprecated="auto")
+
+def verify_password(plain_password, hashed_password):
+ return pwd_context.verify(plain_password, hashed_password)
+
+def get_password_hash(password):
+ return pwd_context.hash(password)
+
+def create_access_token(data: dict, expires_delta: Optional[timedelta] = None):
+ to_encode = data.copy()
+ if expires_delta:
+ expire = datetime.utcnow() + expires_delta
+ else:
+ expire = datetime.utcnow() + timedelta(minutes=15)
+ to_encode.update({"exp": expire})
+ encoded_jwt = jwt.encode(to_encode, SECRET_KEY, algorithm=ALGORITHM)
+ return encoded_jwt
diff --git a/server_api/main.py b/server_api/main.py
index 648c5d9..b4e6727 100644
--- a/server_api/main.py
+++ b/server_api/main.py
@@ -11,12 +11,26 @@
from utils.io import readVol
from utils.utils import process_path
from chatbot.chatbot import chain, memory
+from auth import models, database, router as auth_router
+from synanno import router as synanno_router
+
+from fastapi.staticfiles import StaticFiles
+import os
REACT_APP_SERVER_PROTOCOL = "http"
REACT_APP_SERVER_URL = "localhost:4243"
+models.Base.metadata.create_all(bind=database.engine)
+
app = FastAPI()
+# Ensure uploads directory exists
+os.makedirs("uploads", exist_ok=True)
+app.mount("/uploads", StaticFiles(directory="uploads"), name="uploads")
+
+app.include_router(auth_router.router)
+app.include_router(synanno_router.router, tags=["synanno"])
+
app.add_middleware(
CORSMiddleware,
allow_origins=["*"],
diff --git a/server_api/requirements.txt b/server_api/requirements.txt
index f18d91d..56c96f2 100644
--- a/server_api/requirements.txt
+++ b/server_api/requirements.txt
@@ -14,5 +14,9 @@ requests>=2.31
scipy>=1.11
tensorboard==2.20.0
tensorboard-data-server==0.7.2
+python-jose[cryptography]
+passlib[argon2]
+argon2-cffi
+sqlalchemy
uvicorn==0.38.0
zarr>=2.18
diff --git a/server_api/scripts/convert_to_nifti.py b/server_api/scripts/convert_to_nifti.py
new file mode 100644
index 0000000..0e218ff
--- /dev/null
+++ b/server_api/scripts/convert_to_nifti.py
@@ -0,0 +1,76 @@
+import os
+import numpy as np
+import nibabel as nib
+from PIL import Image
+import glob
+
+def convert_tiff_to_nifti(tiff_path, output_path, resolution=(5, 5, 5)):
+ """
+ Convert a multi-page TIFF or sequence of TIFFs to a NIfTI file.
+ """
+ print(f"Converting {tiff_path} to {output_path}...")
+
+ # Check if it's a single multi-page TIFF or a pattern
+ if '*' in tiff_path:
+ files = sorted(glob.glob(tiff_path))
+ if not files:
+ print(f"No files found for pattern {tiff_path}")
+ return False
+ # Read first image to get dimensions
+ img0 = np.array(Image.open(files[0]))
+ volume = np.zeros((len(files), img0.shape[0], img0.shape[1]), dtype=img0.dtype)
+ for i, f in enumerate(files):
+ volume[i] = np.array(Image.open(f))
+ else:
+ # Multi-page TIFF
+ if not os.path.exists(tiff_path):
+ print(f"File not found: {tiff_path}")
+ return False
+
+ img = Image.open(tiff_path)
+ frames = []
+ try:
+ while True:
+ frames.append(np.array(img))
+ img.seek(img.tell() + 1)
+ except EOFError:
+ pass
+ volume = np.array(frames)
+
+ # NIfTI expects (x, y, z) or (x, y, z, t)
+ # Our volume is likely (z, y, x) from reading frames
+ # Let's transpose to (x, y, z) which is standard for Neuroglancer NIfTI
+ volume = np.transpose(volume, (2, 1, 0))
+
+ # Create affine matrix
+ # Scaling is in mm for NIfTI usually, but Neuroglancer can handle nm if specified
+ # But standard NIfTI is usually mm. 5nm = 0.000005 mm
+ scale = np.array(resolution) * 1e-6 # Convert nm to mm? Or just use raw units and specify in Neuroglancer
+
+ # Let's use identity affine for now and specify voxel size in Neuroglancer state
+ affine = np.eye(4)
+ # Set voxel size in header
+
+ nifti_img = nib.Nifti1Image(volume, affine)
+ nifti_img.header.set_zooms(resolution) # This sets pixdim
+
+ nib.save(nifti_img, output_path)
+ print(f"Saved {output_path}")
+ return True
+
+if __name__ == "__main__":
+ # Use current working directory to find samples
+ # Assuming we run from project root
+ samples_dir = os.path.join(os.getcwd(), "samples_pytc")
+
+ if not os.path.exists(samples_dir):
+ print(f"Samples directory not found at {samples_dir}")
+ exit(1)
+ img_path = os.path.join(samples_dir, "lucchiIm.tif")
+ img_out = os.path.join(samples_dir, "lucchiIm.nii.gz")
+ convert_tiff_to_nifti(img_path, img_out)
+
+ # Convert Labels
+ lbl_path = os.path.join(samples_dir, "lucchiLabels.tif")
+ lbl_out = os.path.join(samples_dir, "lucchiLabels.nii.gz")
+ convert_tiff_to_nifti(lbl_path, lbl_out)
diff --git a/server_api/scripts/create_mock_synapses.py b/server_api/scripts/create_mock_synapses.py
new file mode 100644
index 0000000..65708d4
--- /dev/null
+++ b/server_api/scripts/create_mock_synapses.py
@@ -0,0 +1,140 @@
+"""
+Create mock synapse data for testing the Proof Reading tab
+"""
+import sqlite3
+import random
+from datetime import datetime
+
+# Connect to database
+conn = sqlite3.connect('sql_app.db')
+cursor = conn.cursor()
+
+print("Creating database tables...")
+
+# Create projects table
+cursor.execute('''
+CREATE TABLE IF NOT EXISTS projects (
+ id INTEGER PRIMARY KEY AUTOINCREMENT,
+ name TEXT NOT NULL,
+ user_id INTEGER,
+ neuroglancer_url TEXT,
+ created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
+)
+''')
+
+# Create synapses table
+cursor.execute('''
+CREATE TABLE IF NOT EXISTS synapses (
+ id INTEGER PRIMARY KEY AUTOINCREMENT,
+ project_id INTEGER,
+ pre_neuron_id INTEGER,
+ post_neuron_id INTEGER,
+ x REAL NOT NULL,
+ y REAL NOT NULL,
+ z REAL NOT NULL,
+ status TEXT DEFAULT 'error',
+ confidence REAL,
+ reviewed_by INTEGER,
+ reviewed_at TIMESTAMP,
+ created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
+ FOREIGN KEY (project_id) REFERENCES projects(id)
+)
+''')
+
+print("Tables created successfully!")
+
+# Check if mock project already exists
+cursor.execute("SELECT id FROM projects WHERE name = 'Mock Synapse Project'")
+existing_project = cursor.fetchone()
+
+if existing_project:
+ project_id = existing_project[0]
+ print(f"\nMock project already exists with ID {project_id}")
+
+ # Ask if user wants to recreate data
+ response = input("Do you want to recreate the synapse data? (y/n): ")
+ if response.lower() == 'y':
+ cursor.execute("DELETE FROM synapses WHERE project_id = ?", (project_id,))
+ print("Deleted existing synapses")
+ else:
+ print("Keeping existing data")
+ conn.close()
+ exit(0)
+else:
+ # Create mock project
+ print("\nCreating mock project...")
+ cursor.execute('''
+ INSERT INTO projects (name, user_id, neuroglancer_url)
+ VALUES (?, ?, ?)
+ ''', ('Mock Synapse Project', 1, 'http://localhost:8080'))
+
+ project_id = cursor.lastrowid
+ print(f"Created project with ID {project_id}")
+
+# Create 100 mock synapses
+print(f"\nGenerating 100 mock synapses...")
+
+# Distribution: 80 errors, 15 correct, 5 incorrect
+statuses = ['error'] * 80 + ['correct'] * 15 + ['incorrect'] * 5
+random.shuffle(statuses)
+
+for i in range(100):
+ # Random coordinates within 100x100x100 volume
+ x = random.uniform(0, 100)
+ y = random.uniform(0, 100)
+ z = random.uniform(0, 100)
+
+ # Random neuron IDs (some may be None for errors)
+ if statuses[i] == 'error':
+ # Errors may have missing or incorrect neuron IDs
+ pre_id = random.randint(1, 50) if random.random() > 0.3 else None
+ post_id = random.randint(1, 50) if random.random() > 0.3 else None
+ else:
+ pre_id = random.randint(1, 50)
+ post_id = random.randint(1, 50)
+
+ confidence = random.uniform(0.5, 0.99)
+
+ cursor.execute('''
+ INSERT INTO synapses (
+ project_id, pre_neuron_id, post_neuron_id,
+ x, y, z, status, confidence
+ ) VALUES (?, ?, ?, ?, ?, ?, ?, ?)
+ ''', (
+ project_id,
+ pre_id,
+ post_id,
+ x, y, z,
+ statuses[i],
+ confidence
+ ))
+
+conn.commit()
+
+# Print summary
+print(f"\n{'='*60}")
+print("Mock data created successfully!")
+print(f"{'='*60}")
+print(f"Project ID: {project_id}")
+print(f"Project Name: Mock Synapse Project")
+print(f"Total Synapses: 100")
+print(f" - Errors (need review): 80")
+print(f" - Correct: 15")
+print(f" - Incorrect: 5")
+print(f"{'='*60}\n")
+
+# Show sample data
+print("Sample synapses:")
+cursor.execute('''
+SELECT id, x, y, z, status, pre_neuron_id, post_neuron_id
+FROM synapses
+WHERE project_id = ?
+LIMIT 5
+''', (project_id,))
+
+for row in cursor.fetchall():
+ print(f" ID {row[0]}: ({row[1]:.1f}, {row[2]:.1f}, {row[3]:.1f}) - Status: {row[4]}, Pre: {row[5]}, Post: {row[6]}")
+
+conn.close()
+print("\nDatabase connection closed.")
+print("You can now start the backend server and use the Proof Reading tab!")
diff --git a/server_api/scripts/create_synthetic_samples.py b/server_api/scripts/create_synthetic_samples.py
new file mode 100644
index 0000000..8f56438
--- /dev/null
+++ b/server_api/scripts/create_synthetic_samples.py
@@ -0,0 +1,123 @@
+"""
+Create synthetic sample TIFF files for Neuroglancer demonstration.
+These are placeholders until real Lucchi dataset files are downloaded.
+
+The Lucchi dataset specifications:
+- Image dimensions: typically 165 x 1024 x 768 (z, y, x)
+- Resolution: 5nm per voxel
+- Type: EM (electron microscopy) grayscale images
+"""
+
+import numpy as np
+from PIL import Image
+import os
+
+def create_synthetic_volume(output_path, shape=(165, 1024, 768), is_label=False):
+ """
+ Create a synthetic 3D volume and save as multi-page TIFF
+
+ Args:
+ output_path: Path to save the TIFF file
+ shape: (z, y, x) dimensions
+ is_label: If True, create segmentation labels; if False, create grayscale image
+ """
+ print(f"Creating synthetic volume: {output_path}")
+ print(f"Dimensions (z,y,x): {shape}")
+
+ # Create directory if it doesn't exist
+ os.makedirs(os.path.dirname(output_path), exist_ok=True)
+
+ if is_label:
+ # Create synthetic segmentation labels
+ # Simulate mitochondria as random ellipsoids
+ volume = np.zeros(shape, dtype=np.uint8)
+
+ # Add some random "mitochondria" labels
+ num_mitochondria = 50
+ for i in range(1, num_mitochondria + 1):
+ # Random center
+ z_center = np.random.randint(10, shape[0] - 10)
+ y_center = np.random.randint(100, shape[1] - 100)
+ x_center = np.random.randint(100, shape[2] - 100)
+
+ # Random size
+ z_radius = np.random.randint(3, 8)
+ y_radius = np.random.randint(20, 50)
+ x_radius = np.random.randint(20, 50)
+
+ # Create ellipsoid
+ for z in range(max(0, z_center - z_radius), min(shape[0], z_center + z_radius)):
+ for y in range(max(0, y_center - y_radius), min(shape[1], y_center + y_radius)):
+ for x in range(max(0, x_center - x_radius), min(shape[2], x_center + x_radius)):
+ # Ellipsoid equation
+ if ((z - z_center)**2 / z_radius**2 +
+ (y - y_center)**2 / y_radius**2 +
+ (x - x_center)**2 / x_radius**2) <= 1:
+ volume[z, y, x] = i # Label ID
+
+ print(f"Created {num_mitochondria} synthetic mitochondria")
+ else:
+ # Create synthetic grayscale EM-like image
+ # Use Perlin-like noise for realistic texture
+ volume = np.random.randint(50, 200, shape, dtype=np.uint8)
+
+ # Add some structure (simulate cell membranes)
+ for _ in range(20):
+ z = np.random.randint(0, shape[0])
+ y_start = np.random.randint(0, shape[1] - 100)
+ x_start = np.random.randint(0, shape[2] - 100)
+
+ # Draw a "membrane" line
+ for i in range(100):
+ y = min(y_start + i, shape[1] - 1)
+ x = min(x_start + i + np.random.randint(-5, 5), shape[2] - 1)
+ volume[z, y, x] = 30 # Dark line
+
+ # Save as multi-page TIFF
+ images = [Image.fromarray(volume[i]) for i in range(shape[0])]
+ images[0].save(
+ output_path,
+ save_all=True,
+ append_images=images[1:],
+ compression='tiff_deflate'
+ )
+
+ print(f"Saved to: {output_path}")
+ print(f"File size: {os.path.getsize(output_path) / (1024*1024):.2f} MB\n")
+
+if __name__ == "__main__":
+ # Create samples_pytc directory
+ os.makedirs("samples_pytc", exist_ok=True)
+
+ print("="*60)
+ print("Creating Synthetic Lucchi-like Dataset")
+ print("="*60)
+ print("\nNOTE: These are SYNTHETIC placeholder files.")
+ print("For real data, download from:")
+ print(" https://www.epfl.ch/labs/cvlab/data/data-em/")
+ print(" or search for 'Lucchi mitochondria dataset'\n")
+ print("="*60)
+ print()
+
+ # Create synthetic image (smaller for demo - 50 slices instead of 165)
+ create_synthetic_volume(
+ "samples_pytc/lucchiIm.tif",
+ shape=(50, 512, 512), # Smaller for faster loading
+ is_label=False
+ )
+
+ # Create synthetic labels
+ create_synthetic_volume(
+ "samples_pytc/lucchiLabels.tif",
+ shape=(50, 512, 512),
+ is_label=True
+ )
+
+ print("="*60)
+ print("✅ Synthetic sample files created successfully!")
+ print("="*60)
+ print("\nFiles created:")
+ print(" - samples_pytc/lucchiIm.tif (grayscale EM-like image)")
+ print(" - samples_pytc/lucchiLabels.tif (mitochondria segmentation)")
+ print("\nThese files are ready to use with Neuroglancer!")
+ print("="*60)
diff --git a/server_api/scripts/serve_data.py b/server_api/scripts/serve_data.py
new file mode 100644
index 0000000..3c001e8
--- /dev/null
+++ b/server_api/scripts/serve_data.py
@@ -0,0 +1,37 @@
+import http.server
+import socketserver
+import os
+import sys
+
+PORT = 8000
+DIRECTORY = "samples_pytc"
+
+class CORSRequestHandler(http.server.SimpleHTTPRequestHandler):
+ def end_headers(self):
+ self.send_header('Access-Control-Allow-Origin', '*')
+ self.send_header('Access-Control-Allow-Methods', 'GET')
+ self.send_header('Cache-Control', 'no-store, no-cache, must-revalidate')
+ return super().end_headers()
+
+ def translate_path(self, path):
+ # Ensure we serve from the correct directory relative to where the script is run
+ # or where the data is located.
+ # This script assumes it's run from the project root or we explicitly set the dir.
+ return super().translate_path(path)
+
+if __name__ == "__main__":
+ # Change to the directory we want to serve
+ target_dir = os.path.join(os.getcwd(), DIRECTORY)
+ if not os.path.exists(target_dir):
+ print(f"Directory {target_dir} not found. Creating it...")
+ os.makedirs(target_dir, exist_ok=True)
+
+ os.chdir(target_dir)
+
+ print(f"Serving directory {target_dir} at http://localhost:{PORT}")
+
+ with socketserver.TCPServer(("", PORT), CORSRequestHandler) as httpd:
+ try:
+ httpd.serve_forever()
+ except KeyboardInterrupt:
+ pass
diff --git a/server_api/scripts/update_synapses_with_samples.py b/server_api/scripts/update_synapses_with_samples.py
new file mode 100644
index 0000000..af05f2b
--- /dev/null
+++ b/server_api/scripts/update_synapses_with_samples.py
@@ -0,0 +1,141 @@
+"""
+Update mock synapse data to use realistic positions based on Lucchi sample dimensions.
+Also update the project to link to the sample image files.
+
+Sample dimensions: 50 x 512 x 512 (z, y, x)
+Resolution: 5nm per voxel
+"""
+import sqlite3
+import random
+import os
+from datetime import datetime
+
+# Connect to database
+conn = sqlite3.connect('sql_app.db')
+cursor = conn.cursor()
+
+print("Updating database with Lucchi sample integration...")
+
+# Get absolute paths to sample files
+base_dir = os.path.abspath('.')
+image_path = os.path.join(base_dir, 'samples_pytc', 'lucchiIm.tif')
+label_path = os.path.join(base_dir, 'samples_pytc', 'lucchiLabels.tif')
+
+print(f"\nImage path: {image_path}")
+print(f"Label path: {label_path}")
+
+# Check if files exist
+if not os.path.exists(image_path):
+ print(f"WARNING: Image file not found: {image_path}")
+ print("Run: python server_api/scripts/create_synthetic_samples.py")
+else:
+ print(f"✓ Image file exists ({os.path.getsize(image_path) / (1024*1024):.2f} MB)")
+
+if not os.path.exists(label_path):
+ print(f"WARNING: Label file not found: {label_path}")
+else:
+ print(f"✓ Label file exists ({os.path.getsize(label_path) / (1024*1024):.2f} MB)")
+
+# Update project table to include image and label paths
+cursor.execute('''
+ALTER TABLE projects ADD COLUMN image_path TEXT
+''')
+cursor.execute('''
+ALTER TABLE projects ADD COLUMN label_path TEXT
+''')
+
+print("\n✓ Added image_path and label_path columns to projects table")
+
+# Update the mock project with file paths
+cursor.execute('''
+UPDATE projects
+SET image_path = ?, label_path = ?
+WHERE name = 'Mock Synapse Project'
+''', (image_path, label_path))
+
+print("✓ Updated Mock Synapse Project with sample file paths")
+
+# Delete existing synapses
+cursor.execute("DELETE FROM synapses WHERE project_id = 1")
+print("\n✓ Deleted old synapse data")
+
+# Image dimensions (z, y, x) in voxels
+z_max, y_max, x_max = 50, 512, 512
+resolution_nm = 5 # 5nm per voxel
+
+# Convert to nm for synapse positions
+z_max_nm = z_max * resolution_nm
+y_max_nm = y_max * resolution_nm
+x_max_nm = x_max * resolution_nm
+
+print(f"\nImage dimensions:")
+print(f" Voxels: {z_max} x {y_max} x {x_max}")
+print(f" Physical size: {z_max_nm}nm x {y_max_nm}nm x {x_max_nm}nm")
+print(f" ({z_max_nm/1000:.1f}µm x {y_max_nm/1000:.1f}µm x {x_max_nm/1000:.1f}µm)")
+
+# Create 100 new synapses with realistic positions
+print(f"\nGenerating 100 synapses within image bounds...")
+
+# Distribution: 80 errors, 15 correct, 5 incorrect
+statuses = ['error'] * 80 + ['correct'] * 15 + ['incorrect'] * 5
+random.shuffle(statuses)
+
+for i in range(100):
+ # Random coordinates within image bounds (in nm)
+ x = random.uniform(0, x_max_nm)
+ y = random.uniform(0, y_max_nm)
+ z = random.uniform(0, z_max_nm)
+
+ # Random neuron IDs (some may be None for errors)
+ if statuses[i] == 'error':
+ pre_id = random.randint(1, 50) if random.random() > 0.3 else None
+ post_id = random.randint(1, 50) if random.random() > 0.3 else None
+ else:
+ pre_id = random.randint(1, 50)
+ post_id = random.randint(1, 50)
+
+ confidence = random.uniform(0.6, 0.95)
+
+ cursor.execute('''
+ INSERT INTO synapses (
+ project_id, pre_neuron_id, post_neuron_id,
+ x, y, z, status, confidence
+ ) VALUES (?, ?, ?, ?, ?, ?, ?, ?)
+ ''', (
+ 1, # project_id
+ pre_id,
+ post_id,
+ x, y, z,
+ statuses[i],
+ confidence
+ ))
+
+conn.commit()
+
+print(f"✓ Created 100 synapses")
+print(f" - Errors (need review): 80")
+print(f" - Correct: 15")
+print(f" - Incorrect: 5")
+
+# Show sample data
+print("\nSample synapses:")
+cursor.execute('''
+SELECT id, x, y, z, status, pre_neuron_id, post_neuron_id
+FROM synapses
+WHERE project_id = 1
+LIMIT 5
+''')
+
+for row in cursor.fetchall():
+ print(f" ID {row[0]}: ({row[1]:.1f}, {row[2]:.1f}, {row[3]:.1f})nm - Status: {row[4]}, Pre: {row[5]}, Post: {row[6]}")
+
+conn.close()
+
+print("\n" + "="*60)
+print("Database updated successfully!")
+print("="*60)
+print("\nNext steps:")
+print("1. Restart the backend server")
+print("2. Navigate to Proof Reading tab")
+print("3. Neuroglancer will load with the sample data!")
+print("="*60)
diff --git a/server_api/synanno/__init__.py b/server_api/synanno/__init__.py
new file mode 100644
index 0000000..4ef953c
--- /dev/null
+++ b/server_api/synanno/__init__.py
@@ -0,0 +1 @@
+from . import router
diff --git a/server_api/synanno/router.py b/server_api/synanno/router.py
new file mode 100644
index 0000000..83f3506
--- /dev/null
+++ b/server_api/synanno/router.py
@@ -0,0 +1,201 @@
+from fastapi import APIRouter, Depends, HTTPException, Query
+from sqlalchemy.orm import Session
+from typing import List, Optional
+from datetime import datetime
+from auth import models, database
+from auth.router import get_current_user
+
+router = APIRouter()
+
+@router.get("/api/synanno/ng-url/{project_id}")
+def get_neuroglancer_url(
+ project_id: int,
+ current_user: models.User = Depends(get_current_user),
+ db: Session = Depends(database.get_db)
+):
+ """
+ Get Neuroglancer URL for a project by generating a viewer from its image files
+
+ Args:
+ project_id: ID of the project
+ current_user: Authenticated user
+ db: Database session
+
+ Returns:
+ Dictionary with 'url' key containing the Neuroglancer URL
+ """
+ import httpx
+ import os
+
+ # Get project with image paths
+ project = db.query(models.Project).filter(
+ models.Project.id == project_id
+ ).first()
+
+ if not project:
+ raise HTTPException(status_code=404, detail="Project not found")
+
+ # Check if project has image files
+ image_path = getattr(project, 'image_path', None)
+ label_path = getattr(project, 'label_path', None)
+
+ if not image_path or not os.path.exists(image_path):
+ # Fallback: return a placeholder message
+ return {"url": None, "message": "No image file associated with this project"}
+
+ # Generate Neuroglancer web client URL
+ try:
+ import json
+ import urllib.parse
+
+ # We assume the NIfTI files are/will be generated in the samples directory
+ # and served by the static file server at localhost:8000
+
+ # Base URL for the official Neuroglancer web client
+ # We can use the demo instance or a specific deployment
+ ng_base_url = "https://neuroglancer-demo.appspot.com/"
+
+ # Construct the state
+ # Note: NIfTI files need to be served with CORS enabled
+ ng_state = {
+ "dimensions": {
+ "x": [5e-9, "m"],
+ "y": [5e-9, "m"],
+ "z": [5e-9, "m"]
+ },
+ "position": [256, 256, 25], # Default center
+ "crossSectionScale": 1,
+ "projectionScale": 256,
+ "layers": [
+ {
+ "type": "image",
+ "source": "nifti://http://localhost:8000/lucchiIm.nii.gz",
+ "name": "EM Image",
+ "visible": True
+ },
+ {
+ "type": "segmentation",
+ "source": "nifti://http://localhost:8000/lucchiLabels.nii.gz",
+ "name": "Mitochondria",
+ "visible": True
+ }
+ ],
+ "layout": "4panel"
+ }
+
+ # Encode state in URL fragment
+ # Neuroglancer uses a specific URL encoding for the state
+ # Simple JSON stringification usually works for the fragment
+ json_state = json.dumps(ng_state)
+ # We need to URL encode it? Usually the browser handles it, but let's be safe
+ # Actually, Neuroglancer expects the JSON directly in the fragment often,
+ # or encoded. Let's send the raw JSON state to the frontend,
+ # and let the frontend construct the final URL or use an iframe with this src.
+
+ # But wait, our frontend expects a "url" field.
+ # Let's construct the full URL.
+ # The format is https://site/#!{json_state}
+ viewer_url = f"{ng_base_url}#!{json_state}"
+
+ return {
+ "url": viewer_url,
+ "message": "Viewer ready (requires NIfTI files)"
+ }
+
+ except Exception as e:
+ import traceback
+ traceback.print_exc()
+ raise HTTPException(
+ status_code=500,
+ detail=f"Error preparing Neuroglancer URL: {str(e)}"
+ )
+
+@router.get("/api/projects/{id}/synapses", response_model=List[models.SynapseResponse])
+def get_synapses(
+ id: int,
+ status: Optional[str] = Query(None, description="Filter by status: error, correct, incorrect, unsure"),
+ current_user: models.User = Depends(get_current_user),
+ db: Session = Depends(database.get_db)
+):
+ """
+ Get synapses for a project, optionally filtered by status
+
+ Args:
+ id: Project ID
+ status: Optional status filter
+ current_user: Authenticated user
+ db: Database session
+
+ Returns:
+ List of synapses
+ """
+ # Verify project exists
+ project = db.query(models.Project).filter(models.Project.id == id).first()
+ if not project:
+ raise HTTPException(status_code=404, detail="Project not found")
+
+ query = db.query(models.Synapse).filter(models.Synapse.project_id == id)
+
+ if status:
+ query = query.filter(models.Synapse.status == status)
+
+ synapses = query.all()
+ return synapses
+
+@router.put("/api/synapses/{id}", response_model=models.SynapseResponse)
+def update_synapse(
+ id: int,
+ synapse_update: models.SynapseUpdate,
+ current_user: models.User = Depends(get_current_user),
+ db: Session = Depends(database.get_db)
+):
+ """
+ Update a synapse's status and/or neuron IDs
+
+ Args:
+ id: Synapse ID
+ synapse_update: Update data
+ current_user: Authenticated user
+ db: Database session
+
+ Returns:
+ Updated synapse
+ """
+ synapse = db.query(models.Synapse).filter(models.Synapse.id == id).first()
+
+ if not synapse:
+ raise HTTPException(status_code=404, detail="Synapse not found")
+
+ # Update fields if provided
+ if synapse_update.status is not None:
+ synapse.status = synapse_update.status
+ if synapse_update.pre_neuron_id is not None:
+ synapse.pre_neuron_id = synapse_update.pre_neuron_id
+ if synapse_update.post_neuron_id is not None:
+ synapse.post_neuron_id = synapse_update.post_neuron_id
+
+ # Track who reviewed and when
+ synapse.reviewed_by = current_user.id
+ synapse.reviewed_at = datetime.utcnow()
+
+ db.commit()
+ db.refresh(synapse)
+ return synapse
+
+@router.get("/api/projects", response_model=List[models.ProjectResponse])
+def get_projects(
+ current_user: models.User = Depends(get_current_user),
+ db: Session = Depends(database.get_db)
+):
+ """
+ Get all projects
+
+ Args:
+ current_user: Authenticated user
+ db: Database session
+
+ Returns:
+ List of projects
+ """
+ projects = db.query(models.Project).all()
+ return projects
diff --git a/sql_app.db b/sql_app.db
new file mode 100644
index 0000000..ef9d952
Binary files /dev/null and b/sql_app.db differ
diff --git a/start.sh b/start.sh
index 4e756aa..5bb5bbd 100755
--- a/start.sh
+++ b/start.sh
@@ -1,6 +1,11 @@
#!/usr/bin/env bash
+# Wrapper script that runs the main dev script
+# Neuroglancer Python server is no longer started here as we are using the web client
+
set -euo pipefail
-SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
-exec "${SCRIPT_DIR}/scripts/dev.sh" "$@"
+ROOT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
+
+# Run the main dev script
+exec "${ROOT_DIR}/scripts/dev.sh" "$@"
diff --git a/start_neuroglancer.py b/start_neuroglancer.py
new file mode 100644
index 0000000..431e479
--- /dev/null
+++ b/start_neuroglancer.py
@@ -0,0 +1,49 @@
+import neuroglancer
+import numpy as np
+import sys
+
+# Configure server - use port 9000 to avoid conflict with PyTC server on 8080
+neuroglancer.set_server_bind_address('127.0.0.1', 9000)
+
+# Create viewer
+viewer = neuroglancer.Viewer()
+
+# Create mock 3D volume (100x100x100 voxels)
+print("Generating mock volume data...", flush=True)
+volume = np.random.randint(0, 255, (100, 100, 100), dtype=np.uint8)
+
+# Add image layer
+with viewer.txn() as s:
+ s.layers['image'] = neuroglancer.ImageLayer(
+ source=neuroglancer.LocalVolume(
+ data=volume,
+ dimensions=neuroglancer.CoordinateSpace(
+ names=['x', 'y', 'z'],
+ units=['nm', 'nm', 'nm'],
+ scales=[10, 10, 10]
+ )
+ )
+ )
+
+viewer_url = str(viewer)
+
+print(f"\n{'='*60}", flush=True)
+print(f"Neuroglancer is running!", flush=True)
+print(f"{'='*60}", flush=True)
+print(f"URL: {viewer_url}", flush=True)
+print(f"\nThis URL will be used by the Proof Reading tab.", flush=True)
+print(f"Keep this script running while using the application.", flush=True)
+print(f"\nPress Ctrl+C to stop the server", flush=True)
+print(f"{'='*60}\n", flush=True)
+
+# Save URL to file for the backend to read
+with open('neuroglancer_url.txt', 'w') as f:
+ f.write(viewer_url)
+
+# Keep server running
+try:
+ import time
+ while True:
+ time.sleep(1)
+except KeyboardInterrupt:
+ print("\nShutting down Neuroglancer server...", flush=True)
diff --git a/uploads/1/156b7d8b-385b-4590-8418-7144aac6807e.txt b/uploads/1/156b7d8b-385b-4590-8418-7144aac6807e.txt
new file mode 100644
index 0000000..236ec26
--- /dev/null
+++ b/uploads/1/156b7d8b-385b-4590-8418-7144aac6807e.txt
@@ -0,0 +1 @@
+https://apply.workable.com/flexcompute/j/75BA5AB5DC/
\ No newline at end of file
diff --git a/uploads/1/202cd802-2f6c-4d38-bf98-c31dbcda3b9c.txt b/uploads/1/202cd802-2f6c-4d38-bf98-c31dbcda3b9c.txt
new file mode 100644
index 0000000..236ec26
--- /dev/null
+++ b/uploads/1/202cd802-2f6c-4d38-bf98-c31dbcda3b9c.txt
@@ -0,0 +1 @@
+https://apply.workable.com/flexcompute/j/75BA5AB5DC/
\ No newline at end of file
diff --git a/uploads/1/6a83e4a1-4253-44dc-ae65-d1c3c2c2fd37.txt b/uploads/1/6a83e4a1-4253-44dc-ae65-d1c3c2c2fd37.txt
new file mode 100644
index 0000000..236ec26
--- /dev/null
+++ b/uploads/1/6a83e4a1-4253-44dc-ae65-d1c3c2c2fd37.txt
@@ -0,0 +1 @@
+https://apply.workable.com/flexcompute/j/75BA5AB5DC/
\ No newline at end of file
diff --git a/uploads/1/a4fb5cbe-bd76-40e8-81b5-360fb8427c71.pdf b/uploads/1/a4fb5cbe-bd76-40e8-81b5-360fb8427c71.pdf
new file mode 100644
index 0000000..98d01cf
Binary files /dev/null and b/uploads/1/a4fb5cbe-bd76-40e8-81b5-360fb8427c71.pdf differ
diff --git a/uploads/2/9bd86436-5a80-4a0c-a014-daef1aad612b.xlsx b/uploads/2/9bd86436-5a80-4a0c-a014-daef1aad612b.xlsx
new file mode 100644
index 0000000..0048d7f
Binary files /dev/null and b/uploads/2/9bd86436-5a80-4a0c-a014-daef1aad612b.xlsx differ
diff --git a/uploads/3/2b64fb56-cfdd-491c-83b3-761997045880.txt b/uploads/3/2b64fb56-cfdd-491c-83b3-761997045880.txt
new file mode 100644
index 0000000..0cf7eb8
--- /dev/null
+++ b/uploads/3/2b64fb56-cfdd-491c-83b3-761997045880.txt
@@ -0,0 +1,32 @@
+Julian
+
+FDA devices development
+
+aim at reducing the medical devices complicity with AI.
+
+Seeking external help.
+
+QMB:
+
+
+C10: accelerator for start-ups. QMB is a client of C10. C10 as a investor in QMB.
+
+FastAPI.
+
+
+
+
+
+
+
+In the nation wide University Entrance Exam of year 2002, I scored 649 out of 750, which is the 2nd place in my high school (670 graduates/year), which is the 2nd best high school in Fuzhou, my hometown in Fuzhou. I was awarded the scholarship for graduate from my high school and the scholarship for new student from my college for this achievement.
+
+
+
+My university is not using any credit system by the time I graduated. However, I was awarded academic scholarship for all 4 years of college, and was admitted to Ph.D program in Chinese Academy of Sciences with advanced admission with National Graduate School Entrance Exam Waiver.
+
+
+2025/03/03:
+Applying to Job #23-00875
+
+Set a Password*: Romans16@07
diff --git a/uploads/3/999bb717-fbec-40bf-9138-43091ddb75a4.xlsx b/uploads/3/999bb717-fbec-40bf-9138-43091ddb75a4.xlsx
new file mode 100644
index 0000000..0048d7f
Binary files /dev/null and b/uploads/3/999bb717-fbec-40bf-9138-43091ddb75a4.xlsx differ
diff --git a/uv.lock b/uv.lock
index 3bf083b..1a1a7a1 100644
--- a/uv.lock
+++ b/uv.lock
@@ -113,6 +113,44 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/15/b3/9b1a8074496371342ec1e796a96f99c82c945a339cd81a8e73de28b4cf9e/anyio-4.11.0-py3-none-any.whl", hash = "sha256:0287e96f4d26d4149305414d4e3bc32f0dcd0862365a4bddea19d7a1ec38c4fc", size = 109097, upload-time = "2025-09-23T09:19:10.601Z" },
]
+[[package]]
+name = "argon2-cffi"
+version = "25.1.0"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "argon2-cffi-bindings" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/0e/89/ce5af8a7d472a67cc819d5d998aa8c82c5d860608c4db9f46f1162d7dab9/argon2_cffi-25.1.0.tar.gz", hash = "sha256:694ae5cc8a42f4c4e2bf2ca0e64e51e23a040c6a517a85074683d3959e1346c1", size = 45706, upload-time = "2025-06-03T06:55:32.073Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/4f/d3/a8b22fa575b297cd6e3e3b0155c7e25db170edf1c74783d6a31a2490b8d9/argon2_cffi-25.1.0-py3-none-any.whl", hash = "sha256:fdc8b074db390fccb6eb4a3604ae7231f219aa669a2652e0f20e16ba513d5741", size = 14657, upload-time = "2025-06-03T06:55:30.804Z" },
+]
+
+[[package]]
+name = "argon2-cffi-bindings"
+version = "25.1.0"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "cffi" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/5c/2d/db8af0df73c1cf454f71b2bbe5e356b8c1f8041c979f505b3d3186e520a9/argon2_cffi_bindings-25.1.0.tar.gz", hash = "sha256:b957f3e6ea4d55d820e40ff76f450952807013d361a65d7f28acc0acbf29229d", size = 1783441, upload-time = "2025-07-30T10:02:05.147Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/1d/57/96b8b9f93166147826da5f90376e784a10582dd39a393c99bb62cfcf52f0/argon2_cffi_bindings-25.1.0-cp39-abi3-macosx_10_9_universal2.whl", hash = "sha256:aecba1723ae35330a008418a91ea6cfcedf6d31e5fbaa056a166462ff066d500", size = 54121, upload-time = "2025-07-30T10:01:50.815Z" },
+ { url = "https://files.pythonhosted.org/packages/0a/08/a9bebdb2e0e602dde230bdde8021b29f71f7841bd54801bcfd514acb5dcf/argon2_cffi_bindings-25.1.0-cp39-abi3-macosx_10_9_x86_64.whl", hash = "sha256:2630b6240b495dfab90aebe159ff784d08ea999aa4b0d17efa734055a07d2f44", size = 29177, upload-time = "2025-07-30T10:01:51.681Z" },
+ { url = "https://files.pythonhosted.org/packages/b6/02/d297943bcacf05e4f2a94ab6f462831dc20158614e5d067c35d4e63b9acb/argon2_cffi_bindings-25.1.0-cp39-abi3-macosx_11_0_arm64.whl", hash = "sha256:7aef0c91e2c0fbca6fc68e7555aa60ef7008a739cbe045541e438373bc54d2b0", size = 31090, upload-time = "2025-07-30T10:01:53.184Z" },
+ { url = "https://files.pythonhosted.org/packages/c1/93/44365f3d75053e53893ec6d733e4a5e3147502663554b4d864587c7828a7/argon2_cffi_bindings-25.1.0-cp39-abi3-manylinux_2_26_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:1e021e87faa76ae0d413b619fe2b65ab9a037f24c60a1e6cc43457ae20de6dc6", size = 81246, upload-time = "2025-07-30T10:01:54.145Z" },
+ { url = "https://files.pythonhosted.org/packages/09/52/94108adfdd6e2ddf58be64f959a0b9c7d4ef2fa71086c38356d22dc501ea/argon2_cffi_bindings-25.1.0-cp39-abi3-manylinux_2_26_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:d3e924cfc503018a714f94a49a149fdc0b644eaead5d1f089330399134fa028a", size = 87126, upload-time = "2025-07-30T10:01:55.074Z" },
+ { url = "https://files.pythonhosted.org/packages/72/70/7a2993a12b0ffa2a9271259b79cc616e2389ed1a4d93842fac5a1f923ffd/argon2_cffi_bindings-25.1.0-cp39-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:c87b72589133f0346a1cb8d5ecca4b933e3c9b64656c9d175270a000e73b288d", size = 80343, upload-time = "2025-07-30T10:01:56.007Z" },
+ { url = "https://files.pythonhosted.org/packages/78/9a/4e5157d893ffc712b74dbd868c7f62365618266982b64accab26bab01edc/argon2_cffi_bindings-25.1.0-cp39-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:1db89609c06afa1a214a69a462ea741cf735b29a57530478c06eb81dd403de99", size = 86777, upload-time = "2025-07-30T10:01:56.943Z" },
+ { url = "https://files.pythonhosted.org/packages/74/cd/15777dfde1c29d96de7f18edf4cc94c385646852e7c7b0320aa91ccca583/argon2_cffi_bindings-25.1.0-cp39-abi3-win32.whl", hash = "sha256:473bcb5f82924b1becbb637b63303ec8d10e84c8d241119419897a26116515d2", size = 27180, upload-time = "2025-07-30T10:01:57.759Z" },
+ { url = "https://files.pythonhosted.org/packages/e2/c6/a759ece8f1829d1f162261226fbfd2c6832b3ff7657384045286d2afa384/argon2_cffi_bindings-25.1.0-cp39-abi3-win_amd64.whl", hash = "sha256:a98cd7d17e9f7ce244c0803cad3c23a7d379c301ba618a5fa76a67d116618b98", size = 31715, upload-time = "2025-07-30T10:01:58.56Z" },
+ { url = "https://files.pythonhosted.org/packages/42/b9/f8d6fa329ab25128b7e98fd83a3cb34d9db5b059a9847eddb840a0af45dd/argon2_cffi_bindings-25.1.0-cp39-abi3-win_arm64.whl", hash = "sha256:b0fdbcf513833809c882823f98dc2f931cf659d9a1429616ac3adebb49f5db94", size = 27149, upload-time = "2025-07-30T10:01:59.329Z" },
+ { url = "https://files.pythonhosted.org/packages/11/2d/ba4e4ca8d149f8dcc0d952ac0967089e1d759c7e5fcf0865a317eb680fbb/argon2_cffi_bindings-25.1.0-pp310-pypy310_pp73-macosx_10_15_x86_64.whl", hash = "sha256:6dca33a9859abf613e22733131fc9194091c1fa7cb3e131c143056b4856aa47e", size = 24549, upload-time = "2025-07-30T10:02:00.101Z" },
+ { url = "https://files.pythonhosted.org/packages/5c/82/9b2386cc75ac0bd3210e12a44bfc7fd1632065ed8b80d573036eecb10442/argon2_cffi_bindings-25.1.0-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:21378b40e1b8d1655dd5310c84a40fc19a9aa5e6366e835ceb8576bf0fea716d", size = 25539, upload-time = "2025-07-30T10:02:00.929Z" },
+ { url = "https://files.pythonhosted.org/packages/31/db/740de99a37aa727623730c90d92c22c9e12585b3c98c54b7960f7810289f/argon2_cffi_bindings-25.1.0-pp310-pypy310_pp73-manylinux_2_26_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:5d588dec224e2a83edbdc785a5e6f3c6cd736f46bfd4b441bbb5aa1f5085e584", size = 28467, upload-time = "2025-07-30T10:02:02.08Z" },
+ { url = "https://files.pythonhosted.org/packages/71/7a/47c4509ea18d755f44e2b92b7178914f0c113946d11e16e626df8eaa2b0b/argon2_cffi_bindings-25.1.0-pp310-pypy310_pp73-manylinux_2_26_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:5acb4e41090d53f17ca1110c3427f0a130f944b896fc8c83973219c97f57b690", size = 27355, upload-time = "2025-07-30T10:02:02.867Z" },
+ { url = "https://files.pythonhosted.org/packages/ee/82/82745642d3c46e7cea25e1885b014b033f4693346ce46b7f47483cf5d448/argon2_cffi_bindings-25.1.0-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:da0c79c23a63723aa5d782250fbf51b768abca630285262fb5144ba5ae01e520", size = 29187, upload-time = "2025-07-30T10:02:03.674Z" },
+]
+
[[package]]
name = "asciitree"
version = "0.3.3"
@@ -143,6 +181,48 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/3a/2a/7cc015f5b9f5db42b7d48157e23356022889fc354a2813c15934b7cb5c0e/attrs-25.4.0-py3-none-any.whl", hash = "sha256:adcf7e2a1fb3b36ac48d97835bb6d8ade15b8dcce26aba8bf1d14847b57a3373", size = 67615, upload-time = "2025-10-06T13:54:43.17Z" },
]
+[[package]]
+name = "bcrypt"
+version = "5.0.0"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/d4/36/3329e2518d70ad8e2e5817d5a4cac6bba05a47767ec416c7d020a965f408/bcrypt-5.0.0.tar.gz", hash = "sha256:f748f7c2d6fd375cc93d3fba7ef4a9e3a092421b8dbf34d8d4dc06be9492dfdd", size = 25386, upload-time = "2025-09-25T19:50:47.829Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/84/29/6237f151fbfe295fe3e074ecc6d44228faa1e842a81f6d34a02937ee1736/bcrypt-5.0.0-cp38-abi3-macosx_10_12_universal2.whl", hash = "sha256:fc746432b951e92b58317af8e0ca746efe93e66555f1b40888865ef5bf56446b", size = 494553, upload-time = "2025-09-25T19:49:49.006Z" },
+ { url = "https://files.pythonhosted.org/packages/45/b6/4c1205dde5e464ea3bd88e8742e19f899c16fa8916fb8510a851fae985b5/bcrypt-5.0.0-cp38-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:c2388ca94ffee269b6038d48747f4ce8df0ffbea43f31abfa18ac72f0218effb", size = 275009, upload-time = "2025-09-25T19:49:50.581Z" },
+ { url = "https://files.pythonhosted.org/packages/3b/71/427945e6ead72ccffe77894b2655b695ccf14ae1866cd977e185d606dd2f/bcrypt-5.0.0-cp38-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:560ddb6ec730386e7b3b26b8b4c88197aaed924430e7b74666a586ac997249ef", size = 278029, upload-time = "2025-09-25T19:49:52.533Z" },
+ { url = "https://files.pythonhosted.org/packages/17/72/c344825e3b83c5389a369c8a8e58ffe1480b8a699f46c127c34580c4666b/bcrypt-5.0.0-cp38-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:d79e5c65dcc9af213594d6f7f1fa2c98ad3fc10431e7aa53c176b441943efbdd", size = 275907, upload-time = "2025-09-25T19:49:54.709Z" },
+ { url = "https://files.pythonhosted.org/packages/0b/7e/d4e47d2df1641a36d1212e5c0514f5291e1a956a7749f1e595c07a972038/bcrypt-5.0.0-cp38-abi3-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:2b732e7d388fa22d48920baa267ba5d97cca38070b69c0e2d37087b381c681fd", size = 296500, upload-time = "2025-09-25T19:49:56.013Z" },
+ { url = "https://files.pythonhosted.org/packages/0f/c3/0ae57a68be2039287ec28bc463b82e4b8dc23f9d12c0be331f4782e19108/bcrypt-5.0.0-cp38-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:0c8e093ea2532601a6f686edbc2c6b2ec24131ff5c52f7610dd64fa4553b5464", size = 278412, upload-time = "2025-09-25T19:49:57.356Z" },
+ { url = "https://files.pythonhosted.org/packages/45/2b/77424511adb11e6a99e3a00dcc7745034bee89036ad7d7e255a7e47be7d8/bcrypt-5.0.0-cp38-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:5b1589f4839a0899c146e8892efe320c0fa096568abd9b95593efac50a87cb75", size = 275486, upload-time = "2025-09-25T19:49:59.116Z" },
+ { url = "https://files.pythonhosted.org/packages/43/0a/405c753f6158e0f3f14b00b462d8bca31296f7ecfc8fc8bc7919c0c7d73a/bcrypt-5.0.0-cp38-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:89042e61b5e808b67daf24a434d89bab164d4de1746b37a8d173b6b14f3db9ff", size = 277940, upload-time = "2025-09-25T19:50:00.869Z" },
+ { url = "https://files.pythonhosted.org/packages/62/83/b3efc285d4aadc1fa83db385ec64dcfa1707e890eb42f03b127d66ac1b7b/bcrypt-5.0.0-cp38-abi3-musllinux_1_1_aarch64.whl", hash = "sha256:e3cf5b2560c7b5a142286f69bde914494b6d8f901aaa71e453078388a50881c4", size = 310776, upload-time = "2025-09-25T19:50:02.393Z" },
+ { url = "https://files.pythonhosted.org/packages/95/7d/47ee337dacecde6d234890fe929936cb03ebc4c3a7460854bbd9c97780b8/bcrypt-5.0.0-cp38-abi3-musllinux_1_1_x86_64.whl", hash = "sha256:f632fd56fc4e61564f78b46a2269153122db34988e78b6be8b32d28507b7eaeb", size = 312922, upload-time = "2025-09-25T19:50:04.232Z" },
+ { url = "https://files.pythonhosted.org/packages/d6/3a/43d494dfb728f55f4e1cf8fd435d50c16a2d75493225b54c8d06122523c6/bcrypt-5.0.0-cp38-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:801cad5ccb6b87d1b430f183269b94c24f248dddbbc5c1f78b6ed231743e001c", size = 341367, upload-time = "2025-09-25T19:50:05.559Z" },
+ { url = "https://files.pythonhosted.org/packages/55/ab/a0727a4547e383e2e22a630e0f908113db37904f58719dc48d4622139b5c/bcrypt-5.0.0-cp38-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:3cf67a804fc66fc217e6914a5635000259fbbbb12e78a99488e4d5ba445a71eb", size = 359187, upload-time = "2025-09-25T19:50:06.916Z" },
+ { url = "https://files.pythonhosted.org/packages/1b/bb/461f352fdca663524b4643d8b09e8435b4990f17fbf4fea6bc2a90aa0cc7/bcrypt-5.0.0-cp38-abi3-win32.whl", hash = "sha256:3abeb543874b2c0524ff40c57a4e14e5d3a66ff33fb423529c88f180fd756538", size = 153752, upload-time = "2025-09-25T19:50:08.515Z" },
+ { url = "https://files.pythonhosted.org/packages/41/aa/4190e60921927b7056820291f56fc57d00d04757c8b316b2d3c0d1d6da2c/bcrypt-5.0.0-cp38-abi3-win_amd64.whl", hash = "sha256:35a77ec55b541e5e583eb3436ffbbf53b0ffa1fa16ca6782279daf95d146dcd9", size = 150881, upload-time = "2025-09-25T19:50:09.742Z" },
+ { url = "https://files.pythonhosted.org/packages/54/12/cd77221719d0b39ac0b55dbd39358db1cd1246e0282e104366ebbfb8266a/bcrypt-5.0.0-cp38-abi3-win_arm64.whl", hash = "sha256:cde08734f12c6a4e28dc6755cd11d3bdfea608d93d958fffbe95a7026ebe4980", size = 144931, upload-time = "2025-09-25T19:50:11.016Z" },
+ { url = "https://files.pythonhosted.org/packages/5d/ba/2af136406e1c3839aea9ecadc2f6be2bcd1eff255bd451dd39bcf302c47a/bcrypt-5.0.0-cp39-abi3-macosx_10_12_universal2.whl", hash = "sha256:0c418ca99fd47e9c59a301744d63328f17798b5947b0f791e9af3c1c499c2d0a", size = 495313, upload-time = "2025-09-25T19:50:12.309Z" },
+ { url = "https://files.pythonhosted.org/packages/ac/ee/2f4985dbad090ace5ad1f7dd8ff94477fe089b5fab2040bd784a3d5f187b/bcrypt-5.0.0-cp39-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:ddb4e1500f6efdd402218ffe34d040a1196c072e07929b9820f363a1fd1f4191", size = 275290, upload-time = "2025-09-25T19:50:13.673Z" },
+ { url = "https://files.pythonhosted.org/packages/e4/6e/b77ade812672d15cf50842e167eead80ac3514f3beacac8902915417f8b7/bcrypt-5.0.0-cp39-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:7aeef54b60ceddb6f30ee3db090351ecf0d40ec6e2abf41430997407a46d2254", size = 278253, upload-time = "2025-09-25T19:50:15.089Z" },
+ { url = "https://files.pythonhosted.org/packages/36/c4/ed00ed32f1040f7990dac7115f82273e3c03da1e1a1587a778d8cea496d8/bcrypt-5.0.0-cp39-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:f0ce778135f60799d89c9693b9b398819d15f1921ba15fe719acb3178215a7db", size = 276084, upload-time = "2025-09-25T19:50:16.699Z" },
+ { url = "https://files.pythonhosted.org/packages/e7/c4/fa6e16145e145e87f1fa351bbd54b429354fd72145cd3d4e0c5157cf4c70/bcrypt-5.0.0-cp39-abi3-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:a71f70ee269671460b37a449f5ff26982a6f2ba493b3eabdd687b4bf35f875ac", size = 297185, upload-time = "2025-09-25T19:50:18.525Z" },
+ { url = "https://files.pythonhosted.org/packages/24/b4/11f8a31d8b67cca3371e046db49baa7c0594d71eb40ac8121e2fc0888db0/bcrypt-5.0.0-cp39-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:f8429e1c410b4073944f03bd778a9e066e7fad723564a52ff91841d278dfc822", size = 278656, upload-time = "2025-09-25T19:50:19.809Z" },
+ { url = "https://files.pythonhosted.org/packages/ac/31/79f11865f8078e192847d2cb526e3fa27c200933c982c5b2869720fa5fce/bcrypt-5.0.0-cp39-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:edfcdcedd0d0f05850c52ba3127b1fce70b9f89e0fe5ff16517df7e81fa3cbb8", size = 275662, upload-time = "2025-09-25T19:50:21.567Z" },
+ { url = "https://files.pythonhosted.org/packages/d4/8d/5e43d9584b3b3591a6f9b68f755a4da879a59712981ef5ad2a0ac1379f7a/bcrypt-5.0.0-cp39-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:611f0a17aa4a25a69362dcc299fda5c8a3d4f160e2abb3831041feb77393a14a", size = 278240, upload-time = "2025-09-25T19:50:23.305Z" },
+ { url = "https://files.pythonhosted.org/packages/89/48/44590e3fc158620f680a978aafe8f87a4c4320da81ed11552f0323aa9a57/bcrypt-5.0.0-cp39-abi3-musllinux_1_1_aarch64.whl", hash = "sha256:db99dca3b1fdc3db87d7c57eac0c82281242d1eabf19dcb8a6b10eb29a2e72d1", size = 311152, upload-time = "2025-09-25T19:50:24.597Z" },
+ { url = "https://files.pythonhosted.org/packages/5f/85/e4fbfc46f14f47b0d20493669a625da5827d07e8a88ee460af6cd9768b44/bcrypt-5.0.0-cp39-abi3-musllinux_1_1_x86_64.whl", hash = "sha256:5feebf85a9cefda32966d8171f5db7e3ba964b77fdfe31919622256f80f9cf42", size = 313284, upload-time = "2025-09-25T19:50:26.268Z" },
+ { url = "https://files.pythonhosted.org/packages/25/ae/479f81d3f4594456a01ea2f05b132a519eff9ab5768a70430fa1132384b1/bcrypt-5.0.0-cp39-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:3ca8a166b1140436e058298a34d88032ab62f15aae1c598580333dc21d27ef10", size = 341643, upload-time = "2025-09-25T19:50:28.02Z" },
+ { url = "https://files.pythonhosted.org/packages/df/d2/36a086dee1473b14276cd6ea7f61aef3b2648710b5d7f1c9e032c29b859f/bcrypt-5.0.0-cp39-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:61afc381250c3182d9078551e3ac3a41da14154fbff647ddf52a769f588c4172", size = 359698, upload-time = "2025-09-25T19:50:31.347Z" },
+ { url = "https://files.pythonhosted.org/packages/c0/f6/688d2cd64bfd0b14d805ddb8a565e11ca1fb0fd6817175d58b10052b6d88/bcrypt-5.0.0-cp39-abi3-win32.whl", hash = "sha256:64d7ce196203e468c457c37ec22390f1a61c85c6f0b8160fd752940ccfb3a683", size = 153725, upload-time = "2025-09-25T19:50:34.384Z" },
+ { url = "https://files.pythonhosted.org/packages/9f/b9/9d9a641194a730bda138b3dfe53f584d61c58cd5230e37566e83ec2ffa0d/bcrypt-5.0.0-cp39-abi3-win_amd64.whl", hash = "sha256:64ee8434b0da054d830fa8e89e1c8bf30061d539044a39524ff7dec90481e5c2", size = 150912, upload-time = "2025-09-25T19:50:35.69Z" },
+ { url = "https://files.pythonhosted.org/packages/27/44/d2ef5e87509158ad2187f4dd0852df80695bb1ee0cfe0a684727b01a69e0/bcrypt-5.0.0-cp39-abi3-win_arm64.whl", hash = "sha256:f2347d3534e76bf50bca5500989d6c1d05ed64b440408057a37673282c654927", size = 144953, upload-time = "2025-09-25T19:50:37.32Z" },
+ { url = "https://files.pythonhosted.org/packages/8a/75/4aa9f5a4d40d762892066ba1046000b329c7cd58e888a6db878019b282dc/bcrypt-5.0.0-pp311-pypy311_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:7edda91d5ab52b15636d9c30da87d2cc84f426c72b9dba7a9b4fe142ba11f534", size = 271180, upload-time = "2025-09-25T19:50:38.575Z" },
+ { url = "https://files.pythonhosted.org/packages/54/79/875f9558179573d40a9cc743038ac2bf67dfb79cecb1e8b5d70e88c94c3d/bcrypt-5.0.0-pp311-pypy311_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:046ad6db88edb3c5ece4369af997938fb1c19d6a699b9c1b27b0db432faae4c4", size = 273791, upload-time = "2025-09-25T19:50:39.913Z" },
+ { url = "https://files.pythonhosted.org/packages/bc/fe/975adb8c216174bf70fc17535f75e85ac06ed5252ea077be10d9cff5ce24/bcrypt-5.0.0-pp311-pypy311_pp73-manylinux_2_34_aarch64.whl", hash = "sha256:dcd58e2b3a908b5ecc9b9df2f0085592506ac2d5110786018ee5e160f28e0911", size = 270746, upload-time = "2025-09-25T19:50:43.306Z" },
+ { url = "https://files.pythonhosted.org/packages/e4/f8/972c96f5a2b6c4b3deca57009d93e946bbdbe2241dca9806d502f29dd3ee/bcrypt-5.0.0-pp311-pypy311_pp73-manylinux_2_34_x86_64.whl", hash = "sha256:6b8f520b61e8781efee73cba14e3e8c9556ccfb375623f4f97429544734545b4", size = 273375, upload-time = "2025-09-25T19:50:45.43Z" },
+]
+
[[package]]
name = "cachetools"
version = "6.2.1"
@@ -161,6 +241,42 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/e4/37/af0d2ef3967ac0d6113837b44a4f0bfe1328c2b9763bd5b1744520e5cfed/certifi-2025.10.5-py3-none-any.whl", hash = "sha256:0f212c2744a9bb6de0c56639a6f68afe01ecd92d91f14ae897c4fe7bbeeef0de", size = 163286, upload-time = "2025-10-05T04:12:14.03Z" },
]
+[[package]]
+name = "cffi"
+version = "2.0.0"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "pycparser", marker = "implementation_name != 'PyPy'" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/eb/56/b1ba7935a17738ae8453301356628e8147c79dbb825bcbc73dc7401f9846/cffi-2.0.0.tar.gz", hash = "sha256:44d1b5909021139fe36001ae048dbdde8214afa20200eda0f64c068cac5d5529", size = 523588, upload-time = "2025-09-08T23:24:04.541Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/93/d7/516d984057745a6cd96575eea814fe1edd6646ee6efd552fb7b0921dec83/cffi-2.0.0-cp310-cp310-macosx_10_13_x86_64.whl", hash = "sha256:0cf2d91ecc3fcc0625c2c530fe004f82c110405f101548512cce44322fa8ac44", size = 184283, upload-time = "2025-09-08T23:22:08.01Z" },
+ { url = "https://files.pythonhosted.org/packages/9e/84/ad6a0b408daa859246f57c03efd28e5dd1b33c21737c2db84cae8c237aa5/cffi-2.0.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:f73b96c41e3b2adedc34a7356e64c8eb96e03a3782b535e043a986276ce12a49", size = 180504, upload-time = "2025-09-08T23:22:10.637Z" },
+ { url = "https://files.pythonhosted.org/packages/50/bd/b1a6362b80628111e6653c961f987faa55262b4002fcec42308cad1db680/cffi-2.0.0-cp310-cp310-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:53f77cbe57044e88bbd5ed26ac1d0514d2acf0591dd6bb02a3ae37f76811b80c", size = 208811, upload-time = "2025-09-08T23:22:12.267Z" },
+ { url = "https://files.pythonhosted.org/packages/4f/27/6933a8b2562d7bd1fb595074cf99cc81fc3789f6a6c05cdabb46284a3188/cffi-2.0.0-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:3e837e369566884707ddaf85fc1744b47575005c0a229de3327f8f9a20f4efeb", size = 216402, upload-time = "2025-09-08T23:22:13.455Z" },
+ { url = "https://files.pythonhosted.org/packages/05/eb/b86f2a2645b62adcfff53b0dd97e8dfafb5c8aa864bd0d9a2c2049a0d551/cffi-2.0.0-cp310-cp310-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:5eda85d6d1879e692d546a078b44251cdd08dd1cfb98dfb77b670c97cee49ea0", size = 203217, upload-time = "2025-09-08T23:22:14.596Z" },
+ { url = "https://files.pythonhosted.org/packages/9f/e0/6cbe77a53acf5acc7c08cc186c9928864bd7c005f9efd0d126884858a5fe/cffi-2.0.0-cp310-cp310-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:9332088d75dc3241c702d852d4671613136d90fa6881da7d770a483fd05248b4", size = 203079, upload-time = "2025-09-08T23:22:15.769Z" },
+ { url = "https://files.pythonhosted.org/packages/98/29/9b366e70e243eb3d14a5cb488dfd3a0b6b2f1fb001a203f653b93ccfac88/cffi-2.0.0-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:fc7de24befaeae77ba923797c7c87834c73648a05a4bde34b3b7e5588973a453", size = 216475, upload-time = "2025-09-08T23:22:17.427Z" },
+ { url = "https://files.pythonhosted.org/packages/21/7a/13b24e70d2f90a322f2900c5d8e1f14fa7e2a6b3332b7309ba7b2ba51a5a/cffi-2.0.0-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:cf364028c016c03078a23b503f02058f1814320a56ad535686f90565636a9495", size = 218829, upload-time = "2025-09-08T23:22:19.069Z" },
+ { url = "https://files.pythonhosted.org/packages/60/99/c9dc110974c59cc981b1f5b66e1d8af8af764e00f0293266824d9c4254bc/cffi-2.0.0-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:e11e82b744887154b182fd3e7e8512418446501191994dbf9c9fc1f32cc8efd5", size = 211211, upload-time = "2025-09-08T23:22:20.588Z" },
+ { url = "https://files.pythonhosted.org/packages/49/72/ff2d12dbf21aca1b32a40ed792ee6b40f6dc3a9cf1644bd7ef6e95e0ac5e/cffi-2.0.0-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:8ea985900c5c95ce9db1745f7933eeef5d314f0565b27625d9a10ec9881e1bfb", size = 218036, upload-time = "2025-09-08T23:22:22.143Z" },
+ { url = "https://files.pythonhosted.org/packages/e2/cc/027d7fb82e58c48ea717149b03bcadcbdc293553edb283af792bd4bcbb3f/cffi-2.0.0-cp310-cp310-win32.whl", hash = "sha256:1f72fb8906754ac8a2cc3f9f5aaa298070652a0ffae577e0ea9bd480dc3c931a", size = 172184, upload-time = "2025-09-08T23:22:23.328Z" },
+ { url = "https://files.pythonhosted.org/packages/33/fa/072dd15ae27fbb4e06b437eb6e944e75b068deb09e2a2826039e49ee2045/cffi-2.0.0-cp310-cp310-win_amd64.whl", hash = "sha256:b18a3ed7d5b3bd8d9ef7a8cb226502c6bf8308df1525e1cc676c3680e7176739", size = 182790, upload-time = "2025-09-08T23:22:24.752Z" },
+ { url = "https://files.pythonhosted.org/packages/12/4a/3dfd5f7850cbf0d06dc84ba9aa00db766b52ca38d8b86e3a38314d52498c/cffi-2.0.0-cp311-cp311-macosx_10_13_x86_64.whl", hash = "sha256:b4c854ef3adc177950a8dfc81a86f5115d2abd545751a304c5bcf2c2c7283cfe", size = 184344, upload-time = "2025-09-08T23:22:26.456Z" },
+ { url = "https://files.pythonhosted.org/packages/4f/8b/f0e4c441227ba756aafbe78f117485b25bb26b1c059d01f137fa6d14896b/cffi-2.0.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:2de9a304e27f7596cd03d16f1b7c72219bd944e99cc52b84d0145aefb07cbd3c", size = 180560, upload-time = "2025-09-08T23:22:28.197Z" },
+ { url = "https://files.pythonhosted.org/packages/b1/b7/1200d354378ef52ec227395d95c2576330fd22a869f7a70e88e1447eb234/cffi-2.0.0-cp311-cp311-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:baf5215e0ab74c16e2dd324e8ec067ef59e41125d3eade2b863d294fd5035c92", size = 209613, upload-time = "2025-09-08T23:22:29.475Z" },
+ { url = "https://files.pythonhosted.org/packages/b8/56/6033f5e86e8cc9bb629f0077ba71679508bdf54a9a5e112a3c0b91870332/cffi-2.0.0-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:730cacb21e1bdff3ce90babf007d0a0917cc3e6492f336c2f0134101e0944f93", size = 216476, upload-time = "2025-09-08T23:22:31.063Z" },
+ { url = "https://files.pythonhosted.org/packages/dc/7f/55fecd70f7ece178db2f26128ec41430d8720f2d12ca97bf8f0a628207d5/cffi-2.0.0-cp311-cp311-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:6824f87845e3396029f3820c206e459ccc91760e8fa24422f8b0c3d1731cbec5", size = 203374, upload-time = "2025-09-08T23:22:32.507Z" },
+ { url = "https://files.pythonhosted.org/packages/84/ef/a7b77c8bdc0f77adc3b46888f1ad54be8f3b7821697a7b89126e829e676a/cffi-2.0.0-cp311-cp311-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:9de40a7b0323d889cf8d23d1ef214f565ab154443c42737dfe52ff82cf857664", size = 202597, upload-time = "2025-09-08T23:22:34.132Z" },
+ { url = "https://files.pythonhosted.org/packages/d7/91/500d892b2bf36529a75b77958edfcd5ad8e2ce4064ce2ecfeab2125d72d1/cffi-2.0.0-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:8941aaadaf67246224cee8c3803777eed332a19d909b47e29c9842ef1e79ac26", size = 215574, upload-time = "2025-09-08T23:22:35.443Z" },
+ { url = "https://files.pythonhosted.org/packages/44/64/58f6255b62b101093d5df22dcb752596066c7e89dd725e0afaed242a61be/cffi-2.0.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:a05d0c237b3349096d3981b727493e22147f934b20f6f125a3eba8f994bec4a9", size = 218971, upload-time = "2025-09-08T23:22:36.805Z" },
+ { url = "https://files.pythonhosted.org/packages/ab/49/fa72cebe2fd8a55fbe14956f9970fe8eb1ac59e5df042f603ef7c8ba0adc/cffi-2.0.0-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:94698a9c5f91f9d138526b48fe26a199609544591f859c870d477351dc7b2414", size = 211972, upload-time = "2025-09-08T23:22:38.436Z" },
+ { url = "https://files.pythonhosted.org/packages/0b/28/dd0967a76aab36731b6ebfe64dec4e981aff7e0608f60c2d46b46982607d/cffi-2.0.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:5fed36fccc0612a53f1d4d9a816b50a36702c28a2aa880cb8a122b3466638743", size = 217078, upload-time = "2025-09-08T23:22:39.776Z" },
+ { url = "https://files.pythonhosted.org/packages/2b/c0/015b25184413d7ab0a410775fdb4a50fca20f5589b5dab1dbbfa3baad8ce/cffi-2.0.0-cp311-cp311-win32.whl", hash = "sha256:c649e3a33450ec82378822b3dad03cc228b8f5963c0c12fc3b1e0ab940f768a5", size = 172076, upload-time = "2025-09-08T23:22:40.95Z" },
+ { url = "https://files.pythonhosted.org/packages/ae/8f/dc5531155e7070361eb1b7e4c1a9d896d0cb21c49f807a6c03fd63fc877e/cffi-2.0.0-cp311-cp311-win_amd64.whl", hash = "sha256:66f011380d0e49ed280c789fbd08ff0d40968ee7b665575489afa95c98196ab5", size = 182820, upload-time = "2025-09-08T23:22:42.463Z" },
+ { url = "https://files.pythonhosted.org/packages/95/5c/1b493356429f9aecfd56bc171285a4c4ac8697f76e9bbbbb105e537853a1/cffi-2.0.0-cp311-cp311-win_arm64.whl", hash = "sha256:c6638687455baf640e37344fe26d37c404db8b80d037c3d29f58fe8d1c3b194d", size = 177635, upload-time = "2025-09-08T23:22:43.623Z" },
+]
+
[[package]]
name = "charset-normalizer"
version = "3.4.4"
@@ -279,6 +395,18 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/0c/d5/c5db1ea3394c6e1732fb3286b3bd878b59507a8f77d32a2cebda7d7b7cd4/donfig-0.8.1.post1-py3-none-any.whl", hash = "sha256:2a3175ce74a06109ff9307d90a230f81215cbac9a751f4d1c6194644b8204f9d", size = 21592, upload-time = "2024-05-23T14:13:55.283Z" },
]
+[[package]]
+name = "ecdsa"
+version = "0.19.1"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "six" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/c0/1f/924e3caae75f471eae4b26bd13b698f6af2c44279f67af317439c2f4c46a/ecdsa-0.19.1.tar.gz", hash = "sha256:478cba7b62555866fcb3bb3fe985e06decbdb68ef55713c4e5ab98c57d508e61", size = 201793, upload-time = "2025-03-13T11:52:43.25Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/cb/a3/460c57f094a4a165c84a1341c373b0a4f5ec6ac244b998d5021aade89b77/ecdsa-0.19.1-py2.py3-none-any.whl", hash = "sha256:30638e27cf77b7e15c4c4cc1973720149e1033827cfd00661ca5c8cc0cdb24c3", size = 150607, upload-time = "2025-03-13T11:52:41.757Z" },
+]
+
[[package]]
name = "exceptiongroup"
version = "1.3.0"
@@ -1025,6 +1153,15 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/20/12/38679034af332785aac8774540895e234f4d07f7545804097de4b666afd8/packaging-25.0-py3-none-any.whl", hash = "sha256:29572ef2b1f17581046b3a2227d5c611fb25ec70ca1ba8554b24b0e69331a484", size = 66469, upload-time = "2025-04-19T11:48:57.875Z" },
]
+[[package]]
+name = "passlib"
+version = "1.7.4"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/b6/06/9da9ee59a67fae7761aab3ccc84fa4f3f33f125b370f1ccdb915bf967c11/passlib-1.7.4.tar.gz", hash = "sha256:defd50f72b65c5402ab2c573830a6978e5f202ad0d984793c8dde2c4152ebe04", size = 689844, upload-time = "2020-10-08T19:00:52.121Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/3b/a4/ab6b7589382ca3df236e03faa71deac88cae040af60c071a78d254a62172/passlib-1.7.4-py2.py3-none-any.whl", hash = "sha256:aa6bca462b8d8bda89c70b382f0c298a20b5560af6cbfa2dce410c0a2fb669f1", size = 525554, upload-time = "2020-10-08T19:00:49.856Z" },
+]
+
[[package]]
name = "pillow"
version = "12.0.0"
@@ -1151,6 +1288,15 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/47/8d/d529b5d697919ba8c11ad626e835d4039be708a35b0d22de83a269a6682c/pyasn1_modules-0.4.2-py3-none-any.whl", hash = "sha256:29253a9207ce32b64c3ac6600edc75368f98473906e8fd1043bd6b5b1de2c14a", size = 181259, upload-time = "2025-03-28T02:41:19.028Z" },
]
+[[package]]
+name = "pycparser"
+version = "2.23"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/fe/cf/d2d3b9f5699fb1e4615c8e32ff220203e43b248e1dfcc6736ad9057731ca/pycparser-2.23.tar.gz", hash = "sha256:78816d4f24add8f10a06d6f05b4d424ad9e96cfebf68a4ddc99c65c0720d00c2", size = 173734, upload-time = "2025-09-09T13:23:47.91Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/a0/e3/59cd50310fc9b59512193629e1984c1f95e5c8ae6e5d8c69532ccc65a7fe/pycparser-2.23-py3-none-any.whl", hash = "sha256:e5c6e8d3fbad53479cab09ac03729e0a9faf2bee3db8208a550daf5af81a5934", size = 118140, upload-time = "2025-09-09T13:23:46.651Z" },
+]
+
[[package]]
name = "pydantic"
version = "2.12.3"
@@ -1252,6 +1398,8 @@ name = "pytc-client"
version = "0.1.0"
source = { virtual = "." }
dependencies = [
+ { name = "argon2-cffi" },
+ { name = "bcrypt" },
{ name = "faiss-cpu" },
{ name = "fastapi" },
{ name = "h5py" },
@@ -1262,8 +1410,10 @@ dependencies = [
{ name = "neuroglancer" },
{ name = "numpy", version = "2.2.6", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version < '3.11'" },
{ name = "numpy", version = "2.3.4", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.11'" },
+ { name = "passlib" },
{ name = "pillow" },
{ name = "psutil" },
+ { name = "python-jose" },
{ name = "python-multipart" },
{ name = "requests" },
{ name = "scipy", version = "1.15.3", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version < '3.11'" },
@@ -1277,6 +1427,8 @@ dependencies = [
[package.metadata]
requires-dist = [
+ { name = "argon2-cffi", specifier = ">=25.1.0" },
+ { name = "bcrypt", specifier = ">=5.0.0" },
{ name = "faiss-cpu", specifier = "==1.12.0" },
{ name = "fastapi", specifier = "==0.119.0" },
{ name = "h5py", specifier = ">=3.11" },
@@ -1286,8 +1438,10 @@ requires-dist = [
{ name = "langchain-ollama", specifier = "==1.0.0" },
{ name = "neuroglancer", specifier = "==2.38" },
{ name = "numpy", specifier = ">=1.24" },
+ { name = "passlib", specifier = ">=1.7.4" },
{ name = "pillow", specifier = ">=10.0" },
{ name = "psutil", specifier = ">=5.9.0" },
+ { name = "python-jose", specifier = ">=3.5.0" },
{ name = "python-multipart", specifier = "==0.0.20" },
{ name = "requests", specifier = ">=2.31" },
{ name = "scipy", specifier = ">=1.11" },
@@ -1306,6 +1460,20 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/14/1b/a298b06749107c305e1fe0f814c6c74aea7b2f1e10989cb30f544a1b3253/python_dotenv-1.2.1-py3-none-any.whl", hash = "sha256:b81ee9561e9ca4004139c6cbba3a238c32b03e4894671e181b671e8cb8425d61", size = 21230, upload-time = "2025-10-26T15:12:09.109Z" },
]
+[[package]]
+name = "python-jose"
+version = "3.5.0"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "ecdsa" },
+ { name = "pyasn1" },
+ { name = "rsa" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/c6/77/3a1c9039db7124eb039772b935f2244fbb73fc8ee65b9acf2375da1c07bf/python_jose-3.5.0.tar.gz", hash = "sha256:fb4eaa44dbeb1c26dcc69e4bd7ec54a1cb8dd64d3b4d81ef08d90ff453f2b01b", size = 92726, upload-time = "2025-05-28T17:31:54.288Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/d9/c3/0bd11992072e6a1c513b16500a5d07f91a24017c5909b02c72c62d7ad024/python_jose-3.5.0-py2.py3-none-any.whl", hash = "sha256:abd1202f23d34dfad2c3d28cb8617b90acf34132c7afd60abd0b0b7d3cb55771", size = 34624, upload-time = "2025-05-28T17:31:52.802Z" },
+]
+
[[package]]
name = "python-multipart"
version = "0.0.20"