Merge pull request 'feature/exif-endpoint' (#44) from feature/exif-endpoint into master
Reviewed-on: #44
This commit was merged in pull request #44.
This commit is contained in:
7
.idea/sqldialects.xml
generated
Normal file
7
.idea/sqldialects.xml
generated
Normal file
@@ -0,0 +1,7 @@
|
||||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<project version="4">
|
||||
<component name="SqlDialectMappings">
|
||||
<file url="file://$PROJECT_DIR$/migrations/2021-09-02-000740_create_tags/up.sql" dialect="GenericSQL" />
|
||||
<file url="PROJECT" dialect="SQLite" />
|
||||
</component>
|
||||
</project>
|
||||
266
CLAUDE.md
Normal file
266
CLAUDE.md
Normal file
@@ -0,0 +1,266 @@
|
||||
# CLAUDE.md
|
||||
|
||||
This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
|
||||
|
||||
## Project Overview
|
||||
|
||||
An Actix-web REST API for serving images and videos from a filesystem with automatic thumbnail generation, EXIF extraction, tag organization, and a memories feature for browsing photos by date. Uses SQLite/Diesel ORM for data persistence and ffmpeg for video processing.
|
||||
|
||||
## Development Commands
|
||||
|
||||
### Building & Running
|
||||
```bash
|
||||
# Build for development
|
||||
cargo build
|
||||
|
||||
# Build for release (uses thin LTO optimization)
|
||||
cargo build --release
|
||||
|
||||
# Run the server (requires .env file with DATABASE_URL, BASE_PATH, THUMBNAILS, VIDEO_PATH, BIND_URL, SECRET_KEY)
|
||||
cargo run
|
||||
|
||||
# Run with specific log level
|
||||
RUST_LOG=debug cargo run
|
||||
```
|
||||
|
||||
### Testing
|
||||
```bash
|
||||
# Run all tests (requires BASE_PATH in .env)
|
||||
cargo test
|
||||
|
||||
# Run specific test
|
||||
cargo test test_name
|
||||
|
||||
# Run tests with output
|
||||
cargo test -- --nocapture
|
||||
```
|
||||
|
||||
### Database Migrations
|
||||
```bash
|
||||
# Install diesel CLI (one-time setup)
|
||||
cargo install diesel_cli --no-default-features --features sqlite
|
||||
|
||||
# Create new migration
|
||||
diesel migration generate migration_name
|
||||
|
||||
# Run migrations (also runs automatically on app startup)
|
||||
diesel migration run
|
||||
|
||||
# Revert last migration
|
||||
diesel migration revert
|
||||
|
||||
# Regenerate schema.rs after manual migration changes
|
||||
diesel print-schema > src/database/schema.rs
|
||||
```
|
||||
|
||||
### Code Quality
|
||||
```bash
|
||||
# Format code
|
||||
cargo fmt
|
||||
|
||||
# Run clippy linter
|
||||
cargo clippy
|
||||
|
||||
# Fix automatically fixable issues
|
||||
cargo fix
|
||||
```
|
||||
|
||||
### Utility Binaries
|
||||
```bash
|
||||
# Two-phase cleanup: resolve missing files and validate file types
|
||||
cargo run --bin cleanup_files -- --base-path /path/to/media --database-url ./database.db
|
||||
|
||||
# Batch extract EXIF for existing files
|
||||
cargo run --bin migrate_exif
|
||||
```
|
||||
|
||||
## Architecture Overview
|
||||
|
||||
### Core Components
|
||||
|
||||
**Layered Architecture:**
|
||||
- **HTTP Layer** (`main.rs`): Route handlers for images, videos, metadata, tags, favorites, memories
|
||||
- **Auth Layer** (`auth.rs`): JWT token validation, Claims extraction via FromRequest trait
|
||||
- **Service Layer** (`files.rs`, `exif.rs`, `memories.rs`): Business logic for file operations and EXIF extraction
|
||||
- **DAO Layer** (`database/mod.rs`): Trait-based data access (ExifDao, UserDao, FavoriteDao, TagDao)
|
||||
- **Database Layer**: Diesel ORM with SQLite, schema in `database/schema.rs`
|
||||
|
||||
**Async Actor System (Actix):**
|
||||
- `StreamActor`: Manages ffmpeg video processing lifecycle
|
||||
- `VideoPlaylistManager`: Scans directories and queues videos
|
||||
- `PlaylistGenerator`: Creates HLS playlists for video streaming
|
||||
|
||||
### Database Schema & Patterns
|
||||
|
||||
**Tables:**
|
||||
- `users`: Authentication (id, username, password_hash)
|
||||
- `favorites`: User-specific favorites (userid, path)
|
||||
- `tags`: Custom labels with timestamps
|
||||
- `tagged_photo`: Many-to-many photo-tag relationships
|
||||
- `image_exif`: Rich metadata (file_path + 16 EXIF fields: camera, GPS, dates, exposure settings)
|
||||
|
||||
**DAO Pattern:**
|
||||
All database access goes through trait-based DAOs (e.g., `ExifDao`, `SqliteExifDao`). Connection pooling uses `Arc<Mutex<SqliteConnection>>`. All DB operations are traced with OpenTelemetry in release builds.
|
||||
|
||||
**Key DAO Methods:**
|
||||
- `store_exif()`, `get_exif()`, `get_exif_batch()`: EXIF CRUD operations
|
||||
- `query_by_exif()`: Complex filtering by camera, GPS bounds, date ranges
|
||||
- Batch operations minimize DB hits during file watching
|
||||
|
||||
### File Processing Pipeline
|
||||
|
||||
**Thumbnail Generation:**
|
||||
1. Startup scan: Rayon parallel walk of BASE_PATH
|
||||
2. Creates 200x200 thumbnails in THUMBNAILS directory (mirrors source structure)
|
||||
3. Videos: extracts frame at 3-second mark via ffmpeg
|
||||
4. Images: uses `image` crate for JPEG/PNG processing
|
||||
|
||||
**File Watching:**
|
||||
Runs in background thread with two-tier strategy:
|
||||
- **Quick scan** (default 60s): Recently modified files only
|
||||
- **Full scan** (default 3600s): Comprehensive directory check
|
||||
- Batch queries EXIF DB to detect new files
|
||||
- Configurable via `WATCH_QUICK_INTERVAL_SECONDS` and `WATCH_FULL_INTERVAL_SECONDS`
|
||||
|
||||
**EXIF Extraction:**
|
||||
- Uses `kamadak-exif` crate
|
||||
- Supports: JPEG, TIFF, RAW (NEF, CR2, CR3), HEIF/HEIC, PNG, WebP
|
||||
- Extracts: camera make/model, lens, dimensions, GPS coordinates, focal length, aperture, shutter speed, ISO, date taken
|
||||
- Triggered on upload and during file watching
|
||||
|
||||
**File Upload Behavior:**
|
||||
If file exists, appends timestamp to filename (`photo_1735124234.jpg`) to preserve history without overwrites.
|
||||
|
||||
### Authentication Flow
|
||||
|
||||
**Login:**
|
||||
1. POST `/login` with username/password
|
||||
2. Verify with `bcrypt::verify()` against password_hash
|
||||
3. Generate JWT with claims: `{ sub: user_id, exp: 5_days_from_now }`
|
||||
4. Sign with HS256 using `SECRET_KEY` environment variable
|
||||
|
||||
**Authorization:**
|
||||
All protected endpoints extract `Claims` via `FromRequest` trait implementation. Token passed as `Authorization: Bearer <token>` header.
|
||||
|
||||
### API Structure
|
||||
|
||||
**Key Endpoint Patterns:**
|
||||
|
||||
```rust
|
||||
// Image serving & upload
|
||||
GET /image?path=...&size=...&format=...
|
||||
POST /image (multipart file upload)
|
||||
|
||||
// Metadata & EXIF
|
||||
GET /image/metadata?path=...
|
||||
|
||||
// Advanced search with filters
|
||||
GET /photos?path=...&recursive=true&sort=DateTakenDesc&camera_make=Canon&gps_lat=...&gps_lon=...&gps_radius_km=10&date_from=...&date_to=...&tag_ids=1,2,3&media_type=Photo
|
||||
|
||||
// Video streaming (HLS)
|
||||
POST /video/generate (creates .m3u8 playlist + .ts segments)
|
||||
GET /video/stream?path=... (serves playlist)
|
||||
|
||||
// Tags
|
||||
GET /image/tags/all
|
||||
POST /image/tags (add tag to file)
|
||||
DELETE /image/tags (remove tag from file)
|
||||
POST /image/tags/batch (bulk tag updates)
|
||||
|
||||
// Memories (week-based grouping)
|
||||
GET /memories?path=...&recursive=true
|
||||
```
|
||||
|
||||
**Request Types:**
|
||||
- `FilesRequest`: Supports complex filtering (tags, EXIF fields, GPS radius, date ranges)
|
||||
- `SortType`: Shuffle, NameAsc/Desc, TagCountAsc/Desc, DateTakenAsc/Desc
|
||||
|
||||
### Important Patterns
|
||||
|
||||
**Service Builder Pattern:**
|
||||
Routes are registered via composable `ServiceBuilder` trait in `service.rs`. Allows modular feature addition.
|
||||
|
||||
**Path Validation:**
|
||||
Always use `is_valid_full_path(&base_path, &requested_path, check_exists)` to prevent directory traversal attacks.
|
||||
|
||||
**File Type Detection:**
|
||||
Centralized in `file_types.rs` with constants `IMAGE_EXTENSIONS` and `VIDEO_EXTENSIONS`. Provides both `Path` and `DirEntry` variants for performance.
|
||||
|
||||
**OpenTelemetry Tracing:**
|
||||
All database operations and HTTP handlers wrapped in spans. In release builds, exports to OTLP endpoint via `OTLP_OTLS_ENDPOINT`. Debug builds use basic logger.
|
||||
|
||||
**Memory Exclusion:**
|
||||
`PathExcluder` in `memories.rs` filters out directories from memories API via `EXCLUDED_DIRS` environment variable (comma-separated paths or substring patterns).
|
||||
|
||||
### Startup Sequence
|
||||
|
||||
1. Load `.env` file
|
||||
2. Run embedded Diesel migrations
|
||||
3. Spawn file watcher thread
|
||||
4. Create initial thumbnails (parallel scan)
|
||||
5. Generate video GIF thumbnails
|
||||
6. Initialize AppState with Actix actors
|
||||
7. Set up Prometheus metrics (`imageserver_image_total`, `imageserver_video_total`)
|
||||
8. Scan directory for videos and queue HLS processing
|
||||
9. Start HTTP server on `BIND_URL` + localhost:8088
|
||||
|
||||
## Testing Patterns
|
||||
|
||||
Tests require `BASE_PATH` environment variable. Many integration tests create temporary directories and files.
|
||||
|
||||
When testing database code:
|
||||
- Use in-memory SQLite: `DATABASE_URL=":memory:"`
|
||||
- Run migrations in test setup
|
||||
- Clean up with `DROP TABLE` or use `#[serial]` from `serial_test` crate if parallel tests conflict
|
||||
|
||||
## Common Gotchas
|
||||
|
||||
**EXIF Date Parsing:**
|
||||
Multiple formats supported (EXIF DateTime, ISO8601, Unix timestamp). Fallback chain attempts multiple parsers.
|
||||
|
||||
**Video Processing:**
|
||||
ffmpeg processes run asynchronously via actors. Use `StreamActor` to track completion. HLS segments written to `VIDEO_PATH`.
|
||||
|
||||
**File Extensions:**
|
||||
Extension detection is case-insensitive. Use `file_types.rs` helpers rather than manual string matching.
|
||||
|
||||
**Migration Workflow:**
|
||||
After creating a migration, manually edit the SQL, then regenerate `schema.rs` with `diesel print-schema`. Migrations auto-run on startup via `embedded_migrations!()` macro.
|
||||
|
||||
**Path Absolutization:**
|
||||
Use `path-absolutize` crate's `.absolutize()` method when converting user-provided paths to ensure they're within `BASE_PATH`.
|
||||
|
||||
## Required Environment Variables
|
||||
|
||||
```bash
|
||||
DATABASE_URL=./database.db # SQLite database path
|
||||
BASE_PATH=/path/to/media # Root media directory
|
||||
THUMBNAILS=/path/to/thumbnails # Thumbnail storage
|
||||
VIDEO_PATH=/path/to/video/hls # HLS playlist output
|
||||
GIFS_DIRECTORY=/path/to/gifs # Video GIF thumbnails
|
||||
BIND_URL=0.0.0.0:8080 # Server binding
|
||||
CORS_ALLOWED_ORIGINS=http://localhost:3000
|
||||
SECRET_KEY=your-secret-key-here # JWT signing secret
|
||||
RUST_LOG=info # Log level
|
||||
EXCLUDED_DIRS=/private,/archive # Comma-separated paths to exclude from memories
|
||||
```
|
||||
|
||||
Optional:
|
||||
```bash
|
||||
WATCH_QUICK_INTERVAL_SECONDS=60 # Quick scan interval
|
||||
WATCH_FULL_INTERVAL_SECONDS=3600 # Full scan interval
|
||||
OTLP_OTLS_ENDPOINT=http://... # OpenTelemetry collector (release builds)
|
||||
```
|
||||
|
||||
## Dependencies of Note
|
||||
|
||||
- **actix-web**: HTTP framework
|
||||
- **diesel**: ORM for SQLite
|
||||
- **jsonwebtoken**: JWT implementation
|
||||
- **kamadak-exif**: EXIF parsing
|
||||
- **image**: Thumbnail generation
|
||||
- **walkdir**: Directory traversal
|
||||
- **rayon**: Parallel processing
|
||||
- **opentelemetry**: Distributed tracing
|
||||
- **bcrypt**: Password hashing
|
||||
- **infer**: Magic number file type detection
|
||||
390
Cargo.lock
generated
390
Cargo.lock
generated
@@ -11,7 +11,7 @@ dependencies = [
|
||||
"actix-macros",
|
||||
"actix-rt",
|
||||
"actix_derive",
|
||||
"bitflags 2.9.3",
|
||||
"bitflags",
|
||||
"bytes",
|
||||
"crossbeam-channel",
|
||||
"futures-core",
|
||||
@@ -33,7 +33,7 @@ version = "0.5.2"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "5f7b0a21988c1bf877cf4759ef5ddaac04c1c9fe808c9142ecb78ba97d97a28a"
|
||||
dependencies = [
|
||||
"bitflags 2.9.3",
|
||||
"bitflags",
|
||||
"bytes",
|
||||
"futures-core",
|
||||
"futures-sink",
|
||||
@@ -44,6 +44,21 @@ dependencies = [
|
||||
"tracing",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "actix-cors"
|
||||
version = "0.7.1"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "daa239b93927be1ff123eebada5a3ff23e89f0124ccb8609234e5103d5a5ae6d"
|
||||
dependencies = [
|
||||
"actix-utils",
|
||||
"actix-web",
|
||||
"derive_more 2.0.1",
|
||||
"futures-util",
|
||||
"log",
|
||||
"once_cell",
|
||||
"smallvec",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "actix-files"
|
||||
version = "0.6.7"
|
||||
@@ -54,7 +69,7 @@ dependencies = [
|
||||
"actix-service",
|
||||
"actix-utils",
|
||||
"actix-web",
|
||||
"bitflags 2.9.3",
|
||||
"bitflags",
|
||||
"bytes",
|
||||
"derive_more 2.0.1",
|
||||
"futures-core",
|
||||
@@ -78,7 +93,7 @@ dependencies = [
|
||||
"actix-service",
|
||||
"actix-utils",
|
||||
"base64",
|
||||
"bitflags 2.9.3",
|
||||
"bitflags",
|
||||
"brotli",
|
||||
"bytes",
|
||||
"bytestring",
|
||||
@@ -191,7 +206,7 @@ dependencies = [
|
||||
"actix-utils",
|
||||
"futures-core",
|
||||
"futures-util",
|
||||
"mio 1.0.4",
|
||||
"mio",
|
||||
"socket2 0.5.10",
|
||||
"tokio",
|
||||
"tracing",
|
||||
@@ -509,23 +524,17 @@ checksum = "72b3254f16251a8381aa12e40e3c4d2f0199f8c6508fbecb9d91f575e0fbb8c6"
|
||||
|
||||
[[package]]
|
||||
name = "bcrypt"
|
||||
version = "0.16.0"
|
||||
version = "0.17.1"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "2b1866ecef4f2d06a0bb77880015fdf2b89e25a1c2e5addacb87e459c86dc67e"
|
||||
checksum = "abaf6da45c74385272ddf00e1ac074c7d8a6c1a1dda376902bd6a427522a8b2c"
|
||||
dependencies = [
|
||||
"base64",
|
||||
"blowfish",
|
||||
"getrandom 0.2.16",
|
||||
"getrandom 0.3.3",
|
||||
"subtle",
|
||||
"zeroize",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "bitflags"
|
||||
version = "1.3.2"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "bef38d45163c2f1dde094a7dfd33ccf595c92905c8f8f4fdc18d06fb1037718a"
|
||||
|
||||
[[package]]
|
||||
name = "bitflags"
|
||||
version = "2.9.3"
|
||||
@@ -635,6 +644,17 @@ dependencies = [
|
||||
"shlex",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "cfb"
|
||||
version = "0.7.3"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "d38f2da7a0a2c4ccf0065be06397cc26a81f4e528be095826eee9d4adbb8c60f"
|
||||
dependencies = [
|
||||
"byteorder",
|
||||
"fnv",
|
||||
"uuid",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "cfg-expr"
|
||||
version = "0.15.8"
|
||||
@@ -675,12 +695,65 @@ dependencies = [
|
||||
"inout",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "clap"
|
||||
version = "4.5.53"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "c9e340e012a1bf4935f5282ed1436d1489548e8f72308207ea5df0e23d2d03f8"
|
||||
dependencies = [
|
||||
"clap_builder",
|
||||
"clap_derive",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "clap_builder"
|
||||
version = "4.5.53"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "d76b5d13eaa18c901fd2f7fca939fefe3a0727a953561fefdf3b2922b8569d00"
|
||||
dependencies = [
|
||||
"anstream",
|
||||
"anstyle",
|
||||
"clap_lex",
|
||||
"strsim",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "clap_derive"
|
||||
version = "4.5.49"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "2a0b5487afeab2deb2ff4e03a807ad1a03ac532ff5a2cee5d86884440c7f7671"
|
||||
dependencies = [
|
||||
"heck",
|
||||
"proc-macro2",
|
||||
"quote",
|
||||
"syn",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "clap_lex"
|
||||
version = "0.7.6"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "a1d728cc89cf3aee9ff92b05e62b19ee65a02b5702cff7d5a377e32c6ae29d8d"
|
||||
|
||||
[[package]]
|
||||
name = "colorchoice"
|
||||
version = "1.0.4"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "b05b61dc5112cbb17e4b6cd61790d9845d13888356391624cbe7e41efeac1e75"
|
||||
|
||||
[[package]]
|
||||
name = "console"
|
||||
version = "0.15.11"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "054ccb5b10f9f2cbf51eb355ca1d05c2d279ce1804688d0db74b4733a5aeafd8"
|
||||
dependencies = [
|
||||
"encode_unicode",
|
||||
"libc",
|
||||
"once_cell",
|
||||
"unicode-width",
|
||||
"windows-sys 0.59.0",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "convert_case"
|
||||
version = "0.4.0"
|
||||
@@ -844,6 +917,19 @@ dependencies = [
|
||||
"unicode-xid",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "dialoguer"
|
||||
version = "0.11.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "658bce805d770f407bc62102fca7c2c64ceef2fbcb2b8bd19d2765ce093980de"
|
||||
dependencies = [
|
||||
"console",
|
||||
"shell-words",
|
||||
"tempfile",
|
||||
"thiserror 1.0.69",
|
||||
"zeroize",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "diesel"
|
||||
version = "2.2.12"
|
||||
@@ -935,6 +1021,12 @@ version = "1.15.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "48c757948c5ede0e46177b7add2e67155f70e33c07fea8284df6576da70b3719"
|
||||
|
||||
[[package]]
|
||||
name = "encode_unicode"
|
||||
version = "1.0.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "34aa73646ffb006b8f5147f3dc182bd4bcb190227ce861fc4a4844bf8e3cb2c0"
|
||||
|
||||
[[package]]
|
||||
name = "encoding_rs"
|
||||
version = "0.8.35"
|
||||
@@ -1018,18 +1110,6 @@ dependencies = [
|
||||
"simd-adler32",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "filetime"
|
||||
version = "0.2.26"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "bc0505cd1b6fa6580283f6bdf70a73fcf4aba1184038c90902b92b3dd0df63ed"
|
||||
dependencies = [
|
||||
"cfg-if",
|
||||
"libc",
|
||||
"libredox",
|
||||
"windows-sys 0.60.2",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "find-msvc-tools"
|
||||
version = "0.1.0"
|
||||
@@ -1067,15 +1147,6 @@ dependencies = [
|
||||
"percent-encoding",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "fsevent-sys"
|
||||
version = "4.1.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "76ee7a02da4d231650c7cea31349b889be2f45ddb3ef3032d2ec8185f6313fd2"
|
||||
dependencies = [
|
||||
"libc",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "futures"
|
||||
version = "0.3.31"
|
||||
@@ -1534,9 +1605,10 @@ dependencies = [
|
||||
|
||||
[[package]]
|
||||
name = "image-api"
|
||||
version = "0.3.1"
|
||||
version = "0.4.0"
|
||||
dependencies = [
|
||||
"actix",
|
||||
"actix-cors",
|
||||
"actix-files",
|
||||
"actix-multipart",
|
||||
"actix-rt",
|
||||
@@ -1545,16 +1617,19 @@ dependencies = [
|
||||
"anyhow",
|
||||
"bcrypt",
|
||||
"chrono",
|
||||
"clap",
|
||||
"dialoguer",
|
||||
"diesel",
|
||||
"diesel_migrations",
|
||||
"dotenv",
|
||||
"env_logger",
|
||||
"futures",
|
||||
"image",
|
||||
"infer",
|
||||
"jsonwebtoken",
|
||||
"kamadak-exif",
|
||||
"lazy_static",
|
||||
"log",
|
||||
"notify",
|
||||
"opentelemetry",
|
||||
"opentelemetry-appender-log",
|
||||
"opentelemetry-otlp",
|
||||
@@ -1595,23 +1670,12 @@ dependencies = [
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "inotify"
|
||||
version = "0.9.6"
|
||||
name = "infer"
|
||||
version = "0.16.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "f8069d3ec154eb856955c1c0fbffefbf5f3c40a104ec912d4797314c1801abff"
|
||||
checksum = "bc150e5ce2330295b8616ce0e3f53250e53af31759a9dbedad1621ba29151847"
|
||||
dependencies = [
|
||||
"bitflags 1.3.2",
|
||||
"inotify-sys",
|
||||
"libc",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "inotify-sys"
|
||||
version = "0.1.5"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "e05c02b5e89bff3b946cedeca278abc628fe811e604f027c45a8aa3cf793d0eb"
|
||||
dependencies = [
|
||||
"libc",
|
||||
"cfb",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
@@ -1640,7 +1704,7 @@ version = "0.7.10"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "046fa2d4d00aea763528b4950358d0ead425372445dc8ff86312b3c69ff7727b"
|
||||
dependencies = [
|
||||
"bitflags 2.9.3",
|
||||
"bitflags",
|
||||
"cfg-if",
|
||||
"libc",
|
||||
]
|
||||
@@ -1751,23 +1815,12 @@ dependencies = [
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "kqueue"
|
||||
version = "1.1.1"
|
||||
name = "kamadak-exif"
|
||||
version = "0.6.1"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "eac30106d7dce88daf4a3fcb4879ea939476d5074a9b7ddd0fb97fa4bed5596a"
|
||||
checksum = "1130d80c7374efad55a117d715a3af9368f0fa7a2c54573afc15a188cd984837"
|
||||
dependencies = [
|
||||
"kqueue-sys",
|
||||
"libc",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "kqueue-sys"
|
||||
version = "1.0.4"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "ed9625ffda8729b85e45cf04090035ac368927b8cebc34898e7c120f52e4838b"
|
||||
dependencies = [
|
||||
"bitflags 1.3.2",
|
||||
"libc",
|
||||
"mutate_once",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
@@ -1798,17 +1851,6 @@ dependencies = [
|
||||
"cc",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "libredox"
|
||||
version = "0.1.9"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "391290121bad3d37fbddad76d8f5d1c1c314cfc646d143d7e07a3086ddff0ce3"
|
||||
dependencies = [
|
||||
"bitflags 2.9.3",
|
||||
"libc",
|
||||
"redox_syscall",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "libsqlite3-sys"
|
||||
version = "0.35.0"
|
||||
@@ -1942,18 +1984,6 @@ dependencies = [
|
||||
"simd-adler32",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "mio"
|
||||
version = "0.8.11"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "a4a650543ca06a924e8b371db273b2756685faae30f8487da1b56505a8f78b0c"
|
||||
dependencies = [
|
||||
"libc",
|
||||
"log",
|
||||
"wasi 0.11.1+wasi-snapshot-preview1",
|
||||
"windows-sys 0.48.0",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "mio"
|
||||
version = "1.0.4"
|
||||
@@ -1976,6 +2006,12 @@ dependencies = [
|
||||
"pxfm",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "mutate_once"
|
||||
version = "0.1.2"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "13d2233c9842d08cfe13f9eac96e207ca6a2ea10b80259ebe8ad0268be27d2af"
|
||||
|
||||
[[package]]
|
||||
name = "new_debug_unreachable"
|
||||
version = "1.0.6"
|
||||
@@ -1998,25 +2034,6 @@ version = "0.3.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "0676bb32a98c1a483ce53e500a81ad9c3d5b3f7c920c28c24e9cb0980d0b5bc8"
|
||||
|
||||
[[package]]
|
||||
name = "notify"
|
||||
version = "6.1.1"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "6205bd8bb1e454ad2e27422015fb5e4f2bcc7e08fa8f27058670d208324a4d2d"
|
||||
dependencies = [
|
||||
"bitflags 2.9.3",
|
||||
"crossbeam-channel",
|
||||
"filetime",
|
||||
"fsevent-sys",
|
||||
"inotify",
|
||||
"kqueue",
|
||||
"libc",
|
||||
"log",
|
||||
"mio 0.8.11",
|
||||
"walkdir",
|
||||
"windows-sys 0.48.0",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "num-bigint"
|
||||
version = "0.4.6"
|
||||
@@ -2096,9 +2113,9 @@ checksum = "a4895175b425cb1f87721b59f0f286c2092bd4af812243672510e1ac53e2e0ad"
|
||||
|
||||
[[package]]
|
||||
name = "opentelemetry"
|
||||
version = "0.30.0"
|
||||
version = "0.31.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "aaf416e4cb72756655126f7dd7bb0af49c674f4c1b9903e80c009e0c37e552e6"
|
||||
checksum = "b84bcd6ae87133e903af7ef497404dda70c60d0ea14895fc8a5e6722754fc2a0"
|
||||
dependencies = [
|
||||
"futures-core",
|
||||
"futures-sink",
|
||||
@@ -2110,9 +2127,9 @@ dependencies = [
|
||||
|
||||
[[package]]
|
||||
name = "opentelemetry-appender-log"
|
||||
version = "0.30.0"
|
||||
version = "0.31.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "e688026e48f4603494f619583e0aa0b0edd9c0b9430e1c46804df2ff32bc8798"
|
||||
checksum = "9e50c59a96bd6a723a4329c5db31eb04fa4488c5f141ae7b9d4fd587439e6ee1"
|
||||
dependencies = [
|
||||
"log",
|
||||
"opentelemetry",
|
||||
@@ -2120,9 +2137,9 @@ dependencies = [
|
||||
|
||||
[[package]]
|
||||
name = "opentelemetry-http"
|
||||
version = "0.30.0"
|
||||
version = "0.31.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "50f6639e842a97dbea8886e3439710ae463120091e2e064518ba8e716e6ac36d"
|
||||
checksum = "d7a6d09a73194e6b66df7c8f1b680f156d916a1a942abf2de06823dd02b7855d"
|
||||
dependencies = [
|
||||
"async-trait",
|
||||
"bytes",
|
||||
@@ -2133,9 +2150,9 @@ dependencies = [
|
||||
|
||||
[[package]]
|
||||
name = "opentelemetry-otlp"
|
||||
version = "0.30.0"
|
||||
version = "0.31.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "dbee664a43e07615731afc539ca60c6d9f1a9425e25ca09c57bc36c87c55852b"
|
||||
checksum = "7a2366db2dca4d2ad033cad11e6ee42844fd727007af5ad04a1730f4cb8163bf"
|
||||
dependencies = [
|
||||
"http 1.3.1",
|
||||
"opentelemetry",
|
||||
@@ -2152,21 +2169,22 @@ dependencies = [
|
||||
|
||||
[[package]]
|
||||
name = "opentelemetry-proto"
|
||||
version = "0.30.0"
|
||||
version = "0.31.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "2e046fd7660710fe5a05e8748e70d9058dc15c94ba914e7c4faa7c728f0e8ddc"
|
||||
checksum = "a7175df06de5eaee9909d4805a3d07e28bb752c34cab57fa9cff549da596b30f"
|
||||
dependencies = [
|
||||
"opentelemetry",
|
||||
"opentelemetry_sdk",
|
||||
"prost",
|
||||
"tonic",
|
||||
"tonic-prost",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "opentelemetry-stdout"
|
||||
version = "0.30.0"
|
||||
version = "0.31.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "447191061af41c3943e082ea359ab8b64ff27d6d34d30d327df309ddef1eef6f"
|
||||
checksum = "bc8887887e169414f637b18751487cce4e095be787d23fad13c454e2fb1b3811"
|
||||
dependencies = [
|
||||
"chrono",
|
||||
"opentelemetry",
|
||||
@@ -2175,9 +2193,9 @@ dependencies = [
|
||||
|
||||
[[package]]
|
||||
name = "opentelemetry_sdk"
|
||||
version = "0.30.0"
|
||||
version = "0.31.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "11f644aa9e5e31d11896e024305d7e3c98a88884d9f8919dbf37a9991bc47a4b"
|
||||
checksum = "e14ae4f5991976fd48df6d843de219ca6d31b01daaab2dad5af2badeded372bd"
|
||||
dependencies = [
|
||||
"futures-channel",
|
||||
"futures-executor",
|
||||
@@ -2185,7 +2203,6 @@ dependencies = [
|
||||
"opentelemetry",
|
||||
"percent-encoding",
|
||||
"rand 0.9.2",
|
||||
"serde_json",
|
||||
"thiserror 2.0.16",
|
||||
"tokio",
|
||||
"tokio-stream",
|
||||
@@ -2304,7 +2321,7 @@ version = "0.18.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "97baced388464909d42d89643fe4361939af9b7ce7a31ee32a168f832a70f2a0"
|
||||
dependencies = [
|
||||
"bitflags 2.9.3",
|
||||
"bitflags",
|
||||
"crc32fast",
|
||||
"fdeflate",
|
||||
"flate2",
|
||||
@@ -2395,9 +2412,9 @@ dependencies = [
|
||||
|
||||
[[package]]
|
||||
name = "prost"
|
||||
version = "0.13.5"
|
||||
version = "0.14.1"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "2796faa41db3ec313a31f7624d9286acf277b52de526150b7e69f3debf891ee5"
|
||||
checksum = "7231bd9b3d3d33c86b58adbac74b5ec0ad9f496b19d22801d773636feaa95f3d"
|
||||
dependencies = [
|
||||
"bytes",
|
||||
"prost-derive",
|
||||
@@ -2405,9 +2422,9 @@ dependencies = [
|
||||
|
||||
[[package]]
|
||||
name = "prost-derive"
|
||||
version = "0.13.5"
|
||||
version = "0.14.1"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "8a56d757972c98b346a9b766e3f02746cde6dd1cd1d1d563472929fdd74bec4d"
|
||||
checksum = "9120690fafc389a67ba3803df527d0ec9cbbc9cc45e4cc20b332996dfb672425"
|
||||
dependencies = [
|
||||
"anyhow",
|
||||
"itertools 0.14.0",
|
||||
@@ -2587,7 +2604,7 @@ version = "0.5.17"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "5407465600fb0548f1442edf71dd20683c6ed326200ace4b1ef0763521bb3b77"
|
||||
dependencies = [
|
||||
"bitflags 2.9.3",
|
||||
"bitflags",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
@@ -2700,7 +2717,7 @@ version = "1.0.8"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "11181fbabf243db407ef8df94a6ce0b2f9a733bd8be4ad02b4eda9602296cac8"
|
||||
dependencies = [
|
||||
"bitflags 2.9.3",
|
||||
"bitflags",
|
||||
"errno",
|
||||
"libc",
|
||||
"linux-raw-sys",
|
||||
@@ -2822,6 +2839,12 @@ dependencies = [
|
||||
"digest",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "shell-words"
|
||||
version = "1.1.1"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "dc6fe69c597f9c37bfeeeeeb33da3530379845f10be461a66d16d03eca2ded77"
|
||||
|
||||
[[package]]
|
||||
name = "shlex"
|
||||
version = "1.3.0"
|
||||
@@ -3073,7 +3096,7 @@ dependencies = [
|
||||
"bytes",
|
||||
"io-uring",
|
||||
"libc",
|
||||
"mio 1.0.4",
|
||||
"mio",
|
||||
"parking_lot",
|
||||
"pin-project-lite",
|
||||
"signal-hook-registry",
|
||||
@@ -3181,9 +3204,9 @@ checksum = "fcc842091f2def52017664b53082ecbbeb5c7731092bad69d2c63050401dfd64"
|
||||
|
||||
[[package]]
|
||||
name = "tonic"
|
||||
version = "0.13.1"
|
||||
version = "0.14.2"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "7e581ba15a835f4d9ea06c55ab1bd4dce26fc53752c69a04aac00703bfb49ba9"
|
||||
checksum = "eb7613188ce9f7df5bfe185db26c5814347d110db17920415cf2fbcad85e7203"
|
||||
dependencies = [
|
||||
"async-trait",
|
||||
"base64",
|
||||
@@ -3196,7 +3219,7 @@ dependencies = [
|
||||
"hyper-util",
|
||||
"percent-encoding",
|
||||
"pin-project",
|
||||
"prost",
|
||||
"sync_wrapper",
|
||||
"tokio",
|
||||
"tokio-stream",
|
||||
"tower",
|
||||
@@ -3205,6 +3228,17 @@ dependencies = [
|
||||
"tracing",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "tonic-prost"
|
||||
version = "0.14.2"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "66bd50ad6ce1252d87ef024b3d64fe4c3cf54a86fb9ef4c631fdd0ded7aeaa67"
|
||||
dependencies = [
|
||||
"bytes",
|
||||
"prost",
|
||||
"tonic",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "tower"
|
||||
version = "0.5.2"
|
||||
@@ -3230,7 +3264,7 @@ version = "0.6.6"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "adc82fd73de2a9722ac5da747f12383d2bfdb93591ee6c58486e0097890f05f2"
|
||||
dependencies = [
|
||||
"bitflags 2.9.3",
|
||||
"bitflags",
|
||||
"bytes",
|
||||
"futures-util",
|
||||
"http 1.3.1",
|
||||
@@ -3310,6 +3344,12 @@ version = "1.0.18"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "5a5f39404a5da50712a4c1eecf25e90dd62b613502b7e925fd4e4d19b5c96512"
|
||||
|
||||
[[package]]
|
||||
name = "unicode-width"
|
||||
version = "0.2.2"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "b4ac048d71ede7ee76d585517add45da530660ef4390e49b098733c6e897f254"
|
||||
|
||||
[[package]]
|
||||
name = "unicode-xid"
|
||||
version = "0.2.6"
|
||||
@@ -3346,6 +3386,16 @@ version = "0.2.2"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "06abde3611657adf66d383f00b093d7faecc7fa57071cce2578660c9f1010821"
|
||||
|
||||
[[package]]
|
||||
name = "uuid"
|
||||
version = "1.19.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "e2e054861b4bd027cd373e18e8d8d8e6548085000e41290d95ce0c373a654b4a"
|
||||
dependencies = [
|
||||
"js-sys",
|
||||
"wasm-bindgen",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "v_frame"
|
||||
version = "0.3.9"
|
||||
@@ -3564,15 +3614,6 @@ dependencies = [
|
||||
"windows-link",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "windows-sys"
|
||||
version = "0.48.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "677d2418bec65e3338edb076e806bc1ec15693c5d0104683f2efe857f61056a9"
|
||||
dependencies = [
|
||||
"windows-targets 0.48.5",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "windows-sys"
|
||||
version = "0.52.0"
|
||||
@@ -3600,21 +3641,6 @@ dependencies = [
|
||||
"windows-targets 0.53.3",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "windows-targets"
|
||||
version = "0.48.5"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "9a2fa6e2155d7247be68c096456083145c183cbbbc2764150dda45a87197940c"
|
||||
dependencies = [
|
||||
"windows_aarch64_gnullvm 0.48.5",
|
||||
"windows_aarch64_msvc 0.48.5",
|
||||
"windows_i686_gnu 0.48.5",
|
||||
"windows_i686_msvc 0.48.5",
|
||||
"windows_x86_64_gnu 0.48.5",
|
||||
"windows_x86_64_gnullvm 0.48.5",
|
||||
"windows_x86_64_msvc 0.48.5",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "windows-targets"
|
||||
version = "0.52.6"
|
||||
@@ -3648,12 +3674,6 @@ dependencies = [
|
||||
"windows_x86_64_msvc 0.53.0",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "windows_aarch64_gnullvm"
|
||||
version = "0.48.5"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "2b38e32f0abccf9987a4e3079dfb67dcd799fb61361e53e2882c3cbaf0d905d8"
|
||||
|
||||
[[package]]
|
||||
name = "windows_aarch64_gnullvm"
|
||||
version = "0.52.6"
|
||||
@@ -3666,12 +3686,6 @@ version = "0.53.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "86b8d5f90ddd19cb4a147a5fa63ca848db3df085e25fee3cc10b39b6eebae764"
|
||||
|
||||
[[package]]
|
||||
name = "windows_aarch64_msvc"
|
||||
version = "0.48.5"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "dc35310971f3b2dbbf3f0690a219f40e2d9afcf64f9ab7cc1be722937c26b4bc"
|
||||
|
||||
[[package]]
|
||||
name = "windows_aarch64_msvc"
|
||||
version = "0.52.6"
|
||||
@@ -3684,12 +3698,6 @@ version = "0.53.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "c7651a1f62a11b8cbd5e0d42526e55f2c99886c77e007179efff86c2b137e66c"
|
||||
|
||||
[[package]]
|
||||
name = "windows_i686_gnu"
|
||||
version = "0.48.5"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "a75915e7def60c94dcef72200b9a8e58e5091744960da64ec734a6c6e9b3743e"
|
||||
|
||||
[[package]]
|
||||
name = "windows_i686_gnu"
|
||||
version = "0.52.6"
|
||||
@@ -3714,12 +3722,6 @@ version = "0.53.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "9ce6ccbdedbf6d6354471319e781c0dfef054c81fbc7cf83f338a4296c0cae11"
|
||||
|
||||
[[package]]
|
||||
name = "windows_i686_msvc"
|
||||
version = "0.48.5"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "8f55c233f70c4b27f66c523580f78f1004e8b5a8b659e05a4eb49d4166cca406"
|
||||
|
||||
[[package]]
|
||||
name = "windows_i686_msvc"
|
||||
version = "0.52.6"
|
||||
@@ -3732,12 +3734,6 @@ version = "0.53.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "581fee95406bb13382d2f65cd4a908ca7b1e4c2f1917f143ba16efe98a589b5d"
|
||||
|
||||
[[package]]
|
||||
name = "windows_x86_64_gnu"
|
||||
version = "0.48.5"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "53d40abd2583d23e4718fddf1ebec84dbff8381c07cae67ff7768bbf19c6718e"
|
||||
|
||||
[[package]]
|
||||
name = "windows_x86_64_gnu"
|
||||
version = "0.52.6"
|
||||
@@ -3750,12 +3746,6 @@ version = "0.53.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "2e55b5ac9ea33f2fc1716d1742db15574fd6fc8dadc51caab1c16a3d3b4190ba"
|
||||
|
||||
[[package]]
|
||||
name = "windows_x86_64_gnullvm"
|
||||
version = "0.48.5"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "0b7b52767868a23d5bab768e390dc5f5c55825b6d30b86c844ff2dc7414044cc"
|
||||
|
||||
[[package]]
|
||||
name = "windows_x86_64_gnullvm"
|
||||
version = "0.52.6"
|
||||
@@ -3768,12 +3758,6 @@ version = "0.53.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "0a6e035dd0599267ce1ee132e51c27dd29437f63325753051e71dd9e42406c57"
|
||||
|
||||
[[package]]
|
||||
name = "windows_x86_64_msvc"
|
||||
version = "0.48.5"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "ed94fce61571a4006852b7389a063ab983c02eb1bb37b47f8272ce92d06d9538"
|
||||
|
||||
[[package]]
|
||||
name = "windows_x86_64_msvc"
|
||||
version = "0.52.6"
|
||||
|
||||
24
Cargo.toml
24
Cargo.toml
@@ -1,13 +1,13 @@
|
||||
[package]
|
||||
name = "image-api"
|
||||
version = "0.3.1"
|
||||
version = "0.4.0"
|
||||
authors = ["Cameron Cordes <cameronc.dev@gmail.com>"]
|
||||
edition = "2024"
|
||||
|
||||
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
|
||||
|
||||
[profile.release]
|
||||
lto = true
|
||||
lto = "thin"
|
||||
|
||||
[dependencies]
|
||||
actix = "0.13.1"
|
||||
@@ -15,6 +15,7 @@ actix-web = "4"
|
||||
actix-rt = "2.6"
|
||||
tokio = { version = "1.42.0", features = ["default", "process", "sync"] }
|
||||
actix-files = "0.6"
|
||||
actix-cors = "0.7"
|
||||
actix-multipart = "0.7.2"
|
||||
futures = "0.3.5"
|
||||
jsonwebtoken = "9.3.0"
|
||||
@@ -23,12 +24,14 @@ serde_json = "1"
|
||||
diesel = { version = "2.2.10", features = ["sqlite"] }
|
||||
diesel_migrations = "2.2.0"
|
||||
chrono = "0.4"
|
||||
clap = { version = "4.5", features = ["derive"] }
|
||||
dialoguer = "0.11"
|
||||
dotenv = "0.15"
|
||||
bcrypt = "0.16.0"
|
||||
bcrypt = "0.17.1"
|
||||
image = { version = "0.25.5", default-features = false, features = ["jpeg", "png", "rayon"] }
|
||||
infer = "0.16"
|
||||
walkdir = "2.4.0"
|
||||
rayon = "1.5"
|
||||
notify = "6.1.1"
|
||||
path-absolutize = "3.1"
|
||||
log = "0.4"
|
||||
env_logger = "0.11.5"
|
||||
@@ -37,10 +40,11 @@ prometheus = "0.13"
|
||||
lazy_static = "1.5"
|
||||
anyhow = "1.0"
|
||||
rand = "0.8.5"
|
||||
opentelemetry = { version = "0.30.0", features = ["default", "metrics", "tracing"] }
|
||||
opentelemetry_sdk = { version = "0.30.0", features = ["default", "rt-tokio-current-thread", "metrics"] }
|
||||
opentelemetry-otlp = { version = "0.30.0", features = ["default", "metrics", "tracing", "grpc-tonic"] }
|
||||
opentelemetry-stdout = "0.30.0"
|
||||
opentelemetry-appender-log = "0.30.0"
|
||||
opentelemetry = { version = "0.31.0", features = ["default", "metrics", "tracing"] }
|
||||
opentelemetry_sdk = { version = "0.31.0", features = ["default", "rt-tokio-current-thread", "metrics"] }
|
||||
opentelemetry-otlp = { version = "0.31.0", features = ["default", "metrics", "tracing", "grpc-tonic"] }
|
||||
opentelemetry-stdout = "0.31.0"
|
||||
opentelemetry-appender-log = "0.31.0"
|
||||
tempfile = "3.20.0"
|
||||
regex = "1.11.1"
|
||||
regex = "1.11.1"
|
||||
exif = { package = "kamadak-exif", version = "0.6.1" }
|
||||
11
README.md
11
README.md
@@ -2,6 +2,14 @@
|
||||
This is an Actix-web server for serving images and videos from a filesystem.
|
||||
Upon first run it will generate thumbnails for all images and videos at `BASE_PATH`.
|
||||
|
||||
## Features
|
||||
- Automatic thumbnail generation for images and videos
|
||||
- EXIF data extraction and storage for photos
|
||||
- File watching with NFS support (polling-based)
|
||||
- Video streaming with HLS
|
||||
- Tag-based organization
|
||||
- Memories API for browsing photos by date
|
||||
|
||||
## Environment
|
||||
There are a handful of required environment variables to have the API run.
|
||||
They should be defined where the binary is located or above it in an `.env` file.
|
||||
@@ -15,3 +23,6 @@ You must have `ffmpeg` installed for streaming video and generating video thumbn
|
||||
- `SECRET_KEY` is the *hopefully* random string to sign Tokens with
|
||||
- `RUST_LOG` is one of `off, error, warn, info, debug, trace`, from least to most noisy [error is default]
|
||||
- `EXCLUDED_DIRS` is a comma separated list of directories to exclude from the Memories API
|
||||
- `WATCH_QUICK_INTERVAL_SECONDS` (optional) is the interval in seconds for quick file scans [default: 60]
|
||||
- `WATCH_FULL_INTERVAL_SECONDS` (optional) is the interval in seconds for full file scans [default: 3600]
|
||||
|
||||
|
||||
2
migrations/2025-12-17-000000_create_image_exif/down.sql
Normal file
2
migrations/2025-12-17-000000_create_image_exif/down.sql
Normal file
@@ -0,0 +1,2 @@
|
||||
DROP INDEX IF EXISTS idx_image_exif_file_path;
|
||||
DROP TABLE IF EXISTS image_exif;
|
||||
32
migrations/2025-12-17-000000_create_image_exif/up.sql
Normal file
32
migrations/2025-12-17-000000_create_image_exif/up.sql
Normal file
@@ -0,0 +1,32 @@
|
||||
CREATE TABLE image_exif (
|
||||
id INTEGER PRIMARY KEY NOT NULL,
|
||||
file_path TEXT NOT NULL UNIQUE,
|
||||
|
||||
-- Camera Information
|
||||
camera_make TEXT,
|
||||
camera_model TEXT,
|
||||
lens_model TEXT,
|
||||
|
||||
-- Image Properties
|
||||
width INTEGER,
|
||||
height INTEGER,
|
||||
orientation INTEGER,
|
||||
|
||||
-- GPS Coordinates
|
||||
gps_latitude REAL,
|
||||
gps_longitude REAL,
|
||||
gps_altitude REAL,
|
||||
|
||||
-- Capture Settings
|
||||
focal_length REAL,
|
||||
aperture REAL,
|
||||
shutter_speed TEXT,
|
||||
iso INTEGER,
|
||||
date_taken BIGINT,
|
||||
|
||||
-- Housekeeping
|
||||
created_time BIGINT NOT NULL,
|
||||
last_modified BIGINT NOT NULL
|
||||
);
|
||||
|
||||
CREATE INDEX idx_image_exif_file_path ON image_exif(file_path);
|
||||
9
migrations/2025-12-17-230000_add_indexes/down.sql
Normal file
9
migrations/2025-12-17-230000_add_indexes/down.sql
Normal file
@@ -0,0 +1,9 @@
|
||||
-- Rollback indexes
|
||||
|
||||
DROP INDEX IF EXISTS idx_favorites_userid;
|
||||
DROP INDEX IF EXISTS idx_favorites_path;
|
||||
DROP INDEX IF EXISTS idx_tags_name;
|
||||
DROP INDEX IF EXISTS idx_tagged_photo_photo_name;
|
||||
DROP INDEX IF EXISTS idx_tagged_photo_tag_id;
|
||||
DROP INDEX IF EXISTS idx_image_exif_camera;
|
||||
DROP INDEX IF EXISTS idx_image_exif_gps;
|
||||
17
migrations/2025-12-17-230000_add_indexes/up.sql
Normal file
17
migrations/2025-12-17-230000_add_indexes/up.sql
Normal file
@@ -0,0 +1,17 @@
|
||||
-- Add indexes for improved query performance
|
||||
|
||||
-- Favorites table indexes
|
||||
CREATE INDEX IF NOT EXISTS idx_favorites_userid ON favorites(userid);
|
||||
CREATE INDEX IF NOT EXISTS idx_favorites_path ON favorites(path);
|
||||
|
||||
-- Tags table indexes
|
||||
CREATE INDEX IF NOT EXISTS idx_tags_name ON tags(name);
|
||||
|
||||
-- Tagged photos indexes
|
||||
CREATE INDEX IF NOT EXISTS idx_tagged_photo_photo_name ON tagged_photo(photo_name);
|
||||
CREATE INDEX IF NOT EXISTS idx_tagged_photo_tag_id ON tagged_photo(tag_id);
|
||||
|
||||
-- EXIF table indexes (date_taken already has index from previous migration)
|
||||
-- Adding composite index for common EXIF queries
|
||||
CREATE INDEX IF NOT EXISTS idx_image_exif_camera ON image_exif(camera_make, camera_model);
|
||||
CREATE INDEX IF NOT EXISTS idx_image_exif_gps ON image_exif(gps_latitude, gps_longitude);
|
||||
3
migrations/2025-12-17-230100_unique_favorites/down.sql
Normal file
3
migrations/2025-12-17-230100_unique_favorites/down.sql
Normal file
@@ -0,0 +1,3 @@
|
||||
-- Rollback unique constraint on favorites
|
||||
|
||||
DROP INDEX IF EXISTS idx_favorites_unique;
|
||||
12
migrations/2025-12-17-230100_unique_favorites/up.sql
Normal file
12
migrations/2025-12-17-230100_unique_favorites/up.sql
Normal file
@@ -0,0 +1,12 @@
|
||||
-- Add unique constraint to prevent duplicate favorites per user
|
||||
|
||||
-- First, remove any existing duplicates (keep the oldest one)
|
||||
DELETE FROM favorites
|
||||
WHERE rowid NOT IN (
|
||||
SELECT MIN(rowid)
|
||||
FROM favorites
|
||||
GROUP BY userid, path
|
||||
);
|
||||
|
||||
-- Add unique index to enforce constraint
|
||||
CREATE UNIQUE INDEX idx_favorites_unique ON favorites(userid, path);
|
||||
@@ -0,0 +1,2 @@
|
||||
-- Remove date_taken index
|
||||
DROP INDEX IF EXISTS idx_image_exif_date_taken;
|
||||
2
migrations/2025-12-18-120000_add_date_taken_index/up.sql
Normal file
2
migrations/2025-12-18-120000_add_date_taken_index/up.sql
Normal file
@@ -0,0 +1,2 @@
|
||||
-- Add index on date_taken for efficient date range queries
|
||||
CREATE INDEX IF NOT EXISTS idx_image_exif_date_taken ON image_exif(date_taken);
|
||||
@@ -64,7 +64,6 @@ pub async fn login<D: UserDao>(
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
|
||||
use super::*;
|
||||
use crate::testhelpers::{BodyReader, TestUserDao};
|
||||
|
||||
|
||||
143
src/bin/cleanup_files.rs
Normal file
143
src/bin/cleanup_files.rs
Normal file
@@ -0,0 +1,143 @@
|
||||
use std::path::PathBuf;
|
||||
use std::sync::{Arc, Mutex};
|
||||
|
||||
use clap::Parser;
|
||||
|
||||
use image_api::cleanup::{
|
||||
CleanupConfig, DatabaseUpdater, resolve_missing_files, validate_file_types,
|
||||
};
|
||||
use image_api::database::{SqliteExifDao, SqliteFavoriteDao};
|
||||
use image_api::tags::SqliteTagDao;
|
||||
|
||||
#[derive(Parser, Debug)]
|
||||
#[command(name = "cleanup_files")]
|
||||
#[command(about = "File cleanup and fix utility for ImageApi", long_about = None)]
|
||||
struct Args {
|
||||
#[arg(long, help = "Preview changes without making them")]
|
||||
dry_run: bool,
|
||||
|
||||
#[arg(long, help = "Auto-fix all issues without prompting")]
|
||||
auto_fix: bool,
|
||||
|
||||
#[arg(long, help = "Skip phase 1 (missing file resolution)")]
|
||||
skip_phase1: bool,
|
||||
|
||||
#[arg(long, help = "Skip phase 2 (file type validation)")]
|
||||
skip_phase2: bool,
|
||||
}
|
||||
|
||||
fn main() -> anyhow::Result<()> {
|
||||
// Initialize logging
|
||||
env_logger::init();
|
||||
|
||||
// Load environment variables
|
||||
dotenv::dotenv()?;
|
||||
|
||||
// Parse CLI arguments
|
||||
let args = Args::parse();
|
||||
|
||||
// Get base path from environment
|
||||
let base_path = dotenv::var("BASE_PATH")?;
|
||||
let base = PathBuf::from(&base_path);
|
||||
|
||||
println!("File Cleanup and Fix Utility");
|
||||
println!("============================");
|
||||
println!("Base path: {}", base.display());
|
||||
println!("Dry run: {}", args.dry_run);
|
||||
println!("Auto fix: {}", args.auto_fix);
|
||||
println!();
|
||||
|
||||
// Pre-flight checks
|
||||
if !base.exists() {
|
||||
eprintln!("Error: Base path does not exist: {}", base.display());
|
||||
std::process::exit(1);
|
||||
}
|
||||
|
||||
if !base.is_dir() {
|
||||
eprintln!("Error: Base path is not a directory: {}", base.display());
|
||||
std::process::exit(1);
|
||||
}
|
||||
|
||||
// Create configuration
|
||||
let config = CleanupConfig {
|
||||
base_path: base,
|
||||
dry_run: args.dry_run,
|
||||
auto_fix: args.auto_fix,
|
||||
};
|
||||
|
||||
// Create DAOs
|
||||
println!("Connecting to database...");
|
||||
let tag_dao: Arc<Mutex<dyn image_api::tags::TagDao>> =
|
||||
Arc::new(Mutex::new(SqliteTagDao::default()));
|
||||
let exif_dao: Arc<Mutex<dyn image_api::database::ExifDao>> =
|
||||
Arc::new(Mutex::new(SqliteExifDao::new()));
|
||||
let favorites_dao: Arc<Mutex<dyn image_api::database::FavoriteDao>> =
|
||||
Arc::new(Mutex::new(SqliteFavoriteDao::new()));
|
||||
|
||||
// Create database updater
|
||||
let mut db_updater = DatabaseUpdater::new(tag_dao, exif_dao, favorites_dao);
|
||||
|
||||
println!("✓ Database connected\n");
|
||||
|
||||
// Track overall statistics
|
||||
let mut total_issues_found = 0;
|
||||
let mut total_issues_fixed = 0;
|
||||
let mut total_errors = Vec::new();
|
||||
|
||||
// Phase 1: Missing file resolution
|
||||
if !args.skip_phase1 {
|
||||
match resolve_missing_files(&config, &mut db_updater) {
|
||||
Ok(stats) => {
|
||||
total_issues_found += stats.issues_found;
|
||||
total_issues_fixed += stats.issues_fixed;
|
||||
total_errors.extend(stats.errors);
|
||||
}
|
||||
Err(e) => {
|
||||
eprintln!("Phase 1 failed: {:?}", e);
|
||||
total_errors.push(format!("Phase 1 error: {}", e));
|
||||
}
|
||||
}
|
||||
} else {
|
||||
println!("Phase 1: Skipped (--skip-phase1)");
|
||||
}
|
||||
|
||||
// Phase 2: File type validation
|
||||
if !args.skip_phase2 {
|
||||
match validate_file_types(&config, &mut db_updater) {
|
||||
Ok(stats) => {
|
||||
total_issues_found += stats.issues_found;
|
||||
total_issues_fixed += stats.issues_fixed;
|
||||
total_errors.extend(stats.errors);
|
||||
}
|
||||
Err(e) => {
|
||||
eprintln!("Phase 2 failed: {:?}", e);
|
||||
total_errors.push(format!("Phase 2 error: {}", e));
|
||||
}
|
||||
}
|
||||
} else {
|
||||
println!("\nPhase 2: Skipped (--skip-phase2)");
|
||||
}
|
||||
|
||||
// Final summary
|
||||
println!("\n============================");
|
||||
println!("Cleanup Complete!");
|
||||
println!("============================");
|
||||
println!("Total issues found: {}", total_issues_found);
|
||||
if config.dry_run {
|
||||
println!("Total issues that would be fixed: {}", total_issues_found);
|
||||
} else {
|
||||
println!("Total issues fixed: {}", total_issues_fixed);
|
||||
}
|
||||
|
||||
if !total_errors.is_empty() {
|
||||
println!("\nErrors encountered:");
|
||||
for (i, error) in total_errors.iter().enumerate() {
|
||||
println!(" {}. {}", i + 1, error);
|
||||
}
|
||||
println!("\nSome operations failed. Review errors above.");
|
||||
} else {
|
||||
println!("\n✓ No errors encountered");
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
196
src/bin/migrate_exif.rs
Normal file
196
src/bin/migrate_exif.rs
Normal file
@@ -0,0 +1,196 @@
|
||||
use std::path::PathBuf;
|
||||
use std::sync::{Arc, Mutex};
|
||||
|
||||
use chrono::Utc;
|
||||
use clap::Parser;
|
||||
use opentelemetry;
|
||||
use rayon::prelude::*;
|
||||
use walkdir::WalkDir;
|
||||
|
||||
use image_api::database::models::InsertImageExif;
|
||||
use image_api::database::{ExifDao, SqliteExifDao};
|
||||
use image_api::exif;
|
||||
|
||||
#[derive(Parser, Debug)]
|
||||
#[command(name = "migrate_exif")]
|
||||
#[command(about = "Extract and store EXIF data from images", long_about = None)]
|
||||
struct Args {
|
||||
#[arg(long, help = "Skip files that already have EXIF data in database")]
|
||||
skip_existing: bool,
|
||||
}
|
||||
|
||||
fn main() -> anyhow::Result<()> {
|
||||
env_logger::init();
|
||||
dotenv::dotenv()?;
|
||||
|
||||
let args = Args::parse();
|
||||
let base_path = dotenv::var("BASE_PATH")?;
|
||||
let base = PathBuf::from(&base_path);
|
||||
|
||||
println!("EXIF Migration Tool");
|
||||
println!("===================");
|
||||
println!("Base path: {}", base.display());
|
||||
if args.skip_existing {
|
||||
println!("Mode: Skip existing (incremental)");
|
||||
} else {
|
||||
println!("Mode: Upsert (insert new, update existing)");
|
||||
}
|
||||
println!();
|
||||
|
||||
// Collect all image files that support EXIF
|
||||
println!("Scanning for images...");
|
||||
let image_files: Vec<PathBuf> = WalkDir::new(&base)
|
||||
.into_iter()
|
||||
.filter_map(|e| e.ok())
|
||||
.filter(|e| e.file_type().is_file())
|
||||
.filter(|e| exif::supports_exif(e.path()))
|
||||
.map(|e| e.path().to_path_buf())
|
||||
.collect();
|
||||
|
||||
println!("Found {} images to process", image_files.len());
|
||||
|
||||
if image_files.is_empty() {
|
||||
println!("No EXIF-supporting images found. Exiting.");
|
||||
return Ok(());
|
||||
}
|
||||
|
||||
println!();
|
||||
println!("Extracting EXIF data...");
|
||||
|
||||
// Create a thread-safe DAO
|
||||
let dao = Arc::new(Mutex::new(SqliteExifDao::new()));
|
||||
|
||||
// Process in parallel using rayon
|
||||
let results: Vec<_> = image_files
|
||||
.par_iter()
|
||||
.map(|path| {
|
||||
// Create context for this processing iteration
|
||||
let context = opentelemetry::Context::new();
|
||||
|
||||
let relative_path = match path.strip_prefix(&base) {
|
||||
Ok(p) => p.to_str().unwrap().to_string(),
|
||||
Err(_) => {
|
||||
eprintln!(
|
||||
"Error: Could not create relative path for {}",
|
||||
path.display()
|
||||
);
|
||||
return Err(anyhow::anyhow!("Path error"));
|
||||
}
|
||||
};
|
||||
|
||||
// Check if EXIF data already exists
|
||||
let existing = if let Ok(mut dao_lock) = dao.lock() {
|
||||
dao_lock.get_exif(&context, &relative_path).ok().flatten()
|
||||
} else {
|
||||
eprintln!("✗ {} - Failed to acquire database lock", relative_path);
|
||||
return Err(anyhow::anyhow!("Lock error"));
|
||||
};
|
||||
|
||||
// Skip if exists and skip_existing flag is set
|
||||
if args.skip_existing && existing.is_some() {
|
||||
return Ok(("skip".to_string(), relative_path));
|
||||
}
|
||||
|
||||
match exif::extract_exif_from_path(path) {
|
||||
Ok(exif_data) => {
|
||||
let timestamp = Utc::now().timestamp();
|
||||
let insert_exif = InsertImageExif {
|
||||
file_path: relative_path.clone(),
|
||||
camera_make: exif_data.camera_make,
|
||||
camera_model: exif_data.camera_model,
|
||||
lens_model: exif_data.lens_model,
|
||||
width: exif_data.width,
|
||||
height: exif_data.height,
|
||||
orientation: exif_data.orientation,
|
||||
gps_latitude: exif_data.gps_latitude,
|
||||
gps_longitude: exif_data.gps_longitude,
|
||||
gps_altitude: exif_data.gps_altitude,
|
||||
focal_length: exif_data.focal_length,
|
||||
aperture: exif_data.aperture,
|
||||
shutter_speed: exif_data.shutter_speed,
|
||||
iso: exif_data.iso,
|
||||
date_taken: exif_data.date_taken,
|
||||
created_time: existing
|
||||
.as_ref()
|
||||
.map(|e| e.created_time)
|
||||
.unwrap_or(timestamp),
|
||||
last_modified: timestamp,
|
||||
};
|
||||
|
||||
// Store or update in database
|
||||
if let Ok(mut dao_lock) = dao.lock() {
|
||||
let result = if existing.is_some() {
|
||||
// Update existing record
|
||||
dao_lock
|
||||
.update_exif(&context, insert_exif)
|
||||
.map(|_| "update")
|
||||
} else {
|
||||
// Insert new record
|
||||
dao_lock.store_exif(&context, insert_exif).map(|_| "insert")
|
||||
};
|
||||
|
||||
match result {
|
||||
Ok(action) => {
|
||||
if action == "update" {
|
||||
println!("↻ {} (updated)", relative_path);
|
||||
} else {
|
||||
println!("✓ {} (inserted)", relative_path);
|
||||
}
|
||||
Ok((action.to_string(), relative_path))
|
||||
}
|
||||
Err(e) => {
|
||||
eprintln!("✗ {} - Database error: {:?}", relative_path, e);
|
||||
Err(anyhow::anyhow!("Database error"))
|
||||
}
|
||||
}
|
||||
} else {
|
||||
eprintln!("✗ {} - Failed to acquire database lock", relative_path);
|
||||
Err(anyhow::anyhow!("Lock error"))
|
||||
}
|
||||
}
|
||||
Err(e) => {
|
||||
eprintln!("✗ {} - No EXIF data: {:?}", relative_path, e);
|
||||
Err(e)
|
||||
}
|
||||
}
|
||||
})
|
||||
.collect();
|
||||
|
||||
// Count results
|
||||
let mut success_count = 0;
|
||||
let mut inserted_count = 0;
|
||||
let mut updated_count = 0;
|
||||
let mut skipped_count = 0;
|
||||
|
||||
for (action, _) in results.iter().flatten() {
|
||||
success_count += 1;
|
||||
match action.as_str() {
|
||||
"insert" => inserted_count += 1,
|
||||
"update" => updated_count += 1,
|
||||
"skip" => skipped_count += 1,
|
||||
_ => {}
|
||||
}
|
||||
}
|
||||
|
||||
let error_count = results.len() - success_count - skipped_count;
|
||||
|
||||
println!();
|
||||
println!("===================");
|
||||
println!("Migration complete!");
|
||||
println!("Total images processed: {}", image_files.len());
|
||||
|
||||
if inserted_count > 0 {
|
||||
println!(" New EXIF records inserted: {}", inserted_count);
|
||||
}
|
||||
if updated_count > 0 {
|
||||
println!(" Existing records updated: {}", updated_count);
|
||||
}
|
||||
if skipped_count > 0 {
|
||||
println!(" Skipped (already exists): {}", skipped_count);
|
||||
}
|
||||
if error_count > 0 {
|
||||
println!(" Errors (no EXIF data or failures): {}", error_count);
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
154
src/cleanup/database_updater.rs
Normal file
154
src/cleanup/database_updater.rs
Normal file
@@ -0,0 +1,154 @@
|
||||
use crate::database::{ExifDao, FavoriteDao};
|
||||
use crate::tags::TagDao;
|
||||
use anyhow::Result;
|
||||
use log::{error, info};
|
||||
use opentelemetry;
|
||||
use std::sync::{Arc, Mutex};
|
||||
|
||||
pub struct DatabaseUpdater {
|
||||
tag_dao: Arc<Mutex<dyn TagDao>>,
|
||||
exif_dao: Arc<Mutex<dyn ExifDao>>,
|
||||
favorites_dao: Arc<Mutex<dyn FavoriteDao>>,
|
||||
}
|
||||
|
||||
impl DatabaseUpdater {
|
||||
pub fn new(
|
||||
tag_dao: Arc<Mutex<dyn TagDao>>,
|
||||
exif_dao: Arc<Mutex<dyn ExifDao>>,
|
||||
favorites_dao: Arc<Mutex<dyn FavoriteDao>>,
|
||||
) -> Self {
|
||||
Self {
|
||||
tag_dao,
|
||||
exif_dao,
|
||||
favorites_dao,
|
||||
}
|
||||
}
|
||||
|
||||
/// Update file path across all three database tables
|
||||
/// Returns Ok(()) if successful, continues on partial failures but logs errors
|
||||
pub fn update_file_path(&mut self, old_path: &str, new_path: &str) -> Result<()> {
|
||||
let context = opentelemetry::Context::current();
|
||||
let mut success_count = 0;
|
||||
let mut error_count = 0;
|
||||
|
||||
// Update tagged_photo table
|
||||
if let Ok(mut dao) = self.tag_dao.lock() {
|
||||
match dao.update_photo_name(old_path, new_path, &context) {
|
||||
Ok(_) => {
|
||||
info!("Updated tagged_photo: {} -> {}", old_path, new_path);
|
||||
success_count += 1;
|
||||
}
|
||||
Err(e) => {
|
||||
error!("Failed to update tagged_photo for {}: {:?}", old_path, e);
|
||||
error_count += 1;
|
||||
}
|
||||
}
|
||||
} else {
|
||||
error!("Failed to acquire lock on TagDao");
|
||||
error_count += 1;
|
||||
}
|
||||
|
||||
// Update image_exif table
|
||||
if let Ok(mut dao) = self.exif_dao.lock() {
|
||||
match dao.update_file_path(&context, old_path, new_path) {
|
||||
Ok(_) => {
|
||||
info!("Updated image_exif: {} -> {}", old_path, new_path);
|
||||
success_count += 1;
|
||||
}
|
||||
Err(e) => {
|
||||
error!("Failed to update image_exif for {}: {:?}", old_path, e);
|
||||
error_count += 1;
|
||||
}
|
||||
}
|
||||
} else {
|
||||
error!("Failed to acquire lock on ExifDao");
|
||||
error_count += 1;
|
||||
}
|
||||
|
||||
// Update favorites table
|
||||
if let Ok(mut dao) = self.favorites_dao.lock() {
|
||||
match dao.update_path(old_path, new_path) {
|
||||
Ok(_) => {
|
||||
info!("Updated favorites: {} -> {}", old_path, new_path);
|
||||
success_count += 1;
|
||||
}
|
||||
Err(e) => {
|
||||
error!("Failed to update favorites for {}: {:?}", old_path, e);
|
||||
error_count += 1;
|
||||
}
|
||||
}
|
||||
} else {
|
||||
error!("Failed to acquire lock on FavoriteDao");
|
||||
error_count += 1;
|
||||
}
|
||||
|
||||
if success_count > 0 {
|
||||
info!(
|
||||
"Updated {}/{} tables for {} -> {}",
|
||||
success_count,
|
||||
success_count + error_count,
|
||||
old_path,
|
||||
new_path
|
||||
);
|
||||
Ok(())
|
||||
} else {
|
||||
Err(anyhow::anyhow!(
|
||||
"Failed to update any tables for {} -> {}",
|
||||
old_path,
|
||||
new_path
|
||||
))
|
||||
}
|
||||
}
|
||||
|
||||
/// Get all file paths from all three database tables
|
||||
pub fn get_all_file_paths(&mut self) -> Result<Vec<String>> {
|
||||
let context = opentelemetry::Context::current();
|
||||
let mut all_paths = Vec::new();
|
||||
|
||||
// Get from tagged_photo
|
||||
if let Ok(mut dao) = self.tag_dao.lock() {
|
||||
match dao.get_all_photo_names(&context) {
|
||||
Ok(paths) => {
|
||||
info!("Found {} paths in tagged_photo", paths.len());
|
||||
all_paths.extend(paths);
|
||||
}
|
||||
Err(e) => {
|
||||
error!("Failed to get paths from tagged_photo: {:?}", e);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Get from image_exif
|
||||
if let Ok(mut dao) = self.exif_dao.lock() {
|
||||
match dao.get_all_file_paths(&context) {
|
||||
Ok(paths) => {
|
||||
info!("Found {} paths in image_exif", paths.len());
|
||||
all_paths.extend(paths);
|
||||
}
|
||||
Err(e) => {
|
||||
error!("Failed to get paths from image_exif: {:?}", e);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Get from favorites
|
||||
if let Ok(mut dao) = self.favorites_dao.lock() {
|
||||
match dao.get_all_paths() {
|
||||
Ok(paths) => {
|
||||
info!("Found {} paths in favorites", paths.len());
|
||||
all_paths.extend(paths);
|
||||
}
|
||||
Err(e) => {
|
||||
error!("Failed to get paths from favorites: {:?}", e);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Deduplicate
|
||||
all_paths.sort();
|
||||
all_paths.dedup();
|
||||
|
||||
info!("Total unique paths across all tables: {}", all_paths.len());
|
||||
Ok(all_paths)
|
||||
}
|
||||
}
|
||||
103
src/cleanup/file_type_detector.rs
Normal file
103
src/cleanup/file_type_detector.rs
Normal file
@@ -0,0 +1,103 @@
|
||||
use anyhow::{Context, Result};
|
||||
use std::fs::File;
|
||||
use std::io::Read;
|
||||
use std::path::Path;
|
||||
|
||||
/// Detect the actual file type by reading the magic number (file header)
|
||||
/// Returns the canonical extension for the detected type, or None if unknown
|
||||
pub fn detect_file_type(path: &Path) -> Result<Option<String>> {
|
||||
let mut file = File::open(path).with_context(|| format!("Failed to open file: {:?}", path))?;
|
||||
|
||||
// Read first 512 bytes for magic number detection
|
||||
let mut buffer = vec![0; 512];
|
||||
let bytes_read = file
|
||||
.read(&mut buffer)
|
||||
.with_context(|| format!("Failed to read file: {:?}", path))?;
|
||||
buffer.truncate(bytes_read);
|
||||
|
||||
// Detect type using infer crate
|
||||
let detected_type = infer::get(&buffer);
|
||||
|
||||
Ok(detected_type.map(|t| get_canonical_extension(t.mime_type())))
|
||||
}
|
||||
|
||||
/// Map MIME type to canonical file extension
|
||||
pub fn get_canonical_extension(mime_type: &str) -> String {
|
||||
match mime_type {
|
||||
// Images
|
||||
"image/jpeg" => "jpg",
|
||||
"image/png" => "png",
|
||||
"image/webp" => "webp",
|
||||
"image/tiff" => "tiff",
|
||||
"image/heif" | "image/heic" => "heic",
|
||||
"image/avif" => "avif",
|
||||
|
||||
// Videos
|
||||
"video/mp4" => "mp4",
|
||||
"video/quicktime" => "mov",
|
||||
|
||||
// Fallback: use the last part of MIME type
|
||||
_ => mime_type.split('/').next_back().unwrap_or("unknown"),
|
||||
}
|
||||
.to_string()
|
||||
}
|
||||
|
||||
/// Check if a file should be renamed based on current vs detected extension
|
||||
/// Handles aliases (jpg/jpeg are equivalent)
|
||||
pub fn should_rename(current_ext: &str, detected_ext: &str) -> bool {
|
||||
let current = current_ext.to_lowercase();
|
||||
let detected = detected_ext.to_lowercase();
|
||||
|
||||
// Direct match
|
||||
if current == detected {
|
||||
return false;
|
||||
}
|
||||
|
||||
// Handle JPEG aliases (jpg and jpeg are equivalent)
|
||||
if (current == "jpg" || current == "jpeg") && (detected == "jpg" || detected == "jpeg") {
|
||||
return false;
|
||||
}
|
||||
|
||||
// Handle TIFF aliases (tiff and tif are equivalent)
|
||||
if (current == "tiff" || current == "tif") && (detected == "tiff" || detected == "tif") {
|
||||
return false;
|
||||
}
|
||||
|
||||
// Extensions differ and are not aliases
|
||||
true
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_get_canonical_extension() {
|
||||
assert_eq!(get_canonical_extension("image/jpeg"), "jpg");
|
||||
assert_eq!(get_canonical_extension("image/png"), "png");
|
||||
assert_eq!(get_canonical_extension("image/webp"), "webp");
|
||||
assert_eq!(get_canonical_extension("video/mp4"), "mp4");
|
||||
assert_eq!(get_canonical_extension("video/quicktime"), "mov");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_should_rename() {
|
||||
// Same extension - no rename
|
||||
assert!(!should_rename("jpg", "jpg"));
|
||||
assert!(!should_rename("png", "png"));
|
||||
|
||||
// JPEG aliases - no rename
|
||||
assert!(!should_rename("jpg", "jpeg"));
|
||||
assert!(!should_rename("jpeg", "jpg"));
|
||||
assert!(!should_rename("JPG", "jpeg"));
|
||||
|
||||
// TIFF aliases - no rename
|
||||
assert!(!should_rename("tiff", "tif"));
|
||||
assert!(!should_rename("tif", "tiff"));
|
||||
|
||||
// Different types - should rename
|
||||
assert!(should_rename("png", "jpg"));
|
||||
assert!(should_rename("jpg", "png"));
|
||||
assert!(should_rename("webp", "png"));
|
||||
}
|
||||
}
|
||||
11
src/cleanup/mod.rs
Normal file
11
src/cleanup/mod.rs
Normal file
@@ -0,0 +1,11 @@
|
||||
pub mod database_updater;
|
||||
pub mod file_type_detector;
|
||||
pub mod phase1;
|
||||
pub mod phase2;
|
||||
pub mod types;
|
||||
|
||||
pub use database_updater::DatabaseUpdater;
|
||||
pub use file_type_detector::{detect_file_type, get_canonical_extension, should_rename};
|
||||
pub use phase1::resolve_missing_files;
|
||||
pub use phase2::validate_file_types;
|
||||
pub use types::{CleanupConfig, CleanupStats, FileIssue, IssueType};
|
||||
147
src/cleanup/phase1.rs
Normal file
147
src/cleanup/phase1.rs
Normal file
@@ -0,0 +1,147 @@
|
||||
use crate::cleanup::database_updater::DatabaseUpdater;
|
||||
use crate::cleanup::types::{CleanupConfig, CleanupStats};
|
||||
use crate::file_types::IMAGE_EXTENSIONS;
|
||||
use anyhow::Result;
|
||||
use log::{error, warn};
|
||||
use std::path::PathBuf;
|
||||
|
||||
// All supported image extensions to try
|
||||
const SUPPORTED_EXTENSIONS: &[&str] = IMAGE_EXTENSIONS;
|
||||
|
||||
/// Phase 1: Resolve missing files by searching for alternative extensions
|
||||
pub fn resolve_missing_files(
|
||||
config: &CleanupConfig,
|
||||
db_updater: &mut DatabaseUpdater,
|
||||
) -> Result<CleanupStats> {
|
||||
let mut stats = CleanupStats::new();
|
||||
|
||||
println!("\nPhase 1: Missing File Resolution");
|
||||
println!("---------------------------------");
|
||||
|
||||
// Get all file paths from database
|
||||
println!("Scanning database for file references...");
|
||||
let all_paths = db_updater.get_all_file_paths()?;
|
||||
println!("Found {} unique file paths\n", all_paths.len());
|
||||
|
||||
stats.files_checked = all_paths.len();
|
||||
|
||||
println!("Checking file existence...");
|
||||
let mut missing_count = 0;
|
||||
let mut resolved_count = 0;
|
||||
|
||||
for path_str in all_paths {
|
||||
let full_path = config.base_path.join(&path_str);
|
||||
|
||||
// Check if file exists
|
||||
if full_path.exists() {
|
||||
continue;
|
||||
}
|
||||
|
||||
missing_count += 1;
|
||||
stats.issues_found += 1;
|
||||
|
||||
// Try to find the file with different extensions
|
||||
match find_file_with_alternative_extension(&config.base_path, &path_str) {
|
||||
Some(new_path_str) => {
|
||||
println!(
|
||||
"✓ {} → found as {} {}",
|
||||
path_str,
|
||||
new_path_str,
|
||||
if config.dry_run {
|
||||
"(dry-run, not updated)"
|
||||
} else {
|
||||
""
|
||||
}
|
||||
);
|
||||
|
||||
if !config.dry_run {
|
||||
// Update database
|
||||
match db_updater.update_file_path(&path_str, &new_path_str) {
|
||||
Ok(_) => {
|
||||
resolved_count += 1;
|
||||
stats.issues_fixed += 1;
|
||||
}
|
||||
Err(e) => {
|
||||
error!("Failed to update database for {}: {:?}", path_str, e);
|
||||
stats.add_error(format!("DB update failed for {}: {}", path_str, e));
|
||||
}
|
||||
}
|
||||
} else {
|
||||
resolved_count += 1;
|
||||
}
|
||||
}
|
||||
None => {
|
||||
warn!("✗ {} → not found with any extension", path_str);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
println!("\nResults:");
|
||||
println!("- Files checked: {}", stats.files_checked);
|
||||
println!("- Missing files: {}", missing_count);
|
||||
println!("- Resolved: {}", resolved_count);
|
||||
println!(
|
||||
"- Still missing: {}",
|
||||
missing_count - if config.dry_run { 0 } else { resolved_count }
|
||||
);
|
||||
|
||||
if !stats.errors.is_empty() {
|
||||
println!("- Errors: {}", stats.errors.len());
|
||||
}
|
||||
|
||||
Ok(stats)
|
||||
}
|
||||
|
||||
/// Find a file with an alternative extension
|
||||
/// Returns the relative path with the new extension if found
|
||||
fn find_file_with_alternative_extension(
|
||||
base_path: &PathBuf,
|
||||
relative_path: &str,
|
||||
) -> Option<String> {
|
||||
let full_path = base_path.join(relative_path);
|
||||
|
||||
// Get the parent directory and file stem (name without extension)
|
||||
let parent = full_path.parent()?;
|
||||
let stem = full_path.file_stem()?.to_str()?;
|
||||
|
||||
// Try each supported extension
|
||||
for ext in SUPPORTED_EXTENSIONS {
|
||||
let test_path = parent.join(format!("{}.{}", stem, ext));
|
||||
if test_path.exists() {
|
||||
// Convert back to relative path
|
||||
if let Ok(rel) = test_path.strip_prefix(base_path)
|
||||
&& let Some(rel_str) = rel.to_str()
|
||||
{
|
||||
return Some(rel_str.to_string());
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
None
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
use std::fs;
|
||||
use tempfile::TempDir;
|
||||
|
||||
#[test]
|
||||
fn test_find_file_with_alternative_extension() {
|
||||
let temp_dir = TempDir::new().unwrap();
|
||||
let base_path = temp_dir.path().to_path_buf();
|
||||
|
||||
// Create a test file with .jpeg extension
|
||||
let test_file = base_path.join("test.jpeg");
|
||||
fs::write(&test_file, b"test").unwrap();
|
||||
|
||||
// Try to find it as .jpg
|
||||
let result = find_file_with_alternative_extension(&base_path, "test.jpg");
|
||||
assert!(result.is_some());
|
||||
assert_eq!(result.unwrap(), "test.jpeg");
|
||||
|
||||
// Try to find non-existent file
|
||||
let result = find_file_with_alternative_extension(&base_path, "nonexistent.jpg");
|
||||
assert!(result.is_none());
|
||||
}
|
||||
}
|
||||
241
src/cleanup/phase2.rs
Normal file
241
src/cleanup/phase2.rs
Normal file
@@ -0,0 +1,241 @@
|
||||
use crate::cleanup::database_updater::DatabaseUpdater;
|
||||
use crate::cleanup::file_type_detector::{detect_file_type, should_rename};
|
||||
use crate::cleanup::types::{CleanupConfig, CleanupStats};
|
||||
use anyhow::Result;
|
||||
use log::{error, warn};
|
||||
use std::fs;
|
||||
use std::path::{Path, PathBuf};
|
||||
use walkdir::WalkDir;
|
||||
|
||||
/// Phase 2: Validate file types and rename mismatches
|
||||
pub fn validate_file_types(
|
||||
config: &CleanupConfig,
|
||||
db_updater: &mut DatabaseUpdater,
|
||||
) -> Result<CleanupStats> {
|
||||
let mut stats = CleanupStats::new();
|
||||
let mut auto_fix_all = config.auto_fix;
|
||||
let mut skip_all = false;
|
||||
|
||||
println!("\nPhase 2: File Type Validation");
|
||||
println!("------------------------------");
|
||||
|
||||
// Walk the filesystem
|
||||
println!("Scanning filesystem...");
|
||||
let files: Vec<PathBuf> = WalkDir::new(&config.base_path)
|
||||
.into_iter()
|
||||
.filter_map(|e| e.ok())
|
||||
.filter(|e| e.file_type().is_file())
|
||||
.filter(|e| is_supported_media_file(e.path()))
|
||||
.map(|e| e.path().to_path_buf())
|
||||
.collect();
|
||||
|
||||
println!("Files found: {}\n", files.len());
|
||||
stats.files_checked = files.len();
|
||||
|
||||
println!("Detecting file types...");
|
||||
let mut mismatches_found = 0;
|
||||
let mut files_renamed = 0;
|
||||
let mut user_skipped = 0;
|
||||
|
||||
for file_path in files {
|
||||
// Get current extension
|
||||
let current_ext = match file_path.extension() {
|
||||
Some(ext) => ext.to_str().unwrap_or(""),
|
||||
None => continue, // Skip files without extensions
|
||||
};
|
||||
|
||||
// Detect actual file type
|
||||
match detect_file_type(&file_path) {
|
||||
Ok(Some(detected_ext)) => {
|
||||
// Check if we should rename
|
||||
if should_rename(current_ext, &detected_ext) {
|
||||
mismatches_found += 1;
|
||||
stats.issues_found += 1;
|
||||
|
||||
// Get relative path for display and database
|
||||
let relative_path = match file_path.strip_prefix(&config.base_path) {
|
||||
Ok(rel) => rel.to_str().unwrap_or(""),
|
||||
Err(_) => {
|
||||
error!("Failed to get relative path for {:?}", file_path);
|
||||
continue;
|
||||
}
|
||||
};
|
||||
|
||||
println!("\nFile type mismatch:");
|
||||
println!(" Path: {}", relative_path);
|
||||
println!(" Current: .{}", current_ext);
|
||||
println!(" Actual: .{}", detected_ext);
|
||||
|
||||
// Calculate new path
|
||||
let new_file_path = file_path.with_extension(&detected_ext);
|
||||
let new_relative_path = match new_file_path.strip_prefix(&config.base_path) {
|
||||
Ok(rel) => rel.to_str().unwrap_or(""),
|
||||
Err(_) => {
|
||||
error!("Failed to get new relative path for {:?}", new_file_path);
|
||||
continue;
|
||||
}
|
||||
};
|
||||
|
||||
// Check if destination already exists
|
||||
if new_file_path.exists() {
|
||||
warn!("✗ Destination already exists: {}", new_relative_path);
|
||||
stats.add_error(format!(
|
||||
"Destination exists for {}: {}",
|
||||
relative_path, new_relative_path
|
||||
));
|
||||
continue;
|
||||
}
|
||||
|
||||
// Determine if we should proceed
|
||||
let should_proceed = if config.dry_run {
|
||||
println!(" (dry-run mode - would rename to {})", new_relative_path);
|
||||
false
|
||||
} else if skip_all {
|
||||
println!(" Skipped (skip all)");
|
||||
user_skipped += 1;
|
||||
false
|
||||
} else if auto_fix_all {
|
||||
true
|
||||
} else {
|
||||
// Interactive prompt
|
||||
match prompt_for_rename(new_relative_path) {
|
||||
RenameDecision::Yes => true,
|
||||
RenameDecision::No => {
|
||||
user_skipped += 1;
|
||||
false
|
||||
}
|
||||
RenameDecision::All => {
|
||||
auto_fix_all = true;
|
||||
true
|
||||
}
|
||||
RenameDecision::SkipAll => {
|
||||
skip_all = true;
|
||||
user_skipped += 1;
|
||||
false
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
if should_proceed {
|
||||
// Rename the file
|
||||
match fs::rename(&file_path, &new_file_path) {
|
||||
Ok(_) => {
|
||||
println!("✓ Renamed file");
|
||||
|
||||
// Update database
|
||||
match db_updater.update_file_path(relative_path, new_relative_path)
|
||||
{
|
||||
Ok(_) => {
|
||||
files_renamed += 1;
|
||||
stats.issues_fixed += 1;
|
||||
}
|
||||
Err(e) => {
|
||||
error!(
|
||||
"File renamed but DB update failed for {}: {:?}",
|
||||
relative_path, e
|
||||
);
|
||||
stats.add_error(format!(
|
||||
"DB update failed for {}: {}",
|
||||
relative_path, e
|
||||
));
|
||||
}
|
||||
}
|
||||
}
|
||||
Err(e) => {
|
||||
error!("✗ Failed to rename file: {:?}", e);
|
||||
stats.add_error(format!(
|
||||
"Rename failed for {}: {}",
|
||||
relative_path, e
|
||||
));
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
Ok(None) => {
|
||||
// Could not detect file type - skip
|
||||
// This is normal for some RAW formats or corrupted files
|
||||
}
|
||||
Err(e) => {
|
||||
warn!("Failed to detect type for {:?}: {:?}", file_path, e);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
println!("\nResults:");
|
||||
println!("- Files scanned: {}", stats.files_checked);
|
||||
println!("- Mismatches found: {}", mismatches_found);
|
||||
if config.dry_run {
|
||||
println!("- Would rename: {}", mismatches_found);
|
||||
} else {
|
||||
println!("- Files renamed: {}", files_renamed);
|
||||
if user_skipped > 0 {
|
||||
println!("- User skipped: {}", user_skipped);
|
||||
}
|
||||
}
|
||||
|
||||
if !stats.errors.is_empty() {
|
||||
println!("- Errors: {}", stats.errors.len());
|
||||
}
|
||||
|
||||
Ok(stats)
|
||||
}
|
||||
|
||||
/// Check if a file is a supported media file based on extension
|
||||
fn is_supported_media_file(path: &Path) -> bool {
|
||||
use crate::file_types::is_media_file;
|
||||
is_media_file(path)
|
||||
}
|
||||
|
||||
#[derive(Debug)]
|
||||
enum RenameDecision {
|
||||
Yes,
|
||||
No,
|
||||
All,
|
||||
SkipAll,
|
||||
}
|
||||
|
||||
/// Prompt the user for rename decision
|
||||
fn prompt_for_rename(new_path: &str) -> RenameDecision {
|
||||
println!("\nRename to {}?", new_path);
|
||||
println!(" [y] Yes");
|
||||
println!(" [n] No (default)");
|
||||
println!(" [a] Yes to all");
|
||||
println!(" [s] Skip all remaining");
|
||||
print!("Choice: ");
|
||||
|
||||
// Force flush stdout
|
||||
use std::io::{self, Write};
|
||||
let _ = io::stdout().flush();
|
||||
|
||||
let mut input = String::new();
|
||||
match io::stdin().read_line(&mut input) {
|
||||
Ok(_) => {
|
||||
let choice = input.trim().to_lowercase();
|
||||
match choice.as_str() {
|
||||
"y" | "yes" => RenameDecision::Yes,
|
||||
"a" | "all" => RenameDecision::All,
|
||||
"s" | "skip" => RenameDecision::SkipAll,
|
||||
_ => RenameDecision::No,
|
||||
}
|
||||
}
|
||||
Err(_) => RenameDecision::No,
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_is_supported_media_file() {
|
||||
assert!(is_supported_media_file(Path::new("test.jpg")));
|
||||
assert!(is_supported_media_file(Path::new("test.JPG")));
|
||||
assert!(is_supported_media_file(Path::new("test.png")));
|
||||
assert!(is_supported_media_file(Path::new("test.webp")));
|
||||
assert!(is_supported_media_file(Path::new("test.mp4")));
|
||||
assert!(is_supported_media_file(Path::new("test.mov")));
|
||||
assert!(!is_supported_media_file(Path::new("test.txt")));
|
||||
assert!(!is_supported_media_file(Path::new("test")));
|
||||
}
|
||||
}
|
||||
39
src/cleanup/types.rs
Normal file
39
src/cleanup/types.rs
Normal file
@@ -0,0 +1,39 @@
|
||||
use std::path::PathBuf;
|
||||
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct CleanupConfig {
|
||||
pub base_path: PathBuf,
|
||||
pub dry_run: bool,
|
||||
pub auto_fix: bool,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct FileIssue {
|
||||
pub current_path: String,
|
||||
pub issue_type: IssueType,
|
||||
pub suggested_path: Option<String>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone)]
|
||||
pub enum IssueType {
|
||||
MissingFile,
|
||||
ExtensionMismatch { current: String, actual: String },
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Default)]
|
||||
pub struct CleanupStats {
|
||||
pub files_checked: usize,
|
||||
pub issues_found: usize,
|
||||
pub issues_fixed: usize,
|
||||
pub errors: Vec<String>,
|
||||
}
|
||||
|
||||
impl CleanupStats {
|
||||
pub fn new() -> Self {
|
||||
Self::default()
|
||||
}
|
||||
|
||||
pub fn add_error(&mut self, error: String) {
|
||||
self.errors.push(error);
|
||||
}
|
||||
}
|
||||
125
src/data/mod.rs
125
src/data/mod.rs
@@ -1,5 +1,6 @@
|
||||
use std::{fs, str::FromStr};
|
||||
|
||||
use crate::database::models::ImageExif;
|
||||
use anyhow::{Context, anyhow};
|
||||
|
||||
use chrono::{DateTime, Utc};
|
||||
@@ -108,6 +109,8 @@ pub enum SortType {
|
||||
NameDesc,
|
||||
TagCountAsc,
|
||||
TagCountDesc,
|
||||
DateTakenAsc,
|
||||
DateTakenDesc,
|
||||
}
|
||||
|
||||
#[derive(Deserialize)]
|
||||
@@ -119,6 +122,23 @@ pub struct FilesRequest {
|
||||
pub tag_filter_mode: Option<FilterMode>,
|
||||
pub recursive: Option<bool>,
|
||||
pub sort: Option<SortType>,
|
||||
|
||||
// EXIF-based search parameters
|
||||
pub camera_make: Option<String>,
|
||||
pub camera_model: Option<String>,
|
||||
pub lens_model: Option<String>,
|
||||
|
||||
// GPS location search
|
||||
pub gps_lat: Option<f64>,
|
||||
pub gps_lon: Option<f64>,
|
||||
pub gps_radius_km: Option<f64>,
|
||||
|
||||
// Date range filtering (Unix timestamps)
|
||||
pub date_from: Option<i64>,
|
||||
pub date_to: Option<i64>,
|
||||
|
||||
// Media type filtering
|
||||
pub media_type: Option<MediaType>,
|
||||
}
|
||||
|
||||
#[derive(Copy, Clone, Deserialize, PartialEq, Debug)]
|
||||
@@ -127,6 +147,14 @@ pub enum FilterMode {
|
||||
All,
|
||||
}
|
||||
|
||||
#[derive(Copy, Clone, Deserialize, PartialEq, Debug)]
|
||||
#[serde(rename_all = "lowercase")]
|
||||
pub enum MediaType {
|
||||
Photo,
|
||||
Video,
|
||||
All,
|
||||
}
|
||||
|
||||
#[derive(Copy, Clone, Deserialize, PartialEq, Debug)]
|
||||
#[serde(rename_all = "lowercase")]
|
||||
pub enum PhotoSize {
|
||||
@@ -173,6 +201,7 @@ pub struct MetadataResponse {
|
||||
pub created: Option<i64>,
|
||||
pub modified: Option<i64>,
|
||||
pub size: u64,
|
||||
pub exif: Option<ExifMetadata>,
|
||||
}
|
||||
|
||||
impl From<fs::Metadata> for MetadataResponse {
|
||||
@@ -187,6 +216,102 @@ impl From<fs::Metadata> for MetadataResponse {
|
||||
utc.timestamp()
|
||||
}),
|
||||
size: metadata.len(),
|
||||
exif: None,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug, Serialize)]
|
||||
pub struct ExifMetadata {
|
||||
pub camera: Option<CameraInfo>,
|
||||
pub image_properties: Option<ImageProperties>,
|
||||
pub gps: Option<GpsCoordinates>,
|
||||
pub capture_settings: Option<CaptureSettings>,
|
||||
pub date_taken: Option<i64>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Serialize)]
|
||||
pub struct CameraInfo {
|
||||
pub make: Option<String>,
|
||||
pub model: Option<String>,
|
||||
pub lens: Option<String>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Serialize)]
|
||||
pub struct ImageProperties {
|
||||
pub width: Option<i32>,
|
||||
pub height: Option<i32>,
|
||||
pub orientation: Option<i32>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Serialize)]
|
||||
pub struct GpsCoordinates {
|
||||
pub latitude: Option<f64>,
|
||||
pub longitude: Option<f64>,
|
||||
pub altitude: Option<f64>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Serialize)]
|
||||
pub struct CaptureSettings {
|
||||
pub focal_length: Option<f64>,
|
||||
pub aperture: Option<f64>,
|
||||
pub shutter_speed: Option<String>,
|
||||
pub iso: Option<i32>,
|
||||
}
|
||||
|
||||
impl From<ImageExif> for ExifMetadata {
|
||||
fn from(exif: ImageExif) -> Self {
|
||||
let has_camera_info =
|
||||
exif.camera_make.is_some() || exif.camera_model.is_some() || exif.lens_model.is_some();
|
||||
let has_image_properties =
|
||||
exif.width.is_some() || exif.height.is_some() || exif.orientation.is_some();
|
||||
let has_gps = exif.gps_latitude.is_some()
|
||||
|| exif.gps_longitude.is_some()
|
||||
|| exif.gps_altitude.is_some();
|
||||
let has_capture_settings = exif.focal_length.is_some()
|
||||
|| exif.aperture.is_some()
|
||||
|| exif.shutter_speed.is_some()
|
||||
|| exif.iso.is_some();
|
||||
|
||||
ExifMetadata {
|
||||
camera: if has_camera_info {
|
||||
Some(CameraInfo {
|
||||
make: exif.camera_make,
|
||||
model: exif.camera_model,
|
||||
lens: exif.lens_model,
|
||||
})
|
||||
} else {
|
||||
None
|
||||
},
|
||||
image_properties: if has_image_properties {
|
||||
Some(ImageProperties {
|
||||
width: exif.width,
|
||||
height: exif.height,
|
||||
orientation: exif.orientation,
|
||||
})
|
||||
} else {
|
||||
None
|
||||
},
|
||||
gps: if has_gps {
|
||||
Some(GpsCoordinates {
|
||||
latitude: exif.gps_latitude,
|
||||
longitude: exif.gps_longitude,
|
||||
altitude: exif.gps_altitude,
|
||||
})
|
||||
} else {
|
||||
None
|
||||
},
|
||||
capture_settings: if has_capture_settings {
|
||||
Some(CaptureSettings {
|
||||
focal_length: exif.focal_length,
|
||||
aperture: exif.aperture,
|
||||
shutter_speed: exif.shutter_speed,
|
||||
iso: exif.iso,
|
||||
})
|
||||
} else {
|
||||
None
|
||||
},
|
||||
date_taken: exif.date_taken,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -4,7 +4,10 @@ use diesel::sqlite::SqliteConnection;
|
||||
use std::ops::DerefMut;
|
||||
use std::sync::{Arc, Mutex};
|
||||
|
||||
use crate::database::models::{Favorite, InsertFavorite, InsertUser, User};
|
||||
use crate::database::models::{
|
||||
Favorite, ImageExif, InsertFavorite, InsertImageExif, InsertUser, User,
|
||||
};
|
||||
use crate::otel::trace_db_call;
|
||||
|
||||
pub mod models;
|
||||
pub mod schema;
|
||||
@@ -19,6 +22,12 @@ pub struct SqliteUserDao {
|
||||
connection: SqliteConnection,
|
||||
}
|
||||
|
||||
impl Default for SqliteUserDao {
|
||||
fn default() -> Self {
|
||||
Self::new()
|
||||
}
|
||||
}
|
||||
|
||||
impl SqliteUserDao {
|
||||
pub fn new() -> Self {
|
||||
Self {
|
||||
@@ -91,7 +100,8 @@ impl UserDao for SqliteUserDao {
|
||||
!users
|
||||
.filter(username.eq(user))
|
||||
.load::<User>(&mut self.connection)
|
||||
.unwrap_or_default().is_empty()
|
||||
.unwrap_or_default()
|
||||
.is_empty()
|
||||
}
|
||||
}
|
||||
|
||||
@@ -120,18 +130,27 @@ pub enum DbErrorKind {
|
||||
AlreadyExists,
|
||||
InsertError,
|
||||
QueryError,
|
||||
UpdateError,
|
||||
}
|
||||
|
||||
pub trait FavoriteDao: Sync + Send {
|
||||
fn add_favorite(&mut self, user_id: i32, favorite_path: &str) -> Result<usize, DbError>;
|
||||
fn remove_favorite(&mut self, user_id: i32, favorite_path: String);
|
||||
fn get_favorites(&mut self, user_id: i32) -> Result<Vec<Favorite>, DbError>;
|
||||
fn update_path(&mut self, old_path: &str, new_path: &str) -> Result<(), DbError>;
|
||||
fn get_all_paths(&mut self) -> Result<Vec<String>, DbError>;
|
||||
}
|
||||
|
||||
pub struct SqliteFavoriteDao {
|
||||
connection: Arc<Mutex<SqliteConnection>>,
|
||||
}
|
||||
|
||||
impl Default for SqliteFavoriteDao {
|
||||
fn default() -> Self {
|
||||
Self::new()
|
||||
}
|
||||
}
|
||||
|
||||
impl SqliteFavoriteDao {
|
||||
pub fn new() -> Self {
|
||||
SqliteFavoriteDao {
|
||||
@@ -180,4 +199,370 @@ impl FavoriteDao for SqliteFavoriteDao {
|
||||
.load::<Favorite>(self.connection.lock().unwrap().deref_mut())
|
||||
.map_err(|_| DbError::new(DbErrorKind::QueryError))
|
||||
}
|
||||
|
||||
fn update_path(&mut self, old_path: &str, new_path: &str) -> Result<(), DbError> {
|
||||
use schema::favorites::dsl::*;
|
||||
|
||||
diesel::update(favorites.filter(path.eq(old_path)))
|
||||
.set(path.eq(new_path))
|
||||
.execute(self.connection.lock().unwrap().deref_mut())
|
||||
.map_err(|_| DbError::new(DbErrorKind::UpdateError))?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn get_all_paths(&mut self) -> Result<Vec<String>, DbError> {
|
||||
use schema::favorites::dsl::*;
|
||||
|
||||
favorites
|
||||
.select(path)
|
||||
.distinct()
|
||||
.load(self.connection.lock().unwrap().deref_mut())
|
||||
.map_err(|_| DbError::new(DbErrorKind::QueryError))
|
||||
}
|
||||
}
|
||||
|
||||
pub trait ExifDao: Sync + Send {
|
||||
fn store_exif(
|
||||
&mut self,
|
||||
context: &opentelemetry::Context,
|
||||
exif_data: InsertImageExif,
|
||||
) -> Result<ImageExif, DbError>;
|
||||
fn get_exif(
|
||||
&mut self,
|
||||
context: &opentelemetry::Context,
|
||||
file_path: &str,
|
||||
) -> Result<Option<ImageExif>, DbError>;
|
||||
fn update_exif(
|
||||
&mut self,
|
||||
context: &opentelemetry::Context,
|
||||
exif_data: InsertImageExif,
|
||||
) -> Result<ImageExif, DbError>;
|
||||
fn delete_exif(
|
||||
&mut self,
|
||||
context: &opentelemetry::Context,
|
||||
file_path: &str,
|
||||
) -> Result<(), DbError>;
|
||||
fn get_all_with_date_taken(
|
||||
&mut self,
|
||||
context: &opentelemetry::Context,
|
||||
) -> Result<Vec<(String, i64)>, DbError>;
|
||||
|
||||
/// Batch load EXIF data for multiple file paths (single query)
|
||||
fn get_exif_batch(
|
||||
&mut self,
|
||||
context: &opentelemetry::Context,
|
||||
file_paths: &[String],
|
||||
) -> Result<Vec<ImageExif>, DbError>;
|
||||
|
||||
/// Query files by EXIF criteria with optional filters
|
||||
fn query_by_exif(
|
||||
&mut self,
|
||||
context: &opentelemetry::Context,
|
||||
camera_make: Option<&str>,
|
||||
camera_model: Option<&str>,
|
||||
lens_model: Option<&str>,
|
||||
gps_bounds: Option<(f64, f64, f64, f64)>, // (min_lat, max_lat, min_lon, max_lon)
|
||||
date_from: Option<i64>,
|
||||
date_to: Option<i64>,
|
||||
) -> Result<Vec<ImageExif>, DbError>;
|
||||
|
||||
/// Get distinct camera makes with counts
|
||||
fn get_camera_makes(
|
||||
&mut self,
|
||||
context: &opentelemetry::Context,
|
||||
) -> Result<Vec<(String, i64)>, DbError>;
|
||||
|
||||
/// Update file path in EXIF database
|
||||
fn update_file_path(
|
||||
&mut self,
|
||||
context: &opentelemetry::Context,
|
||||
old_path: &str,
|
||||
new_path: &str,
|
||||
) -> Result<(), DbError>;
|
||||
|
||||
/// Get all file paths from EXIF database
|
||||
fn get_all_file_paths(
|
||||
&mut self,
|
||||
context: &opentelemetry::Context,
|
||||
) -> Result<Vec<String>, DbError>;
|
||||
}
|
||||
|
||||
pub struct SqliteExifDao {
|
||||
connection: Arc<Mutex<SqliteConnection>>,
|
||||
}
|
||||
|
||||
impl Default for SqliteExifDao {
|
||||
fn default() -> Self {
|
||||
Self::new()
|
||||
}
|
||||
}
|
||||
|
||||
impl SqliteExifDao {
|
||||
pub fn new() -> Self {
|
||||
SqliteExifDao {
|
||||
connection: Arc::new(Mutex::new(connect())),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl ExifDao for SqliteExifDao {
|
||||
fn store_exif(
|
||||
&mut self,
|
||||
context: &opentelemetry::Context,
|
||||
exif_data: InsertImageExif,
|
||||
) -> Result<ImageExif, DbError> {
|
||||
trace_db_call(context, "insert", "store_exif", |_span| {
|
||||
use schema::image_exif::dsl::*;
|
||||
|
||||
let mut connection = self.connection.lock().expect("Unable to get ExifDao");
|
||||
|
||||
diesel::insert_into(image_exif)
|
||||
.values(&exif_data)
|
||||
.execute(connection.deref_mut())
|
||||
.map_err(|_| anyhow::anyhow!("Insert error"))?;
|
||||
|
||||
image_exif
|
||||
.filter(file_path.eq(&exif_data.file_path))
|
||||
.first::<ImageExif>(connection.deref_mut())
|
||||
.map_err(|_| anyhow::anyhow!("Query error"))
|
||||
})
|
||||
.map_err(|_| DbError::new(DbErrorKind::InsertError))
|
||||
}
|
||||
|
||||
fn get_exif(
|
||||
&mut self,
|
||||
context: &opentelemetry::Context,
|
||||
path: &str,
|
||||
) -> Result<Option<ImageExif>, DbError> {
|
||||
trace_db_call(context, "query", "get_exif", |_span| {
|
||||
use schema::image_exif::dsl::*;
|
||||
|
||||
let mut connection = self.connection.lock().expect("Unable to get ExifDao");
|
||||
|
||||
match image_exif
|
||||
.filter(file_path.eq(path))
|
||||
.first::<ImageExif>(connection.deref_mut())
|
||||
{
|
||||
Ok(exif) => Ok(Some(exif)),
|
||||
Err(diesel::result::Error::NotFound) => Ok(None),
|
||||
Err(_) => Err(anyhow::anyhow!("Query error")),
|
||||
}
|
||||
})
|
||||
.map_err(|_| DbError::new(DbErrorKind::QueryError))
|
||||
}
|
||||
|
||||
fn update_exif(
|
||||
&mut self,
|
||||
context: &opentelemetry::Context,
|
||||
exif_data: InsertImageExif,
|
||||
) -> Result<ImageExif, DbError> {
|
||||
trace_db_call(context, "update", "update_exif", |_span| {
|
||||
use schema::image_exif::dsl::*;
|
||||
|
||||
let mut connection = self.connection.lock().expect("Unable to get ExifDao");
|
||||
|
||||
diesel::update(image_exif.filter(file_path.eq(&exif_data.file_path)))
|
||||
.set((
|
||||
camera_make.eq(&exif_data.camera_make),
|
||||
camera_model.eq(&exif_data.camera_model),
|
||||
lens_model.eq(&exif_data.lens_model),
|
||||
width.eq(&exif_data.width),
|
||||
height.eq(&exif_data.height),
|
||||
orientation.eq(&exif_data.orientation),
|
||||
gps_latitude.eq(&exif_data.gps_latitude),
|
||||
gps_longitude.eq(&exif_data.gps_longitude),
|
||||
gps_altitude.eq(&exif_data.gps_altitude),
|
||||
focal_length.eq(&exif_data.focal_length),
|
||||
aperture.eq(&exif_data.aperture),
|
||||
shutter_speed.eq(&exif_data.shutter_speed),
|
||||
iso.eq(&exif_data.iso),
|
||||
date_taken.eq(&exif_data.date_taken),
|
||||
last_modified.eq(&exif_data.last_modified),
|
||||
))
|
||||
.execute(connection.deref_mut())
|
||||
.map_err(|_| anyhow::anyhow!("Update error"))?;
|
||||
|
||||
image_exif
|
||||
.filter(file_path.eq(&exif_data.file_path))
|
||||
.first::<ImageExif>(connection.deref_mut())
|
||||
.map_err(|_| anyhow::anyhow!("Query error"))
|
||||
})
|
||||
.map_err(|_| DbError::new(DbErrorKind::UpdateError))
|
||||
}
|
||||
|
||||
fn delete_exif(&mut self, context: &opentelemetry::Context, path: &str) -> Result<(), DbError> {
|
||||
trace_db_call(context, "delete", "delete_exif", |_span| {
|
||||
use schema::image_exif::dsl::*;
|
||||
|
||||
diesel::delete(image_exif.filter(file_path.eq(path)))
|
||||
.execute(self.connection.lock().unwrap().deref_mut())
|
||||
.map(|_| ())
|
||||
.map_err(|_| anyhow::anyhow!("Delete error"))
|
||||
})
|
||||
.map_err(|_| DbError::new(DbErrorKind::QueryError))
|
||||
}
|
||||
|
||||
fn get_all_with_date_taken(
|
||||
&mut self,
|
||||
context: &opentelemetry::Context,
|
||||
) -> Result<Vec<(String, i64)>, DbError> {
|
||||
trace_db_call(context, "query", "get_all_with_date_taken", |_span| {
|
||||
use schema::image_exif::dsl::*;
|
||||
|
||||
let mut connection = self.connection.lock().expect("Unable to get ExifDao");
|
||||
|
||||
image_exif
|
||||
.select((file_path, date_taken))
|
||||
.filter(date_taken.is_not_null())
|
||||
.load::<(String, Option<i64>)>(connection.deref_mut())
|
||||
.map(|records| {
|
||||
records
|
||||
.into_iter()
|
||||
.filter_map(|(path, dt)| dt.map(|ts| (path, ts)))
|
||||
.collect()
|
||||
})
|
||||
.map_err(|_| anyhow::anyhow!("Query error"))
|
||||
})
|
||||
.map_err(|_| DbError::new(DbErrorKind::QueryError))
|
||||
}
|
||||
|
||||
fn get_exif_batch(
|
||||
&mut self,
|
||||
context: &opentelemetry::Context,
|
||||
file_paths: &[String],
|
||||
) -> Result<Vec<ImageExif>, DbError> {
|
||||
trace_db_call(context, "query", "get_exif_batch", |_span| {
|
||||
use schema::image_exif::dsl::*;
|
||||
|
||||
if file_paths.is_empty() {
|
||||
return Ok(Vec::new());
|
||||
}
|
||||
|
||||
let mut connection = self.connection.lock().expect("Unable to get ExifDao");
|
||||
|
||||
image_exif
|
||||
.filter(file_path.eq_any(file_paths))
|
||||
.load::<ImageExif>(connection.deref_mut())
|
||||
.map_err(|_| anyhow::anyhow!("Query error"))
|
||||
})
|
||||
.map_err(|_| DbError::new(DbErrorKind::QueryError))
|
||||
}
|
||||
|
||||
fn query_by_exif(
|
||||
&mut self,
|
||||
context: &opentelemetry::Context,
|
||||
camera_make_filter: Option<&str>,
|
||||
camera_model_filter: Option<&str>,
|
||||
lens_model_filter: Option<&str>,
|
||||
gps_bounds: Option<(f64, f64, f64, f64)>,
|
||||
date_from: Option<i64>,
|
||||
date_to: Option<i64>,
|
||||
) -> Result<Vec<ImageExif>, DbError> {
|
||||
trace_db_call(context, "query", "query_by_exif", |_span| {
|
||||
use schema::image_exif::dsl::*;
|
||||
|
||||
let mut connection = self.connection.lock().expect("Unable to get ExifDao");
|
||||
let mut query = image_exif.into_boxed();
|
||||
|
||||
// Camera filters (case-insensitive partial match)
|
||||
if let Some(make) = camera_make_filter {
|
||||
query = query.filter(camera_make.like(format!("%{}%", make)));
|
||||
}
|
||||
if let Some(model) = camera_model_filter {
|
||||
query = query.filter(camera_model.like(format!("%{}%", model)));
|
||||
}
|
||||
if let Some(lens) = lens_model_filter {
|
||||
query = query.filter(lens_model.like(format!("%{}%", lens)));
|
||||
}
|
||||
|
||||
// GPS bounding box
|
||||
if let Some((min_lat, max_lat, min_lon, max_lon)) = gps_bounds {
|
||||
query = query
|
||||
.filter(gps_latitude.between(min_lat, max_lat))
|
||||
.filter(gps_longitude.between(min_lon, max_lon))
|
||||
.filter(gps_latitude.is_not_null())
|
||||
.filter(gps_longitude.is_not_null());
|
||||
}
|
||||
|
||||
// Date range
|
||||
if let Some(from) = date_from {
|
||||
query = query.filter(date_taken.ge(from));
|
||||
}
|
||||
if let Some(to) = date_to {
|
||||
query = query.filter(date_taken.le(to));
|
||||
}
|
||||
if date_from.is_some() || date_to.is_some() {
|
||||
query = query.filter(date_taken.is_not_null());
|
||||
}
|
||||
|
||||
query
|
||||
.load::<ImageExif>(connection.deref_mut())
|
||||
.map_err(|_| anyhow::anyhow!("Query error"))
|
||||
})
|
||||
.map_err(|_| DbError::new(DbErrorKind::QueryError))
|
||||
}
|
||||
|
||||
fn get_camera_makes(
|
||||
&mut self,
|
||||
context: &opentelemetry::Context,
|
||||
) -> Result<Vec<(String, i64)>, DbError> {
|
||||
trace_db_call(context, "query", "get_camera_makes", |_span| {
|
||||
use diesel::dsl::count;
|
||||
use schema::image_exif::dsl::*;
|
||||
|
||||
let mut connection = self.connection.lock().expect("Unable to get ExifDao");
|
||||
|
||||
image_exif
|
||||
.filter(camera_make.is_not_null())
|
||||
.group_by(camera_make)
|
||||
.select((camera_make, count(id)))
|
||||
.order(count(id).desc())
|
||||
.load::<(Option<String>, i64)>(connection.deref_mut())
|
||||
.map(|records| {
|
||||
records
|
||||
.into_iter()
|
||||
.filter_map(|(make, cnt)| make.map(|m| (m, cnt)))
|
||||
.collect()
|
||||
})
|
||||
.map_err(|_| anyhow::anyhow!("Query error"))
|
||||
})
|
||||
.map_err(|_| DbError::new(DbErrorKind::QueryError))
|
||||
}
|
||||
|
||||
fn update_file_path(
|
||||
&mut self,
|
||||
context: &opentelemetry::Context,
|
||||
old_path: &str,
|
||||
new_path: &str,
|
||||
) -> Result<(), DbError> {
|
||||
trace_db_call(context, "update", "update_file_path", |_span| {
|
||||
use schema::image_exif::dsl::*;
|
||||
|
||||
let mut connection = self.connection.lock().expect("Unable to get ExifDao");
|
||||
|
||||
diesel::update(image_exif.filter(file_path.eq(old_path)))
|
||||
.set(file_path.eq(new_path))
|
||||
.execute(connection.deref_mut())
|
||||
.map_err(|_| anyhow::anyhow!("Update error"))?;
|
||||
Ok(())
|
||||
})
|
||||
.map_err(|_| DbError::new(DbErrorKind::UpdateError))
|
||||
}
|
||||
|
||||
fn get_all_file_paths(
|
||||
&mut self,
|
||||
context: &opentelemetry::Context,
|
||||
) -> Result<Vec<String>, DbError> {
|
||||
trace_db_call(context, "query", "get_all_file_paths", |_span| {
|
||||
use schema::image_exif::dsl::*;
|
||||
|
||||
let mut connection = self.connection.lock().expect("Unable to get ExifDao");
|
||||
|
||||
image_exif
|
||||
.select(file_path)
|
||||
.load(connection.deref_mut())
|
||||
.map_err(|_| anyhow::anyhow!("Query error"))
|
||||
})
|
||||
.map_err(|_| DbError::new(DbErrorKind::QueryError))
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
use crate::database::schema::{favorites, users};
|
||||
use crate::database::schema::{favorites, image_exif, users};
|
||||
use serde::Serialize;
|
||||
|
||||
#[derive(Insertable)]
|
||||
@@ -29,3 +29,47 @@ pub struct Favorite {
|
||||
pub userid: i32,
|
||||
pub path: String,
|
||||
}
|
||||
|
||||
#[derive(Insertable)]
|
||||
#[diesel(table_name = image_exif)]
|
||||
pub struct InsertImageExif {
|
||||
pub file_path: String,
|
||||
pub camera_make: Option<String>,
|
||||
pub camera_model: Option<String>,
|
||||
pub lens_model: Option<String>,
|
||||
pub width: Option<i32>,
|
||||
pub height: Option<i32>,
|
||||
pub orientation: Option<i32>,
|
||||
pub gps_latitude: Option<f64>,
|
||||
pub gps_longitude: Option<f64>,
|
||||
pub gps_altitude: Option<f64>,
|
||||
pub focal_length: Option<f64>,
|
||||
pub aperture: Option<f64>,
|
||||
pub shutter_speed: Option<String>,
|
||||
pub iso: Option<i32>,
|
||||
pub date_taken: Option<i64>,
|
||||
pub created_time: i64,
|
||||
pub last_modified: i64,
|
||||
}
|
||||
|
||||
#[derive(Serialize, Queryable, Clone, Debug)]
|
||||
pub struct ImageExif {
|
||||
pub id: i32,
|
||||
pub file_path: String,
|
||||
pub camera_make: Option<String>,
|
||||
pub camera_model: Option<String>,
|
||||
pub lens_model: Option<String>,
|
||||
pub width: Option<i32>,
|
||||
pub height: Option<i32>,
|
||||
pub orientation: Option<i32>,
|
||||
pub gps_latitude: Option<f64>,
|
||||
pub gps_longitude: Option<f64>,
|
||||
pub gps_altitude: Option<f64>,
|
||||
pub focal_length: Option<f64>,
|
||||
pub aperture: Option<f64>,
|
||||
pub shutter_speed: Option<String>,
|
||||
pub iso: Option<i32>,
|
||||
pub date_taken: Option<i64>,
|
||||
pub created_time: i64,
|
||||
pub last_modified: i64,
|
||||
}
|
||||
|
||||
@@ -6,6 +6,29 @@ table! {
|
||||
}
|
||||
}
|
||||
|
||||
table! {
|
||||
image_exif (id) {
|
||||
id -> Integer,
|
||||
file_path -> Text,
|
||||
camera_make -> Nullable<Text>,
|
||||
camera_model -> Nullable<Text>,
|
||||
lens_model -> Nullable<Text>,
|
||||
width -> Nullable<Integer>,
|
||||
height -> Nullable<Integer>,
|
||||
orientation -> Nullable<Integer>,
|
||||
gps_latitude -> Nullable<Double>,
|
||||
gps_longitude -> Nullable<Double>,
|
||||
gps_altitude -> Nullable<Double>,
|
||||
focal_length -> Nullable<Double>,
|
||||
aperture -> Nullable<Double>,
|
||||
shutter_speed -> Nullable<Text>,
|
||||
iso -> Nullable<Integer>,
|
||||
date_taken -> Nullable<BigInt>,
|
||||
created_time -> BigInt,
|
||||
last_modified -> BigInt,
|
||||
}
|
||||
}
|
||||
|
||||
table! {
|
||||
tagged_photo (id) {
|
||||
id -> Integer,
|
||||
@@ -33,4 +56,4 @@ table! {
|
||||
|
||||
joinable!(tagged_photo -> tags (tag_id));
|
||||
|
||||
allow_tables_to_appear_in_same_query!(favorites, tagged_photo, tags, users,);
|
||||
allow_tables_to_appear_in_same_query!(favorites, image_exif, tagged_photo, tags, users,);
|
||||
|
||||
319
src/exif.rs
Normal file
319
src/exif.rs
Normal file
@@ -0,0 +1,319 @@
|
||||
use std::fs::File;
|
||||
use std::io::BufReader;
|
||||
use std::path::Path;
|
||||
|
||||
use anyhow::{Result, anyhow};
|
||||
use exif::{In, Reader, Tag, Value};
|
||||
use log::debug;
|
||||
use serde::{Deserialize, Serialize};
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize, Default)]
|
||||
pub struct ExifData {
|
||||
pub camera_make: Option<String>,
|
||||
pub camera_model: Option<String>,
|
||||
pub lens_model: Option<String>,
|
||||
pub width: Option<i32>,
|
||||
pub height: Option<i32>,
|
||||
pub orientation: Option<i32>,
|
||||
pub gps_latitude: Option<f64>,
|
||||
pub gps_longitude: Option<f64>,
|
||||
pub gps_altitude: Option<f64>,
|
||||
pub focal_length: Option<f64>,
|
||||
pub aperture: Option<f64>,
|
||||
pub shutter_speed: Option<String>,
|
||||
pub iso: Option<i32>,
|
||||
pub date_taken: Option<i64>,
|
||||
}
|
||||
|
||||
pub fn supports_exif(path: &Path) -> bool {
|
||||
if let Some(ext) = path.extension() {
|
||||
let ext_lower = ext.to_string_lossy().to_lowercase();
|
||||
matches!(
|
||||
ext_lower.as_str(),
|
||||
// JPEG formats
|
||||
"jpg" | "jpeg" |
|
||||
// TIFF and RAW formats based on TIFF
|
||||
"tiff" | "tif" | "nef" | "cr2" | "cr3" | "arw" | "dng" | "raf" | "orf" | "rw2" | "pef" | "srw" |
|
||||
// HEIF and variants
|
||||
"heif" | "heic" | "avif" |
|
||||
// PNG
|
||||
"png" |
|
||||
// WebP
|
||||
"webp"
|
||||
)
|
||||
} else {
|
||||
false
|
||||
}
|
||||
}
|
||||
|
||||
pub fn extract_exif_from_path(path: &Path) -> Result<ExifData> {
|
||||
debug!("Extracting EXIF from: {:?}", path);
|
||||
|
||||
if !supports_exif(path) {
|
||||
return Err(anyhow!("File type does not support EXIF"));
|
||||
}
|
||||
|
||||
let file = File::open(path)?;
|
||||
let mut bufreader = BufReader::new(file);
|
||||
|
||||
let exifreader = Reader::new();
|
||||
let exif = exifreader.read_from_container(&mut bufreader)?;
|
||||
|
||||
let mut data = ExifData::default();
|
||||
|
||||
for field in exif.fields() {
|
||||
match field.tag {
|
||||
Tag::Make => {
|
||||
data.camera_make = get_string_value(field);
|
||||
}
|
||||
Tag::Model => {
|
||||
data.camera_model = get_string_value(field);
|
||||
}
|
||||
Tag::LensModel => {
|
||||
data.lens_model = get_string_value(field);
|
||||
}
|
||||
Tag::PixelXDimension | Tag::ImageWidth => {
|
||||
if data.width.is_none() {
|
||||
data.width = get_u32_value(field).map(|v| v as i32);
|
||||
}
|
||||
}
|
||||
Tag::PixelYDimension | Tag::ImageLength => {
|
||||
if data.height.is_none() {
|
||||
data.height = get_u32_value(field).map(|v| v as i32);
|
||||
}
|
||||
}
|
||||
Tag::Orientation => {
|
||||
data.orientation = get_u32_value(field).map(|v| v as i32);
|
||||
}
|
||||
Tag::FocalLength => {
|
||||
data.focal_length = get_rational_value(field);
|
||||
}
|
||||
Tag::FNumber => {
|
||||
data.aperture = get_rational_value(field);
|
||||
}
|
||||
Tag::ExposureTime => {
|
||||
data.shutter_speed = get_rational_string(field);
|
||||
}
|
||||
Tag::PhotographicSensitivity | Tag::ISOSpeed => {
|
||||
if data.iso.is_none() {
|
||||
data.iso = get_u32_value(field).map(|v| v as i32);
|
||||
}
|
||||
}
|
||||
Tag::DateTime | Tag::DateTimeOriginal => {
|
||||
if data.date_taken.is_none() {
|
||||
data.date_taken = parse_exif_datetime(field);
|
||||
}
|
||||
}
|
||||
_ => {}
|
||||
}
|
||||
}
|
||||
|
||||
// Extract GPS coordinates
|
||||
if let Some(lat) = extract_gps_coordinate(&exif, Tag::GPSLatitude, Tag::GPSLatitudeRef) {
|
||||
data.gps_latitude = Some(lat);
|
||||
}
|
||||
if let Some(lon) = extract_gps_coordinate(&exif, Tag::GPSLongitude, Tag::GPSLongitudeRef) {
|
||||
data.gps_longitude = Some(lon);
|
||||
}
|
||||
if let Some(alt) = extract_gps_altitude(&exif) {
|
||||
data.gps_altitude = Some(alt);
|
||||
}
|
||||
|
||||
debug!("Extracted EXIF data: {:?}", data);
|
||||
Ok(data)
|
||||
}
|
||||
|
||||
fn get_string_value(field: &exif::Field) -> Option<String> {
|
||||
match &field.value {
|
||||
Value::Ascii(vec) => {
|
||||
if let Some(bytes) = vec.first() {
|
||||
String::from_utf8(bytes.to_vec())
|
||||
.ok()
|
||||
.map(|s| s.trim_end_matches('\0').to_string())
|
||||
} else {
|
||||
None
|
||||
}
|
||||
}
|
||||
_ => {
|
||||
let display = field.display_value().to_string();
|
||||
if display.is_empty() {
|
||||
None
|
||||
} else {
|
||||
Some(display)
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn get_u32_value(field: &exif::Field) -> Option<u32> {
|
||||
match &field.value {
|
||||
Value::Short(vec) => vec.first().map(|&v| v as u32),
|
||||
Value::Long(vec) => vec.first().copied(),
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
|
||||
fn get_rational_value(field: &exif::Field) -> Option<f64> {
|
||||
match &field.value {
|
||||
Value::Rational(vec) => {
|
||||
if let Some(rational) = vec.first() {
|
||||
if rational.denom == 0 {
|
||||
None
|
||||
} else {
|
||||
Some(rational.num as f64 / rational.denom as f64)
|
||||
}
|
||||
} else {
|
||||
None
|
||||
}
|
||||
}
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
|
||||
fn get_rational_string(field: &exif::Field) -> Option<String> {
|
||||
match &field.value {
|
||||
Value::Rational(vec) => {
|
||||
if let Some(rational) = vec.first() {
|
||||
if rational.denom == 0 {
|
||||
None
|
||||
} else if rational.num < rational.denom {
|
||||
Some(format!("{}/{}", rational.num, rational.denom))
|
||||
} else {
|
||||
let value = rational.num as f64 / rational.denom as f64;
|
||||
Some(format!("{:.2}", value))
|
||||
}
|
||||
} else {
|
||||
None
|
||||
}
|
||||
}
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
|
||||
fn parse_exif_datetime(field: &exif::Field) -> Option<i64> {
|
||||
if let Some(datetime_str) = get_string_value(field) {
|
||||
use chrono::NaiveDateTime;
|
||||
|
||||
// EXIF datetime format: "YYYY:MM:DD HH:MM:SS"
|
||||
// Note: EXIF dates are local time without timezone info
|
||||
// We return the timestamp as if it were UTC, and the client will display it as-is
|
||||
NaiveDateTime::parse_from_str(&datetime_str, "%Y:%m:%d %H:%M:%S")
|
||||
.ok()
|
||||
.map(|dt| dt.and_utc().timestamp())
|
||||
} else {
|
||||
None
|
||||
}
|
||||
}
|
||||
|
||||
fn extract_gps_coordinate(exif: &exif::Exif, coord_tag: Tag, ref_tag: Tag) -> Option<f64> {
|
||||
let coord_field = exif.get_field(coord_tag, In::PRIMARY)?;
|
||||
let ref_field = exif.get_field(ref_tag, In::PRIMARY)?;
|
||||
|
||||
let coordinates = match &coord_field.value {
|
||||
Value::Rational(vec) => {
|
||||
if vec.len() < 3 {
|
||||
return None;
|
||||
}
|
||||
let degrees = vec[0].num as f64 / vec[0].denom as f64;
|
||||
let minutes = vec[1].num as f64 / vec[1].denom as f64;
|
||||
let seconds = vec[2].num as f64 / vec[2].denom as f64;
|
||||
degrees + (minutes / 60.0) + (seconds / 3600.0)
|
||||
}
|
||||
_ => return None,
|
||||
};
|
||||
|
||||
let reference = get_string_value(ref_field)?;
|
||||
let sign = if reference.starts_with('S') || reference.starts_with('W') {
|
||||
-1.0
|
||||
} else {
|
||||
1.0
|
||||
};
|
||||
|
||||
Some(coordinates * sign)
|
||||
}
|
||||
|
||||
fn extract_gps_altitude(exif: &exif::Exif) -> Option<f64> {
|
||||
let alt_field = exif.get_field(Tag::GPSAltitude, In::PRIMARY)?;
|
||||
|
||||
match &alt_field.value {
|
||||
Value::Rational(vec) => {
|
||||
if let Some(rational) = vec.first() {
|
||||
if rational.denom == 0 {
|
||||
None
|
||||
} else {
|
||||
let altitude = rational.num as f64 / rational.denom as f64;
|
||||
|
||||
// Check if below sea level
|
||||
if let Some(ref_field) = exif.get_field(Tag::GPSAltitudeRef, In::PRIMARY)
|
||||
&& let Some(ref_val) = get_u32_value(ref_field)
|
||||
&& ref_val == 1
|
||||
{
|
||||
return Some(-altitude);
|
||||
}
|
||||
|
||||
Some(altitude)
|
||||
}
|
||||
} else {
|
||||
None
|
||||
}
|
||||
}
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_supports_exif_jpeg() {
|
||||
assert!(supports_exif(Path::new("test.jpg")));
|
||||
assert!(supports_exif(Path::new("test.jpeg")));
|
||||
assert!(supports_exif(Path::new("test.JPG")));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_supports_exif_raw_formats() {
|
||||
assert!(supports_exif(Path::new("test.nef"))); // Nikon
|
||||
assert!(supports_exif(Path::new("test.NEF")));
|
||||
assert!(supports_exif(Path::new("test.cr2"))); // Canon
|
||||
assert!(supports_exif(Path::new("test.cr3"))); // Canon
|
||||
assert!(supports_exif(Path::new("test.arw"))); // Sony
|
||||
assert!(supports_exif(Path::new("test.dng"))); // Adobe DNG
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_supports_exif_tiff() {
|
||||
assert!(supports_exif(Path::new("test.tiff")));
|
||||
assert!(supports_exif(Path::new("test.tif")));
|
||||
assert!(supports_exif(Path::new("test.TIFF")));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_supports_exif_heif() {
|
||||
assert!(supports_exif(Path::new("test.heif")));
|
||||
assert!(supports_exif(Path::new("test.heic")));
|
||||
assert!(supports_exif(Path::new("test.avif")));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_supports_exif_png_webp() {
|
||||
assert!(supports_exif(Path::new("test.png")));
|
||||
assert!(supports_exif(Path::new("test.PNG")));
|
||||
assert!(supports_exif(Path::new("test.webp")));
|
||||
assert!(supports_exif(Path::new("test.WEBP")));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_supports_exif_unsupported() {
|
||||
assert!(!supports_exif(Path::new("test.mp4")));
|
||||
assert!(!supports_exif(Path::new("test.mov")));
|
||||
assert!(!supports_exif(Path::new("test.txt")));
|
||||
assert!(!supports_exif(Path::new("test.gif")));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_supports_exif_no_extension() {
|
||||
assert!(!supports_exif(Path::new("test")));
|
||||
}
|
||||
}
|
||||
85
src/file_types.rs
Normal file
85
src/file_types.rs
Normal file
@@ -0,0 +1,85 @@
|
||||
use std::path::Path;
|
||||
use walkdir::DirEntry;
|
||||
|
||||
/// Supported image file extensions
|
||||
pub const IMAGE_EXTENSIONS: &[&str] = &[
|
||||
"jpg", "jpeg", "png", "webp", "tiff", "tif", "heif", "heic", "avif", "nef",
|
||||
];
|
||||
|
||||
/// Supported video file extensions
|
||||
pub const VIDEO_EXTENSIONS: &[&str] = &["mp4", "mov", "avi", "mkv"];
|
||||
|
||||
/// Check if a path has an image extension
|
||||
pub fn is_image_file(path: &Path) -> bool {
|
||||
if let Some(ext) = path.extension().and_then(|e| e.to_str()) {
|
||||
let ext_lower = ext.to_lowercase();
|
||||
IMAGE_EXTENSIONS.contains(&ext_lower.as_str())
|
||||
} else {
|
||||
false
|
||||
}
|
||||
}
|
||||
|
||||
/// Check if a path has a video extension
|
||||
pub fn is_video_file(path: &Path) -> bool {
|
||||
if let Some(ext) = path.extension().and_then(|e| e.to_str()) {
|
||||
let ext_lower = ext.to_lowercase();
|
||||
VIDEO_EXTENSIONS.contains(&ext_lower.as_str())
|
||||
} else {
|
||||
false
|
||||
}
|
||||
}
|
||||
|
||||
/// Check if a path has a supported media extension (image or video)
|
||||
pub fn is_media_file(path: &Path) -> bool {
|
||||
is_image_file(path) || is_video_file(path)
|
||||
}
|
||||
|
||||
/// Check if a DirEntry is an image file (for walkdir usage)
|
||||
pub fn direntry_is_image(entry: &DirEntry) -> bool {
|
||||
is_image_file(&entry.path())
|
||||
}
|
||||
|
||||
/// Check if a DirEntry is a video file (for walkdir usage)
|
||||
pub fn direntry_is_video(entry: &DirEntry) -> bool {
|
||||
is_video_file(&entry.path())
|
||||
}
|
||||
|
||||
/// Check if a DirEntry is a media file (for walkdir usage)
|
||||
pub fn direntry_is_media(entry: &DirEntry) -> bool {
|
||||
is_media_file(&entry.path())
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
use std::path::Path;
|
||||
|
||||
#[test]
|
||||
fn test_is_image_file() {
|
||||
assert!(is_image_file(Path::new("photo.jpg")));
|
||||
assert!(is_image_file(Path::new("photo.JPG")));
|
||||
assert!(is_image_file(Path::new("photo.png")));
|
||||
assert!(is_image_file(Path::new("photo.nef")));
|
||||
assert!(!is_image_file(Path::new("video.mp4")));
|
||||
assert!(!is_image_file(Path::new("document.txt")));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_is_video_file() {
|
||||
assert!(is_video_file(Path::new("video.mp4")));
|
||||
assert!(is_video_file(Path::new("video.MP4")));
|
||||
assert!(is_video_file(Path::new("video.mov")));
|
||||
assert!(is_video_file(Path::new("video.avi")));
|
||||
assert!(!is_video_file(Path::new("photo.jpg")));
|
||||
assert!(!is_video_file(Path::new("document.txt")));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_is_media_file() {
|
||||
assert!(is_media_file(Path::new("photo.jpg")));
|
||||
assert!(is_media_file(Path::new("video.mp4")));
|
||||
assert!(is_media_file(Path::new("photo.PNG")));
|
||||
assert!(!is_media_file(Path::new("document.txt")));
|
||||
assert!(!is_media_file(Path::new("no_extension")));
|
||||
}
|
||||
}
|
||||
760
src/files.rs
760
src/files.rs
@@ -1,3 +1,4 @@
|
||||
use std::collections::HashSet;
|
||||
use std::fmt::Debug;
|
||||
use std::fs::read_dir;
|
||||
use std::io;
|
||||
@@ -9,14 +10,18 @@ use ::anyhow;
|
||||
use actix::{Handler, Message};
|
||||
use anyhow::{Context, anyhow};
|
||||
|
||||
use crate::data::{Claims, FilesRequest, FilterMode, PhotosResponse, SortType};
|
||||
use crate::data::{Claims, FilesRequest, FilterMode, MediaType, PhotosResponse, SortType};
|
||||
use crate::database::ExifDao;
|
||||
use crate::file_types;
|
||||
use crate::geo::{gps_bounding_box, haversine_distance};
|
||||
use crate::memories::extract_date_from_filename;
|
||||
use crate::{AppState, create_thumbnails};
|
||||
use actix_web::web::Data;
|
||||
use actix_web::{
|
||||
HttpRequest, HttpResponse,
|
||||
web::{self, Query},
|
||||
};
|
||||
use log::{debug, error, info, trace};
|
||||
use log::{debug, error, info, trace, warn};
|
||||
use opentelemetry::KeyValue;
|
||||
use opentelemetry::trace::{Span, Status, TraceContextExt, Tracer};
|
||||
|
||||
@@ -28,8 +33,66 @@ use crate::video::actors::StreamActor;
|
||||
use path_absolutize::*;
|
||||
use rand::prelude::SliceRandom;
|
||||
use rand::thread_rng;
|
||||
|
||||
/// File metadata for sorting and filtering
|
||||
/// Includes tag count and optional date for date-based sorting
|
||||
pub struct FileWithMetadata {
|
||||
pub file_name: String,
|
||||
pub tag_count: i64,
|
||||
pub date_taken: Option<i64>, // Unix timestamp from EXIF or filename extraction
|
||||
}
|
||||
use serde::Deserialize;
|
||||
|
||||
/// Apply sorting to files with EXIF data support for date-based sorting
|
||||
/// Handles both date sorting (with EXIF/filename fallback) and regular sorting
|
||||
fn apply_sorting_with_exif(
|
||||
files: Vec<FileWithTagCount>,
|
||||
sort_type: SortType,
|
||||
exif_dao: &mut Box<dyn ExifDao>,
|
||||
span_context: &opentelemetry::Context,
|
||||
) -> Vec<String> {
|
||||
match sort_type {
|
||||
SortType::DateTakenAsc | SortType::DateTakenDesc => {
|
||||
info!("Date sorting requested, fetching EXIF data");
|
||||
|
||||
// Collect file paths for batch EXIF query
|
||||
let file_paths: Vec<String> = files.iter().map(|f| f.file_name.clone()).collect();
|
||||
|
||||
// Batch fetch EXIF data
|
||||
let exif_map: std::collections::HashMap<String, i64> = exif_dao
|
||||
.get_exif_batch(span_context, &file_paths)
|
||||
.unwrap_or_default()
|
||||
.into_iter()
|
||||
.filter_map(|exif| exif.date_taken.map(|dt| (exif.file_path, dt)))
|
||||
.collect();
|
||||
|
||||
// Convert to FileWithMetadata with date fallback logic
|
||||
let files_with_metadata: Vec<FileWithMetadata> = files
|
||||
.into_iter()
|
||||
.map(|f| {
|
||||
// Try EXIF date first
|
||||
let date_taken = exif_map.get(&f.file_name).copied().or_else(|| {
|
||||
// Fallback to filename extraction
|
||||
extract_date_from_filename(&f.file_name).map(|dt| dt.timestamp())
|
||||
});
|
||||
|
||||
FileWithMetadata {
|
||||
file_name: f.file_name,
|
||||
tag_count: f.tag_count,
|
||||
date_taken,
|
||||
}
|
||||
})
|
||||
.collect();
|
||||
|
||||
sort_with_metadata(files_with_metadata, sort_type)
|
||||
}
|
||||
_ => {
|
||||
// Use regular sort for non-date sorting
|
||||
sort(files, sort_type)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
pub async fn list_photos<TagD: TagDao, FS: FileSystemAccess>(
|
||||
_: Claims,
|
||||
request: HttpRequest,
|
||||
@@ -37,6 +100,7 @@ pub async fn list_photos<TagD: TagDao, FS: FileSystemAccess>(
|
||||
app_state: web::Data<AppState>,
|
||||
file_system: web::Data<FS>,
|
||||
tag_dao: web::Data<Mutex<TagD>>,
|
||||
exif_dao: web::Data<Mutex<Box<dyn ExifDao>>>,
|
||||
) -> HttpResponse {
|
||||
let search_path = &req.path;
|
||||
|
||||
@@ -59,93 +123,253 @@ pub async fn list_photos<TagD: TagDao, FS: FileSystemAccess>(
|
||||
req.exclude_tag_ids.clone().unwrap_or_default().to_string(),
|
||||
),
|
||||
KeyValue::new("sort", format!("{:?}", &req.sort.unwrap_or(NameAsc))),
|
||||
// EXIF search parameters
|
||||
KeyValue::new("camera_make", req.camera_make.clone().unwrap_or_default()),
|
||||
KeyValue::new("camera_model", req.camera_model.clone().unwrap_or_default()),
|
||||
KeyValue::new("lens_model", req.lens_model.clone().unwrap_or_default()),
|
||||
KeyValue::new(
|
||||
"gps_lat",
|
||||
req.gps_lat.map(|v| v.to_string()).unwrap_or_default(),
|
||||
),
|
||||
KeyValue::new(
|
||||
"gps_lon",
|
||||
req.gps_lon.map(|v| v.to_string()).unwrap_or_default(),
|
||||
),
|
||||
KeyValue::new(
|
||||
"gps_radius_km",
|
||||
req.gps_radius_km.map(|v| v.to_string()).unwrap_or_default(),
|
||||
),
|
||||
KeyValue::new(
|
||||
"date_from",
|
||||
req.date_from.map(|v| v.to_string()).unwrap_or_default(),
|
||||
),
|
||||
KeyValue::new(
|
||||
"date_to",
|
||||
req.date_to.map(|v| v.to_string()).unwrap_or_default(),
|
||||
),
|
||||
KeyValue::new(
|
||||
"media_type",
|
||||
req.media_type
|
||||
.as_ref()
|
||||
.map(|mt| format!("{:?}", mt))
|
||||
.unwrap_or_default(),
|
||||
),
|
||||
]);
|
||||
|
||||
let span_context = opentelemetry::Context::current_with_span(span);
|
||||
|
||||
let search_recursively = req.recursive.unwrap_or(false);
|
||||
if let Some(tag_ids) = &req.tag_ids
|
||||
&& search_recursively {
|
||||
let filter_mode = &req.tag_filter_mode.unwrap_or(FilterMode::Any);
|
||||
info!(
|
||||
"Searching for tags: {}. With path: '{}' and filter mode: {:?}",
|
||||
tag_ids, search_path, filter_mode
|
||||
);
|
||||
// Check if EXIF filtering is requested
|
||||
let has_exif_filters = req.camera_make.is_some()
|
||||
|| req.camera_model.is_some()
|
||||
|| req.lens_model.is_some()
|
||||
|| req.gps_lat.is_some()
|
||||
|| req.date_from.is_some()
|
||||
|| req.date_to.is_some();
|
||||
|
||||
let mut dao = tag_dao.lock().expect("Unable to get TagDao");
|
||||
let tag_ids = tag_ids
|
||||
.split(',')
|
||||
.filter_map(|t| t.parse().ok())
|
||||
.collect::<Vec<i32>>();
|
||||
|
||||
let exclude_tag_ids = req
|
||||
.exclude_tag_ids
|
||||
.clone()
|
||||
.unwrap_or_default()
|
||||
.split(',')
|
||||
.filter_map(|t| t.parse().ok())
|
||||
.collect::<Vec<i32>>();
|
||||
|
||||
return match filter_mode {
|
||||
FilterMode::Any => {
|
||||
dao.get_files_with_any_tag_ids(tag_ids.clone(), exclude_tag_ids, &span_context)
|
||||
}
|
||||
FilterMode::All => {
|
||||
dao.get_files_with_all_tag_ids(tag_ids.clone(), exclude_tag_ids, &span_context)
|
||||
}
|
||||
}
|
||||
.context(format!(
|
||||
"Failed to get files with tag_ids: {:?} with filter_mode: {:?}",
|
||||
tag_ids, filter_mode
|
||||
))
|
||||
.inspect(|files| {
|
||||
info!(
|
||||
"Found {:?} tagged files, filtering down by search path {:?}",
|
||||
files.len(),
|
||||
search_path
|
||||
)
|
||||
})
|
||||
.map(|tagged_files| {
|
||||
tagged_files
|
||||
.into_iter()
|
||||
.filter(|f| {
|
||||
// When searching at the root, everything matches recursively
|
||||
if search_path.trim() == "" {
|
||||
return true;
|
||||
}
|
||||
|
||||
f.file_name.starts_with(&format!(
|
||||
"{}/",
|
||||
search_path.strip_suffix('/').unwrap_or_else(|| search_path)
|
||||
))
|
||||
})
|
||||
.collect::<Vec<FileWithTagCount>>()
|
||||
})
|
||||
.map(|files| sort(files, req.sort.unwrap_or(NameAsc)))
|
||||
.inspect(|files| debug!("Found {:?} files", files.len()))
|
||||
.map(|tagged_files: Vec<String>| {
|
||||
info!(
|
||||
"Found {:?} tagged files: {:?}",
|
||||
tagged_files.len(),
|
||||
tagged_files
|
||||
);
|
||||
span_context
|
||||
.span()
|
||||
.set_attribute(KeyValue::new("file_count", tagged_files.len().to_string()));
|
||||
span_context.span().set_status(Status::Ok);
|
||||
|
||||
HttpResponse::Ok().json(PhotosResponse {
|
||||
photos: tagged_files,
|
||||
dirs: vec![],
|
||||
})
|
||||
})
|
||||
.into_http_internal_err()
|
||||
.unwrap_or_else(|e| e.error_response());
|
||||
// Apply EXIF-based filtering if requested
|
||||
let exif_matched_files: Option<HashSet<String>> = if has_exif_filters {
|
||||
// Validate GPS parameters (all 3 must be present together)
|
||||
if (req.gps_lat.is_some() || req.gps_lon.is_some() || req.gps_radius_km.is_some())
|
||||
&& !(req.gps_lat.is_some() && req.gps_lon.is_some() && req.gps_radius_km.is_some())
|
||||
{
|
||||
warn!("GPS search requires lat, lon, and radius_km to all be specified");
|
||||
span_context
|
||||
.span()
|
||||
.set_status(Status::error("Invalid GPS parameters"));
|
||||
return HttpResponse::BadRequest().body("GPS search requires lat, lon, and radius_km");
|
||||
}
|
||||
|
||||
match file_system.get_files_for_path(search_path) {
|
||||
// Calculate GPS bounding box if GPS search is requested
|
||||
let gps_bounds = if let (Some(lat), Some(lon), Some(radius_km)) =
|
||||
(req.gps_lat, req.gps_lon, req.gps_radius_km)
|
||||
{
|
||||
let (min_lat, max_lat, min_lon, max_lon) = gps_bounding_box(lat, lon, radius_km);
|
||||
Some((min_lat, max_lat, min_lon, max_lon))
|
||||
} else {
|
||||
None
|
||||
};
|
||||
|
||||
// Query EXIF database
|
||||
let mut exif_dao_guard = exif_dao.lock().expect("Unable to get ExifDao");
|
||||
let exif_results = exif_dao_guard
|
||||
.query_by_exif(
|
||||
&span_context,
|
||||
req.camera_make.as_deref(),
|
||||
req.camera_model.as_deref(),
|
||||
req.lens_model.as_deref(),
|
||||
gps_bounds,
|
||||
req.date_from,
|
||||
req.date_to,
|
||||
)
|
||||
.unwrap_or_else(|e| {
|
||||
warn!("EXIF query failed: {:?}", e);
|
||||
Vec::new()
|
||||
});
|
||||
|
||||
// Apply precise GPS distance filtering if GPS search was requested
|
||||
let filtered_results = if let (Some(lat), Some(lon), Some(radius_km)) =
|
||||
(req.gps_lat, req.gps_lon, req.gps_radius_km)
|
||||
{
|
||||
exif_results
|
||||
.into_iter()
|
||||
.filter(|exif| {
|
||||
if let (Some(photo_lat), Some(photo_lon)) =
|
||||
(exif.gps_latitude, exif.gps_longitude)
|
||||
{
|
||||
let distance = haversine_distance(lat, lon, photo_lat, photo_lon);
|
||||
distance <= radius_km
|
||||
} else {
|
||||
false
|
||||
}
|
||||
})
|
||||
.map(|exif| exif.file_path)
|
||||
.collect::<HashSet<String>>()
|
||||
} else {
|
||||
exif_results
|
||||
.into_iter()
|
||||
.map(|exif| exif.file_path)
|
||||
.collect::<HashSet<String>>()
|
||||
};
|
||||
|
||||
info!("EXIF filtering matched {} files", filtered_results.len());
|
||||
Some(filtered_results)
|
||||
} else {
|
||||
None
|
||||
};
|
||||
|
||||
let search_recursively = req.recursive.unwrap_or(false);
|
||||
if let Some(tag_ids) = &req.tag_ids
|
||||
&& search_recursively
|
||||
{
|
||||
let filter_mode = &req.tag_filter_mode.unwrap_or(FilterMode::Any);
|
||||
info!(
|
||||
"Searching for tags: {}. With path: '{}' and filter mode: {:?}",
|
||||
tag_ids, search_path, filter_mode
|
||||
);
|
||||
|
||||
let mut dao = tag_dao.lock().expect("Unable to get TagDao");
|
||||
let tag_ids = tag_ids
|
||||
.split(',')
|
||||
.filter_map(|t| t.parse().ok())
|
||||
.collect::<Vec<i32>>();
|
||||
|
||||
let exclude_tag_ids = req
|
||||
.exclude_tag_ids
|
||||
.clone()
|
||||
.unwrap_or_default()
|
||||
.split(',')
|
||||
.filter_map(|t| t.parse().ok())
|
||||
.collect::<Vec<i32>>();
|
||||
|
||||
return match filter_mode {
|
||||
FilterMode::Any => {
|
||||
dao.get_files_with_any_tag_ids(tag_ids.clone(), exclude_tag_ids, &span_context)
|
||||
}
|
||||
FilterMode::All => {
|
||||
dao.get_files_with_all_tag_ids(tag_ids.clone(), exclude_tag_ids, &span_context)
|
||||
}
|
||||
}
|
||||
.context(format!(
|
||||
"Failed to get files with tag_ids: {:?} with filter_mode: {:?}",
|
||||
tag_ids, filter_mode
|
||||
))
|
||||
.inspect(|files| {
|
||||
info!(
|
||||
"Found {:?} tagged files, filtering down by search path {:?}",
|
||||
files.len(),
|
||||
search_path
|
||||
)
|
||||
})
|
||||
.map(|tagged_files| {
|
||||
tagged_files
|
||||
.into_iter()
|
||||
.filter(|f| {
|
||||
// When searching at the root, everything matches recursively
|
||||
if search_path.trim() == "" {
|
||||
return true;
|
||||
}
|
||||
|
||||
f.file_name.starts_with(&format!(
|
||||
"{}/",
|
||||
search_path.strip_suffix('/').unwrap_or_else(|| search_path)
|
||||
))
|
||||
})
|
||||
.filter(|f| {
|
||||
// Apply EXIF filtering if present
|
||||
if let Some(ref exif_files) = exif_matched_files {
|
||||
exif_files.contains(&f.file_name)
|
||||
} else {
|
||||
true
|
||||
}
|
||||
})
|
||||
.filter(|f| {
|
||||
// Apply media type filtering if specified
|
||||
if let Some(ref media_type) = req.media_type {
|
||||
let path = PathBuf::from(&f.file_name);
|
||||
matches_media_type(&path, media_type)
|
||||
} else {
|
||||
true
|
||||
}
|
||||
})
|
||||
.collect::<Vec<FileWithTagCount>>()
|
||||
})
|
||||
.map(|files| {
|
||||
// Handle sorting - use helper function that supports EXIF date sorting
|
||||
let sort_type = req.sort.unwrap_or(NameAsc);
|
||||
let mut exif_dao_guard = exif_dao.lock().expect("Unable to get ExifDao");
|
||||
let result =
|
||||
apply_sorting_with_exif(files, sort_type, &mut exif_dao_guard, &span_context);
|
||||
drop(exif_dao_guard);
|
||||
result
|
||||
})
|
||||
.inspect(|files| debug!("Found {:?} files", files.len()))
|
||||
.map(|tagged_files: Vec<String>| {
|
||||
info!(
|
||||
"Found {:?} tagged files: {:?}",
|
||||
tagged_files.len(),
|
||||
tagged_files
|
||||
);
|
||||
span_context
|
||||
.span()
|
||||
.set_attribute(KeyValue::new("file_count", tagged_files.len().to_string()));
|
||||
span_context.span().set_status(Status::Ok);
|
||||
|
||||
HttpResponse::Ok().json(PhotosResponse {
|
||||
photos: tagged_files,
|
||||
dirs: vec![],
|
||||
})
|
||||
})
|
||||
.into_http_internal_err()
|
||||
.unwrap_or_else(|e| e.error_response());
|
||||
}
|
||||
|
||||
// Use recursive or non-recursive file listing based on flag
|
||||
let files_result = if search_recursively {
|
||||
// For recursive search without tags, manually list files recursively
|
||||
is_valid_full_path(
|
||||
&PathBuf::from(&app_state.base_path),
|
||||
&PathBuf::from(search_path),
|
||||
false,
|
||||
)
|
||||
.map(|path| {
|
||||
debug!("Valid path for recursive search: {:?}", path);
|
||||
list_files_recursive(&path).unwrap_or_default()
|
||||
})
|
||||
.context("Invalid path")
|
||||
} else {
|
||||
file_system.get_files_for_path(search_path)
|
||||
};
|
||||
|
||||
match files_result {
|
||||
Ok(files) => {
|
||||
info!("Found {:?} files in path: {:?}", files.len(), search_path);
|
||||
info!(
|
||||
"Found {:?} files in path: {:?} (recursive: {})",
|
||||
files.len(),
|
||||
search_path,
|
||||
search_recursively
|
||||
);
|
||||
|
||||
info!("Starting to filter {} files from filesystem", files.len());
|
||||
|
||||
let photos = files
|
||||
.iter()
|
||||
@@ -202,21 +426,45 @@ pub async fn list_photos<TagD: TagDao, FS: FileSystemAccess>(
|
||||
|
||||
true
|
||||
})
|
||||
.filter(|(file_name, _)| {
|
||||
// Apply EXIF filtering if present
|
||||
if let Some(ref exif_files) = exif_matched_files {
|
||||
exif_files.contains(file_name)
|
||||
} else {
|
||||
true
|
||||
}
|
||||
})
|
||||
.filter(|(file_name, _)| {
|
||||
// Apply media type filtering if specified
|
||||
if let Some(ref media_type) = req.media_type {
|
||||
let path = PathBuf::from(file_name);
|
||||
matches_media_type(&path, media_type)
|
||||
} else {
|
||||
true
|
||||
}
|
||||
})
|
||||
.map(|(file_name, tags)| FileWithTagCount {
|
||||
file_name,
|
||||
tag_count: tags.len() as i64,
|
||||
})
|
||||
.collect::<Vec<FileWithTagCount>>();
|
||||
|
||||
let mut response_files = photos
|
||||
.clone()
|
||||
.into_iter()
|
||||
.map(|f| f.file_name)
|
||||
.collect::<Vec<String>>();
|
||||
if let Some(sort_type) = req.sort {
|
||||
debug!("Sorting files: {:?}", sort_type);
|
||||
response_files = sort(photos, sort_type)
|
||||
}
|
||||
info!("After all filters, {} files remain", photos.len());
|
||||
|
||||
// Handle sorting - use helper function that supports EXIF date sorting
|
||||
let response_files = if let Some(sort_type) = req.sort {
|
||||
let mut exif_dao_guard = exif_dao.lock().expect("Unable to get ExifDao");
|
||||
let result =
|
||||
apply_sorting_with_exif(photos, sort_type, &mut exif_dao_guard, &span_context);
|
||||
drop(exif_dao_guard);
|
||||
result
|
||||
} else {
|
||||
// No sorting requested
|
||||
photos
|
||||
.into_iter()
|
||||
.map(|f| f.file_name)
|
||||
.collect::<Vec<String>>()
|
||||
};
|
||||
|
||||
let dirs = files
|
||||
.iter()
|
||||
@@ -263,6 +511,52 @@ fn sort(mut files: Vec<FileWithTagCount>, sort_type: SortType) -> Vec<String> {
|
||||
SortType::TagCountDesc => {
|
||||
files.sort_by(|l, r| r.tag_count.cmp(&l.tag_count));
|
||||
}
|
||||
SortType::DateTakenAsc | SortType::DateTakenDesc => {
|
||||
// Date sorting not implemented for FileWithTagCount
|
||||
// We shouldn't be hitting this code
|
||||
warn!("Date sorting not implemented for FileWithTagCount");
|
||||
files.sort_by(|l, r| l.file_name.cmp(&r.file_name));
|
||||
}
|
||||
}
|
||||
|
||||
files
|
||||
.iter()
|
||||
.map(|f| f.file_name.clone())
|
||||
.collect::<Vec<String>>()
|
||||
}
|
||||
|
||||
/// Sort files with metadata support (including date sorting)
|
||||
fn sort_with_metadata(mut files: Vec<FileWithMetadata>, sort_type: SortType) -> Vec<String> {
|
||||
match sort_type {
|
||||
SortType::Shuffle => files.shuffle(&mut thread_rng()),
|
||||
SortType::NameAsc => {
|
||||
files.sort_by(|l, r| l.file_name.cmp(&r.file_name));
|
||||
}
|
||||
SortType::NameDesc => {
|
||||
files.sort_by(|l, r| r.file_name.cmp(&l.file_name));
|
||||
}
|
||||
SortType::TagCountAsc => {
|
||||
files.sort_by(|l, r| l.tag_count.cmp(&r.tag_count));
|
||||
}
|
||||
SortType::TagCountDesc => {
|
||||
files.sort_by(|l, r| r.tag_count.cmp(&l.tag_count));
|
||||
}
|
||||
SortType::DateTakenAsc | SortType::DateTakenDesc => {
|
||||
files.sort_by(|l, r| {
|
||||
match (l.date_taken, r.date_taken) {
|
||||
(Some(a), Some(b)) => {
|
||||
if sort_type == SortType::DateTakenAsc {
|
||||
a.cmp(&b)
|
||||
} else {
|
||||
b.cmp(&a)
|
||||
}
|
||||
}
|
||||
(Some(_), None) => std::cmp::Ordering::Less, // Dated photos first
|
||||
(None, Some(_)) => std::cmp::Ordering::Greater,
|
||||
(None, None) => l.file_name.cmp(&r.file_name), // Fallback to name
|
||||
}
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
files
|
||||
@@ -290,19 +584,66 @@ pub fn list_files(dir: &Path) -> io::Result<Vec<PathBuf>> {
|
||||
Ok(files)
|
||||
}
|
||||
|
||||
pub fn list_files_recursive(dir: &Path) -> io::Result<Vec<PathBuf>> {
|
||||
let tracer = global_tracer();
|
||||
let mut span = tracer.start("list_files_recursive");
|
||||
let dir_name_string = dir.to_str().unwrap_or_default().to_string();
|
||||
span.set_attribute(KeyValue::new("dir", dir_name_string));
|
||||
info!("Recursively listing files in: {:?}", dir);
|
||||
|
||||
let mut result = Vec::new();
|
||||
|
||||
fn visit_dirs(dir: &Path, files: &mut Vec<PathBuf>) -> io::Result<()> {
|
||||
if dir.is_dir() {
|
||||
for entry in read_dir(dir)? {
|
||||
let entry = entry?;
|
||||
let path = entry.path();
|
||||
|
||||
if path.is_dir() {
|
||||
visit_dirs(&path, files)?;
|
||||
} else if is_image_or_video(&path) {
|
||||
files.push(path);
|
||||
}
|
||||
}
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
|
||||
visit_dirs(dir, &mut result)?;
|
||||
|
||||
span.set_attribute(KeyValue::new("file_count", result.len().to_string()));
|
||||
span.set_status(Status::Ok);
|
||||
info!(
|
||||
"Found {:?} files recursively in directory: {:?}",
|
||||
result.len(),
|
||||
dir
|
||||
);
|
||||
Ok(result)
|
||||
}
|
||||
|
||||
pub fn is_image_or_video(path: &Path) -> bool {
|
||||
file_types::is_media_file(path)
|
||||
}
|
||||
|
||||
/// Check if a file matches the media type filter
|
||||
fn matches_media_type(path: &Path, media_type: &MediaType) -> bool {
|
||||
let result = match media_type {
|
||||
MediaType::All => file_types::is_image_file(path) || file_types::is_video_file(path),
|
||||
MediaType::Photo => file_types::is_image_file(path),
|
||||
MediaType::Video => file_types::is_video_file(path),
|
||||
};
|
||||
|
||||
let extension = path
|
||||
.extension()
|
||||
.and_then(|p| p.to_str())
|
||||
.map_or(String::from(""), |p| p.to_lowercase());
|
||||
|
||||
extension == "png"
|
||||
|| extension == "jpg"
|
||||
|| extension == "jpeg"
|
||||
|| extension == "mp4"
|
||||
|| extension == "mov"
|
||||
|| extension == "nef"
|
||||
|| extension == "webp"
|
||||
debug!(
|
||||
"Media type check: path={:?}, extension='{}', type={:?}, match={}",
|
||||
path, extension, media_type, result
|
||||
);
|
||||
|
||||
result
|
||||
}
|
||||
|
||||
pub fn is_valid_full_path<P: AsRef<Path> + Debug + AsRef<std::ffi::OsStr>>(
|
||||
@@ -467,14 +808,15 @@ impl Handler<RefreshThumbnailsMessage> for StreamActor {
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
use crate::database::DbError;
|
||||
use std::collections::HashMap;
|
||||
use std::env;
|
||||
use std::fs::File;
|
||||
|
||||
use super::*;
|
||||
|
||||
struct FakeFileSystem {
|
||||
files: HashMap<String, Vec<String>>,
|
||||
base_path: String,
|
||||
err: bool,
|
||||
}
|
||||
|
||||
@@ -482,12 +824,19 @@ mod tests {
|
||||
fn with_error() -> FakeFileSystem {
|
||||
FakeFileSystem {
|
||||
files: HashMap::new(),
|
||||
base_path: String::new(),
|
||||
err: true,
|
||||
}
|
||||
}
|
||||
|
||||
fn new(files: HashMap<String, Vec<String>>) -> FakeFileSystem {
|
||||
FakeFileSystem { files, err: false }
|
||||
// Use temp dir as base path for consistency
|
||||
let base_path = env::temp_dir();
|
||||
FakeFileSystem {
|
||||
files,
|
||||
base_path: base_path.to_str().unwrap().to_string(),
|
||||
err: false,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -496,7 +845,11 @@ mod tests {
|
||||
if self.err {
|
||||
Err(anyhow!("Error for test"))
|
||||
} else if let Some(files) = self.files.get(path) {
|
||||
Ok(files.iter().map(PathBuf::from).collect::<Vec<PathBuf>>())
|
||||
// Prepend base_path to all returned files
|
||||
Ok(files
|
||||
.iter()
|
||||
.map(|f| PathBuf::from(&self.base_path).join(f))
|
||||
.collect::<Vec<PathBuf>>())
|
||||
} else {
|
||||
Ok(Vec::new())
|
||||
}
|
||||
@@ -507,6 +860,133 @@ mod tests {
|
||||
}
|
||||
}
|
||||
|
||||
struct MockExifDao;
|
||||
|
||||
impl crate::database::ExifDao for MockExifDao {
|
||||
fn store_exif(
|
||||
&mut self,
|
||||
_context: &opentelemetry::Context,
|
||||
data: crate::database::models::InsertImageExif,
|
||||
) -> Result<crate::database::models::ImageExif, crate::database::DbError> {
|
||||
// Return a dummy ImageExif for tests
|
||||
Ok(crate::database::models::ImageExif {
|
||||
id: 1,
|
||||
file_path: data.file_path.to_string(),
|
||||
camera_make: data.camera_make.map(|s| s.to_string()),
|
||||
camera_model: data.camera_model.map(|s| s.to_string()),
|
||||
lens_model: data.lens_model.map(|s| s.to_string()),
|
||||
width: data.width,
|
||||
height: data.height,
|
||||
orientation: data.orientation,
|
||||
gps_latitude: data.gps_latitude,
|
||||
gps_longitude: data.gps_longitude,
|
||||
gps_altitude: data.gps_altitude,
|
||||
focal_length: data.focal_length,
|
||||
aperture: data.aperture,
|
||||
shutter_speed: data.shutter_speed,
|
||||
iso: data.iso,
|
||||
date_taken: data.date_taken,
|
||||
created_time: data.created_time,
|
||||
last_modified: data.last_modified,
|
||||
})
|
||||
}
|
||||
|
||||
fn get_exif(
|
||||
&mut self,
|
||||
_context: &opentelemetry::Context,
|
||||
_: &str,
|
||||
) -> Result<Option<crate::database::models::ImageExif>, crate::database::DbError> {
|
||||
Ok(None)
|
||||
}
|
||||
|
||||
fn update_exif(
|
||||
&mut self,
|
||||
_context: &opentelemetry::Context,
|
||||
data: crate::database::models::InsertImageExif,
|
||||
) -> Result<crate::database::models::ImageExif, crate::database::DbError> {
|
||||
// Return a dummy ImageExif for tests
|
||||
Ok(crate::database::models::ImageExif {
|
||||
id: 1,
|
||||
file_path: data.file_path.to_string(),
|
||||
camera_make: data.camera_make.map(|s| s.to_string()),
|
||||
camera_model: data.camera_model.map(|s| s.to_string()),
|
||||
lens_model: data.lens_model.map(|s| s.to_string()),
|
||||
width: data.width,
|
||||
height: data.height,
|
||||
orientation: data.orientation,
|
||||
gps_latitude: data.gps_latitude,
|
||||
gps_longitude: data.gps_longitude,
|
||||
gps_altitude: data.gps_altitude,
|
||||
focal_length: data.focal_length,
|
||||
aperture: data.aperture,
|
||||
shutter_speed: data.shutter_speed,
|
||||
iso: data.iso,
|
||||
date_taken: data.date_taken,
|
||||
created_time: data.created_time,
|
||||
last_modified: data.last_modified,
|
||||
})
|
||||
}
|
||||
|
||||
fn delete_exif(
|
||||
&mut self,
|
||||
_context: &opentelemetry::Context,
|
||||
_: &str,
|
||||
) -> Result<(), crate::database::DbError> {
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn get_all_with_date_taken(
|
||||
&mut self,
|
||||
_context: &opentelemetry::Context,
|
||||
) -> Result<Vec<(String, i64)>, crate::database::DbError> {
|
||||
Ok(Vec::new())
|
||||
}
|
||||
|
||||
fn get_exif_batch(
|
||||
&mut self,
|
||||
_context: &opentelemetry::Context,
|
||||
_: &[String],
|
||||
) -> Result<Vec<crate::database::models::ImageExif>, crate::database::DbError> {
|
||||
Ok(Vec::new())
|
||||
}
|
||||
|
||||
fn query_by_exif(
|
||||
&mut self,
|
||||
_context: &opentelemetry::Context,
|
||||
_: Option<&str>,
|
||||
_: Option<&str>,
|
||||
_: Option<&str>,
|
||||
_: Option<(f64, f64, f64, f64)>,
|
||||
_: Option<i64>,
|
||||
_: Option<i64>,
|
||||
) -> Result<Vec<crate::database::models::ImageExif>, crate::database::DbError> {
|
||||
Ok(Vec::new())
|
||||
}
|
||||
|
||||
fn get_camera_makes(
|
||||
&mut self,
|
||||
_context: &opentelemetry::Context,
|
||||
) -> Result<Vec<(String, i64)>, crate::database::DbError> {
|
||||
Ok(Vec::new())
|
||||
}
|
||||
|
||||
fn update_file_path(
|
||||
&mut self,
|
||||
_context: &opentelemetry::Context,
|
||||
_old_path: &str,
|
||||
_new_path: &str,
|
||||
) -> Result<(), DbError> {
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn get_all_file_paths(
|
||||
&mut self,
|
||||
_context: &opentelemetry::Context,
|
||||
) -> Result<Vec<String>, DbError> {
|
||||
Ok(Vec::new())
|
||||
}
|
||||
}
|
||||
|
||||
mod api {
|
||||
use super::*;
|
||||
use actix_web::{HttpResponse, web::Query};
|
||||
@@ -517,6 +997,7 @@ mod tests {
|
||||
testhelpers::BodyReader,
|
||||
};
|
||||
|
||||
use crate::database::SqliteExifDao;
|
||||
use crate::database::test::in_memory_db_connection;
|
||||
use crate::tags::SqliteTagDao;
|
||||
use actix_web::test::TestRequest;
|
||||
@@ -538,23 +1019,40 @@ mod tests {
|
||||
|
||||
let request: Query<FilesRequest> = Query::from_query("path=").unwrap();
|
||||
|
||||
let mut temp_photo = env::temp_dir();
|
||||
let mut tmp = temp_photo.clone();
|
||||
// Create a dedicated test directory to avoid interference from other files in system temp
|
||||
let mut test_base = env::temp_dir();
|
||||
test_base.push("image_api_test_list_photos");
|
||||
fs::create_dir_all(&test_base).unwrap();
|
||||
|
||||
tmp.push("test-dir");
|
||||
fs::create_dir_all(tmp).unwrap();
|
||||
let mut test_dir = test_base.clone();
|
||||
test_dir.push("test-dir");
|
||||
fs::create_dir_all(&test_dir).unwrap();
|
||||
|
||||
temp_photo.push("photo.jpg");
|
||||
let mut photo_path = test_base.clone();
|
||||
photo_path.push("photo.jpg");
|
||||
File::create(&photo_path).unwrap();
|
||||
|
||||
File::create(temp_photo.clone()).unwrap();
|
||||
// Create AppState with the same base_path as RealFileSystem
|
||||
use actix::Actor;
|
||||
let test_state = AppState::new(
|
||||
std::sync::Arc::new(crate::video::actors::StreamActor {}.start()),
|
||||
test_base.to_str().unwrap().to_string(),
|
||||
test_base.join("thumbnails").to_str().unwrap().to_string(),
|
||||
test_base.join("videos").to_str().unwrap().to_string(),
|
||||
test_base.join("gifs").to_str().unwrap().to_string(),
|
||||
Vec::new(),
|
||||
);
|
||||
|
||||
let response: HttpResponse = list_photos(
|
||||
claims,
|
||||
TestRequest::default().to_http_request(),
|
||||
request,
|
||||
Data::new(AppState::test_state()),
|
||||
Data::new(RealFileSystem::new(String::from("/tmp"))),
|
||||
Data::new(test_state),
|
||||
Data::new(RealFileSystem::new(test_base.to_str().unwrap().to_string())),
|
||||
Data::new(Mutex::new(SqliteTagDao::default())),
|
||||
Data::new(Mutex::new(
|
||||
Box::new(MockExifDao) as Box<dyn crate::database::ExifDao>
|
||||
)),
|
||||
)
|
||||
.await;
|
||||
let status = response.status();
|
||||
@@ -574,6 +1072,9 @@ mod tests {
|
||||
.collect::<Vec<&String>>()
|
||||
.is_empty()
|
||||
);
|
||||
|
||||
// Cleanup
|
||||
let _ = fs::remove_dir_all(test_base);
|
||||
}
|
||||
|
||||
#[actix_rt::test]
|
||||
@@ -587,13 +1088,17 @@ mod tests {
|
||||
|
||||
let request: Query<FilesRequest> = Query::from_query("path=..").unwrap();
|
||||
|
||||
let temp_dir = env::temp_dir();
|
||||
let response = list_photos(
|
||||
claims,
|
||||
TestRequest::default().to_http_request(),
|
||||
request,
|
||||
Data::new(AppState::test_state()),
|
||||
Data::new(RealFileSystem::new(String::from("./"))),
|
||||
Data::new(RealFileSystem::new(temp_dir.to_str().unwrap().to_string())),
|
||||
Data::new(Mutex::new(SqliteTagDao::default())),
|
||||
Data::new(Mutex::new(
|
||||
Box::new(MockExifDao) as Box<dyn crate::database::ExifDao>
|
||||
)),
|
||||
)
|
||||
.await;
|
||||
|
||||
@@ -609,7 +1114,8 @@ mod tests {
|
||||
exp: 12345,
|
||||
};
|
||||
|
||||
let request: Query<FilesRequest> = Query::from_query("path=&tag_ids=1,3").unwrap();
|
||||
let request: Query<FilesRequest> =
|
||||
Query::from_query("path=&tag_ids=1,3&recursive=true").unwrap();
|
||||
|
||||
let mut tag_dao = SqliteTagDao::new(in_memory_db_connection());
|
||||
|
||||
@@ -630,23 +1136,16 @@ mod tests {
|
||||
.tag_file(&opentelemetry::Context::current(), "test.jpg", tag3.id)
|
||||
.unwrap();
|
||||
|
||||
let mut files = HashMap::new();
|
||||
files.insert(
|
||||
String::from(""),
|
||||
vec![
|
||||
String::from("file1.txt"),
|
||||
String::from("test.jpg"),
|
||||
String::from("some-other.jpg"),
|
||||
],
|
||||
);
|
||||
|
||||
let response: HttpResponse = list_photos(
|
||||
claims,
|
||||
TestRequest::default().to_http_request(),
|
||||
request,
|
||||
Data::new(AppState::test_state()),
|
||||
Data::new(FakeFileSystem::new(files)),
|
||||
Data::new(FakeFileSystem::new(HashMap::new())),
|
||||
Data::new(Mutex::new(tag_dao)),
|
||||
Data::new(Mutex::new(
|
||||
Box::new(MockExifDao) as Box<dyn crate::database::ExifDao>
|
||||
)),
|
||||
)
|
||||
.await;
|
||||
|
||||
@@ -694,18 +1193,8 @@ mod tests {
|
||||
)
|
||||
.unwrap();
|
||||
|
||||
let mut files = HashMap::new();
|
||||
files.insert(
|
||||
String::from(""),
|
||||
vec![
|
||||
String::from("file1.txt"),
|
||||
String::from("test.jpg"),
|
||||
String::from("some-other.jpg"),
|
||||
],
|
||||
);
|
||||
|
||||
let request: Query<FilesRequest> = Query::from_query(&format!(
|
||||
"path=&tag_ids={},{}&tag_filter_mode=All",
|
||||
"path=&tag_ids={},{}&tag_filter_mode=All&recursive=true",
|
||||
tag1.id, tag3.id
|
||||
))
|
||||
.unwrap();
|
||||
@@ -715,8 +1204,11 @@ mod tests {
|
||||
TestRequest::default().to_http_request(),
|
||||
request,
|
||||
Data::new(AppState::test_state()),
|
||||
Data::new(FakeFileSystem::new(files)),
|
||||
Data::new(FakeFileSystem::new(HashMap::new())),
|
||||
Data::new(Mutex::new(tag_dao)),
|
||||
Data::new(Mutex::new(
|
||||
Box::new(MockExifDao) as Box<dyn crate::database::ExifDao>
|
||||
)),
|
||||
)
|
||||
.await;
|
||||
|
||||
|
||||
121
src/geo.rs
Normal file
121
src/geo.rs
Normal file
@@ -0,0 +1,121 @@
|
||||
/// Geographic calculation utilities for GPS-based search
|
||||
use std::f64;
|
||||
|
||||
/// Calculate distance between two GPS coordinates using the Haversine formula.
|
||||
/// Returns distance in kilometers.
|
||||
///
|
||||
/// # Arguments
|
||||
/// * `lat1` - Latitude of first point in decimal degrees
|
||||
/// * `lon1` - Longitude of first point in decimal degrees
|
||||
/// * `lat2` - Latitude of second point in decimal degrees
|
||||
/// * `lon2` - Longitude of second point in decimal degrees
|
||||
///
|
||||
/// # Example
|
||||
/// ```
|
||||
/// use image_api::geo::haversine_distance;
|
||||
/// let distance = haversine_distance(37.7749, -122.4194, 34.0522, -118.2437);
|
||||
/// // Distance between San Francisco and Los Angeles (~559 km)
|
||||
/// ```
|
||||
pub fn haversine_distance(lat1: f64, lon1: f64, lat2: f64, lon2: f64) -> f64 {
|
||||
const EARTH_RADIUS_KM: f64 = 6371.0;
|
||||
|
||||
let lat1_rad = lat1.to_radians();
|
||||
let lat2_rad = lat2.to_radians();
|
||||
let delta_lat = (lat2 - lat1).to_radians();
|
||||
let delta_lon = (lon2 - lon1).to_radians();
|
||||
|
||||
let a = (delta_lat / 2.0).sin().powi(2)
|
||||
+ lat1_rad.cos() * lat2_rad.cos() * (delta_lon / 2.0).sin().powi(2);
|
||||
let c = 2.0 * a.sqrt().atan2((1.0 - a).sqrt());
|
||||
|
||||
EARTH_RADIUS_KM * c
|
||||
}
|
||||
|
||||
/// Calculate bounding box for GPS radius query.
|
||||
/// Returns (min_lat, max_lat, min_lon, max_lon) that encompasses the search radius.
|
||||
///
|
||||
/// This is used as a fast first-pass filter for GPS queries, narrowing down
|
||||
/// candidates before applying the more expensive Haversine distance calculation.
|
||||
///
|
||||
/// # Arguments
|
||||
/// * `lat` - Center latitude in decimal degrees
|
||||
/// * `lon` - Center longitude in decimal degrees
|
||||
/// * `radius_km` - Search radius in kilometers
|
||||
///
|
||||
/// # Returns
|
||||
/// A tuple of (min_lat, max_lat, min_lon, max_lon) in decimal degrees
|
||||
pub fn gps_bounding_box(lat: f64, lon: f64, radius_km: f64) -> (f64, f64, f64, f64) {
|
||||
const EARTH_RADIUS_KM: f64 = 6371.0;
|
||||
|
||||
// Calculate latitude delta (same at all latitudes)
|
||||
let lat_delta = (radius_km / EARTH_RADIUS_KM) * (180.0 / f64::consts::PI);
|
||||
|
||||
// Calculate longitude delta (varies with latitude)
|
||||
let lon_delta = lat_delta / lat.to_radians().cos();
|
||||
|
||||
(
|
||||
lat - lat_delta, // min_lat
|
||||
lat + lat_delta, // max_lat
|
||||
lon - lon_delta, // min_lon
|
||||
lon + lon_delta, // max_lon
|
||||
)
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_haversine_distance_sf_to_la() {
|
||||
// San Francisco to Los Angeles
|
||||
let distance = haversine_distance(37.7749, -122.4194, 34.0522, -118.2437);
|
||||
// Should be approximately 559 km
|
||||
assert!(
|
||||
(distance - 559.0).abs() < 10.0,
|
||||
"Distance should be ~559km, got {}",
|
||||
distance
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_haversine_distance_same_point() {
|
||||
// Same point should have zero distance
|
||||
let distance = haversine_distance(37.7749, -122.4194, 37.7749, -122.4194);
|
||||
assert!(
|
||||
distance < 0.001,
|
||||
"Same point should have ~0 distance, got {}",
|
||||
distance
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_gps_bounding_box() {
|
||||
// Test bounding box calculation for 10km radius around San Francisco
|
||||
let (min_lat, max_lat, min_lon, max_lon) = gps_bounding_box(37.7749, -122.4194, 10.0);
|
||||
|
||||
// Verify the bounds are reasonable
|
||||
assert!(min_lat < 37.7749, "min_lat should be less than center");
|
||||
assert!(max_lat > 37.7749, "max_lat should be greater than center");
|
||||
assert!(min_lon < -122.4194, "min_lon should be less than center");
|
||||
assert!(max_lon > -122.4194, "max_lon should be greater than center");
|
||||
|
||||
// Verify bounds span roughly the right distance
|
||||
let lat_span = max_lat - min_lat;
|
||||
assert!(
|
||||
lat_span > 0.1 && lat_span < 0.3,
|
||||
"Latitude span should be reasonable for 10km"
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_haversine_distance_across_equator() {
|
||||
// Test across equator
|
||||
let distance = haversine_distance(1.0, 0.0, -1.0, 0.0);
|
||||
// Should be approximately 222 km
|
||||
assert!(
|
||||
(distance - 222.0).abs() < 5.0,
|
||||
"Distance should be ~222km, got {}",
|
||||
distance
|
||||
);
|
||||
}
|
||||
}
|
||||
42
src/lib.rs
Normal file
42
src/lib.rs
Normal file
@@ -0,0 +1,42 @@
|
||||
#[macro_use]
|
||||
extern crate diesel;
|
||||
|
||||
pub mod auth;
|
||||
pub mod cleanup;
|
||||
pub mod data;
|
||||
pub mod database;
|
||||
pub mod error;
|
||||
pub mod exif;
|
||||
pub mod file_types;
|
||||
pub mod files;
|
||||
pub mod geo;
|
||||
pub mod memories;
|
||||
pub mod otel;
|
||||
pub mod service;
|
||||
pub mod state;
|
||||
pub mod tags;
|
||||
#[cfg(test)]
|
||||
pub mod testhelpers;
|
||||
pub mod video;
|
||||
|
||||
// Re-export commonly used types
|
||||
pub use data::{Claims, ThumbnailRequest};
|
||||
pub use database::{connect, schema};
|
||||
pub use state::AppState;
|
||||
|
||||
// Stub functions for modules that reference main.rs
|
||||
// These are not used by cleanup_files binary
|
||||
use std::path::Path;
|
||||
use walkdir::DirEntry;
|
||||
|
||||
pub fn create_thumbnails() {
|
||||
// Stub - implemented in main.rs
|
||||
}
|
||||
|
||||
pub fn update_media_counts(_media_dir: &Path) {
|
||||
// Stub - implemented in main.rs
|
||||
}
|
||||
|
||||
pub fn is_video(entry: &DirEntry) -> bool {
|
||||
file_types::direntry_is_video(entry)
|
||||
}
|
||||
436
src/main.rs
436
src/main.rs
@@ -9,8 +9,8 @@ use futures::stream::StreamExt;
|
||||
use lazy_static::lazy_static;
|
||||
use prometheus::{self, IntGauge};
|
||||
use std::error::Error;
|
||||
use std::sync::Mutex;
|
||||
use std::sync::mpsc::channel;
|
||||
use std::sync::{Arc, Mutex};
|
||||
use std::time::{Duration, SystemTime};
|
||||
use std::{collections::HashMap, io::prelude::*};
|
||||
use std::{env, fs::File};
|
||||
use std::{
|
||||
@@ -19,20 +19,20 @@ use std::{
|
||||
};
|
||||
use walkdir::{DirEntry, WalkDir};
|
||||
|
||||
use actix_cors::Cors;
|
||||
use actix_files::NamedFile;
|
||||
use actix_multipart as mp;
|
||||
use actix_web::{
|
||||
App, HttpRequest, HttpResponse, HttpServer, Responder, delete, get, middleware, post, put,
|
||||
web::{self, BufMut, BytesMut},
|
||||
};
|
||||
use anyhow::Context;
|
||||
use chrono::Utc;
|
||||
use diesel::sqlite::Sqlite;
|
||||
use notify::{Config, EventKind, RecommendedWatcher, RecursiveMode, Watcher};
|
||||
use rayon::prelude::*;
|
||||
|
||||
use crate::auth::login;
|
||||
use crate::data::*;
|
||||
use crate::database::models::InsertImageExif;
|
||||
use crate::database::*;
|
||||
use crate::files::{
|
||||
RealFileSystem, RefreshThumbnailsMessage, is_image_or_video, is_valid_full_path, move_file,
|
||||
@@ -53,7 +53,10 @@ mod auth;
|
||||
mod data;
|
||||
mod database;
|
||||
mod error;
|
||||
mod exif;
|
||||
mod file_types;
|
||||
mod files;
|
||||
mod geo;
|
||||
mod state;
|
||||
mod tags;
|
||||
mod video;
|
||||
@@ -111,13 +114,23 @@ async fn get_image(
|
||||
if let Ok(file) = NamedFile::open(&thumb_path) {
|
||||
span.set_status(Status::Ok);
|
||||
// The NamedFile will automatically set the correct content-type
|
||||
return file.into_response(&request);
|
||||
// Enable ETag and set cache headers for thumbnails (1 day cache)
|
||||
return file
|
||||
.use_etag(true)
|
||||
.use_last_modified(true)
|
||||
.prefer_utf8(true)
|
||||
.into_response(&request);
|
||||
}
|
||||
}
|
||||
|
||||
if let Ok(file) = NamedFile::open(&path) {
|
||||
span.set_status(Status::Ok);
|
||||
return file.into_response(&request);
|
||||
// Enable ETag and set cache headers for full images (1 hour cache)
|
||||
return file
|
||||
.use_etag(true)
|
||||
.use_last_modified(true)
|
||||
.prefer_utf8(true)
|
||||
.into_response(&request);
|
||||
}
|
||||
|
||||
span.set_status(Status::error("Not found"));
|
||||
@@ -130,14 +143,8 @@ async fn get_image(
|
||||
}
|
||||
|
||||
fn is_video_file(path: &Path) -> bool {
|
||||
if let Some(extension) = path.extension() {
|
||||
matches!(
|
||||
extension.to_str().unwrap_or("").to_lowercase().as_str(),
|
||||
"mp4" | "mov" | "avi" | "mkv"
|
||||
)
|
||||
} else {
|
||||
false
|
||||
}
|
||||
use image_api::file_types;
|
||||
file_types::is_video_file(path)
|
||||
}
|
||||
|
||||
#[get("/image/metadata")]
|
||||
@@ -146,17 +153,31 @@ async fn get_file_metadata(
|
||||
request: HttpRequest,
|
||||
path: web::Query<ThumbnailRequest>,
|
||||
app_state: Data<AppState>,
|
||||
exif_dao: Data<Mutex<Box<dyn ExifDao>>>,
|
||||
) -> impl Responder {
|
||||
let tracer = global_tracer();
|
||||
let context = extract_context_from_request(&request);
|
||||
let mut span = tracer.start_with_context("get_file_metadata", &context);
|
||||
match is_valid_full_path(&app_state.base_path, &path.path, false)
|
||||
let span_context =
|
||||
opentelemetry::Context::new().with_remote_span_context(span.span_context().clone());
|
||||
|
||||
let full_path = is_valid_full_path(&app_state.base_path, &path.path, false);
|
||||
|
||||
match full_path
|
||||
.ok_or_else(|| ErrorKind::InvalidData.into())
|
||||
.and_then(File::open)
|
||||
.and_then(|file| file.metadata())
|
||||
{
|
||||
Ok(metadata) => {
|
||||
let response: MetadataResponse = metadata.into();
|
||||
let mut response: MetadataResponse = metadata.into();
|
||||
|
||||
// Query EXIF data if available
|
||||
if let Ok(mut dao) = exif_dao.lock()
|
||||
&& let Ok(Some(exif)) = dao.get_exif(&span_context, &path.path)
|
||||
{
|
||||
response.exif = Some(exif.into());
|
||||
}
|
||||
|
||||
span.add_event(
|
||||
"Metadata fetched",
|
||||
vec![KeyValue::new("file", path.path.clone())],
|
||||
@@ -181,10 +202,13 @@ async fn upload_image(
|
||||
request: HttpRequest,
|
||||
mut payload: mp::Multipart,
|
||||
app_state: Data<AppState>,
|
||||
exif_dao: Data<Mutex<Box<dyn ExifDao>>>,
|
||||
) -> impl Responder {
|
||||
let tracer = global_tracer();
|
||||
let context = extract_context_from_request(&request);
|
||||
let mut span = tracer.start_with_context("upload_image", &context);
|
||||
let span_context =
|
||||
opentelemetry::Context::new().with_remote_span_context(span.span_context().clone());
|
||||
|
||||
let mut file_content: BytesMut = BytesMut::new();
|
||||
let mut file_name: Option<String> = None;
|
||||
@@ -224,11 +248,12 @@ async fn upload_image(
|
||||
.span_builder("file write")
|
||||
.start_with_context(&tracer, &context);
|
||||
|
||||
if !full_path.is_file() && is_image_or_video(&full_path) {
|
||||
let uploaded_path = if !full_path.is_file() && is_image_or_video(&full_path) {
|
||||
let mut file = File::create(&full_path).unwrap();
|
||||
file.write_all(&file_content).unwrap();
|
||||
|
||||
info!("Uploaded: {:?}", full_path);
|
||||
full_path
|
||||
} else {
|
||||
warn!("File already exists: {:?}", full_path);
|
||||
|
||||
@@ -245,8 +270,60 @@ async fn upload_image(
|
||||
);
|
||||
info!("Uploaded: {}", new_path);
|
||||
|
||||
let mut file = File::create(new_path).unwrap();
|
||||
let new_path_buf = PathBuf::from(&new_path);
|
||||
let mut file = File::create(&new_path_buf).unwrap();
|
||||
file.write_all(&file_content).unwrap();
|
||||
new_path_buf
|
||||
};
|
||||
|
||||
// Extract and store EXIF data if file supports it
|
||||
if exif::supports_exif(&uploaded_path) {
|
||||
let relative_path = uploaded_path
|
||||
.strip_prefix(&app_state.base_path)
|
||||
.expect("Error stripping base path prefix")
|
||||
.to_str()
|
||||
.unwrap()
|
||||
.to_string();
|
||||
|
||||
match exif::extract_exif_from_path(&uploaded_path) {
|
||||
Ok(exif_data) => {
|
||||
let timestamp = Utc::now().timestamp();
|
||||
let insert_exif = InsertImageExif {
|
||||
file_path: relative_path.clone(),
|
||||
camera_make: exif_data.camera_make,
|
||||
camera_model: exif_data.camera_model,
|
||||
lens_model: exif_data.lens_model,
|
||||
width: exif_data.width,
|
||||
height: exif_data.height,
|
||||
orientation: exif_data.orientation,
|
||||
gps_latitude: exif_data.gps_latitude,
|
||||
gps_longitude: exif_data.gps_longitude,
|
||||
gps_altitude: exif_data.gps_altitude,
|
||||
focal_length: exif_data.focal_length,
|
||||
aperture: exif_data.aperture,
|
||||
shutter_speed: exif_data.shutter_speed,
|
||||
iso: exif_data.iso,
|
||||
date_taken: exif_data.date_taken,
|
||||
created_time: timestamp,
|
||||
last_modified: timestamp,
|
||||
};
|
||||
|
||||
if let Ok(mut dao) = exif_dao.lock() {
|
||||
if let Err(e) = dao.store_exif(&span_context, insert_exif) {
|
||||
error!("Failed to store EXIF data for {}: {:?}", relative_path, e);
|
||||
} else {
|
||||
debug!("EXIF data stored for {}", relative_path);
|
||||
}
|
||||
}
|
||||
}
|
||||
Err(e) => {
|
||||
debug!(
|
||||
"No EXIF data or error extracting from {}: {:?}",
|
||||
uploaded_path.display(),
|
||||
e
|
||||
);
|
||||
}
|
||||
}
|
||||
}
|
||||
} else {
|
||||
error!("Invalid path for upload: {:?}", full_path);
|
||||
@@ -362,8 +439,34 @@ async fn get_video_part(
|
||||
let mut file_part = PathBuf::new();
|
||||
file_part.push(app_state.video_path.clone());
|
||||
file_part.push(part);
|
||||
// TODO: Do we need to guard against directory attacks here?
|
||||
match NamedFile::open(&file_part) {
|
||||
|
||||
// Guard against directory traversal attacks
|
||||
let canonical_base = match std::fs::canonicalize(&app_state.video_path) {
|
||||
Ok(path) => path,
|
||||
Err(e) => {
|
||||
error!("Failed to canonicalize video path: {:?}", e);
|
||||
span.set_status(Status::error("Invalid video path configuration"));
|
||||
return HttpResponse::InternalServerError().finish();
|
||||
}
|
||||
};
|
||||
|
||||
let canonical_file = match std::fs::canonicalize(&file_part) {
|
||||
Ok(path) => path,
|
||||
Err(_) => {
|
||||
warn!("Video part not found or invalid: {:?}", file_part);
|
||||
span.set_status(Status::error(format!("Video part not found '{}'", part)));
|
||||
return HttpResponse::NotFound().finish();
|
||||
}
|
||||
};
|
||||
|
||||
// Ensure the resolved path is still within the video directory
|
||||
if !canonical_file.starts_with(&canonical_base) {
|
||||
warn!("Directory traversal attempt detected: {:?}", part);
|
||||
span.set_status(Status::error("Invalid video path"));
|
||||
return HttpResponse::Forbidden().finish();
|
||||
}
|
||||
|
||||
match NamedFile::open(&canonical_file) {
|
||||
Ok(file) => {
|
||||
span.set_status(Status::Ok);
|
||||
file.into_response(&request)
|
||||
@@ -575,23 +678,13 @@ fn update_media_counts(media_dir: &Path) {
|
||||
}
|
||||
|
||||
fn is_image(entry: &DirEntry) -> bool {
|
||||
entry
|
||||
.path()
|
||||
.extension()
|
||||
.and_then(|ext| ext.to_str())
|
||||
.map(|ext| ext.to_lowercase())
|
||||
.map(|ext| ext == "jpg" || ext == "jpeg" || ext == "png" || ext == "nef")
|
||||
.unwrap_or(false)
|
||||
use image_api::file_types;
|
||||
file_types::direntry_is_image(entry)
|
||||
}
|
||||
|
||||
fn is_video(entry: &DirEntry) -> bool {
|
||||
entry
|
||||
.path()
|
||||
.extension()
|
||||
.and_then(|ext| ext.to_str())
|
||||
.map(|ext| ext.to_lowercase())
|
||||
.map(|ext| ext == "mp4" || ext == "mov")
|
||||
.unwrap_or(false)
|
||||
use image_api::file_types;
|
||||
file_types::direntry_is_video(entry)
|
||||
}
|
||||
|
||||
fn main() -> std::io::Result<()> {
|
||||
@@ -645,8 +738,31 @@ fn main() -> std::io::Result<()> {
|
||||
let user_dao = SqliteUserDao::new();
|
||||
let favorites_dao = SqliteFavoriteDao::new();
|
||||
let tag_dao = SqliteTagDao::default();
|
||||
let exif_dao = SqliteExifDao::new();
|
||||
let cors = Cors::default()
|
||||
.allowed_origin_fn(|origin, _req_head| {
|
||||
// Allow all origins in development, or check against CORS_ALLOWED_ORIGINS env var
|
||||
if let Ok(allowed_origins) = env::var("CORS_ALLOWED_ORIGINS") {
|
||||
allowed_origins
|
||||
.split(',')
|
||||
.any(|allowed| origin.as_bytes() == allowed.trim().as_bytes())
|
||||
} else {
|
||||
// Default: allow all origins if not configured
|
||||
true
|
||||
}
|
||||
})
|
||||
.allowed_methods(vec!["GET", "POST", "PUT", "DELETE", "OPTIONS"])
|
||||
.allowed_headers(vec![
|
||||
actix_web::http::header::AUTHORIZATION,
|
||||
actix_web::http::header::ACCEPT,
|
||||
actix_web::http::header::CONTENT_TYPE,
|
||||
])
|
||||
.supports_credentials()
|
||||
.max_age(3600);
|
||||
|
||||
App::new()
|
||||
.wrap(middleware::Logger::default())
|
||||
.wrap(cors)
|
||||
.service(web::resource("/login").route(web::post().to(login::<SqliteUserDao>)))
|
||||
.service(
|
||||
web::resource("/photos")
|
||||
@@ -673,6 +789,9 @@ fn main() -> std::io::Result<()> {
|
||||
favorites_dao,
|
||||
))))
|
||||
.app_data::<Data<Mutex<SqliteTagDao>>>(Data::new(Mutex::new(tag_dao)))
|
||||
.app_data::<Data<Mutex<Box<dyn ExifDao>>>>(Data::new(Mutex::new(Box::new(
|
||||
exif_dao,
|
||||
))))
|
||||
.wrap(prometheus.clone())
|
||||
})
|
||||
.bind(dotenv::var("BIND_URL").unwrap())?
|
||||
@@ -692,62 +811,215 @@ fn run_migrations(
|
||||
|
||||
fn watch_files() {
|
||||
std::thread::spawn(|| {
|
||||
let (wtx, wrx) = channel();
|
||||
let mut watcher = RecommendedWatcher::new(wtx, Config::default()).unwrap();
|
||||
let base_str = dotenv::var("BASE_PATH").unwrap();
|
||||
let base_path = Path::new(&base_str);
|
||||
let base_path = PathBuf::from(&base_str);
|
||||
|
||||
watcher
|
||||
.watch(base_path, RecursiveMode::Recursive)
|
||||
.context(format!("Unable to watch BASE_PATH: '{}'", base_str))
|
||||
.unwrap();
|
||||
// Get polling intervals from environment variables
|
||||
// Quick scan: Check recently modified files (default: 60 seconds)
|
||||
let quick_interval_secs = dotenv::var("WATCH_QUICK_INTERVAL_SECONDS")
|
||||
.ok()
|
||||
.and_then(|s| s.parse::<u64>().ok())
|
||||
.unwrap_or(60);
|
||||
|
||||
// Full scan: Check all files regardless of modification time (default: 3600 seconds = 1 hour)
|
||||
let full_interval_secs = dotenv::var("WATCH_FULL_INTERVAL_SECONDS")
|
||||
.ok()
|
||||
.and_then(|s| s.parse::<u64>().ok())
|
||||
.unwrap_or(3600);
|
||||
|
||||
info!("Starting optimized file watcher");
|
||||
info!(" Quick scan interval: {} seconds", quick_interval_secs);
|
||||
info!(" Full scan interval: {} seconds", full_interval_secs);
|
||||
info!(" Watching directory: {}", base_str);
|
||||
|
||||
// Create EXIF DAO for tracking processed files
|
||||
let exif_dao = Arc::new(Mutex::new(
|
||||
Box::new(SqliteExifDao::new()) as Box<dyn ExifDao>
|
||||
));
|
||||
|
||||
let mut last_quick_scan = SystemTime::now();
|
||||
let mut last_full_scan = SystemTime::now();
|
||||
let mut scan_count = 0u64;
|
||||
|
||||
loop {
|
||||
let ev = wrx.recv();
|
||||
if let Ok(Ok(event)) = ev {
|
||||
match event.kind {
|
||||
EventKind::Create(create_kind) => {
|
||||
info!(
|
||||
"Creating thumbnails {:?} create event kind: {:?}",
|
||||
event.paths, create_kind
|
||||
);
|
||||
create_thumbnails();
|
||||
}
|
||||
EventKind::Modify(kind) => {
|
||||
debug!("All modified paths: {:?}", event.paths);
|
||||
debug!("Modify kind: {:?}", kind);
|
||||
std::thread::sleep(Duration::from_secs(quick_interval_secs));
|
||||
|
||||
if let Some(orig) = event.paths.first() {
|
||||
let image_base_path = PathBuf::from(env::var("BASE_PATH").unwrap());
|
||||
let image_relative = orig.strip_prefix(&image_base_path).unwrap();
|
||||
if let Ok(old_thumbnail) =
|
||||
env::var("THUMBNAILS").map(PathBuf::from).map(|mut base| {
|
||||
base.push(image_relative);
|
||||
base
|
||||
})
|
||||
{
|
||||
if let Err(e) = std::fs::remove_file(&old_thumbnail) {
|
||||
error!(
|
||||
"Error removing thumbnail: {}\n{}",
|
||||
old_thumbnail.display(),
|
||||
e
|
||||
);
|
||||
} else {
|
||||
info!("Deleted moved thumbnail: {}", old_thumbnail.display());
|
||||
let now = SystemTime::now();
|
||||
let since_last_full = now
|
||||
.duration_since(last_full_scan)
|
||||
.unwrap_or(Duration::from_secs(0));
|
||||
|
||||
create_thumbnails();
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
let is_full_scan = since_last_full.as_secs() >= full_interval_secs;
|
||||
|
||||
EventKind::Remove(_) => {
|
||||
update_media_counts(&PathBuf::from(env::var("BASE_PATH").unwrap()))
|
||||
}
|
||||
if is_full_scan {
|
||||
info!("Running full scan (scan #{})", scan_count);
|
||||
process_new_files(&base_path, Arc::clone(&exif_dao), None);
|
||||
last_full_scan = now;
|
||||
} else {
|
||||
debug!(
|
||||
"Running quick scan (checking files modified in last {} seconds)",
|
||||
quick_interval_secs + 10
|
||||
);
|
||||
// Check files modified since last quick scan, plus 10 second buffer
|
||||
let check_since = last_quick_scan
|
||||
.checked_sub(Duration::from_secs(10))
|
||||
.unwrap_or(last_quick_scan);
|
||||
process_new_files(&base_path, Arc::clone(&exif_dao), Some(check_since));
|
||||
}
|
||||
|
||||
_ => {}
|
||||
}
|
||||
};
|
||||
last_quick_scan = now;
|
||||
scan_count += 1;
|
||||
|
||||
// Update media counts
|
||||
update_media_counts(&base_path);
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
fn process_new_files(
|
||||
base_path: &Path,
|
||||
exif_dao: Arc<Mutex<Box<dyn ExifDao>>>,
|
||||
modified_since: Option<SystemTime>,
|
||||
) {
|
||||
let context = opentelemetry::Context::new();
|
||||
let thumbs = dotenv::var("THUMBNAILS").expect("THUMBNAILS not defined");
|
||||
let thumbnail_directory = Path::new(&thumbs);
|
||||
|
||||
// Collect all image and video files, optionally filtered by modification time
|
||||
let files: Vec<(PathBuf, String)> = WalkDir::new(base_path)
|
||||
.into_iter()
|
||||
.filter_map(|entry| entry.ok())
|
||||
.filter(|entry| entry.file_type().is_file())
|
||||
.filter(|entry| {
|
||||
// Filter by modification time if specified
|
||||
if let Some(since) = modified_since {
|
||||
if let Ok(metadata) = entry.metadata()
|
||||
&& let Ok(modified) = metadata.modified()
|
||||
{
|
||||
return modified >= since;
|
||||
}
|
||||
// If we can't get metadata, include the file to be safe
|
||||
return true;
|
||||
}
|
||||
true
|
||||
})
|
||||
.filter(|entry| is_image(entry) || is_video(entry))
|
||||
.filter_map(|entry| {
|
||||
let file_path = entry.path().to_path_buf();
|
||||
let relative_path = file_path
|
||||
.strip_prefix(base_path)
|
||||
.ok()?
|
||||
.to_str()?
|
||||
.to_string();
|
||||
Some((file_path, relative_path))
|
||||
})
|
||||
.collect();
|
||||
|
||||
if files.is_empty() {
|
||||
debug!("No files to process");
|
||||
return;
|
||||
}
|
||||
|
||||
debug!("Found {} files to check", files.len());
|
||||
|
||||
// Batch query: Get all EXIF data for these files in one query
|
||||
let file_paths: Vec<String> = files.iter().map(|(_, rel_path)| rel_path.clone()).collect();
|
||||
|
||||
let existing_exif_paths: HashMap<String, bool> = {
|
||||
let mut dao = exif_dao.lock().expect("Unable to lock ExifDao");
|
||||
match dao.get_exif_batch(&context, &file_paths) {
|
||||
Ok(exif_records) => exif_records
|
||||
.into_iter()
|
||||
.map(|record| (record.file_path, true))
|
||||
.collect(),
|
||||
Err(e) => {
|
||||
error!("Error batch querying EXIF data: {:?}", e);
|
||||
HashMap::new()
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
let mut new_files_found = false;
|
||||
let mut files_needing_exif = Vec::new();
|
||||
|
||||
// Check each file for missing thumbnail or EXIF data
|
||||
for (file_path, relative_path) in files {
|
||||
// Check if thumbnail exists
|
||||
let thumb_path = thumbnail_directory.join(&relative_path);
|
||||
let needs_thumbnail = !thumb_path.exists();
|
||||
|
||||
// Check if EXIF data exists (for supported files)
|
||||
let needs_exif = if exif::supports_exif(&file_path) {
|
||||
!existing_exif_paths.contains_key(&relative_path)
|
||||
} else {
|
||||
false
|
||||
};
|
||||
|
||||
if needs_thumbnail || needs_exif {
|
||||
new_files_found = true;
|
||||
|
||||
if needs_thumbnail {
|
||||
info!("New file detected (missing thumbnail): {}", relative_path);
|
||||
}
|
||||
|
||||
if needs_exif {
|
||||
files_needing_exif.push((file_path, relative_path));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Process EXIF data for files that need it
|
||||
if !files_needing_exif.is_empty() {
|
||||
info!(
|
||||
"Processing EXIF data for {} files",
|
||||
files_needing_exif.len()
|
||||
);
|
||||
|
||||
for (file_path, relative_path) in files_needing_exif {
|
||||
match exif::extract_exif_from_path(&file_path) {
|
||||
Ok(exif_data) => {
|
||||
let timestamp = Utc::now().timestamp();
|
||||
let insert_exif = InsertImageExif {
|
||||
file_path: relative_path.clone(),
|
||||
camera_make: exif_data.camera_make,
|
||||
camera_model: exif_data.camera_model,
|
||||
lens_model: exif_data.lens_model,
|
||||
width: exif_data.width,
|
||||
height: exif_data.height,
|
||||
orientation: exif_data.orientation,
|
||||
gps_latitude: exif_data.gps_latitude,
|
||||
gps_longitude: exif_data.gps_longitude,
|
||||
gps_altitude: exif_data.gps_altitude,
|
||||
focal_length: exif_data.focal_length,
|
||||
aperture: exif_data.aperture,
|
||||
shutter_speed: exif_data.shutter_speed,
|
||||
iso: exif_data.iso,
|
||||
date_taken: exif_data.date_taken,
|
||||
created_time: timestamp,
|
||||
last_modified: timestamp,
|
||||
};
|
||||
|
||||
let mut dao = exif_dao.lock().expect("Unable to lock ExifDao");
|
||||
if let Err(e) = dao.store_exif(&context, insert_exif) {
|
||||
error!("Failed to store EXIF data for {}: {:?}", relative_path, e);
|
||||
} else {
|
||||
debug!("EXIF data stored for {}", relative_path);
|
||||
}
|
||||
}
|
||||
Err(e) => {
|
||||
debug!(
|
||||
"No EXIF data or error extracting from {}: {:?}",
|
||||
file_path.display(),
|
||||
e
|
||||
);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Generate thumbnails for all files that need them
|
||||
if new_files_found {
|
||||
info!("Processing thumbnails for new files...");
|
||||
create_thumbnails();
|
||||
}
|
||||
}
|
||||
|
||||
394
src/memories.rs
394
src/memories.rs
@@ -1,17 +1,22 @@
|
||||
use actix_web::web::Data;
|
||||
use actix_web::{HttpRequest, HttpResponse, Responder, get, web};
|
||||
use chrono::LocalResult::{Ambiguous, Single};
|
||||
use chrono::{DateTime, Datelike, FixedOffset, Local, LocalResult, NaiveDate, TimeZone, Utc};
|
||||
use chrono::{
|
||||
DateTime, Datelike, FixedOffset, Local, LocalResult, NaiveDate, TimeZone, Timelike, Utc,
|
||||
};
|
||||
use log::{debug, trace, warn};
|
||||
use opentelemetry::KeyValue;
|
||||
use opentelemetry::trace::{Span, Status, Tracer};
|
||||
use opentelemetry::trace::{Span, Status, TraceContextExt, Tracer};
|
||||
use rayon::prelude::*;
|
||||
use serde::{Deserialize, Serialize};
|
||||
use std::collections::HashSet;
|
||||
use std::path::Path;
|
||||
use std::path::PathBuf;
|
||||
use std::sync::Mutex;
|
||||
use walkdir::WalkDir;
|
||||
|
||||
use crate::data::Claims;
|
||||
use crate::database::ExifDao;
|
||||
use crate::files::is_image_or_video;
|
||||
use crate::otel::{extract_context_from_request, global_tracer};
|
||||
use crate::state::AppState;
|
||||
@@ -76,13 +81,14 @@ impl PathExcluder {
|
||||
if !self.excluded_patterns.is_empty() {
|
||||
for component in path.components() {
|
||||
if let Some(comp_str) = component.as_os_str().to_str()
|
||||
&& self.excluded_patterns.iter().any(|pat| pat == comp_str) {
|
||||
debug!(
|
||||
"PathExcluder: excluded by component pattern: {:?} (component: {:?}, patterns: {:?})",
|
||||
path, comp_str, self.excluded_patterns
|
||||
);
|
||||
return true;
|
||||
}
|
||||
&& self.excluded_patterns.iter().any(|pat| pat == comp_str)
|
||||
{
|
||||
debug!(
|
||||
"PathExcluder: excluded by component pattern: {:?} (component: {:?}, patterns: {:?})",
|
||||
path, comp_str, self.excluded_patterns
|
||||
);
|
||||
return true;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -124,8 +130,12 @@ fn get_file_date_info(
|
||||
// Read file metadata once
|
||||
let meta = std::fs::metadata(path).ok()?;
|
||||
|
||||
// Extract metadata timestamps
|
||||
let metadata_created = meta.created().ok().map(|t| {
|
||||
// Get created timestamp (tries filename first, then metadata)
|
||||
let path_str = path.to_str()?;
|
||||
let created = get_created_timestamp_with_fallback(path_str, &meta, client_timezone);
|
||||
|
||||
// Get modified timestamp from metadata
|
||||
let modified = meta.modified().ok().map(|t| {
|
||||
let utc: DateTime<Utc> = t.into();
|
||||
if let Some(tz) = client_timezone {
|
||||
utc.with_timezone(tz).timestamp()
|
||||
@@ -134,16 +144,7 @@ fn get_file_date_info(
|
||||
}
|
||||
});
|
||||
|
||||
let metadata_modified = meta.modified().ok().map(|t| {
|
||||
let utc: DateTime<Utc> = t.into();
|
||||
if let Some(tz) = client_timezone {
|
||||
utc.with_timezone(tz).timestamp()
|
||||
} else {
|
||||
utc.timestamp()
|
||||
}
|
||||
});
|
||||
|
||||
// Try to get date from filename
|
||||
// Try to get date from filename for the NaiveDate
|
||||
if let Some(date_time) = path
|
||||
.file_name()
|
||||
.and_then(|filename| filename.to_str())
|
||||
@@ -156,20 +157,13 @@ fn get_file_date_info(
|
||||
date_time.with_timezone(&Local).fixed_offset()
|
||||
};
|
||||
|
||||
// Use the timestamp from the filename date
|
||||
let created_ts = date_in_timezone.timestamp();
|
||||
|
||||
debug!(
|
||||
"File date from file {:?} > {:?} = {:?}",
|
||||
path.file_name(),
|
||||
date_time,
|
||||
date_in_timezone
|
||||
);
|
||||
return Some((
|
||||
date_in_timezone.date_naive(),
|
||||
Some(created_ts),
|
||||
metadata_modified,
|
||||
));
|
||||
return Some((date_in_timezone.date_naive(), created, modified));
|
||||
}
|
||||
|
||||
// Fall back to metadata if no date in filename
|
||||
@@ -183,10 +177,57 @@ fn get_file_date_info(
|
||||
};
|
||||
|
||||
trace!("Fallback metadata create date = {:?}", date_in_timezone);
|
||||
Some((date_in_timezone, metadata_created, metadata_modified))
|
||||
Some((date_in_timezone, created, modified))
|
||||
}
|
||||
|
||||
fn extract_date_from_filename(filename: &str) -> Option<DateTime<FixedOffset>> {
|
||||
/// Convert Unix timestamp to NaiveDate in client timezone
|
||||
fn timestamp_to_naive_date(
|
||||
timestamp: i64,
|
||||
client_timezone: &Option<FixedOffset>,
|
||||
) -> Option<NaiveDate> {
|
||||
let dt_utc = DateTime::<Utc>::from_timestamp(timestamp, 0)?;
|
||||
|
||||
let date = if let Some(tz) = client_timezone {
|
||||
dt_utc.with_timezone(tz).date_naive()
|
||||
} else {
|
||||
dt_utc.with_timezone(&Local).date_naive()
|
||||
};
|
||||
|
||||
Some(date)
|
||||
}
|
||||
|
||||
/// Get created timestamp, trying filename parsing first, then falling back to metadata
|
||||
fn get_created_timestamp_with_fallback(
|
||||
file_path: &str,
|
||||
metadata: &std::fs::Metadata,
|
||||
client_timezone: &Option<FixedOffset>,
|
||||
) -> Option<i64> {
|
||||
// Try to extract date from filename first
|
||||
if let Some(filename_date) = Path::new(file_path)
|
||||
.file_name()
|
||||
.and_then(|f| f.to_str())
|
||||
.and_then(extract_date_from_filename)
|
||||
{
|
||||
let timestamp = if let Some(tz) = client_timezone {
|
||||
filename_date.with_timezone(tz).timestamp()
|
||||
} else {
|
||||
filename_date.timestamp()
|
||||
};
|
||||
return Some(timestamp);
|
||||
}
|
||||
|
||||
// Fall back to metadata
|
||||
metadata.created().ok().map(|t| {
|
||||
let utc: DateTime<Utc> = t.into();
|
||||
if let Some(tz) = client_timezone {
|
||||
utc.with_timezone(tz).timestamp()
|
||||
} else {
|
||||
utc.timestamp()
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
pub fn extract_date_from_filename(filename: &str) -> Option<DateTime<FixedOffset>> {
|
||||
let build_date_from_ymd_capture =
|
||||
|captures: ®ex::Captures| -> Option<DateTime<FixedOffset>> {
|
||||
let year = captures.get(1)?.as_str().parse::<i32>().ok()?;
|
||||
@@ -267,9 +308,9 @@ fn extract_date_from_filename(filename: &str) -> Option<DateTime<FixedOffset>> {
|
||||
.ok()
|
||||
.and_then(DateTime::from_timestamp_millis)
|
||||
.map(|naive_dt| naive_dt.fixed_offset())
|
||||
{
|
||||
return Some(date_time);
|
||||
}
|
||||
{
|
||||
return Some(date_time);
|
||||
}
|
||||
|
||||
// Second timestamp (10 digits)
|
||||
if timestamp_str.len() >= 10
|
||||
@@ -278,24 +319,167 @@ fn extract_date_from_filename(filename: &str) -> Option<DateTime<FixedOffset>> {
|
||||
.ok()
|
||||
.and_then(|timestamp_secs| DateTime::from_timestamp(timestamp_secs, 0))
|
||||
.map(|naive_dt| naive_dt.fixed_offset())
|
||||
{
|
||||
return Some(date_time);
|
||||
}
|
||||
{
|
||||
return Some(date_time);
|
||||
}
|
||||
}
|
||||
|
||||
None
|
||||
}
|
||||
|
||||
/// Collect memories from EXIF database
|
||||
fn collect_exif_memories(
|
||||
exif_dao: &Data<Mutex<Box<dyn ExifDao>>>,
|
||||
context: &opentelemetry::Context,
|
||||
base_path: &str,
|
||||
now: NaiveDate,
|
||||
span_mode: MemoriesSpan,
|
||||
years_back: u32,
|
||||
client_timezone: &Option<FixedOffset>,
|
||||
path_excluder: &PathExcluder,
|
||||
) -> Vec<(MemoryItem, NaiveDate)> {
|
||||
// Query database for all files with date_taken
|
||||
let exif_records = match exif_dao.lock() {
|
||||
Ok(mut dao) => match dao.get_all_with_date_taken(context) {
|
||||
Ok(records) => records,
|
||||
Err(e) => {
|
||||
warn!("Failed to query EXIF database: {:?}", e);
|
||||
return Vec::new(); // Graceful fallback
|
||||
}
|
||||
},
|
||||
Err(e) => {
|
||||
warn!("Failed to lock EXIF DAO: {:?}", e);
|
||||
return Vec::new();
|
||||
}
|
||||
};
|
||||
|
||||
// Parallel processing with Rayon
|
||||
exif_records
|
||||
.par_iter()
|
||||
.filter_map(|(file_path, date_taken_ts)| {
|
||||
// Build full path
|
||||
let full_path = Path::new(base_path).join(file_path);
|
||||
|
||||
// Check exclusions
|
||||
if path_excluder.is_excluded(&full_path) {
|
||||
return None;
|
||||
}
|
||||
|
||||
// Verify file exists
|
||||
if !full_path.exists() || !full_path.is_file() {
|
||||
warn!("EXIF record exists but file not found: {:?}", full_path);
|
||||
return None;
|
||||
}
|
||||
|
||||
// Convert timestamp to NaiveDate in client timezone
|
||||
let file_date = timestamp_to_naive_date(*date_taken_ts, client_timezone)?;
|
||||
|
||||
// Check if matches memory criteria
|
||||
if !is_memories_match(file_path, file_date, now, span_mode, years_back) {
|
||||
return None;
|
||||
}
|
||||
|
||||
// Get file metadata for created/modified timestamps
|
||||
let metadata = std::fs::metadata(&full_path).ok()?;
|
||||
let created =
|
||||
get_created_timestamp_with_fallback(file_path, &metadata, client_timezone);
|
||||
let modified = metadata.modified().ok().map(|t| {
|
||||
let utc: DateTime<Utc> = t.into();
|
||||
if let Some(tz) = client_timezone {
|
||||
utc.with_timezone(tz).timestamp()
|
||||
} else {
|
||||
utc.timestamp()
|
||||
}
|
||||
});
|
||||
|
||||
Some((
|
||||
MemoryItem {
|
||||
path: file_path.clone(),
|
||||
created,
|
||||
modified,
|
||||
},
|
||||
file_date,
|
||||
))
|
||||
})
|
||||
.collect()
|
||||
}
|
||||
|
||||
/// Collect memories from file system scan (for files not in EXIF DB)
|
||||
fn collect_filesystem_memories(
|
||||
base_path: &str,
|
||||
path_excluder: &PathExcluder,
|
||||
skip_paths: &HashSet<PathBuf>,
|
||||
now: NaiveDate,
|
||||
span_mode: MemoriesSpan,
|
||||
years_back: u32,
|
||||
client_timezone: &Option<FixedOffset>,
|
||||
) -> Vec<(MemoryItem, NaiveDate)> {
|
||||
let base = Path::new(base_path);
|
||||
|
||||
let entries: Vec<_> = WalkDir::new(base)
|
||||
.into_iter()
|
||||
.filter_map(|e| e.ok())
|
||||
.filter(|e| {
|
||||
let path = e.path();
|
||||
|
||||
// Skip if already processed by EXIF query
|
||||
if skip_paths.contains(path) {
|
||||
return false;
|
||||
}
|
||||
|
||||
// Check exclusions
|
||||
if path_excluder.is_excluded(path) {
|
||||
return false;
|
||||
}
|
||||
|
||||
// Only process image/video files
|
||||
e.file_type().is_file() && is_image_or_video(path)
|
||||
})
|
||||
.collect();
|
||||
|
||||
entries
|
||||
.par_iter()
|
||||
.filter_map(|entry| {
|
||||
// Use existing get_file_date_info() for filename/metadata fallback
|
||||
let (file_date, created, modified) = get_file_date_info(entry.path(), client_timezone)?;
|
||||
|
||||
if is_memories_match(
|
||||
entry.path().to_str().unwrap_or("Unknown"),
|
||||
file_date,
|
||||
now,
|
||||
span_mode,
|
||||
years_back,
|
||||
) {
|
||||
let path_relative = entry.path().strip_prefix(base).ok()?.to_str()?.to_string();
|
||||
|
||||
Some((
|
||||
MemoryItem {
|
||||
path: path_relative,
|
||||
created,
|
||||
modified,
|
||||
},
|
||||
file_date,
|
||||
))
|
||||
} else {
|
||||
None
|
||||
}
|
||||
})
|
||||
.collect()
|
||||
}
|
||||
|
||||
#[get("/memories")]
|
||||
pub async fn list_memories(
|
||||
_claims: Claims,
|
||||
request: HttpRequest,
|
||||
q: web::Query<MemoriesRequest>,
|
||||
app_state: Data<AppState>,
|
||||
exif_dao: Data<Mutex<Box<dyn ExifDao>>>,
|
||||
) -> impl Responder {
|
||||
let tracer = global_tracer();
|
||||
let context = extract_context_from_request(&request);
|
||||
let mut span = tracer.start_with_context("list_memories", &context);
|
||||
let parent_context = extract_context_from_request(&request);
|
||||
let mut span = tracer.start_with_context("list_memories", &parent_context);
|
||||
let span_context =
|
||||
opentelemetry::Context::new().with_remote_span_context(span.span_context().clone());
|
||||
|
||||
let span_mode = q.span.unwrap_or(MemoriesSpan::Day);
|
||||
let years_back: u32 = 15;
|
||||
@@ -326,60 +510,95 @@ pub async fn list_memories(
|
||||
// Build the path excluder from base and env-configured exclusions
|
||||
let path_excluder = PathExcluder::new(base, &app_state.excluded_dirs);
|
||||
|
||||
let entries: Vec<_> = WalkDir::new(base)
|
||||
.into_iter()
|
||||
.filter_map(|e| e.ok())
|
||||
.filter(|e| {
|
||||
let path = e.path();
|
||||
// Phase 1: Query EXIF database
|
||||
let exif_memories = collect_exif_memories(
|
||||
&exif_dao,
|
||||
&span_context,
|
||||
&app_state.base_path,
|
||||
now,
|
||||
span_mode,
|
||||
years_back,
|
||||
&client_timezone,
|
||||
&path_excluder,
|
||||
);
|
||||
|
||||
// Skip paths that should be excluded
|
||||
if path_excluder.is_excluded(path) {
|
||||
return false;
|
||||
}
|
||||
|
||||
true
|
||||
})
|
||||
.filter(|e| e.file_type().is_file() && is_image_or_video(e.path()))
|
||||
// Build HashSet for deduplication
|
||||
let exif_paths: HashSet<PathBuf> = exif_memories
|
||||
.iter()
|
||||
.map(|(item, _)| PathBuf::from(&app_state.base_path).join(&item.path))
|
||||
.collect();
|
||||
|
||||
let mut memories_with_dates: Vec<(MemoryItem, NaiveDate)> = entries
|
||||
.par_iter()
|
||||
.filter_map(|entry| {
|
||||
let path = entry.path();
|
||||
// Phase 2: File system scan (skip EXIF files)
|
||||
let fs_memories = collect_filesystem_memories(
|
||||
&app_state.base_path,
|
||||
&path_excluder,
|
||||
&exif_paths,
|
||||
now,
|
||||
span_mode,
|
||||
years_back,
|
||||
&client_timezone,
|
||||
);
|
||||
|
||||
// Get file date and timestamps in one operation
|
||||
let (file_date, created, modified) = match get_file_date_info(path, &client_timezone) {
|
||||
Some(info) => info,
|
||||
None => {
|
||||
warn!("No date info found for file: {:?}", path);
|
||||
return None;
|
||||
}
|
||||
};
|
||||
|
||||
if is_memories_match(file_date, now, span_mode, years_back) {
|
||||
return if let Ok(rel) = path.strip_prefix(base) {
|
||||
Some((
|
||||
MemoryItem {
|
||||
path: rel.to_string_lossy().to_string(),
|
||||
created,
|
||||
modified,
|
||||
},
|
||||
file_date,
|
||||
))
|
||||
} else {
|
||||
warn!("Failed to strip prefix from path: {:?}", path);
|
||||
None
|
||||
};
|
||||
}
|
||||
|
||||
None
|
||||
})
|
||||
.collect();
|
||||
// Phase 3: Merge and sort
|
||||
let mut memories_with_dates = exif_memories;
|
||||
memories_with_dates.extend(fs_memories);
|
||||
|
||||
match span_mode {
|
||||
// Sort by absolute time for a more 'overview'
|
||||
MemoriesSpan::Month => memories_with_dates.sort_by(|a, b| a.1.cmp(&b.1)),
|
||||
_ => {
|
||||
// For week span, sort by day of month, then time of day, then year (oldest first)
|
||||
MemoriesSpan::Week => {
|
||||
memories_with_dates.sort_by(|a, b| {
|
||||
// First, sort by day of month
|
||||
let day_cmp = a.1.day().cmp(&b.1.day());
|
||||
if day_cmp != std::cmp::Ordering::Equal {
|
||||
return day_cmp;
|
||||
}
|
||||
|
||||
// Then sort by time of day
|
||||
match (a.0.created, b.0.created) {
|
||||
(Some(a_time), Some(b_time)) => {
|
||||
// Convert timestamps to DateTime
|
||||
let a_dt_utc = DateTime::<Utc>::from_timestamp(a_time, 0).unwrap();
|
||||
let b_dt_utc = DateTime::<Utc>::from_timestamp(b_time, 0).unwrap();
|
||||
|
||||
// Extract time of day in the appropriate timezone
|
||||
let a_time_of_day = if let Some(ref tz) = client_timezone {
|
||||
let dt = a_dt_utc.with_timezone(tz);
|
||||
(dt.hour(), dt.minute(), dt.second())
|
||||
} else {
|
||||
let dt = a_dt_utc.with_timezone(&Local);
|
||||
(dt.hour(), dt.minute(), dt.second())
|
||||
};
|
||||
|
||||
let b_time_of_day = if let Some(ref tz) = client_timezone {
|
||||
let dt = b_dt_utc.with_timezone(tz);
|
||||
(dt.hour(), dt.minute(), dt.second())
|
||||
} else {
|
||||
let dt = b_dt_utc.with_timezone(&Local);
|
||||
(dt.hour(), dt.minute(), dt.second())
|
||||
};
|
||||
|
||||
// Compare time of day
|
||||
let time_cmp = a_time_of_day.cmp(&b_time_of_day);
|
||||
if time_cmp != std::cmp::Ordering::Equal {
|
||||
return time_cmp;
|
||||
}
|
||||
|
||||
// Finally, sort by year (oldest first)
|
||||
a.1.year().cmp(&b.1.year())
|
||||
}
|
||||
(Some(_), None) => std::cmp::Ordering::Less,
|
||||
(None, Some(_)) => std::cmp::Ordering::Greater,
|
||||
(None, None) => {
|
||||
// If no timestamps, just sort by year (oldest first)
|
||||
a.1.year().cmp(&b.1.year())
|
||||
}
|
||||
}
|
||||
});
|
||||
}
|
||||
// For day span, sort by day of month then by time
|
||||
MemoriesSpan::Day => {
|
||||
memories_with_dates.sort_by(|a, b| {
|
||||
let day_comparison = a.1.day().cmp(&b.1.day());
|
||||
|
||||
@@ -422,6 +641,7 @@ pub async fn list_memories(
|
||||
}
|
||||
|
||||
fn is_memories_match(
|
||||
file_path: &str,
|
||||
file_date: NaiveDate,
|
||||
today: NaiveDate,
|
||||
span: MemoriesSpan,
|
||||
@@ -433,8 +653,8 @@ fn is_memories_match(
|
||||
let years_diff = (today.year() - file_date.year()).unsigned_abs();
|
||||
if years_diff > years_back {
|
||||
warn!(
|
||||
"File date is too far in the past: {:?} vs {:?}",
|
||||
file_date, today
|
||||
"File ({}) date is too far in the past: {:?} vs {:?}",
|
||||
file_path, file_date, today
|
||||
);
|
||||
return false;
|
||||
}
|
||||
|
||||
167
src/tags.rs
167
src/tags.rs
@@ -87,6 +87,7 @@ async fn get_tags<D: TagDao>(
|
||||
async fn get_all_tags<D: TagDao>(
|
||||
_: Claims,
|
||||
tag_dao: web::Data<Mutex<D>>,
|
||||
exif_dao: web::Data<Mutex<Box<dyn crate::database::ExifDao>>>,
|
||||
request: HttpRequest,
|
||||
query: web::Query<GetTagsRequest>,
|
||||
) -> impl Responder {
|
||||
@@ -100,14 +101,31 @@ async fn get_all_tags<D: TagDao>(
|
||||
.map(|tags| {
|
||||
span_context.span().set_status(Status::Ok);
|
||||
|
||||
HttpResponse::Ok().json(
|
||||
tags.iter()
|
||||
.map(|(tag_count, tag)| TagWithTagCount {
|
||||
tag: tag.clone(),
|
||||
tag_count: *tag_count,
|
||||
})
|
||||
.collect::<Vec<TagWithTagCount>>(),
|
||||
)
|
||||
let tags_response = tags
|
||||
.iter()
|
||||
.map(|(tag_count, tag)| TagWithTagCount {
|
||||
tag: tag.clone(),
|
||||
tag_count: *tag_count,
|
||||
})
|
||||
.collect::<Vec<TagWithTagCount>>();
|
||||
|
||||
// Get camera makes from EXIF database
|
||||
let camera_makes = exif_dao
|
||||
.lock()
|
||||
.expect("Unable to get ExifDao")
|
||||
.get_camera_makes(&span_context)
|
||||
.unwrap_or_else(|e| {
|
||||
log::warn!("Failed to get camera makes: {:?}", e);
|
||||
Vec::new()
|
||||
})
|
||||
.into_iter()
|
||||
.map(|(make, count)| CameraMakeCount { make, count })
|
||||
.collect::<Vec<CameraMakeCount>>();
|
||||
|
||||
HttpResponse::Ok().json(AllTagsResponse {
|
||||
tags: tags_response,
|
||||
camera_makes,
|
||||
})
|
||||
})
|
||||
.into_http_internal_err()
|
||||
}
|
||||
@@ -208,6 +226,18 @@ pub struct TagWithTagCount {
|
||||
pub tag: Tag,
|
||||
}
|
||||
|
||||
#[derive(Serialize, Debug)]
|
||||
pub struct CameraMakeCount {
|
||||
pub make: String,
|
||||
pub count: i64,
|
||||
}
|
||||
|
||||
#[derive(Serialize, Debug)]
|
||||
pub struct AllTagsResponse {
|
||||
pub tags: Vec<TagWithTagCount>,
|
||||
pub camera_makes: Vec<CameraMakeCount>,
|
||||
}
|
||||
|
||||
#[derive(Insertable, Clone, Debug)]
|
||||
#[diesel(table_name = tags)]
|
||||
pub struct InsertTag {
|
||||
@@ -273,6 +303,16 @@ pub trait TagDao {
|
||||
exclude_tag_ids: Vec<i32>,
|
||||
context: &opentelemetry::Context,
|
||||
) -> anyhow::Result<Vec<FileWithTagCount>>;
|
||||
fn update_photo_name(
|
||||
&mut self,
|
||||
old_name: &str,
|
||||
new_name: &str,
|
||||
context: &opentelemetry::Context,
|
||||
) -> anyhow::Result<()>;
|
||||
fn get_all_photo_names(
|
||||
&mut self,
|
||||
context: &opentelemetry::Context,
|
||||
) -> anyhow::Result<Vec<String>>;
|
||||
}
|
||||
|
||||
pub struct SqliteTagDao {
|
||||
@@ -467,31 +507,51 @@ impl TagDao for SqliteTagDao {
|
||||
trace_db_call(context, "query", "get_files_with_all_tags", |_| {
|
||||
use diesel::dsl::*;
|
||||
|
||||
let exclude_subquery = tagged_photo::table
|
||||
.filter(tagged_photo::tag_id.eq_any(exclude_tag_ids.clone()))
|
||||
.select(tagged_photo::photo_name)
|
||||
.into_boxed();
|
||||
// Create the placeholders for the IN clauses
|
||||
let tag_placeholders = std::iter::repeat_n("?", tag_ids.len())
|
||||
.collect::<Vec<_>>()
|
||||
.join(",");
|
||||
let exclude_placeholders = std::iter::repeat_n("?", exclude_tag_ids.len())
|
||||
.collect::<Vec<_>>()
|
||||
.join(",");
|
||||
|
||||
tagged_photo::table
|
||||
.filter(tagged_photo::tag_id.eq_any(tag_ids.clone()))
|
||||
.filter(tagged_photo::photo_name.ne_all(exclude_subquery))
|
||||
.group_by(tagged_photo::photo_name)
|
||||
.select((
|
||||
tagged_photo::photo_name,
|
||||
count_distinct(tagged_photo::tag_id),
|
||||
))
|
||||
.having(count_distinct(tagged_photo::tag_id).ge(tag_ids.len() as i64))
|
||||
.get_results::<(String, i64)>(&mut self.connection)
|
||||
.map(|results| {
|
||||
results
|
||||
.into_iter()
|
||||
.map(|(file_name, tag_count)| FileWithTagCount {
|
||||
file_name,
|
||||
tag_count,
|
||||
})
|
||||
.collect()
|
||||
})
|
||||
.with_context(|| format!("Unable to get Tagged photos with ids: {:?}", tag_ids))
|
||||
let query = sql_query(format!(
|
||||
r#"
|
||||
WITH filtered_photos AS (
|
||||
SELECT photo_name
|
||||
FROM tagged_photo tp
|
||||
WHERE tp.tag_id IN ({})
|
||||
AND tp.photo_name NOT IN (
|
||||
SELECT photo_name
|
||||
FROM tagged_photo
|
||||
WHERE tag_id IN ({})
|
||||
)
|
||||
GROUP BY photo_name
|
||||
HAVING COUNT(DISTINCT tag_id) >= {}
|
||||
)
|
||||
SELECT
|
||||
fp.photo_name as file_name,
|
||||
COUNT(DISTINCT tp2.tag_id) as tag_count
|
||||
FROM filtered_photos fp
|
||||
JOIN tagged_photo tp2 ON fp.photo_name = tp2.photo_name
|
||||
GROUP BY fp.photo_name"#,
|
||||
tag_placeholders,
|
||||
exclude_placeholders,
|
||||
tag_ids.len()
|
||||
))
|
||||
.into_boxed();
|
||||
|
||||
// Bind all parameters
|
||||
let query = tag_ids
|
||||
.into_iter()
|
||||
.fold(query, |q, id| q.bind::<Integer, _>(id));
|
||||
let query = exclude_tag_ids
|
||||
.into_iter()
|
||||
.fold(query, |q, id| q.bind::<Integer, _>(id));
|
||||
|
||||
query
|
||||
.load::<FileWithTagCount>(&mut self.connection)
|
||||
.with_context(|| "Unable to get tagged photos with all specified tags")
|
||||
})
|
||||
}
|
||||
|
||||
@@ -546,6 +606,33 @@ impl TagDao for SqliteTagDao {
|
||||
.with_context(|| "Unable to get tagged photos")
|
||||
})
|
||||
}
|
||||
|
||||
fn update_photo_name(
|
||||
&mut self,
|
||||
old_name: &str,
|
||||
new_name: &str,
|
||||
_context: &opentelemetry::Context,
|
||||
) -> anyhow::Result<()> {
|
||||
use crate::database::schema::tagged_photo::dsl::*;
|
||||
|
||||
diesel::update(tagged_photo.filter(photo_name.eq(old_name)))
|
||||
.set(photo_name.eq(new_name))
|
||||
.execute(&mut self.connection)?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn get_all_photo_names(
|
||||
&mut self,
|
||||
_context: &opentelemetry::Context,
|
||||
) -> anyhow::Result<Vec<String>> {
|
||||
use crate::database::schema::tagged_photo::dsl::*;
|
||||
|
||||
tagged_photo
|
||||
.select(photo_name)
|
||||
.distinct()
|
||||
.load(&mut self.connection)
|
||||
.with_context(|| "Unable to get photo names")
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
@@ -706,6 +793,22 @@ mod tests {
|
||||
) -> anyhow::Result<Vec<FileWithTagCount>> {
|
||||
todo!()
|
||||
}
|
||||
|
||||
fn update_photo_name(
|
||||
&mut self,
|
||||
old_name: &str,
|
||||
new_name: &str,
|
||||
context: &opentelemetry::Context,
|
||||
) -> anyhow::Result<()> {
|
||||
todo!()
|
||||
}
|
||||
|
||||
fn get_all_photo_names(
|
||||
&mut self,
|
||||
context: &opentelemetry::Context,
|
||||
) -> anyhow::Result<Vec<String>> {
|
||||
todo!()
|
||||
}
|
||||
}
|
||||
|
||||
#[actix_rt::test]
|
||||
|
||||
Reference in New Issue
Block a user