Go back to Django

This commit is contained in:
Joakim Hellsén 2024-02-19 05:44:08 +01:00
commit 7eee113cdf
22 changed files with 1481 additions and 172 deletions

View file

@ -1,10 +1,11 @@
{ {
"$schema": "https://docs.renovatebot.com/renovate-schema.json", "$schema": "https://docs.renovatebot.com/renovate-schema.json",
"extends": ["config:recommended"], "extends": [
"config:recommended"
],
"automerge": true, "automerge": true,
"configMigration": true, "configMigration": true,
"dependencyDashboard": false, "dependencyDashboard": false,
"osvVulnerabilityAlerts": true, "osvVulnerabilityAlerts": true,
"timezone": "Europe/Stockholm", "timezone": "Europe/Stockholm"
"postUpdateOptions": ["gomodTidy"]
} }

64
.github/workflows/docker-publish.yml vendored Normal file
View file

@ -0,0 +1,64 @@
name: Test and Build Docker Image
on:
push:
pull_request:
workflow_dispatch:
schedule:
- cron: "0 0 * * *"
jobs:
test:
runs-on: ubuntu-latest
env:
SECRET_KEY: 1234567890
DEBUG: True
ADMIN_EMAIL: 4153203+TheLovinator1@users.noreply.github.com
EMAIL_HOST_USER: ${{ secrets.EMAIL_HOST_USER }}
EMAIL_HOST_PASSWORD: ${{ secrets.EMAIL_HOST_PASSWORD }}
POSTGRES_PASSWORD: githubtest
POSTGRES_HOST: 127.0.0.1
POSTGRES_USER: feedvault
services:
postgres:
image: postgres:16
env:
POSTGRES_PASSWORD: ${{ env.POSTGRES_PASSWORD }}
POSTGRES_HOST: ${{ env.POSTGRES_HOST }}
POSTGRES_USER: ${{ env.POSTGRES_USER }}
ports:
- 5432:5432
options: --health-cmd pg_isready --health-interval 1s --health-timeout 5s --health-retries 5
steps:
- uses: actions/checkout@v4
- uses: actions/setup-python@v5
with:
python-version: 3.12
- run: pipx install poetry
- run: pipx inject poetry poetry-plugin-export
- run: poetry install
- run: poetry run python manage.py migrate
# - run: poetry run python manage.py test
build:
runs-on: ubuntu-latest
permissions:
contents: read
packages: write
if: github.event_name != 'pull_request'
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
needs: test
steps:
- uses: actions/checkout@v4
- uses: docker/setup-buildx-action@v3
- uses: docker/login-action@v3
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }}
- uses: docker/build-push-action@v5
with:
context: .
push: ${{ github.event_name != 'pull_request' }}
tags: |
ghcr.io/thelovinator1/feedvault:latest

14
.github/workflows/ruff.yml vendored Normal file
View file

@ -0,0 +1,14 @@
name: Ruff
on:
push:
pull_request:
schedule:
- cron: "0 0 * * *" # Run every day
workflow_dispatch:
jobs:
ruff:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: chartboost/ruff-action@v1

View file

@ -28,6 +28,7 @@
"feedvault", "feedvault",
"gaierror", "gaierror",
"giga", "giga",
"githubtest",
"godotenv", "godotenv",
"gofeed", "gofeed",
"gomod", "gomod",
@ -63,6 +64,8 @@
"pgtype", "pgtype",
"PGUSER", "PGUSER",
"pgxpool", "pgxpool",
"pipx",
"Plipp",
"Prés", "Prés",
"pressly", "pressly",
"psql", "psql",

64
Dockerfile Normal file
View file

@ -0,0 +1,64 @@
# Stage 1: Build the requirements.txt using Poetry
FROM python:3.12-slim AS builder
# Set environment variables for Python
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
ENV PATH="${PATH}:/root/.local/bin"
# Install system dependencies
RUN apt-get update && \
apt-get install -y --no-install-recommends \
curl \
git \
&& rm -rf /var/lib/apt/lists/*
# Install Poetry
RUN curl -sSL https://install.python-poetry.org | python3 -
# Copy only the poetry.lock/pyproject.toml to leverage Docker cache
WORKDIR /app
COPY pyproject.toml poetry.lock /app/
# Install dependencies and create requirements.txt
RUN poetry self add poetry-plugin-export && poetry export --format=requirements.txt --output=requirements.txt --only=main --without-hashes
# Stage 2: Install dependencies and run the Django application
FROM python:3.12-slim AS runner
# Set environment variables for Python
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
# Create a non-root user
RUN useradd -ms /bin/bash appuser
# Install system dependencies
RUN apt-get update && \
apt-get install -y --no-install-recommends \
libpq-dev \
git \
netcat-openbsd \
&& rm -rf /var/lib/apt/lists/*
# Copy the generated requirements.txt from the builder stage
WORKDIR /app
COPY --from=builder /app/requirements.txt /app/
# Install application dependencies
RUN pip install --no-cache-dir -r requirements.txt
# Copy the rest of the application code
COPY . /app/
# Change ownership of the application directory to the non-root user
RUN chown -R appuser:appuser /app
# Switch to the non-root user
USER appuser
# The port the application will listen on
EXPOSE 8000
# Run startup script
CMD ["./docker-entrypoint.sh"]

View file

@ -2,7 +2,7 @@
_A seed vault for your feeds._ _A seed vault for your feeds._
FeedVault is an open-source web application written in Golang that allows users to archive and search their favorite RSS, Atom, and JSON feeds. With FeedVault, users can effortlessly add their favorite feeds, ensuring they have a centralized location for accessing and preserving valuable content. FeedVault is an open-source web application that allows users to archive and search their favorite RSS, Atom, and JSON feeds. With FeedVault, users can effortlessly add their favorite feeds, ensuring they have a centralized location for accessing and preserving valuable content.
## Features ## Features
@ -34,13 +34,4 @@ Try to minimize the number of dependencies you add to the project. If you need t
For any inquiries or support, please create an issue on GitHub. For any inquiries or support, please create an issue on GitHub.
## Development
- I use [goose](https://github.com/pressly/goose) and [sqlc](https://github.com/sqlc-dev/sqlc) for database migrations and queries.
- To create a new migration, run `goose create <migration_name> sql`. Then, edit the file in `sql/schema/<date>_<migration_name>.sql` and run `goose up` to apply the migration.
- You will have to install `goose` first. See the [Goose documentation](https://pressly.github.io/goose/installation/).
- You will also have to install `sqlc`. See the [sqlc documentation](https://docs.sqlc.dev/en/latest/overview/install.html).
- You have to set some environment variables for this. See [.vscode/settings.json](.vscode/settings.json) for local development.
- To generate new queries, run `sqlc generate`.
Thank you for using FeedVault! Happy archiving! Thank you for using FeedVault! Happy archiving!

View file

@ -6,7 +6,6 @@ services:
user: "1000:1000" user: "1000:1000"
restart: always restart: always
networks: networks:
- redis
- db - db
- web - web
environment: environment:
@ -15,8 +14,6 @@ services:
- ADMIN_EMAIL=${ADMIN_EMAIL} - ADMIN_EMAIL=${ADMIN_EMAIL}
- EMAIL_HOST_USER=${EMAIL_HOST_USER} - EMAIL_HOST_USER=${EMAIL_HOST_USER}
- EMAIL_HOST_PASSWORD=${EMAIL_HOST_PASSWORD} - EMAIL_HOST_PASSWORD=${EMAIL_HOST_PASSWORD}
- REDIS_HOST=redis
- REDIS_PASSWORD=${REDIS_PASSWORD}
- POSTGRES_HOST=feedvault_postgres - POSTGRES_HOST=feedvault_postgres
- POSTGRES_PORT=5432 - POSTGRES_PORT=5432
- POSTGRES_USER=feedvault - POSTGRES_USER=feedvault

28
docker-entrypoint.sh Normal file
View file

@ -0,0 +1,28 @@
#!/bin/sh
# Exit on error
set -e
# Debug
set -x
# Wait for database
echo "Waiting for database"
while ! nc -z "$POSTGRES_HOST" "$POSTGRES_PORT"; do
sleep 0.1
done
echo "Database started"
# 2. Apply database migrations
echo "Apply database migrations"
python manage.py migrate
echo "Apply database migrations done"
# https://docs.gunicorn.org/en/stable/design.html#how-many-workers
num_cores=$(nproc --all)
workers=$((2 * num_cores + 1))
# 3. Start server
echo "Starting server with $workers workers"
gunicorn --workers=$workers --bind=0.0.0.0:8000 feedvault.wsgi:application --log-level=info --access-logfile=- --error-logfile=- --forwarded-allow-ips="172.*,192.*" --proxy-allow-from="172.*,192.*"
echo "Bye, love you"

296
feeds/add_feeds.py Normal file
View file

@ -0,0 +1,296 @@
from __future__ import annotations
import datetime
import logging
from time import mktime, struct_time
from urllib.parse import ParseResult, urlparse
import feedparser
from django.utils import timezone
from feedparser import FeedParserDict
from feeds.models import Author, Domain, Entry, Feed, Generator, Publisher
logger: logging.Logger = logging.getLogger(__name__)
def get_domain(url: str | None) -> None | str:
"""Get the domain of a URL."""
if not url:
return None
# Parse the URL.
parsed_url: ParseResult = urlparse(url)
if not parsed_url:
logger.error("Error parsing URL: %s", url)
return None
# Get the domain.
return str(parsed_url.netloc)
def get_author(parsed_feed: dict) -> Author:
"""Get the author of a feed.
Args:
parsed_feed: The parsed feed.
Returns:
The author of the feed. If the author doesn't exist, it will be created.
"""
# A dictionary with details about the author of this entry.
author_detail: dict = parsed_feed.get("author_detail", {})
author = Author(
name=author_detail.get("name", ""),
href=author_detail.get("href", ""),
email=author_detail.get("email", ""),
)
# Create the author if it doesn't exist.
try:
author: Author = Author.objects.get(name=author.name, email=author.email, href=author.href)
except Author.DoesNotExist:
author.save()
logger.info("Created author: %s", author)
return author
def def_generator(parsed_feed: dict) -> Generator:
"""Get the generator of a feed.
Args:
parsed_feed: The parsed feed.
Returns:
The generator of the feed. If the generator doesn't exist, it will be created.
"""
generator_detail: dict = parsed_feed.get("generator_detail", {})
generator = Generator(
name=generator_detail.get("name", ""),
href=generator_detail.get("href", ""),
version=generator_detail.get("version", ""),
)
# Create the generator if it doesn't exist.
try:
generator: Generator = Generator.objects.get(
name=generator.name,
href=generator.href,
version=generator.version,
)
except Generator.DoesNotExist:
generator.save()
logger.info("Created generator: %s", generator)
return generator
def get_publisher(parsed_feed: dict) -> Publisher:
"""Get the publisher of a feed.
Args:
parsed_feed: The parsed feed.
Returns:
The publisher of the feed. If the publisher doesn't exist, it will be created.
"""
publisher_detail: dict = parsed_feed.get("publisher_detail", {})
publisher = Publisher(
name=publisher_detail.get("name", ""),
href=publisher_detail.get("href", ""),
email=publisher_detail.get("email", ""),
)
# Create the publisher if it doesn't exist.
try:
publisher: Publisher = Publisher.objects.get(
name=publisher.name,
href=publisher.href,
email=publisher.email,
)
except Publisher.DoesNotExist:
publisher.save()
logger.info("Created publisher: %s", publisher)
return publisher
def parse_feed(url: str | None) -> dict | None:
"""Parse a feed.
Args:
url: The URL of the feed.
Returns:
The parsed feed.
"""
# TODO(TheLovinator): Backup the feed URL to a cloudflare worker. # noqa: TD003
if not url:
return None
# Parse the feed.
parsed_feed: dict = feedparser.parse(url)
if not parsed_feed:
return None
return parsed_feed
def struct_time_to_datetime(struct_time: struct_time | None) -> datetime.datetime | None:
"""Convert a struct_time to a datetime."""
if not struct_time:
return None
dt: datetime.datetime = datetime.datetime.fromtimestamp(mktime(struct_time), tz=datetime.timezone.utc)
if not dt:
logger.error("Error converting struct_time to datetime: %s", struct_time)
return None
return dt
def add_entry(feed: Feed, entry: FeedParserDict) -> Entry | None:
"""Add an entry to the database.
Args:
entry: The entry to add.
feed: The feed the entry belongs to.
"""
author: Author = get_author(parsed_feed=entry)
publisher: Publisher = get_publisher(parsed_feed=entry)
updated_parsed: datetime | None = struct_time_to_datetime(struct_time=entry.get("updated_parsed")) # type: ignore # noqa: PGH003
published_parsed: datetime | None = struct_time_to_datetime(struct_time=entry.get("published_parsed")) # type: ignore # noqa: PGH003
expired_parsed: datetime | None = struct_time_to_datetime(struct_time=entry.get("expired_parsed")) # type: ignore # noqa: PGH003
created_parsed: datetime | None = struct_time_to_datetime(struct_time=entry.get("created_parsed")) # type: ignore # noqa: PGH003
_entry = Entry(
feed=feed,
author=entry.get("author", ""),
author_detail=author,
comments=entry.get("comments", ""),
content=entry.get("content", {}),
contributors=entry.get("contributors", {}),
created=entry.get("created", ""),
created_parsed=created_parsed,
enclosures=entry.get("enclosures", []),
expired=entry.get("expired", ""),
expired_parsed=expired_parsed,
_id=entry.get("id", ""),
license=entry.get("license", ""),
link=entry.get("link", ""),
links=entry.get("links", []),
published=entry.get("published", ""),
published_parsed=published_parsed,
publisher=entry.get("publisher", ""),
publisher_detail=publisher,
source=entry.get("source", {}),
summary=entry.get("summary", ""),
summary_detail=entry.get("summary_detail", {}),
tags=entry.get("tags", []),
title=entry.get("title", ""),
title_detail=entry.get("title_detail", {}),
updated=entry.get("updated", ""),
updated_parsed=updated_parsed,
)
# Save the entry.
try:
_entry.save()
except Exception:
logger.exception("Error saving entry for feed: %s", feed)
return None
logger.info("Created entry: %s", _entry)
return _entry
def add_feed(url: str | None) -> None | Feed:
"""Add a feed to the database."""
# Parse the feed.
parsed_feed: dict | None = parse_feed(url=url)
if not parsed_feed:
return None
domain_url: None | str = get_domain(url=url)
if not domain_url:
return None
# Create the domain if it doesn't exist.
domain: Domain
domain, created = Domain.objects.get_or_create(url=domain_url)
if created:
logger.info("Created domain: %s", domain.url)
domain.save()
author: Author = get_author(parsed_feed=parsed_feed)
generator: Generator = def_generator(parsed_feed=parsed_feed)
publisher: Publisher = get_publisher(parsed_feed=parsed_feed)
published_parsed: datetime | None = struct_time_to_datetime(struct_time=parsed_feed.get("published_parsed")) # type: ignore # noqa: PGH003
updated_parsed: datetime | None = struct_time_to_datetime(struct_time=parsed_feed.get("updated_parsed")) # type: ignore # noqa: PGH003
# Create the feed
feed = Feed(
feed_url=url,
domain=domain,
last_checked=timezone.now(),
bozo=parsed_feed.get("bozo", 0),
bozo_exception=parsed_feed.get("bozo_exception", ""),
encoding=parsed_feed.get("encoding", ""),
etag=parsed_feed.get("etag", ""),
headers=parsed_feed.get("headers", {}),
href=parsed_feed.get("href", ""),
modified=parsed_feed.get("modified"),
namespaces=parsed_feed.get("namespaces", {}),
status=parsed_feed.get("status", 0),
version=parsed_feed.get("version", ""),
author=parsed_feed.get("author", ""),
author_detail=author,
cloud=parsed_feed.get("cloud", {}),
contributors=parsed_feed.get("contributors", {}),
docs=parsed_feed.get("docs", ""),
errorreportsto=parsed_feed.get("errorreportsto", ""),
generator=parsed_feed.get("generator", ""),
generator_detail=generator,
icon=parsed_feed.get("icon", ""),
_id=parsed_feed.get("id", ""),
image=parsed_feed.get("image", {}),
info=parsed_feed.get("info", ""),
language=parsed_feed.get("language", ""),
license=parsed_feed.get("license", ""),
link=parsed_feed.get("link", ""),
links=parsed_feed.get("links", []),
logo=parsed_feed.get("logo", ""),
published=parsed_feed.get("published", ""),
published_parsed=published_parsed,
publisher=parsed_feed.get("publisher", ""),
publisher_detail=publisher,
rights=parsed_feed.get("rights", ""),
rights_detail=parsed_feed.get("rights_detail", {}),
subtitle=parsed_feed.get("subtitle", ""),
subtitle_detail=parsed_feed.get("subtitle_detail", {}),
tags=parsed_feed.get("tags", []),
textinput=parsed_feed.get("textinput", {}),
title=parsed_feed.get("title", ""),
title_detail=parsed_feed.get("title_detail", {}),
ttl=parsed_feed.get("ttl", ""),
updated=parsed_feed.get("updated", ""),
updated_parsed=updated_parsed,
)
# Save the feed.
try:
feed.save()
except Exception:
logger.exception("Error saving feed: %s", feed)
return None
entries = parsed_feed.get("entries", [])
for entry in entries:
added_entry: Entry | None = add_entry(feed=feed, entry=entry)
if not added_entry:
continue
logger.info("Created feed: %s", feed)
return feed

View file

@ -1,4 +1,4 @@
# Generated by Django 5.0.2 on 2024-02-18 20:59 # Generated by Django 5.0.2 on 2024-02-19 02:47
import django.db.models.deletion import django.db.models.deletion
from django.db import migrations, models from django.db import migrations, models
@ -12,13 +12,24 @@ class Migration(migrations.Migration):
] ]
operations = [ operations = [
migrations.CreateModel(
name='Author',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('created_at', models.DateTimeField(auto_now_add=True)),
('modified_at', models.DateTimeField(auto_now=True)),
('name', models.TextField(blank=True)),
('href', models.TextField(blank=True)),
('email', models.TextField(blank=True)),
],
),
migrations.CreateModel( migrations.CreateModel(
name='Domain', name='Domain',
fields=[ fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')), ('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=255, unique=True)), ('url', models.URLField(unique=True)),
('url', models.URLField()), ('name', models.CharField(max_length=255)),
('categories', models.JSONField()), ('categories', models.JSONField(blank=True, null=True)),
('created_at', models.DateTimeField(auto_now_add=True)), ('created_at', models.DateTimeField(auto_now_add=True)),
('modified_at', models.DateTimeField(auto_now=True)), ('modified_at', models.DateTimeField(auto_now=True)),
('hidden', models.BooleanField(default=False)), ('hidden', models.BooleanField(default=False)),
@ -27,55 +38,95 @@ class Migration(migrations.Migration):
], ],
), ),
migrations.CreateModel( migrations.CreateModel(
name='Feed', name='Links',
fields=[ fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')), ('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('created_at', models.DateTimeField(auto_now_add=True)), ('created_at', models.DateTimeField(auto_now_add=True)),
('modified_at', models.DateTimeField(auto_now=True)), ('modified_at', models.DateTimeField(auto_now=True)),
('rel', models.TextField(blank=True)),
('type', models.TextField(blank=True)),
('href', models.TextField(blank=True)),
('title', models.TextField(blank=True)),
],
),
migrations.CreateModel(
name='Publisher',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('created_at', models.DateTimeField(auto_now_add=True)),
('modified_at', models.DateTimeField(auto_now=True)),
('name', models.TextField(blank=True)),
('href', models.TextField(blank=True)),
('email', models.TextField(blank=True)),
],
),
migrations.CreateModel(
name='Generator',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('created_at', models.DateTimeField(auto_now_add=True)),
('modified_at', models.DateTimeField(auto_now=True)),
('name', models.TextField(blank=True)),
('href', models.TextField(blank=True)),
('version', models.TextField(blank=True)),
],
options={
'unique_together': {('name', 'version', 'href')},
},
),
migrations.CreateModel(
name='Feed',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('feed_url', models.URLField(unique=True)),
('created_at', models.DateTimeField(auto_now_add=True)),
('modified_at', models.DateTimeField(auto_now=True)),
('last_checked', models.DateTimeField(blank=True, null=True)),
('active', models.BooleanField(default=True)),
('bozo', models.BooleanField()), ('bozo', models.BooleanField()),
('bozo_exception', models.TextField()), ('bozo_exception', models.TextField(blank=True)),
('encoding', models.TextField()), ('encoding', models.TextField(blank=True)),
('etag', models.TextField()), ('etag', models.TextField(blank=True)),
('headers', models.JSONField()), ('headers', models.JSONField(blank=True, null=True)),
('href', models.TextField()), ('href', models.TextField(blank=True)),
('modified', models.DateTimeField()), ('modified', models.DateTimeField(blank=True, null=True)),
('namespaces', models.JSONField()), ('namespaces', models.JSONField(blank=True, null=True)),
('status', models.IntegerField()), ('status', models.IntegerField()),
('version', models.CharField(max_length=50)), ('version', models.CharField(blank=True, max_length=255)),
('author', models.TextField()), ('author', models.TextField(blank=True)),
('author_detail', models.JSONField()), ('cloud', models.JSONField(blank=True, null=True)),
('cloud', models.JSONField()), ('contributors', models.JSONField(blank=True, null=True)),
('contributors', models.JSONField()), ('docs', models.TextField(blank=True)),
('docs', models.TextField()), ('errorreportsto', models.TextField(blank=True)),
('errorreportsto', models.TextField()), ('generator', models.TextField(blank=True)),
('generator', models.TextField()), ('icon', models.TextField(blank=True)),
('generator_detail', models.TextField()), ('_id', models.TextField(blank=True)),
('icon', models.TextField()), ('image', models.JSONField(blank=True, null=True)),
('_id', models.TextField()), ('info', models.TextField(blank=True)),
('image', models.JSONField()), ('info_detail', models.JSONField(blank=True, null=True)),
('info', models.TextField()), ('language', models.TextField(blank=True)),
('info_detail', models.JSONField()), ('license', models.TextField(blank=True)),
('language', models.TextField()), ('link', models.TextField(blank=True)),
('license', models.TextField()), ('links', models.JSONField(blank=True, null=True)),
('link', models.TextField()), ('logo', models.TextField(blank=True)),
('links', models.JSONField()), ('published', models.TextField(blank=True)),
('logo', models.TextField()), ('published_parsed', models.DateTimeField(blank=True, null=True)),
('published', models.TextField()), ('publisher', models.TextField(blank=True)),
('published_parsed', models.DateTimeField()), ('rights', models.TextField(blank=True)),
('publisher', models.TextField()), ('rights_detail', models.JSONField(blank=True, null=True)),
('publisher_detail', models.JSONField()), ('subtitle', models.TextField(blank=True)),
('rights', models.TextField()), ('subtitle_detail', models.JSONField(blank=True, null=True)),
('rights_detail', models.JSONField()), ('tags', models.JSONField(blank=True, null=True)),
('subtitle', models.TextField()), ('textinput', models.JSONField(blank=True, null=True)),
('subtitle_detail', models.JSONField()), ('title', models.TextField(blank=True)),
('tags', models.JSONField()), ('title_detail', models.JSONField(blank=True, null=True)),
('textinput', models.JSONField()), ('ttl', models.TextField(blank=True)),
('title', models.TextField()), ('updated', models.TextField(blank=True)),
('title_detail', models.JSONField()), ('updated_parsed', models.DateTimeField(blank=True, null=True)),
('ttl', models.TextField()), ('author_detail', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='feeds', to='feeds.author')),
('updated', models.TextField()),
('updated_parsed', models.DateTimeField()),
('domain', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='feeds.domain')), ('domain', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='feeds.domain')),
('generator_detail', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='feeds', to='feeds.generator')),
('publisher_detail', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='feeds', to='feeds.publisher')),
], ],
), ),
migrations.CreateModel( migrations.CreateModel(
@ -84,33 +135,33 @@ class Migration(migrations.Migration):
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')), ('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('created_at', models.DateTimeField(auto_now_add=True)), ('created_at', models.DateTimeField(auto_now_add=True)),
('modified_at', models.DateTimeField(auto_now=True)), ('modified_at', models.DateTimeField(auto_now=True)),
('author', models.TextField()), ('author', models.TextField(blank=True)),
('author_detail', models.JSONField()), ('comments', models.TextField(blank=True)),
('comments', models.TextField()), ('content', models.JSONField(blank=True, null=True)),
('content', models.JSONField()), ('contributors', models.JSONField(blank=True, null=True)),
('contributors', models.JSONField()), ('created', models.TextField(blank=True)),
('created', models.TextField()), ('created_parsed', models.DateTimeField(blank=True, null=True)),
('created_parsed', models.DateTimeField()), ('enclosures', models.JSONField(blank=True, null=True)),
('enclosures', models.JSONField()), ('expired', models.TextField(blank=True)),
('expired', models.TextField()), ('expired_parsed', models.DateTimeField(blank=True, null=True)),
('expired_parsed', models.DateTimeField()), ('_id', models.TextField(blank=True)),
('_id', models.TextField()), ('license', models.TextField(blank=True)),
('license', models.TextField()), ('link', models.TextField(blank=True)),
('link', models.TextField()), ('links', models.JSONField(blank=True, null=True)),
('links', models.JSONField()), ('published', models.TextField(blank=True)),
('published', models.TextField()), ('published_parsed', models.DateTimeField(blank=True, null=True)),
('published_parsed', models.DateTimeField()), ('publisher', models.TextField(blank=True)),
('publisher', models.TextField()), ('source', models.JSONField(blank=True, null=True)),
('publisher_detail', models.JSONField()), ('summary', models.TextField(blank=True)),
('source', models.JSONField()), ('summary_detail', models.JSONField(blank=True, null=True)),
('summary', models.TextField()), ('tags', models.JSONField(blank=True, null=True)),
('summary_detail', models.JSONField()), ('title', models.TextField(blank=True)),
('tags', models.JSONField()), ('title_detail', models.JSONField(blank=True, null=True)),
('title', models.TextField()), ('updated', models.TextField(blank=True)),
('title_detail', models.JSONField()), ('updated_parsed', models.DateTimeField(blank=True, null=True)),
('updated', models.TextField()), ('author_detail', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='entries', to='feeds.author')),
('updated_parsed', models.DateTimeField()),
('feed', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='feeds.feed')), ('feed', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='feeds.feed')),
('publisher_detail', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='entries', to='feeds.publisher')),
], ],
), ),
] ]

View file

@ -1,17 +1,21 @@
from __future__ import annotations from __future__ import annotations
import logging
import typing
from typing import Literal from typing import Literal
from django.db import models from django.db import models
from django.db.models import JSONField from django.db.models import JSONField
logger = logging.getLogger(__name__)
class Domain(models.Model): class Domain(models.Model):
"""A domain that has one or more feeds.""" """A domain that has one or more feeds."""
name = models.CharField(max_length=255, unique=True) url = models.URLField(unique=True)
url = models.URLField() name = models.CharField(max_length=255)
categories = models.JSONField() categories = models.JSONField(null=True, blank=True)
created_at = models.DateTimeField(auto_now_add=True) created_at = models.DateTimeField(auto_now_add=True)
modified_at = models.DateTimeField(auto_now=True) modified_at = models.DateTimeField(auto_now=True)
hidden = models.BooleanField(default=False) hidden = models.BooleanField(default=False)
@ -24,62 +28,154 @@ class Domain(models.Model):
return self.name + if_hidden return self.name + if_hidden
class Author(models.Model):
"""An author of an entry."""
created_at = models.DateTimeField(auto_now_add=True)
modified_at = models.DateTimeField(auto_now=True)
name = models.TextField(blank=True)
href = models.TextField(blank=True)
email = models.TextField(blank=True)
def __str__(self) -> str:
"""Return string representation of the author."""
return f"{self.name} - {self.email} - {self.href}"
class Generator(models.Model):
"""A generator of a feed."""
created_at = models.DateTimeField(auto_now_add=True)
modified_at = models.DateTimeField(auto_now=True)
name = models.TextField(blank=True)
href = models.TextField(blank=True)
version = models.TextField(blank=True)
class Meta:
"""Meta information for the generator model."""
unique_together: typing.ClassVar[list[str]] = ["name", "version", "href"]
def __str__(self) -> str:
"""Return string representation of the generator."""
return self.name
class Links(models.Model):
"""A link to a feed or entry."""
created_at = models.DateTimeField(auto_now_add=True)
modified_at = models.DateTimeField(auto_now=True)
rel = models.TextField(blank=True)
type = models.TextField(blank=True)
href = models.TextField(blank=True)
title = models.TextField(blank=True)
def __str__(self) -> str:
"""Return string representation of the links."""
return self.href
class Publisher(models.Model):
"""The publisher of a feed or entry."""
created_at = models.DateTimeField(auto_now_add=True)
modified_at = models.DateTimeField(auto_now=True)
name = models.TextField(blank=True)
href = models.TextField(blank=True)
email = models.TextField(blank=True)
def __str__(self) -> str:
"""Return string representation of the publisher."""
return self.name
class Feed(models.Model): class Feed(models.Model):
"""A RSS/Atom/JSON feed.""" """A RSS/Atom/JSON feed."""
feed_url = models.URLField(unique=True)
domain = models.ForeignKey(Domain, on_delete=models.CASCADE) domain = models.ForeignKey(Domain, on_delete=models.CASCADE)
created_at = models.DateTimeField(auto_now_add=True) created_at = models.DateTimeField(auto_now_add=True)
modified_at = models.DateTimeField(auto_now=True) modified_at = models.DateTimeField(auto_now=True)
last_checked = models.DateTimeField(null=True, blank=True)
active = models.BooleanField(default=True)
# General data # General data
bozo = models.BooleanField() bozo = models.BooleanField()
bozo_exception = models.TextField() bozo_exception = models.TextField(blank=True)
encoding = models.TextField() encoding = models.TextField(blank=True)
etag = models.TextField() etag = models.TextField(blank=True)
headers = JSONField() headers = JSONField(null=True, blank=True)
href = models.TextField() href = models.TextField(blank=True)
modified = models.DateTimeField() modified = models.DateTimeField(null=True, blank=True)
namespaces = JSONField() namespaces = JSONField(null=True, blank=True)
status = models.IntegerField() status = models.IntegerField()
version = models.CharField(max_length=50) version = models.CharField(max_length=255, blank=True)
# Feed data # Feed data
author = models.TextField() author = models.TextField(blank=True)
author_detail = JSONField() author_detail = models.ForeignKey(
cloud = JSONField() Author,
contributors = JSONField() on_delete=models.PROTECT,
docs = models.TextField() null=True,
errorreportsto = models.TextField() blank=True,
generator = models.TextField() related_name="feeds",
generator_detail = models.TextField() )
icon = models.TextField()
_id = models.TextField() cloud = JSONField(null=True, blank=True)
image = JSONField() contributors = JSONField(null=True, blank=True)
info = models.TextField() docs = models.TextField(blank=True)
info_detail = JSONField() errorreportsto = models.TextField(blank=True)
language = models.TextField() generator = models.TextField(blank=True)
license = models.TextField() generator_detail = models.ForeignKey(
link = models.TextField() Generator,
links = JSONField() on_delete=models.PROTECT,
logo = models.TextField() null=True,
published = models.TextField() blank=True,
published_parsed = models.DateTimeField() related_name="feeds",
publisher = models.TextField() )
publisher_detail = JSONField()
rights = models.TextField() icon = models.TextField(blank=True)
rights_detail = JSONField() _id = models.TextField(blank=True)
subtitle = models.TextField() image = JSONField(null=True, blank=True)
subtitle_detail = JSONField() info = models.TextField(blank=True)
tags = JSONField() info_detail = JSONField(null=True, blank=True)
textinput = JSONField() language = models.TextField(blank=True)
title = models.TextField() license = models.TextField(blank=True)
title_detail = JSONField() link = models.TextField(blank=True)
ttl = models.TextField() links = JSONField(null=True, blank=True)
updated = models.TextField() logo = models.TextField(blank=True)
updated_parsed = models.DateTimeField() published = models.TextField(blank=True)
published_parsed = models.DateTimeField(null=True, blank=True)
publisher = models.TextField(blank=True)
publisher_detail = models.ForeignKey(
Publisher,
on_delete=models.PROTECT,
null=True,
blank=True,
related_name="feeds",
)
rights = models.TextField(blank=True)
rights_detail = JSONField(null=True, blank=True)
subtitle = models.TextField(blank=True)
subtitle_detail = JSONField(null=True, blank=True)
tags = JSONField(null=True, blank=True)
textinput = JSONField(null=True, blank=True)
title = models.TextField(blank=True)
title_detail = JSONField(null=True, blank=True)
ttl = models.TextField(blank=True)
updated = models.TextField(blank=True)
updated_parsed = models.DateTimeField(null=True, blank=True)
def __str__(self) -> str: def __str__(self) -> str:
"""Return string representation of the feed.""" """Return string representation of the feed."""
return self.title_detail["value"] or "No title" return f"{self.domain} - {self.title}"
def get_fields(self) -> list:
"""Return the fields of the feed."""
return [(field.name, field.value_from_object(self)) for field in Feed._meta.fields]
class Entry(models.Model): class Entry(models.Model):
@ -90,33 +186,49 @@ class Entry(models.Model):
modified_at = models.DateTimeField(auto_now=True) modified_at = models.DateTimeField(auto_now=True)
# Entry data # Entry data
author = models.TextField() author = models.TextField(blank=True)
author_detail = JSONField() author_detail = models.ForeignKey(
comments = models.TextField() Author,
content = JSONField() on_delete=models.PROTECT,
contributors = JSONField() null=True,
created = models.TextField() blank=True,
created_parsed = models.DateTimeField() related_name="entries",
enclosures = JSONField() )
expired = models.TextField() comments = models.TextField(blank=True)
expired_parsed = models.DateTimeField() content = JSONField(null=True, blank=True)
_id = models.TextField() contributors = JSONField(null=True, blank=True)
license = models.TextField() created = models.TextField(blank=True)
link = models.TextField() created_parsed = models.DateTimeField(null=True, blank=True)
links = JSONField() enclosures = JSONField(null=True, blank=True)
published = models.TextField() expired = models.TextField(blank=True)
published_parsed = models.DateTimeField() expired_parsed = models.DateTimeField(null=True, blank=True)
publisher = models.TextField() _id = models.TextField(blank=True)
publisher_detail = JSONField() license = models.TextField(blank=True)
source = JSONField() link = models.TextField(blank=True)
summary = models.TextField() links = JSONField(null=True, blank=True)
summary_detail = JSONField() published = models.TextField(blank=True)
tags = JSONField() published_parsed = models.DateTimeField(null=True, blank=True)
title = models.TextField() publisher = models.TextField(blank=True)
title_detail = JSONField() publisher_detail = models.ForeignKey(
updated = models.TextField() Publisher,
updated_parsed = models.DateTimeField() on_delete=models.PROTECT,
null=True,
blank=True,
related_name="entries",
)
source = JSONField(null=True, blank=True)
summary = models.TextField(blank=True)
summary_detail = JSONField(null=True, blank=True)
tags = JSONField(null=True, blank=True)
title = models.TextField(blank=True)
title_detail = JSONField(null=True, blank=True)
updated = models.TextField(blank=True)
updated_parsed = models.DateTimeField(null=True, blank=True)
def __str__(self) -> str: def __str__(self) -> str:
"""Return string representation of the entry.""" """Return string representation of the entry."""
return self.title_detail["value"] or "No title" return f"{self.feed} - {self.title}"
def get_fields(self) -> list:
"""Return the fields of the entry."""
return [(field.name, field.value_from_object(self)) for field in Entry._meta.fields]

33
feeds/stats.py Normal file
View file

@ -0,0 +1,33 @@
from __future__ import annotations
import logging
from django.core.cache import cache
from django.db import connection
logger: logging.Logger = logging.getLogger(__name__)
def get_db_size() -> str:
"""Get the size of the database.
Returns:
str: The size of the database.
"""
# Try to get value from cache
db_size = cache.get("db_size")
if db_size is not None:
logger.debug("Got db_size from cache")
return db_size
with connection.cursor() as cursor:
cursor.execute("SELECT pg_size_pretty(pg_database_size(current_database()))")
row = cursor.fetchone()
db_size = "0 MB" if row is None else str(row[0])
# Store value in cache for 15 minutes
cache.set("db_size", db_size, 60 * 15)
return db_size

View file

@ -10,4 +10,5 @@ urlpatterns: list[URLPattern] = [
path(route="", view=views.IndexView.as_view(), name="index"), path(route="", view=views.IndexView.as_view(), name="index"),
path(route="feed/<int:feed_id>/", view=views.FeedView.as_view(), name="feed"), path(route="feed/<int:feed_id>/", view=views.FeedView.as_view(), name="feed"),
path(route="feeds/", view=views.FeedsView.as_view(), name="feeds"), path(route="feeds/", view=views.FeedsView.as_view(), name="feeds"),
path(route="add", view=views.AddView.as_view(), name="add"),
] ]

View file

@ -1,35 +1,92 @@
from __future__ import annotations from __future__ import annotations
from django.contrib import messages
from django.http import HttpRequest, HttpResponse from django.http import HttpRequest, HttpResponse
from django.shortcuts import get_object_or_404, render
from django.template import loader from django.template import loader
from django.views import View from django.views import View
from django.views.generic.list import ListView
from feeds.add_feeds import add_feed
from feeds.models import Entry, Feed
from feeds.stats import get_db_size
class IndexView(View): class IndexView(View):
"""Index path.""" """Index path."""
def get(self, request: HttpRequest) -> HttpResponse: def get(self, request: HttpRequest) -> HttpResponse:
"""GET request for index path.""" """Load the index page."""
template = loader.get_template(template_name="index.html") template = loader.get_template(template_name="index.html")
context = {} context = {
"db_size": get_db_size(),
"amount_of_feeds": Feed.objects.count(),
}
return HttpResponse(content=template.render(context=context, request=request)) return HttpResponse(content=template.render(context=context, request=request))
class FeedView(View): class FeedView(View):
"""A single feed.""" """A single feed."""
def get(self, request: HttpRequest, feed_id: int) -> HttpResponse: def get(self, request: HttpRequest, *args, **kwargs) -> HttpResponse: # noqa: ANN002, ANN003, ARG002
"""GET request for index path.""" """Load the feed page."""
template = loader.get_template(template_name="feed.html") feed_id = kwargs.get("feed_id", None)
context = {"feed_id": feed_id} if not feed_id:
return HttpResponse(content=template.render(context=context, request=request)) return HttpResponse(content="No id", status=400)
feed = get_object_or_404(Feed, id=feed_id)
entries = Entry.objects.filter(feed=feed).order_by("-created_parsed")[:100]
context = {"feed": feed, "entries": entries, "db_size": get_db_size(), "amount_of_feeds": Feed.objects.count()}
return render(request, "feed.html", context)
class FeedsView(View): class FeedsView(ListView):
"""All feeds.""" """All feeds."""
model = Feed
paginate_by = 100
template_name = "feeds.html"
context_object_name = "feeds"
def get_context_data(self, **kwargs) -> dict: # noqa: ANN003
"""Get the context data."""
context = super().get_context_data(**kwargs)
context["db_size"] = get_db_size()
context["amount_of_feeds"] = Feed.objects.count()
return context
class AddView(View):
"""Add a feed."""
def get(self, request: HttpRequest) -> HttpResponse: def get(self, request: HttpRequest) -> HttpResponse:
"""GET request for index path.""" """Load the index page."""
template = loader.get_template(template_name="feeds.html") template = loader.get_template(template_name="index.html")
context = {} context = {
"db_size": get_db_size(),
"amount_of_feeds": Feed.objects.count(),
}
return HttpResponse(content=template.render(context=context, request=request)) return HttpResponse(content=template.render(context=context, request=request))
def post(self, request: HttpRequest) -> HttpResponse:
"""Add a feed."""
urls: str | None = request.POST.get("urls", None)
if not urls:
return HttpResponse(content="No urls", status=400)
# Split the urls by newline.
for url in urls.split("\n"):
feed: None | Feed = add_feed(url)
if not feed:
messages.error(request, f"{url} - Failed to add")
continue
# Check if bozo is true.
if feed.bozo:
messages.warning(request, f"{feed.feed_url} - Bozo: {feed.bozo_exception}")
messages.success(request, f"{feed.feed_url} added")
# Render the index page.
template = loader.get_template(template_name="index.html")
return HttpResponse(content=template.render(context={}, request=request))

View file

@ -95,6 +95,31 @@ AUTH_PASSWORD_VALIDATORS = [
] ]
# A list containing the settings for all template engines to be used with Django.
TEMPLATES = [
{
"BACKEND": "django.template.backends.django.DjangoTemplates",
"DIRS": [BASE_DIR / "templates"],
"OPTIONS": {
"context_processors": [
"django.template.context_processors.debug",
"django.template.context_processors.request",
"django.contrib.auth.context_processors.auth",
"django.contrib.messages.context_processors.messages",
],
"loaders": [
(
"django.template.loaders.cached.Loader",
[
"django.template.loaders.filesystem.Loader",
"django.template.loaders.app_directories.Loader",
],
),
],
},
},
]
# A list of all the people who get code error notifications. When DEBUG=False and a view raises an exception, Django # A list of all the people who get code error notifications. When DEBUG=False and a view raises an exception, Django
ADMINS: list[tuple[str, str]] = [("Joakim Hellsén", "django@feedvault.se")] ADMINS: list[tuple[str, str]] = [("Joakim Hellsén", "django@feedvault.se")]

View file

@ -1,6 +1,7 @@
from django.contrib import admin from django.contrib import admin
from django.urls import path from django.urls import include, path
urlpatterns = [ urlpatterns = [
path("admin/", admin.site.urls), path("admin/", admin.site.urls),
path("", include("feeds.urls")),
] ]

329
poetry.lock generated
View file

@ -14,6 +14,46 @@ files = [
[package.extras] [package.extras]
tests = ["mypy (>=0.800)", "pytest", "pytest-asyncio"] tests = ["mypy (>=0.800)", "pytest", "pytest-asyncio"]
[[package]]
name = "click"
version = "8.1.7"
description = "Composable command line interface toolkit"
optional = false
python-versions = ">=3.7"
files = [
{file = "click-8.1.7-py3-none-any.whl", hash = "sha256:ae74fb96c20a0277a1d615f1e4d73c8414f5a98db8b799a7931d1582f3390c28"},
{file = "click-8.1.7.tar.gz", hash = "sha256:ca9853ad459e787e2192211578cc907e7594e294c7ccc834310722b41b9ca6de"},
]
[package.dependencies]
colorama = {version = "*", markers = "platform_system == \"Windows\""}
[[package]]
name = "colorama"
version = "0.4.6"
description = "Cross-platform colored terminal text."
optional = false
python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,!=3.6.*,>=2.7"
files = [
{file = "colorama-0.4.6-py2.py3-none-any.whl", hash = "sha256:4f1d9991f5acc0ca119f9d443620b77f9d6b33703e51011c16baf57afb285fc6"},
{file = "colorama-0.4.6.tar.gz", hash = "sha256:08695f5cb7ed6e0531a20572697297273c47b8cae5a63ffc6d6ed5c201be6e44"},
]
[[package]]
name = "cssbeautifier"
version = "1.15.1"
description = "CSS unobfuscator and beautifier."
optional = false
python-versions = "*"
files = [
{file = "cssbeautifier-1.15.1.tar.gz", hash = "sha256:9f7064362aedd559c55eeecf6b6bed65e05f33488dcbe39044f0403c26e1c006"},
]
[package.dependencies]
editorconfig = ">=0.12.2"
jsbeautifier = "*"
six = ">=1.13.0"
[[package]] [[package]]
name = "django" name = "django"
version = "5.0.2" version = "5.0.2"
@ -34,6 +74,40 @@ tzdata = {version = "*", markers = "sys_platform == \"win32\""}
argon2 = ["argon2-cffi (>=19.1.0)"] argon2 = ["argon2-cffi (>=19.1.0)"]
bcrypt = ["bcrypt"] bcrypt = ["bcrypt"]
[[package]]
name = "djlint"
version = "1.34.1"
description = "HTML Template Linter and Formatter"
optional = false
python-versions = ">=3.8.0,<4.0.0"
files = [
{file = "djlint-1.34.1-py3-none-any.whl", hash = "sha256:96ff1c464fb6f061130ebc88663a2ea524d7ec51f4b56221a2b3f0320a3cfce8"},
{file = "djlint-1.34.1.tar.gz", hash = "sha256:db93fa008d19eaadb0454edf1704931d14469d48508daba2df9941111f408346"},
]
[package.dependencies]
click = ">=8.0.1,<9.0.0"
colorama = ">=0.4.4,<0.5.0"
cssbeautifier = ">=1.14.4,<2.0.0"
html-tag-names = ">=0.1.2,<0.2.0"
html-void-elements = ">=0.1.0,<0.2.0"
jsbeautifier = ">=1.14.4,<2.0.0"
json5 = ">=0.9.11,<0.10.0"
pathspec = ">=0.12.0,<0.13.0"
PyYAML = ">=6.0,<7.0"
regex = ">=2023.0.0,<2024.0.0"
tqdm = ">=4.62.2,<5.0.0"
[[package]]
name = "editorconfig"
version = "0.12.4"
description = "EditorConfig File Locator and Interpreter for Python"
optional = false
python-versions = "*"
files = [
{file = "EditorConfig-0.12.4.tar.gz", hash = "sha256:24857fa1793917dd9ccf0c7810a07e05404ce9b823521c7dce22a4fb5d125f80"},
]
[[package]] [[package]]
name = "feedparser" name = "feedparser"
version = "6.0.11" version = "6.0.11"
@ -48,6 +122,67 @@ files = [
[package.dependencies] [package.dependencies]
sgmllib3k = "*" sgmllib3k = "*"
[[package]]
name = "html-tag-names"
version = "0.1.2"
description = "List of known HTML tag names"
optional = false
python-versions = ">=3.7,<4.0"
files = [
{file = "html-tag-names-0.1.2.tar.gz", hash = "sha256:04924aca48770f36b5a41c27e4d917062507be05118acb0ba869c97389084297"},
{file = "html_tag_names-0.1.2-py3-none-any.whl", hash = "sha256:eeb69ef21078486b615241f0393a72b41352c5219ee648e7c61f5632d26f0420"},
]
[[package]]
name = "html-void-elements"
version = "0.1.0"
description = "List of HTML void tag names."
optional = false
python-versions = ">=3.7,<4.0"
files = [
{file = "html-void-elements-0.1.0.tar.gz", hash = "sha256:931b88f84cd606fee0b582c28fcd00e41d7149421fb673e1e1abd2f0c4f231f0"},
{file = "html_void_elements-0.1.0-py3-none-any.whl", hash = "sha256:784cf39db03cdeb017320d9301009f8f3480f9d7b254d0974272e80e0cb5e0d2"},
]
[[package]]
name = "jsbeautifier"
version = "1.15.1"
description = "JavaScript unobfuscator and beautifier."
optional = false
python-versions = "*"
files = [
{file = "jsbeautifier-1.15.1.tar.gz", hash = "sha256:ebd733b560704c602d744eafc839db60a1ee9326e30a2a80c4adb8718adc1b24"},
]
[package.dependencies]
editorconfig = ">=0.12.2"
six = ">=1.13.0"
[[package]]
name = "json5"
version = "0.9.14"
description = "A Python implementation of the JSON5 data format."
optional = false
python-versions = "*"
files = [
{file = "json5-0.9.14-py2.py3-none-any.whl", hash = "sha256:740c7f1b9e584a468dbb2939d8d458db3427f2c93ae2139d05f47e453eae964f"},
{file = "json5-0.9.14.tar.gz", hash = "sha256:9ed66c3a6ca3510a976a9ef9b8c0787de24802724ab1860bc0153c7fdd589b02"},
]
[package.extras]
dev = ["hypothesis"]
[[package]]
name = "pathspec"
version = "0.12.1"
description = "Utility library for gitignore style pattern matching of file paths."
optional = false
python-versions = ">=3.8"
files = [
{file = "pathspec-0.12.1-py3-none-any.whl", hash = "sha256:a0d503e138a4c123b27490a4f7beda6a01c6f288df0e4a8b79c7eb0dc7b4cc08"},
{file = "pathspec-0.12.1.tar.gz", hash = "sha256:a482d51503a1ab33b1c67a6c3813a26953dbdc71c31dacaef9a838c4e29f5712"},
]
[[package]] [[package]]
name = "python-dotenv" name = "python-dotenv"
version = "1.0.1" version = "1.0.1"
@ -62,6 +197,167 @@ files = [
[package.extras] [package.extras]
cli = ["click (>=5.0)"] cli = ["click (>=5.0)"]
[[package]]
name = "pyyaml"
version = "6.0.1"
description = "YAML parser and emitter for Python"
optional = false
python-versions = ">=3.6"
files = [
{file = "PyYAML-6.0.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:d858aa552c999bc8a8d57426ed01e40bef403cd8ccdd0fc5f6f04a00414cac2a"},
{file = "PyYAML-6.0.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:fd66fc5d0da6d9815ba2cebeb4205f95818ff4b79c3ebe268e75d961704af52f"},
{file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:69b023b2b4daa7548bcfbd4aa3da05b3a74b772db9e23b982788168117739938"},
{file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:81e0b275a9ecc9c0c0c07b4b90ba548307583c125f54d5b6946cfee6360c733d"},
{file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ba336e390cd8e4d1739f42dfe9bb83a3cc2e80f567d8805e11b46f4a943f5515"},
{file = "PyYAML-6.0.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:326c013efe8048858a6d312ddd31d56e468118ad4cdeda36c719bf5bb6192290"},
{file = "PyYAML-6.0.1-cp310-cp310-win32.whl", hash = "sha256:bd4af7373a854424dabd882decdc5579653d7868b8fb26dc7d0e99f823aa5924"},
{file = "PyYAML-6.0.1-cp310-cp310-win_amd64.whl", hash = "sha256:fd1592b3fdf65fff2ad0004b5e363300ef59ced41c2e6b3a99d4089fa8c5435d"},
{file = "PyYAML-6.0.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:6965a7bc3cf88e5a1c3bd2e0b5c22f8d677dc88a455344035f03399034eb3007"},
{file = "PyYAML-6.0.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:f003ed9ad21d6a4713f0a9b5a7a0a79e08dd0f221aff4525a2be4c346ee60aab"},
{file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:42f8152b8dbc4fe7d96729ec2b99c7097d656dc1213a3229ca5383f973a5ed6d"},
{file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:062582fca9fabdd2c8b54a3ef1c978d786e0f6b3a1510e0ac93ef59e0ddae2bc"},
{file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d2b04aac4d386b172d5b9692e2d2da8de7bfb6c387fa4f801fbf6fb2e6ba4673"},
{file = "PyYAML-6.0.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:e7d73685e87afe9f3b36c799222440d6cf362062f78be1013661b00c5c6f678b"},
{file = "PyYAML-6.0.1-cp311-cp311-win32.whl", hash = "sha256:1635fd110e8d85d55237ab316b5b011de701ea0f29d07611174a1b42f1444741"},
{file = "PyYAML-6.0.1-cp311-cp311-win_amd64.whl", hash = "sha256:bf07ee2fef7014951eeb99f56f39c9bb4af143d8aa3c21b1677805985307da34"},
{file = "PyYAML-6.0.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:855fb52b0dc35af121542a76b9a84f8d1cd886ea97c84703eaa6d88e37a2ad28"},
{file = "PyYAML-6.0.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:40df9b996c2b73138957fe23a16a4f0ba614f4c0efce1e9406a184b6d07fa3a9"},
{file = "PyYAML-6.0.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6c22bec3fbe2524cde73d7ada88f6566758a8f7227bfbf93a408a9d86bcc12a0"},
{file = "PyYAML-6.0.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:8d4e9c88387b0f5c7d5f281e55304de64cf7f9c0021a3525bd3b1c542da3b0e4"},
{file = "PyYAML-6.0.1-cp312-cp312-win32.whl", hash = "sha256:d483d2cdf104e7c9fa60c544d92981f12ad66a457afae824d146093b8c294c54"},
{file = "PyYAML-6.0.1-cp312-cp312-win_amd64.whl", hash = "sha256:0d3304d8c0adc42be59c5f8a4d9e3d7379e6955ad754aa9d6ab7a398b59dd1df"},
{file = "PyYAML-6.0.1-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:50550eb667afee136e9a77d6dc71ae76a44df8b3e51e41b77f6de2932bfe0f47"},
{file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1fe35611261b29bd1de0070f0b2f47cb6ff71fa6595c077e42bd0c419fa27b98"},
{file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:704219a11b772aea0d8ecd7058d0082713c3562b4e271b849ad7dc4a5c90c13c"},
{file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:afd7e57eddb1a54f0f1a974bc4391af8bcce0b444685d936840f125cf046d5bd"},
{file = "PyYAML-6.0.1-cp36-cp36m-win32.whl", hash = "sha256:fca0e3a251908a499833aa292323f32437106001d436eca0e6e7833256674585"},
{file = "PyYAML-6.0.1-cp36-cp36m-win_amd64.whl", hash = "sha256:f22ac1c3cac4dbc50079e965eba2c1058622631e526bd9afd45fedd49ba781fa"},
{file = "PyYAML-6.0.1-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:b1275ad35a5d18c62a7220633c913e1b42d44b46ee12554e5fd39c70a243d6a3"},
{file = "PyYAML-6.0.1-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:18aeb1bf9a78867dc38b259769503436b7c72f7a1f1f4c93ff9a17de54319b27"},
{file = "PyYAML-6.0.1-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:596106435fa6ad000c2991a98fa58eeb8656ef2325d7e158344fb33864ed87e3"},
{file = "PyYAML-6.0.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:baa90d3f661d43131ca170712d903e6295d1f7a0f595074f151c0aed377c9b9c"},
{file = "PyYAML-6.0.1-cp37-cp37m-win32.whl", hash = "sha256:9046c58c4395dff28dd494285c82ba00b546adfc7ef001486fbf0324bc174fba"},
{file = "PyYAML-6.0.1-cp37-cp37m-win_amd64.whl", hash = "sha256:4fb147e7a67ef577a588a0e2c17b6db51dda102c71de36f8549b6816a96e1867"},
{file = "PyYAML-6.0.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:1d4c7e777c441b20e32f52bd377e0c409713e8bb1386e1099c2415f26e479595"},
{file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a0cd17c15d3bb3fa06978b4e8958dcdc6e0174ccea823003a106c7d4d7899ac5"},
{file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:28c119d996beec18c05208a8bd78cbe4007878c6dd15091efb73a30e90539696"},
{file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7e07cbde391ba96ab58e532ff4803f79c4129397514e1413a7dc761ccd755735"},
{file = "PyYAML-6.0.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:49a183be227561de579b4a36efbb21b3eab9651dd81b1858589f796549873dd6"},
{file = "PyYAML-6.0.1-cp38-cp38-win32.whl", hash = "sha256:184c5108a2aca3c5b3d3bf9395d50893a7ab82a38004c8f61c258d4428e80206"},
{file = "PyYAML-6.0.1-cp38-cp38-win_amd64.whl", hash = "sha256:1e2722cc9fbb45d9b87631ac70924c11d3a401b2d7f410cc0e3bbf249f2dca62"},
{file = "PyYAML-6.0.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:9eb6caa9a297fc2c2fb8862bc5370d0303ddba53ba97e71f08023b6cd73d16a8"},
{file = "PyYAML-6.0.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:c8098ddcc2a85b61647b2590f825f3db38891662cfc2fc776415143f599bb859"},
{file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5773183b6446b2c99bb77e77595dd486303b4faab2b086e7b17bc6bef28865f6"},
{file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b786eecbdf8499b9ca1d697215862083bd6d2a99965554781d0d8d1ad31e13a0"},
{file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bc1bf2925a1ecd43da378f4db9e4f799775d6367bdb94671027b73b393a7c42c"},
{file = "PyYAML-6.0.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:04ac92ad1925b2cff1db0cfebffb6ffc43457495c9b3c39d3fcae417d7125dc5"},
{file = "PyYAML-6.0.1-cp39-cp39-win32.whl", hash = "sha256:faca3bdcf85b2fc05d06ff3fbc1f83e1391b3e724afa3feba7d13eeab355484c"},
{file = "PyYAML-6.0.1-cp39-cp39-win_amd64.whl", hash = "sha256:510c9deebc5c0225e8c96813043e62b680ba2f9c50a08d3724c7f28a747d1486"},
{file = "PyYAML-6.0.1.tar.gz", hash = "sha256:bfdf460b1736c775f2ba9f6a92bca30bc2095067b8a9d77876d1fad6cc3b4a43"},
]
[[package]]
name = "regex"
version = "2023.12.25"
description = "Alternative regular expression module, to replace re."
optional = false
python-versions = ">=3.7"
files = [
{file = "regex-2023.12.25-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:0694219a1d54336fd0445ea382d49d36882415c0134ee1e8332afd1529f0baa5"},
{file = "regex-2023.12.25-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:b014333bd0217ad3d54c143de9d4b9a3ca1c5a29a6d0d554952ea071cff0f1f8"},
{file = "regex-2023.12.25-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:d865984b3f71f6d0af64d0d88f5733521698f6c16f445bb09ce746c92c97c586"},
{file = "regex-2023.12.25-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1e0eabac536b4cc7f57a5f3d095bfa557860ab912f25965e08fe1545e2ed8b4c"},
{file = "regex-2023.12.25-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c25a8ad70e716f96e13a637802813f65d8a6760ef48672aa3502f4c24ea8b400"},
{file = "regex-2023.12.25-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:a9b6d73353f777630626f403b0652055ebfe8ff142a44ec2cf18ae470395766e"},
{file = "regex-2023.12.25-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a9cc99d6946d750eb75827cb53c4371b8b0fe89c733a94b1573c9dd16ea6c9e4"},
{file = "regex-2023.12.25-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:88d1f7bef20c721359d8675f7d9f8e414ec5003d8f642fdfd8087777ff7f94b5"},
{file = "regex-2023.12.25-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:cb3fe77aec8f1995611f966d0c656fdce398317f850d0e6e7aebdfe61f40e1cd"},
{file = "regex-2023.12.25-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:7aa47c2e9ea33a4a2a05f40fcd3ea36d73853a2aae7b4feab6fc85f8bf2c9704"},
{file = "regex-2023.12.25-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:df26481f0c7a3f8739fecb3e81bc9da3fcfae34d6c094563b9d4670b047312e1"},
{file = "regex-2023.12.25-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:c40281f7d70baf6e0db0c2f7472b31609f5bc2748fe7275ea65a0b4601d9b392"},
{file = "regex-2023.12.25-cp310-cp310-musllinux_1_1_s390x.whl", hash = "sha256:d94a1db462d5690ebf6ae86d11c5e420042b9898af5dcf278bd97d6bda065423"},
{file = "regex-2023.12.25-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:ba1b30765a55acf15dce3f364e4928b80858fa8f979ad41f862358939bdd1f2f"},
{file = "regex-2023.12.25-cp310-cp310-win32.whl", hash = "sha256:150c39f5b964e4d7dba46a7962a088fbc91f06e606f023ce57bb347a3b2d4630"},
{file = "regex-2023.12.25-cp310-cp310-win_amd64.whl", hash = "sha256:09da66917262d9481c719599116c7dc0c321ffcec4b1f510c4f8a066f8768105"},
{file = "regex-2023.12.25-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:1b9d811f72210fa9306aeb88385b8f8bcef0dfbf3873410413c00aa94c56c2b6"},
{file = "regex-2023.12.25-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:d902a43085a308cef32c0d3aea962524b725403fd9373dea18110904003bac97"},
{file = "regex-2023.12.25-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:d166eafc19f4718df38887b2bbe1467a4f74a9830e8605089ea7a30dd4da8887"},
{file = "regex-2023.12.25-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c7ad32824b7f02bb3c9f80306d405a1d9b7bb89362d68b3c5a9be53836caebdb"},
{file = "regex-2023.12.25-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:636ba0a77de609d6510235b7f0e77ec494d2657108f777e8765efc060094c98c"},
{file = "regex-2023.12.25-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0fda75704357805eb953a3ee15a2b240694a9a514548cd49b3c5124b4e2ad01b"},
{file = "regex-2023.12.25-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f72cbae7f6b01591f90814250e636065850c5926751af02bb48da94dfced7baa"},
{file = "regex-2023.12.25-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:db2a0b1857f18b11e3b0e54ddfefc96af46b0896fb678c85f63fb8c37518b3e7"},
{file = "regex-2023.12.25-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:7502534e55c7c36c0978c91ba6f61703faf7ce733715ca48f499d3dbbd7657e0"},
{file = "regex-2023.12.25-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:e8c7e08bb566de4faaf11984af13f6bcf6a08f327b13631d41d62592681d24fe"},
{file = "regex-2023.12.25-cp311-cp311-musllinux_1_1_ppc64le.whl", hash = "sha256:283fc8eed679758de38fe493b7d7d84a198b558942b03f017b1f94dda8efae80"},
{file = "regex-2023.12.25-cp311-cp311-musllinux_1_1_s390x.whl", hash = "sha256:f44dd4d68697559d007462b0a3a1d9acd61d97072b71f6d1968daef26bc744bd"},
{file = "regex-2023.12.25-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:67d3ccfc590e5e7197750fcb3a2915b416a53e2de847a728cfa60141054123d4"},
{file = "regex-2023.12.25-cp311-cp311-win32.whl", hash = "sha256:68191f80a9bad283432385961d9efe09d783bcd36ed35a60fb1ff3f1ec2efe87"},
{file = "regex-2023.12.25-cp311-cp311-win_amd64.whl", hash = "sha256:7d2af3f6b8419661a0c421584cfe8aaec1c0e435ce7e47ee2a97e344b98f794f"},
{file = "regex-2023.12.25-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:8a0ccf52bb37d1a700375a6b395bff5dd15c50acb745f7db30415bae3c2b0715"},
{file = "regex-2023.12.25-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:c3c4a78615b7762740531c27cf46e2f388d8d727d0c0c739e72048beb26c8a9d"},
{file = "regex-2023.12.25-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:ad83e7545b4ab69216cef4cc47e344d19622e28aabec61574b20257c65466d6a"},
{file = "regex-2023.12.25-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b7a635871143661feccce3979e1727c4e094f2bdfd3ec4b90dfd4f16f571a87a"},
{file = "regex-2023.12.25-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d498eea3f581fbe1b34b59c697512a8baef88212f92e4c7830fcc1499f5b45a5"},
{file = "regex-2023.12.25-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:43f7cd5754d02a56ae4ebb91b33461dc67be8e3e0153f593c509e21d219c5060"},
{file = "regex-2023.12.25-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:51f4b32f793812714fd5307222a7f77e739b9bc566dc94a18126aba3b92b98a3"},
{file = "regex-2023.12.25-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ba99d8077424501b9616b43a2d208095746fb1284fc5ba490139651f971d39d9"},
{file = "regex-2023.12.25-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:4bfc2b16e3ba8850e0e262467275dd4d62f0d045e0e9eda2bc65078c0110a11f"},
{file = "regex-2023.12.25-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:8c2c19dae8a3eb0ea45a8448356ed561be843b13cbc34b840922ddf565498c1c"},
{file = "regex-2023.12.25-cp312-cp312-musllinux_1_1_ppc64le.whl", hash = "sha256:60080bb3d8617d96f0fb7e19796384cc2467447ef1c491694850ebd3670bc457"},
{file = "regex-2023.12.25-cp312-cp312-musllinux_1_1_s390x.whl", hash = "sha256:b77e27b79448e34c2c51c09836033056a0547aa360c45eeeb67803da7b0eedaf"},
{file = "regex-2023.12.25-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:518440c991f514331f4850a63560321f833979d145d7d81186dbe2f19e27ae3d"},
{file = "regex-2023.12.25-cp312-cp312-win32.whl", hash = "sha256:e2610e9406d3b0073636a3a2e80db05a02f0c3169b5632022b4e81c0364bcda5"},
{file = "regex-2023.12.25-cp312-cp312-win_amd64.whl", hash = "sha256:cc37b9aeebab425f11f27e5e9e6cf580be7206c6582a64467a14dda211abc232"},
{file = "regex-2023.12.25-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:da695d75ac97cb1cd725adac136d25ca687da4536154cdc2815f576e4da11c69"},
{file = "regex-2023.12.25-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d126361607b33c4eb7b36debc173bf25d7805847346dd4d99b5499e1fef52bc7"},
{file = "regex-2023.12.25-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:4719bb05094d7d8563a450cf8738d2e1061420f79cfcc1fa7f0a44744c4d8f73"},
{file = "regex-2023.12.25-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5dd58946bce44b53b06d94aa95560d0b243eb2fe64227cba50017a8d8b3cd3e2"},
{file = "regex-2023.12.25-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:22a86d9fff2009302c440b9d799ef2fe322416d2d58fc124b926aa89365ec482"},
{file = "regex-2023.12.25-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:2aae8101919e8aa05ecfe6322b278f41ce2994c4a430303c4cd163fef746e04f"},
{file = "regex-2023.12.25-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:e692296c4cc2873967771345a876bcfc1c547e8dd695c6b89342488b0ea55cd8"},
{file = "regex-2023.12.25-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:263ef5cc10979837f243950637fffb06e8daed7f1ac1e39d5910fd29929e489a"},
{file = "regex-2023.12.25-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:d6f7e255e5fa94642a0724e35406e6cb7001c09d476ab5fce002f652b36d0c39"},
{file = "regex-2023.12.25-cp37-cp37m-musllinux_1_1_ppc64le.whl", hash = "sha256:88ad44e220e22b63b0f8f81f007e8abbb92874d8ced66f32571ef8beb0643b2b"},
{file = "regex-2023.12.25-cp37-cp37m-musllinux_1_1_s390x.whl", hash = "sha256:3a17d3ede18f9cedcbe23d2daa8a2cd6f59fe2bf082c567e43083bba3fb00347"},
{file = "regex-2023.12.25-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:d15b274f9e15b1a0b7a45d2ac86d1f634d983ca40d6b886721626c47a400bf39"},
{file = "regex-2023.12.25-cp37-cp37m-win32.whl", hash = "sha256:ed19b3a05ae0c97dd8f75a5d8f21f7723a8c33bbc555da6bbe1f96c470139d3c"},
{file = "regex-2023.12.25-cp37-cp37m-win_amd64.whl", hash = "sha256:a6d1047952c0b8104a1d371f88f4ab62e6275567d4458c1e26e9627ad489b445"},
{file = "regex-2023.12.25-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:b43523d7bc2abd757119dbfb38af91b5735eea45537ec6ec3a5ec3f9562a1c53"},
{file = "regex-2023.12.25-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:efb2d82f33b2212898f1659fb1c2e9ac30493ac41e4d53123da374c3b5541e64"},
{file = "regex-2023.12.25-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:b7fca9205b59c1a3d5031f7e64ed627a1074730a51c2a80e97653e3e9fa0d415"},
{file = "regex-2023.12.25-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:086dd15e9435b393ae06f96ab69ab2d333f5d65cbe65ca5a3ef0ec9564dfe770"},
{file = "regex-2023.12.25-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e81469f7d01efed9b53740aedd26085f20d49da65f9c1f41e822a33992cb1590"},
{file = "regex-2023.12.25-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:34e4af5b27232f68042aa40a91c3b9bb4da0eeb31b7632e0091afc4310afe6cb"},
{file = "regex-2023.12.25-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9852b76ab558e45b20bf1893b59af64a28bd3820b0c2efc80e0a70a4a3ea51c1"},
{file = "regex-2023.12.25-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ff100b203092af77d1a5a7abe085b3506b7eaaf9abf65b73b7d6905b6cb76988"},
{file = "regex-2023.12.25-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:cc038b2d8b1470364b1888a98fd22d616fba2b6309c5b5f181ad4483e0017861"},
{file = "regex-2023.12.25-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:094ba386bb5c01e54e14434d4caabf6583334090865b23ef58e0424a6286d3dc"},
{file = "regex-2023.12.25-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:5cd05d0f57846d8ba4b71d9c00f6f37d6b97d5e5ef8b3c3840426a475c8f70f4"},
{file = "regex-2023.12.25-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:9aa1a67bbf0f957bbe096375887b2505f5d8ae16bf04488e8b0f334c36e31360"},
{file = "regex-2023.12.25-cp38-cp38-musllinux_1_1_s390x.whl", hash = "sha256:98a2636994f943b871786c9e82bfe7883ecdaba2ef5df54e1450fa9869d1f756"},
{file = "regex-2023.12.25-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:37f8e93a81fc5e5bd8db7e10e62dc64261bcd88f8d7e6640aaebe9bc180d9ce2"},
{file = "regex-2023.12.25-cp38-cp38-win32.whl", hash = "sha256:d78bd484930c1da2b9679290a41cdb25cc127d783768a0369d6b449e72f88beb"},
{file = "regex-2023.12.25-cp38-cp38-win_amd64.whl", hash = "sha256:b521dcecebc5b978b447f0f69b5b7f3840eac454862270406a39837ffae4e697"},
{file = "regex-2023.12.25-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:f7bc09bc9c29ebead055bcba136a67378f03d66bf359e87d0f7c759d6d4ffa31"},
{file = "regex-2023.12.25-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:e14b73607d6231f3cc4622809c196b540a6a44e903bcfad940779c80dffa7be7"},
{file = "regex-2023.12.25-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:9eda5f7a50141291beda3edd00abc2d4a5b16c29c92daf8d5bd76934150f3edc"},
{file = "regex-2023.12.25-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cc6bb9aa69aacf0f6032c307da718f61a40cf970849e471254e0e91c56ffca95"},
{file = "regex-2023.12.25-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:298dc6354d414bc921581be85695d18912bea163a8b23cac9a2562bbcd5088b1"},
{file = "regex-2023.12.25-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2f4e475a80ecbd15896a976aa0b386c5525d0ed34d5c600b6d3ebac0a67c7ddf"},
{file = "regex-2023.12.25-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:531ac6cf22b53e0696f8e1d56ce2396311254eb806111ddd3922c9d937151dae"},
{file = "regex-2023.12.25-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:22f3470f7524b6da61e2020672df2f3063676aff444db1daa283c2ea4ed259d6"},
{file = "regex-2023.12.25-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:89723d2112697feaa320c9d351e5f5e7b841e83f8b143dba8e2d2b5f04e10923"},
{file = "regex-2023.12.25-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:0ecf44ddf9171cd7566ef1768047f6e66975788258b1c6c6ca78098b95cf9a3d"},
{file = "regex-2023.12.25-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:905466ad1702ed4acfd67a902af50b8db1feeb9781436372261808df7a2a7bca"},
{file = "regex-2023.12.25-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:4558410b7a5607a645e9804a3e9dd509af12fb72b9825b13791a37cd417d73a5"},
{file = "regex-2023.12.25-cp39-cp39-musllinux_1_1_s390x.whl", hash = "sha256:7e316026cc1095f2a3e8cc012822c99f413b702eaa2ca5408a513609488cb62f"},
{file = "regex-2023.12.25-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:3b1de218d5375cd6ac4b5493e0b9f3df2be331e86520f23382f216c137913d20"},
{file = "regex-2023.12.25-cp39-cp39-win32.whl", hash = "sha256:11a963f8e25ab5c61348d090bf1b07f1953929c13bd2309a0662e9ff680763c9"},
{file = "regex-2023.12.25-cp39-cp39-win_amd64.whl", hash = "sha256:e693e233ac92ba83a87024e1d32b5f9ab15ca55ddd916d878146f4e3406b5c91"},
{file = "regex-2023.12.25.tar.gz", hash = "sha256:29171aa128da69afdf4bde412d5bedc335f2ca8fcfe4489038577d05f16181e5"},
]
[[package]] [[package]]
name = "ruff" name = "ruff"
version = "0.2.2" version = "0.2.2"
@ -98,6 +394,17 @@ files = [
{file = "sgmllib3k-1.0.0.tar.gz", hash = "sha256:7868fb1c8bfa764c1ac563d3cf369c381d1325d36124933a726f29fcdaa812e9"}, {file = "sgmllib3k-1.0.0.tar.gz", hash = "sha256:7868fb1c8bfa764c1ac563d3cf369c381d1325d36124933a726f29fcdaa812e9"},
] ]
[[package]]
name = "six"
version = "1.16.0"
description = "Python 2 and 3 compatibility utilities"
optional = false
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*"
files = [
{file = "six-1.16.0-py2.py3-none-any.whl", hash = "sha256:8abb2f1d86890a2dfb989f9a77cfcfd3e47c2a354b01111771326f8aa26e0254"},
{file = "six-1.16.0.tar.gz", hash = "sha256:1e61c37477a1626458e36f7b1d82aa5c9b094fa4802892072e49de9c60c4c926"},
]
[[package]] [[package]]
name = "sqlparse" name = "sqlparse"
version = "0.4.4" version = "0.4.4"
@ -114,6 +421,26 @@ dev = ["build", "flake8"]
doc = ["sphinx"] doc = ["sphinx"]
test = ["pytest", "pytest-cov"] test = ["pytest", "pytest-cov"]
[[package]]
name = "tqdm"
version = "4.66.2"
description = "Fast, Extensible Progress Meter"
optional = false
python-versions = ">=3.7"
files = [
{file = "tqdm-4.66.2-py3-none-any.whl", hash = "sha256:1ee4f8a893eb9bef51c6e35730cebf234d5d0b6bd112b0271e10ed7c24a02bd9"},
{file = "tqdm-4.66.2.tar.gz", hash = "sha256:6cd52cdf0fef0e0f543299cfc96fec90d7b8a7e88745f411ec33eb44d5ed3531"},
]
[package.dependencies]
colorama = {version = "*", markers = "platform_system == \"Windows\""}
[package.extras]
dev = ["pytest (>=6)", "pytest-cov", "pytest-timeout", "pytest-xdist"]
notebook = ["ipywidgets (>=6)"]
slack = ["slack-sdk"]
telegram = ["requests"]
[[package]] [[package]]
name = "tzdata" name = "tzdata"
version = "2024.1" version = "2024.1"
@ -128,4 +455,4 @@ files = [
[metadata] [metadata]
lock-version = "2.0" lock-version = "2.0"
python-versions = "^3.12" python-versions = "^3.12"
content-hash = "2009fc28de585811e28f244d45005b50835cb752c757761f61215577699b4cef" content-hash = "ea6acc2956864e24d940cf2e9258c5c6366b3ba82f8566d757e901b82c013e76"

View file

@ -13,6 +13,7 @@ feedparser = "^6.0.11"
[tool.poetry.group.dev.dependencies] [tool.poetry.group.dev.dependencies]
ruff = "^0.2.2" ruff = "^0.2.2"
djlint = "^1.34.1"
[build-system] [build-system]
requires = ["poetry-core"] requires = ["poetry-core"]
@ -47,3 +48,7 @@ convention = "google"
"D102", # Allow missing docstrings in tests "D102", # Allow missing docstrings in tests
"PLR6301", # Checks for the presence of unused self parameter in methods definitions. "PLR6301", # Checks for the presence of unused self parameter in methods definitions.
] ]
[tool.djlint]
format_attribute_template_tags = true

143
templates/base.html Normal file
View file

@ -0,0 +1,143 @@
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<meta name="description" content="%s" />
<meta name="keywords" content="%s" />
<meta name="author" content="%s" />
<link rel="canonical" href="%s" />
{% if title %}
<title>{{ title }}</title>
{% else %}
<title>FeedVault</title>
{% endif %}
<style>
html {
max-width: 70ch;
padding: calc(1vmin + 0.5rem);
margin-inline: auto;
font-size: clamp(1em, 0.909em + 0.45vmin, 1.25em);
font-family: -apple-system, BlinkMacSystemFont, "Segoe UI", Roboto,
Helvetica, Arial, sans-serif;
color-scheme: light dark;
}
h1 {
font-size: 2.5rem;
font-weight: 600;
margin: 0;
}
.title {
text-align: center;
}
.search {
display: flex;
justify-content: center;
margin-top: 1rem;
margin-inline: auto;
}
.leftright {
display: flex;
justify-content: center;
}
.left {
margin-right: auto;
}
.right {
margin-left: auto;
}
textarea {
width: 100%;
height: 10rem;
resize: vertical;
}
.messages {
list-style-type: none;
}
.error {
color: red;
}
.success {
color: green;
}
</style>
</head>
<body>
{% if messages %}
<ul class="messages">
{% for message in messages %}
<li {% if message.tags %}class="{{ message.tags }}"{% endif %}>{{ message }}</li>
{% endfor %}
</ul>
{% endif %}
<span class="title">
<h1>
<a href="/">FeedVault</a>
</h1>
</span>
<div class="leftright">
<div class="left">
<small>Archive of
<a href="https://en.wikipedia.org/wiki/Web_feed">web feeds</a>.
{% if amount_of_feeds %}
{{ amount_of_feeds }} feeds.
{% else %}
0 feeds.
{% endif %}
~{{ db_size }}.
</small>
</div>
<div class="right">
<form action="#" method="get">
<input type="text" name="q" placeholder="Search" />
<button type="submit">Search</button>
</form>
</div>
</div>
<nav>
<small>
<div class="leftright">
<div class="left">
<a href="/">Home</a> | <a href="/feeds">Feeds</a> |
<a href="/api">API</a>
</div>
<div class="right">
<a href="https://github.com/TheLovinator1/FeedVault">GitHub</a> |
<a href="https://github.com/sponsors/TheLovinator1">Donate</a>
</div>
</div>
</small>
</nav>
<hr />
<main>
{% block content %}<!-- default content -->{% endblock %}
</main>
<hr />
<footer>
<small>
<div class="leftright">
<div class="left">
Made by <a href="">Joakim Hellsén</a>.
</div>
<div class="right">No rights reserved.</div>
</div>
<div class="leftright">
<div class="left">
<a href="mailto:hello@feedvault.se">hello@feedvault.se</a>
</div>
<div class="right">A birthday present for Plipp ❤️</div>
</div>
</small>
</footer>
</body>
</html>

27
templates/feed.html Normal file
View file

@ -0,0 +1,27 @@
{% extends "base.html" %}
{% block content %}
<h2>{{ feed.feed_url }}</h2>
<dl>
{% for field_name, field_value in feed.get_fields %}
{% if field_value %}
<dt>{{ field_name }}</dt>
<dd>
{{ field_value }}
</dd>
{% endif %}
{% endfor %}
</dl>
<h3>Entries</h3>
<dl>
{% for entry in entries %}
{% for field_name, field_value in entry.get_fields %}
{% if field_value %}
<dt>{{ field_name }}</dt>
<dd>
{{ field_value }}
</dd>
{% endif %}
{% endfor %}
{% endfor %}
</dl>
{% endblock %}

14
templates/feeds.html Normal file
View file

@ -0,0 +1,14 @@
{% extends "base.html" %}
{% block content %}
<h2>Feeds</h2>
<ul>
{% for feed in feeds %}
<li>{{ feed.feed_url }} - {{ feed.created_at|date }}</li>
<li>
<a href="/feed/{{ feed.id }}">View</a>
</li>
{% empty %}
<li>No feeds yet.</li>
{% endfor %}
</ul>
{% endblock %}

55
templates/index.html Normal file
View file

@ -0,0 +1,55 @@
{% extends "base.html" %}
{% block content %}
<h2>Feeds to archive</h2>
<p>
Input the URLs of the feeds you wish to archive below. You can add as many as needed, and access them through the website or API. Alternatively, include links to .opml files, and the feeds within will be archived.
</p>
<form action="/add" method="post">
{% csrf_token %}
<textarea id="urls" name="urls" rows="5" cols="50" required></textarea>
<button type="submit">Add feeds</button>
</form>
<br>
<p>You can also upload .opml files containing the feeds you wish to archive:</p>
<form enctype="multipart/form-data" method="post" action="/upload_opml">
<input type="file" name="file" id="file" accept=".opml" required>
<button type="submit">Upload OPML</button>
</form>
<h2>FAQ</h2>
<details>
<summary>What are web feeds?</summary>
<p>
Web feeds are a way to distribute content on the web. They allow users to access updates from websites without having to visit them directly. Feeds are typically used for news websites, blogs, and other sites that frequently update content.
<br>
You can read more about web feeds on <a href="https://en.wikipedia.org/wiki/Web_feed">Wikipedia</a>.
</p>
<hr>
</details>
<details>
<summary>What is FeedVault?</summary>
<p>
FeedVault is a service that archives web It allows users to access and search for historical content from various websites. The service is designed to preserve the history of the web and provide a reliable source for accessing content that may no longer be available on the original websites.
</p>
<hr>
</details>
<details>
<summary>Why archive feeds?</summary>
<p>
Web feeds are a valuable source of information, and archiving them ensures that the content is preserved for future reference. By archiving feeds, we can ensure that historical content is available for research, analysis, and other purposes. Additionally, archiving feeds can help prevent the loss of valuable information due to website changes, outages, or other issues.
</p>
<hr>
</details>
<details>
<summary>How does it work?</summary>
<p>
FeedVault is written in Go and uses the <a href="https://github.com/mmcdole/gofeed">gofeed</a> library to parse The service periodically checks for new content in the feeds and stores it in a database. Users can access the archived feeds through the website or API.
</p>
<hr>
</details>
<details>
<summary>How can I access the archived feeds?</summary>
<p>
You can access the archived feeds through the website or API. The website provides a user interface for searching and browsing the feeds, while the API allows you to access the feeds programmatically. You can also download the feeds in various formats, such as JSON, XML, or RSS.
</p>
</details>
{% endblock %}