Compare commits
13 Commits
171c5df068
...
main
Author | SHA1 | Date | |
---|---|---|---|
2ee6b190b4 | |||
e47443739b | |||
8cbf986847 | |||
37c9b83485 | |||
e1f41307f2 | |||
c0cbbb00ba | |||
8e81dcc864 | |||
6c3c752c43 | |||
1b352d2586 | |||
406f8cef0b | |||
43ad7ff17e | |||
710e6749aa | |||
31455f075e |
3
.gitignore
vendored
3
.gitignore
vendored
@ -29,3 +29,6 @@ Thumbs.db
|
||||
*.swp
|
||||
*.bak
|
||||
*.tmp
|
||||
|
||||
# Completed tasks are cached here to preserve them beyond the API's 90 day limit
|
||||
Todoist-Completed-History.json
|
||||
|
28
README.md
28
README.md
@ -1,15 +1,15 @@
|
||||
# Todoist Actual Backup
|
||||
|
||||
Todoist is a SaaS task manager. Todoist provides backups of current tasks, but does not include completed tasks. Nor does it provide a human-readable backup. This script exports everything to JSON and HTML.
|
||||
|
||||
This project provides a command-line tool to export all active and completed tasks from the Todoist API to a JSON file, including attachments and comments, and generates a human-readable HTML backup.
|
||||
Todoist is a SaaS task manager. Todoist provides backups of current tasks, but they do not include completed tasks, subtask relationships, comments or attachments. Nor does it provide a human-readable backup in HTML. This Python script provides a command-line tool to export all available active and completed tasks from the Todoist API to a JSON file, including attachments, subtasks and comments, and generates a human-readable HTML backup.
|
||||
|
||||
## Features
|
||||
- Exports all active and completed tasks from all projects (active and archived)
|
||||
- Downloads attachments and references them in the JSON and HTML output
|
||||
- Nests tasks under their respective projects, including all available fields
|
||||
- Includes comments for each task
|
||||
- Outputs both a JSON file for programmatic access and a styled HTML file for viewing in a browser
|
||||
- Downloads attachments to `output/attachments/` and references them in the JSON and HTML output
|
||||
- JSON and HTML files are named with the current date when the script is run
|
||||
- Maintains `Todoist-Completed-History.json` so completed tasks older than Todoist's 90-day API window stay in future exports
|
||||
- Reuses archived comments for completed tasks to avoid unnecessary API calls (assumes no new comments after completion)
|
||||
|
||||
## Setup
|
||||
- Ensure you have Python 3.8 or newer installed. Check with `python --version` on the command line.
|
||||
@ -19,21 +19,29 @@ This project provides a command-line tool to export all active and completed tas
|
||||
source .venv/bin/activate
|
||||
pip install -r requirements.txt
|
||||
```
|
||||
- Get your API key from [Todoist](https://app.todoist.com/app/settings/integrations/developer)
|
||||
- Optionally set your Todoist API key in the `TODOIST_KEY` environment variable. If the environment variable is not set, the script will prompt for it.
|
||||
|
||||
## Usage
|
||||
1. Run `source .venv/bin/activate` if needed to enter the virtual enivronment.
|
||||
2. Set your Todoist API key in the `TODOIST_KEY` environment variable.
|
||||
2. To see usage instructions, run the script with no arguments or any argument other than `export`.
|
||||
3. Run the script with the `export` argument:
|
||||
```bash
|
||||
python export_todoist.py export
|
||||
```
|
||||
This will create `Todoist-Actual-Backup-YYYY-MM-DD.json` and `Todoist-Actual-Backup-YYYY-MM-DD.html` in the current directory.
|
||||
4. To see usage instructions, run the script with no arguments or any argument other than `export`.
|
||||
This will create `output/Todoist-Actual-Backup-YYYY-MM-DD.json` and `output/Todoist-Actual-Backup-YYYY-MM-DD.html`, and it will update `output/attachments/` with any downloaded files while leaving `Todoist-Completed-History.json` in the project root.
|
||||
Keep `Todoist-Completed-History.json` somewhere safe (e.g., in source control or a backup location); it is the only way the exporter can retain completions older than Todoist's 90-day API retention window.
|
||||
|
||||
## Requirements
|
||||
- Python 3.8+
|
||||
- [todoist-api-python](https://doist.github.io/todoist-api-python/)
|
||||
- [Jinja2](https://palletsprojects.com/p/jinja/)
|
||||
|
||||
## License
|
||||
MIT
|
||||
## MIT License
|
||||
Copyright © 2025 bagaag.com
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
|
||||
|
@ -1,12 +1,34 @@
|
||||
import os
|
||||
import sys
|
||||
import json
|
||||
import time
|
||||
import getpass
|
||||
import shutil
|
||||
import copy
|
||||
from collections import defaultdict
|
||||
from urllib.parse import quote_plus
|
||||
import requests
|
||||
from datetime import datetime, timedelta
|
||||
from todoist_api_python.api import TodoistAPI
|
||||
from jinja2 import Environment, FileSystemLoader, select_autoescape
|
||||
|
||||
ATTACHMENTS_DIR = "attachments"
|
||||
OUTPUT_DIR = "output"
|
||||
ATTACHMENTS_DIR = os.path.join(OUTPUT_DIR, "attachments")
|
||||
LEGACY_ATTACHMENTS_DIR = "attachments"
|
||||
TODOIST_API_TOKEN: str | None = None
|
||||
COMPLETED_HISTORY_FILE = "Todoist-Completed-History.json"
|
||||
COMMENT_REQUEST_MIN_INTERVAL = 0.5 # seconds
|
||||
COMMENT_MAX_ATTEMPTS = 8
|
||||
PROJECTS_URL = "https://api.todoist.com/rest/v2/projects"
|
||||
TASKS_URL = "https://api.todoist.com/rest/v2/tasks"
|
||||
COMPLETED_TASKS_URL = "https://api.todoist.com/api/v1/tasks/completed/by_completion_date"
|
||||
COMMENTS_URL = "https://api.todoist.com/api/v1/comments"
|
||||
|
||||
|
||||
def json_serial(obj):
|
||||
if isinstance(obj, datetime):
|
||||
return obj.isoformat()
|
||||
return str(obj)
|
||||
|
||||
|
||||
def usage():
|
||||
@ -26,89 +48,392 @@ def usage():
|
||||
def get_api_key():
|
||||
key = os.environ.get("TODOIST_KEY")
|
||||
if not key:
|
||||
print("Error: TODOIST_KEY environment variable not set.")
|
||||
sys.exit(1)
|
||||
try:
|
||||
key = getpass.getpass("The TODOIST_KEY environment variable is not set. Enter TODOIST API key to continue: ").strip()
|
||||
except (EOFError, KeyboardInterrupt):
|
||||
print("\nError: TODOIST API key is required.")
|
||||
sys.exit(1)
|
||||
if not key:
|
||||
print("Error: TODOIST API key is required.")
|
||||
sys.exit(1)
|
||||
os.environ["TODOIST_KEY"] = key
|
||||
return key
|
||||
|
||||
|
||||
def ensure_output_dir():
|
||||
if not os.path.exists(OUTPUT_DIR):
|
||||
os.makedirs(OUTPUT_DIR, exist_ok=True)
|
||||
|
||||
|
||||
def ensure_attachments_dir():
|
||||
ensure_output_dir()
|
||||
if os.path.isdir(LEGACY_ATTACHMENTS_DIR) and LEGACY_ATTACHMENTS_DIR != ATTACHMENTS_DIR:
|
||||
try:
|
||||
if not os.path.exists(ATTACHMENTS_DIR):
|
||||
shutil.move(LEGACY_ATTACHMENTS_DIR, ATTACHMENTS_DIR)
|
||||
else:
|
||||
for name in os.listdir(LEGACY_ATTACHMENTS_DIR):
|
||||
shutil.move(
|
||||
os.path.join(LEGACY_ATTACHMENTS_DIR, name),
|
||||
os.path.join(ATTACHMENTS_DIR, name),
|
||||
)
|
||||
os.rmdir(LEGACY_ATTACHMENTS_DIR)
|
||||
print(f"Moved legacy attachments into {ATTACHMENTS_DIR}")
|
||||
except (OSError, shutil.Error) as exc: # pylint: disable=broad-except
|
||||
print(f"Warning: failed to migrate legacy attachments: {exc}")
|
||||
if not os.path.exists(ATTACHMENTS_DIR):
|
||||
os.makedirs(ATTACHMENTS_DIR)
|
||||
os.makedirs(ATTACHMENTS_DIR, exist_ok=True)
|
||||
|
||||
|
||||
def load_completed_history():
|
||||
if not os.path.exists(COMPLETED_HISTORY_FILE):
|
||||
return {}
|
||||
try:
|
||||
with open(COMPLETED_HISTORY_FILE, "r", encoding="utf-8") as handle:
|
||||
data = json.load(handle)
|
||||
except (OSError, json.JSONDecodeError) as exc: # pylint: disable=broad-except
|
||||
print(f"Warning: failed to load completed history ({exc}). Starting fresh.")
|
||||
return {}
|
||||
if isinstance(data, dict):
|
||||
history = {}
|
||||
for key, value in data.items():
|
||||
if isinstance(value, list):
|
||||
history[str(key)] = value
|
||||
return history
|
||||
if isinstance(data, list):
|
||||
history = defaultdict(list)
|
||||
for item in data:
|
||||
if isinstance(item, dict):
|
||||
project_id = str(item.get("project_id", ""))
|
||||
if project_id:
|
||||
history[project_id].append(item)
|
||||
return {key: value for key, value in history.items()}
|
||||
return {}
|
||||
|
||||
|
||||
def save_completed_history(history):
|
||||
try:
|
||||
with open(COMPLETED_HISTORY_FILE, "w", encoding="utf-8") as handle:
|
||||
json.dump(history, handle, ensure_ascii=False, indent=2, default=json_serial)
|
||||
except OSError as exc: # pylint: disable=broad-except
|
||||
print(f"Warning: failed to write completed history ({exc}).")
|
||||
|
||||
|
||||
def normalize_timestamp(value):
|
||||
if not value:
|
||||
return ""
|
||||
if isinstance(value, datetime):
|
||||
return value.isoformat()
|
||||
return str(value)
|
||||
|
||||
|
||||
def make_completed_task_key_from_dict(task):
|
||||
task_id = str(task.get('id', '')) if isinstance(task, dict) else ""
|
||||
if not task_id:
|
||||
return None
|
||||
completed_at = normalize_timestamp(task.get('completed_at'))
|
||||
if not completed_at:
|
||||
completed_at = normalize_timestamp(task.get('updated_at'))
|
||||
return (task_id, completed_at)
|
||||
|
||||
|
||||
def make_completed_task_key_from_api(task):
|
||||
task_id = getattr(task, "id", None)
|
||||
if not task_id:
|
||||
return None
|
||||
completed_at = normalize_timestamp(getattr(task, "completed_at", None))
|
||||
if not completed_at:
|
||||
completed_at = normalize_timestamp(getattr(task, "updated_at", None))
|
||||
return (str(task_id), completed_at)
|
||||
|
||||
|
||||
def merge_completed_lists(history_tasks, new_tasks):
|
||||
merged = []
|
||||
index_by_key = {}
|
||||
|
||||
def merge_task_dicts(primary, secondary, prefer_primary=True):
|
||||
for key, value in secondary.items():
|
||||
if key == 'comments':
|
||||
if (not primary.get('comments')) and value:
|
||||
primary['comments'] = value
|
||||
continue
|
||||
if key == 'attachments':
|
||||
if (not primary.get('attachments')) and value:
|
||||
primary['attachments'] = value
|
||||
continue
|
||||
if key not in primary or primary[key] in (None, "", [], {}):
|
||||
primary[key] = value
|
||||
continue
|
||||
if not prefer_primary:
|
||||
primary[key] = value
|
||||
return primary
|
||||
|
||||
def add_or_merge(task, prefer_existing=True):
|
||||
key = make_completed_task_key_from_dict(task)
|
||||
if key is None:
|
||||
merged.append(task)
|
||||
return
|
||||
if key in index_by_key:
|
||||
idx = index_by_key[key]
|
||||
merge_task_dicts(merged[idx], task, prefer_primary=prefer_existing)
|
||||
else:
|
||||
merged.append(task)
|
||||
index_by_key[key] = len(merged) - 1
|
||||
|
||||
for item in new_tasks:
|
||||
add_or_merge(item, prefer_existing=True)
|
||||
for item in history_tasks:
|
||||
add_or_merge(item, prefer_existing=True)
|
||||
|
||||
def sort_key(task):
|
||||
completed_at = normalize_timestamp(task.get('completed_at'))
|
||||
updated_at = normalize_timestamp(task.get('updated_at'))
|
||||
return (completed_at, updated_at)
|
||||
|
||||
merged.sort(key=sort_key, reverse=True)
|
||||
return merged
|
||||
|
||||
|
||||
def _file_looks_like_html(path):
|
||||
try:
|
||||
with open(path, 'rb') as handle:
|
||||
prefix = handle.read(256)
|
||||
except OSError:
|
||||
return False
|
||||
if not prefix:
|
||||
return True
|
||||
snippet = prefix.lstrip().lower()
|
||||
return snippet.startswith(b"<!doctype") or snippet.startswith(b"<html")
|
||||
|
||||
|
||||
def download_attachment(url, filename):
|
||||
local_path = os.path.join(ATTACHMENTS_DIR, filename)
|
||||
if os.path.exists(local_path):
|
||||
return local_path
|
||||
if _file_looks_like_html(local_path) and not filename.lower().endswith(('.htm', '.html')):
|
||||
try:
|
||||
os.remove(local_path)
|
||||
except OSError:
|
||||
pass
|
||||
else:
|
||||
return local_path
|
||||
print(f"Downloading attachment {url}")
|
||||
r = requests.get(url, stream=True)
|
||||
if r.status_code == 200:
|
||||
with open(local_path, 'wb') as f:
|
||||
for chunk in r.iter_content(1024):
|
||||
f.write(chunk)
|
||||
return local_path
|
||||
else:
|
||||
headers = {}
|
||||
if TODOIST_API_TOKEN:
|
||||
headers["Authorization"] = f"Bearer {TODOIST_API_TOKEN}"
|
||||
try:
|
||||
response = requests.get(url, stream=True, headers=headers, timeout=30)
|
||||
except requests.RequestException as exc: # pylint: disable=broad-except
|
||||
print(f"Failed to download attachment {url}: {exc}")
|
||||
return None
|
||||
if response.status_code != 200:
|
||||
print(f"Failed to download attachment {url}: HTTP {response.status_code}")
|
||||
return None
|
||||
content_type = (response.headers.get("Content-Type") or "").lower()
|
||||
first_chunk = b""
|
||||
try:
|
||||
with open(local_path, 'wb') as handle:
|
||||
for chunk in response.iter_content(chunk_size=8192):
|
||||
if not chunk:
|
||||
continue
|
||||
if not first_chunk:
|
||||
first_chunk = chunk
|
||||
handle.write(chunk)
|
||||
except OSError as exc: # pylint: disable=broad-except
|
||||
print(f"Failed to save attachment {filename}: {exc}")
|
||||
return None
|
||||
looks_like_html = (
|
||||
"text/html" in content_type
|
||||
or (first_chunk and _file_looks_like_html(local_path))
|
||||
)
|
||||
if looks_like_html and not filename.lower().endswith(('.htm', '.html')):
|
||||
try:
|
||||
os.remove(local_path)
|
||||
except OSError:
|
||||
pass
|
||||
print(f"Skipped attachment {url}: received HTML response instead of file")
|
||||
return None
|
||||
print(f"Downloaded attachment {url}")
|
||||
return local_path
|
||||
|
||||
|
||||
def _get_retry_delay(response, attempt, base_delay=5, max_delay=120):
|
||||
if response is not None:
|
||||
headers = getattr(response, "headers", {}) or {}
|
||||
retry_after = headers.get("Retry-After") or headers.get("retry-after")
|
||||
if retry_after:
|
||||
try:
|
||||
return max(1, int(float(retry_after)))
|
||||
except (TypeError, ValueError):
|
||||
pass
|
||||
reset_header = headers.get("X-RateLimit-Reset") or headers.get("x-rate-limit-reset")
|
||||
if reset_header:
|
||||
try:
|
||||
reset_timestamp = float(reset_header)
|
||||
return max(1, int(reset_timestamp - time.time()))
|
||||
except (TypeError, ValueError):
|
||||
pass
|
||||
return min(max_delay, base_delay * (2 ** attempt))
|
||||
|
||||
|
||||
def execute_with_rate_limit(func, *args, max_attempts=5, request_desc=None, **kwargs):
|
||||
attempts = 0
|
||||
desc = request_desc or getattr(func, "__name__", "call")
|
||||
while True:
|
||||
try:
|
||||
print(f" Calling {desc}")
|
||||
return func(*args, **kwargs)
|
||||
except Exception as error: # pylint: disable=broad-except
|
||||
status_code = getattr(error, "status_code", None)
|
||||
response = getattr(error, "response", None)
|
||||
if status_code is None and response is not None:
|
||||
status_code = getattr(response, "status_code", None)
|
||||
if status_code == 429 and attempts < max_attempts:
|
||||
delay = _get_retry_delay(response, attempts)
|
||||
attempts += 1
|
||||
print(f" Rate limit hit for {desc}. Waiting {delay} seconds before retry {attempts}/{max_attempts}...")
|
||||
if delay > 1:
|
||||
print(f" Waiting {delay} seconds due to rate limiting")
|
||||
time.sleep(delay)
|
||||
continue
|
||||
raise
|
||||
|
||||
|
||||
def fetch_all_projects(api):
|
||||
ret = []
|
||||
projects_iter = api.get_projects()
|
||||
for projects in projects_iter:
|
||||
for project in projects:
|
||||
name = getattr(project, 'name', None)
|
||||
id = getattr(project, 'id', None)
|
||||
print(f"Found project {name} with ID {id}")
|
||||
ret.append(project)
|
||||
return ret
|
||||
|
||||
def fetch_all_completed_tasks(api, project_id):
|
||||
# Fetch all completed tasks for a project using get_completed_tasks_by_completion_date
|
||||
# The API only allows up to 3 months per call, so we fetch just once for the last 3 months
|
||||
all_completed = []
|
||||
since = (datetime.now() - timedelta(days=90)).replace(hour=0, minute=0, second=0, microsecond=0)
|
||||
until = datetime.now()
|
||||
projects_by_id = {}
|
||||
try:
|
||||
completed_iter = api.get_completed_tasks_by_completion_date(since=since, until=until)
|
||||
for completed_list in completed_iter:
|
||||
for task in completed_list:
|
||||
if hasattr(task, 'project_id') and str(task.project_id) == str(project_id):
|
||||
all_completed.append(task)
|
||||
except Exception as e:
|
||||
print(f"Error fetching completed tasks for {since} to {until}: {e}")
|
||||
print(f"Found {len(all_completed)} completed tasks for project {project_id}")
|
||||
return all_completed
|
||||
|
||||
def fetch_all_tasks(api, project_id, completed=False):
|
||||
if completed:
|
||||
return fetch_all_completed_tasks(api, project_id)
|
||||
else:
|
||||
tasks = []
|
||||
try:
|
||||
tasks_iter = api.get_tasks(project_id=project_id)
|
||||
for batch in tasks_iter:
|
||||
for task in batch:
|
||||
tasks.append(task)
|
||||
except Exception as e:
|
||||
print(f"Error fetching active tasks for project {project_id}: {e}")
|
||||
print(f"Found {len(tasks)} completed tasks for project {project_id}")
|
||||
return tasks
|
||||
projects_iter = execute_with_rate_limit(
|
||||
api.get_projects,
|
||||
request_desc=f"GET {PROJECTS_URL}"
|
||||
)
|
||||
for batch in projects_iter:
|
||||
for project in batch:
|
||||
projects_by_id[str(getattr(project, "id", ""))] = project
|
||||
except Exception as error: # pylint: disable=broad-except
|
||||
print(f"Error fetching projects: {error}")
|
||||
return list(projects_by_id.values())
|
||||
|
||||
|
||||
def fetch_comments(api, task_id):
|
||||
comments = []
|
||||
def fetch_active_tasks_by_project(api):
|
||||
tasks_by_project = defaultdict(list)
|
||||
try:
|
||||
comments_iter = api.get_comments(task_id=task_id)
|
||||
for batch in comments_iter:
|
||||
for comment in batch:
|
||||
comments.append(comment)
|
||||
except Exception:
|
||||
return []
|
||||
return comments
|
||||
tasks_iter = execute_with_rate_limit(
|
||||
api.get_tasks,
|
||||
request_desc=f"GET {TASKS_URL}"
|
||||
)
|
||||
for batch in tasks_iter:
|
||||
for task in batch:
|
||||
tasks_by_project[str(getattr(task, "project_id", ""))].append(task)
|
||||
except Exception as error: # pylint: disable=broad-except
|
||||
print(f"Error fetching active tasks: {error}")
|
||||
print(f"Fetched active tasks for {len(tasks_by_project)} projects")
|
||||
return tasks_by_project
|
||||
|
||||
|
||||
def process_task(api, task, completed=False):
|
||||
def fetch_completed_tasks_by_project(api, since, until):
|
||||
tasks_by_project = defaultdict(list)
|
||||
try:
|
||||
query = f"?since={since.isoformat()}&until={until.isoformat()}"
|
||||
completed_iter = execute_with_rate_limit(
|
||||
api.get_completed_tasks_by_completion_date,
|
||||
request_desc=f"GET {COMPLETED_TASKS_URL}{query}",
|
||||
since=since,
|
||||
until=until,
|
||||
)
|
||||
for batch in completed_iter:
|
||||
for task in batch:
|
||||
tasks_by_project[str(getattr(task, "project_id", ""))].append(task)
|
||||
except Exception as error: # pylint: disable=broad-except
|
||||
print(f"Error fetching completed tasks between {since} and {until}: {error}")
|
||||
print(f"Fetched completed tasks for {len(tasks_by_project)} projects")
|
||||
return tasks_by_project
|
||||
|
||||
|
||||
def fetch_comments_by_task(api, project_ids, task_ids):
|
||||
comments_by_task = defaultdict(list)
|
||||
total_comments = 0
|
||||
last_comment_call = 0.0
|
||||
|
||||
def throttled_get_comments(**kwargs):
|
||||
nonlocal last_comment_call
|
||||
elapsed = time.time() - last_comment_call
|
||||
if elapsed < COMMENT_REQUEST_MIN_INTERVAL:
|
||||
time.sleep(COMMENT_REQUEST_MIN_INTERVAL - elapsed)
|
||||
params = []
|
||||
for key, value in kwargs.items():
|
||||
if value is None:
|
||||
continue
|
||||
params.append(f"{key}={quote_plus(str(value))}")
|
||||
query = "&".join(params)
|
||||
desc = f"GET {COMMENTS_URL}{('?' + query) if query else ''}"
|
||||
result = execute_with_rate_limit(
|
||||
api.get_comments,
|
||||
max_attempts=COMMENT_MAX_ATTEMPTS,
|
||||
request_desc=desc,
|
||||
**kwargs,
|
||||
)
|
||||
last_comment_call = time.time()
|
||||
return result
|
||||
|
||||
def handle_comment_error(scope, identifier, error):
|
||||
status_code = getattr(error, "status_code", None)
|
||||
response = getattr(error, "response", None)
|
||||
if status_code is None and response is not None:
|
||||
status_code = getattr(response, "status_code", None)
|
||||
if status_code == 404:
|
||||
print(f" Comments not found for {scope} {identifier} (404). Skipping.")
|
||||
return False
|
||||
if status_code == 429:
|
||||
delay = _get_retry_delay(response, COMMENT_MAX_ATTEMPTS)
|
||||
print(
|
||||
f" Rate limit while fetching comments for {scope} {identifier} after retries; waiting {delay} seconds before continuing."
|
||||
)
|
||||
if delay > 1:
|
||||
print(f" Waiting {delay} seconds due to rate limiting")
|
||||
time.sleep(delay)
|
||||
return True
|
||||
print(f" Error fetching comments for {scope} {identifier}: {error}")
|
||||
return False
|
||||
|
||||
for project_id in project_ids:
|
||||
while True:
|
||||
try:
|
||||
comments_iter = throttled_get_comments(project_id=project_id)
|
||||
for batch in comments_iter:
|
||||
for comment in batch:
|
||||
task_id = str(getattr(comment, "task_id", ""))
|
||||
if task_id:
|
||||
comments_by_task[task_id].append(comment)
|
||||
total_comments += 1
|
||||
break
|
||||
except Exception as error: # pylint: disable=broad-except
|
||||
if not handle_comment_error("project", project_id, error):
|
||||
break
|
||||
missing_task_ids = [task_id for task_id in task_ids if task_id not in comments_by_task]
|
||||
for task_id in missing_task_ids:
|
||||
while True:
|
||||
try:
|
||||
comments_iter = throttled_get_comments(task_id=task_id)
|
||||
for batch in comments_iter:
|
||||
for comment in batch:
|
||||
key = str(getattr(comment, "task_id", ""))
|
||||
if key:
|
||||
comments_by_task[key].append(comment)
|
||||
total_comments += 1
|
||||
break
|
||||
except Exception as error: # pylint: disable=broad-except
|
||||
if not handle_comment_error("task", task_id, error):
|
||||
break
|
||||
print(
|
||||
f"Fetched {total_comments} comments mapped to {len(comments_by_task)} tasks"
|
||||
)
|
||||
return comments_by_task
|
||||
|
||||
|
||||
def process_task(task, comments_lookup):
|
||||
task_dict = task.__dict__.copy()
|
||||
task_id = getattr(task, "id", None) or getattr(task, "task_id", None)
|
||||
if task_id is not None:
|
||||
task_dict.setdefault("id", task_id)
|
||||
# Attachments (if any)
|
||||
attachments = []
|
||||
if hasattr(task, 'attachments') and task.attachments:
|
||||
@ -118,55 +443,199 @@ def process_task(api, task, completed=False):
|
||||
filename = att_dict.get('file_name') or os.path.basename(att_dict['file_url'])
|
||||
local_path = download_attachment(att_dict['file_url'], filename)
|
||||
if local_path:
|
||||
att_dict['local_file'] = os.path.relpath(local_path)
|
||||
att_dict['local_file'] = os.path.relpath(local_path, OUTPUT_DIR)
|
||||
attachments.append(att_dict)
|
||||
if attachments:
|
||||
task_dict['attachments'] = attachments
|
||||
# Comments
|
||||
comments = fetch_comments(api, task.id)
|
||||
if comments:
|
||||
task_dict['comments'] = [c.__dict__ for c in comments]
|
||||
comment_key = str(task_id) if task_id is not None else None
|
||||
if comment_key and comment_key in comments_lookup:
|
||||
serialized_comments = []
|
||||
for comment in comments_lookup[comment_key]:
|
||||
comment_dict = comment.__dict__.copy()
|
||||
attachment = getattr(comment, "attachment", None)
|
||||
if attachment:
|
||||
attachment_dict = attachment.__dict__.copy()
|
||||
file_url = attachment_dict.get("file_url")
|
||||
if file_url:
|
||||
filename = attachment_dict.get("file_name") or os.path.basename(file_url)
|
||||
local_path = download_attachment(file_url, filename)
|
||||
if local_path:
|
||||
attachment_dict['local_file'] = os.path.relpath(local_path, OUTPUT_DIR)
|
||||
comment_dict['attachment'] = attachment_dict
|
||||
serialized_comments.append(comment_dict)
|
||||
task_dict['comments'] = serialized_comments
|
||||
return task_dict
|
||||
|
||||
|
||||
def build_task_hierarchy(task_dicts):
|
||||
task_lookup = {}
|
||||
order_lookup = {}
|
||||
for index, task in enumerate(task_dicts):
|
||||
task_id = task.get('id')
|
||||
if task_id is None:
|
||||
continue
|
||||
task_lookup[str(task_id)] = task
|
||||
order_lookup[str(task_id)] = index
|
||||
task.setdefault('subtasks', [])
|
||||
|
||||
roots = []
|
||||
for task in task_dicts:
|
||||
task_id = task.get('id')
|
||||
if task_id is None:
|
||||
roots.append(task)
|
||||
continue
|
||||
parent_id = task.get('parent_id')
|
||||
if parent_id:
|
||||
parent = task_lookup.get(str(parent_id))
|
||||
if parent:
|
||||
parent.setdefault('subtasks', [])
|
||||
parent['subtasks'].append(task)
|
||||
continue
|
||||
roots.append(task)
|
||||
|
||||
def sort_children(children):
|
||||
children.sort(key=lambda item: order_lookup.get(str(item.get('id')), 0))
|
||||
for child in children:
|
||||
child_children = child.get('subtasks') or []
|
||||
if child_children:
|
||||
sort_children(child_children)
|
||||
|
||||
sort_children(roots)
|
||||
|
||||
# Remove empty subtasks lists for cleanliness
|
||||
def prune(task):
|
||||
subtasks = task.get('subtasks')
|
||||
if subtasks:
|
||||
for sub in subtasks:
|
||||
prune(sub)
|
||||
else:
|
||||
task.pop('subtasks', None)
|
||||
|
||||
for root in roots:
|
||||
prune(root)
|
||||
|
||||
return roots
|
||||
|
||||
|
||||
def main():
|
||||
if len(sys.argv) != 2 or sys.argv[1] != "export":
|
||||
usage()
|
||||
return
|
||||
ensure_attachments_dir()
|
||||
api = TodoistAPI(get_api_key())
|
||||
token = get_api_key()
|
||||
global TODOIST_API_TOKEN # pylint: disable=global-statement
|
||||
TODOIST_API_TOKEN = token
|
||||
api = TodoistAPI(token)
|
||||
projects = fetch_all_projects(api)
|
||||
since = (datetime.now() - timedelta(days=90)).replace(hour=0, minute=0, second=0, microsecond=0)
|
||||
until = datetime.now()
|
||||
active_tasks_by_project = fetch_active_tasks_by_project(api)
|
||||
completed_tasks_by_project = fetch_completed_tasks_by_project(api, since=since, until=until)
|
||||
completed_history = load_completed_history()
|
||||
history_by_key = {}
|
||||
for task_list in completed_history.values():
|
||||
for stored_task in task_list:
|
||||
key = make_completed_task_key_from_dict(stored_task)
|
||||
if key:
|
||||
history_by_key[key] = stored_task
|
||||
|
||||
active_comment_project_ids = sorted(
|
||||
pid
|
||||
for pid, tasks in active_tasks_by_project.items()
|
||||
if pid and tasks
|
||||
)
|
||||
completed_task_ids_for_comments: set[str] = set()
|
||||
skipped_completed_history = {}
|
||||
for task_list in completed_tasks_by_project.values():
|
||||
for task in task_list:
|
||||
key = make_completed_task_key_from_api(task)
|
||||
if key is None:
|
||||
continue
|
||||
history_entry = history_by_key.get(key)
|
||||
if history_entry:
|
||||
skipped_completed_history[key] = history_entry
|
||||
else:
|
||||
completed_task_ids_for_comments.add(key[0])
|
||||
|
||||
comments_by_task = fetch_comments_by_task(
|
||||
api,
|
||||
active_comment_project_ids,
|
||||
sorted(completed_task_ids_for_comments),
|
||||
)
|
||||
updated_history = {}
|
||||
data = []
|
||||
for project in projects:
|
||||
project_dict = project.__dict__.copy()
|
||||
project_id = project.id
|
||||
# Active tasks
|
||||
active_tasks = fetch_all_tasks(api, project_id, completed=False)
|
||||
# Completed tasks
|
||||
completed_tasks = fetch_all_tasks(api, project_id, completed=True)
|
||||
project_dict['tasks'] = [process_task(api, t, completed=False) for t in active_tasks]
|
||||
project_dict['completed_tasks'] = [process_task(api, t, completed=True) for t in completed_tasks]
|
||||
project_id = str(getattr(project, "id", ""))
|
||||
active_tasks = active_tasks_by_project.get(project_id, [])
|
||||
completed_tasks = completed_tasks_by_project.get(project_id, [])
|
||||
|
||||
processed_active = [process_task(t, comments_by_task) for t in active_tasks]
|
||||
processed_completed = [process_task(t, comments_by_task) for t in completed_tasks]
|
||||
|
||||
for task in processed_completed:
|
||||
key = make_completed_task_key_from_dict(task)
|
||||
history_entry = skipped_completed_history.get(key) if key else None
|
||||
if history_entry:
|
||||
if (not task.get('comments')) and history_entry.get('comments'):
|
||||
task['comments'] = copy.deepcopy(history_entry['comments'])
|
||||
if (not task.get('attachments')) and history_entry.get('attachments'):
|
||||
task['attachments'] = copy.deepcopy(history_entry['attachments'])
|
||||
|
||||
# Build hierarchy for active tasks
|
||||
project_dict['tasks'] = build_task_hierarchy(processed_active)
|
||||
|
||||
# Map task IDs to names for parent lookups
|
||||
name_lookup = {}
|
||||
for task in active_tasks + completed_tasks:
|
||||
task_id = getattr(task, "id", None)
|
||||
if task_id:
|
||||
name_lookup[str(task_id)] = getattr(task, "content", "")
|
||||
|
||||
for task in processed_completed:
|
||||
parent_id = task.get('parent_id')
|
||||
if parent_id:
|
||||
parent_name = name_lookup.get(str(parent_id))
|
||||
if parent_name:
|
||||
task['parent_task'] = {
|
||||
"id": str(parent_id),
|
||||
"content": parent_name,
|
||||
}
|
||||
|
||||
historical = completed_history.get(project_id, [])
|
||||
merged_completed = merge_completed_lists(historical, processed_completed)
|
||||
project_dict['completed_tasks'] = merged_completed
|
||||
updated_history[project_id] = merged_completed
|
||||
data.append(project_dict)
|
||||
for project_id, tasks in completed_history.items():
|
||||
if project_id not in updated_history:
|
||||
updated_history[project_id] = tasks
|
||||
save_completed_history(updated_history)
|
||||
# Write JSON
|
||||
today = datetime.now().strftime("%Y-%m-%d")
|
||||
json_filename = f"Todoist-Actual-Backup-{today}.json"
|
||||
def json_serial(obj):
|
||||
if isinstance(obj, datetime):
|
||||
return obj.isoformat()
|
||||
return str(obj)
|
||||
with open(json_filename, "w", encoding="utf-8") as f:
|
||||
json_output_path = os.path.join(OUTPUT_DIR, json_filename)
|
||||
with open(json_output_path, "w", encoding="utf-8") as f:
|
||||
json.dump(data, f, ensure_ascii=False, indent=2, default=json_serial)
|
||||
print(f"Exported data to {json_filename}")
|
||||
print(f"Exported data to {json_output_path}")
|
||||
# Write HTML
|
||||
env = Environment(
|
||||
loader=FileSystemLoader(os.path.dirname(__file__)),
|
||||
autoescape=select_autoescape(['html', 'xml'])
|
||||
)
|
||||
# Add markdown filter
|
||||
try:
|
||||
import markdown
|
||||
env.filters['markdown'] = lambda text: markdown.markdown(text or "")
|
||||
except ImportError:
|
||||
env.filters['markdown'] = lambda text: text or ""
|
||||
template = env.get_template("todoist_backup_template.html")
|
||||
html_filename = f"Todoist-Actual-Backup-{today}.html"
|
||||
with open(html_filename, "w", encoding="utf-8") as f:
|
||||
html_output_path = os.path.join(OUTPUT_DIR, html_filename)
|
||||
with open(html_output_path, "w", encoding="utf-8") as f:
|
||||
f.write(template.render(projects=data, date=today))
|
||||
print(f"Generated HTML backup at {html_filename}")
|
||||
print(f"Generated HTML backup at {html_output_path}")
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
|
@ -1,3 +1,4 @@
|
||||
todoist-api-python
|
||||
Jinja2
|
||||
requests
|
||||
markdown
|
||||
|
@ -2,26 +2,122 @@
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="UTF-8">
|
||||
<title>Todoist Backup - {{ date }}</title>
|
||||
<title>Todoist Actual Backup - {{ date }}</title>
|
||||
<style>
|
||||
body { font-family: Arial, sans-serif; background: #f8f9fa; color: #222; margin: 0; padding: 0; }
|
||||
.container { max-width: 900px; margin: 2em auto; background: #fff; padding: 2em; border-radius: 8px; box-shadow: 0 2px 8px #0001; }
|
||||
.container { max-width: 960px; margin: 2em auto; background: #fff; padding: 2em; border-radius: 8px; box-shadow: 0 2px 8px #0001; }
|
||||
h1, h2, h3 { color: #2d72d9; }
|
||||
.project { margin-bottom: 2em; }
|
||||
.task-list { margin: 0 0 1em 1em; }
|
||||
.task { border-bottom: 1px solid #eee; padding: 0.5em 0; }
|
||||
.completed { color: #888; }
|
||||
.attachments { margin: 0.5em 0 0.5em 1em; }
|
||||
.comments { margin: 0.5em 0 0.5em 1em; font-size: 0.95em; color: #444; }
|
||||
.field { font-weight: bold; }
|
||||
a.attachment-link { color: #2d72d9; text-decoration: underline; }
|
||||
.meta { color: #666; font-size: 0.95em; }
|
||||
nav ul { list-style: none; padding: 0; margin: 0; }
|
||||
nav li { margin: 0.25em 0; }
|
||||
nav a { text-decoration: none; color: #2d72d9; }
|
||||
nav a:hover { text-decoration: underline; }
|
||||
.project { margin-bottom: 3em; }
|
||||
.task-list { margin: 0 0 1em 0; }
|
||||
.task { border-bottom: 1px solid #eee; padding: 0 0 0.75em; }
|
||||
.task:last-child { border-bottom: none; }
|
||||
.task.level-0 { margin-left: 0; }
|
||||
.task.level-1 { margin-left: 1.5em; }
|
||||
.task.level-2 { margin-left: 3em; }
|
||||
.task.level-3 { margin-left: 4.5em; }
|
||||
.task-name { font-weight: 600; }
|
||||
.task-desc { margin: 0.35em 0; color: #555; }
|
||||
.meta { color: #777; font-size: 0.9em; display: inline-block; margin-top: 0.25em; }
|
||||
.field-name { font-style: italic; }
|
||||
.attachments ul,
|
||||
.comments ul { margin: 0.5em 0 0 1.2em; }
|
||||
.attachments li,
|
||||
.comments li { margin-bottom: 0.35em; }
|
||||
.attachment-link { color: #2d72d9; }
|
||||
.attachment-link:hover { text-decoration: underline; }
|
||||
.comments { margin-top: 0.5em; }
|
||||
.comment-attachment { margin-top: 0.25em; }
|
||||
.task.completed { background: #f3f6ff; padding: 0.75em; border-radius: 6px; margin-bottom: 0.75em; }
|
||||
.task.completed .task-name p { margin-top: 0; }
|
||||
</style>
|
||||
</head>
|
||||
<body>
|
||||
<div class="container">
|
||||
<h1>Todoist Backup ({{ date }})</h1>
|
||||
<!-- Table of Contents -->
|
||||
<h1>Todoist Actual Backup ({{ date }})</h1>
|
||||
|
||||
{% macro render_task(task, level=0) %}
|
||||
<div class="task level-{{ level }}">
|
||||
<div class="task-name">{{ task.content | markdown | safe }}</div>
|
||||
{% if task.description %}
|
||||
<div class="task-desc">{{ task.description | markdown | safe }}</div>
|
||||
{% endif %}
|
||||
<span class="meta">
|
||||
{% set meta_fields = [] %}
|
||||
{% if task.id is not none %}
|
||||
{% set _ = meta_fields.append('ID: ' ~ task.id) %}
|
||||
{% endif %}
|
||||
{% if task.due and task.due.date %}
|
||||
{% set due_dt = task.due.date %}
|
||||
{% if due_dt.__class__.__name__ == 'datetime' or due_dt.__class__.__name__ == 'date' %}
|
||||
{% set due_fmt = due_dt.strftime('%Y-%m-%d') %}
|
||||
{% else %}
|
||||
{% set due_str = due_dt|string %}
|
||||
{% if 'T' in due_str %}
|
||||
{% set due_fmt = due_str[:10] %}
|
||||
{% else %}
|
||||
{% set due_fmt = due_str %}
|
||||
{% endif %}
|
||||
{% endif %}
|
||||
{% set _ = meta_fields.append('Due: ' ~ due_fmt) %}
|
||||
{% endif %}
|
||||
{% if task.due and task.due.is_recurring %}
|
||||
{% if task.due.string %}
|
||||
{% set _ = meta_fields.append('Recurring: ' ~ task.due.string) %}
|
||||
{% endif %}
|
||||
{% endif %}
|
||||
{% if task.priority is not none %}
|
||||
{% set _ = meta_fields.append('Priority: ' ~ task.priority) %}
|
||||
{% endif %}
|
||||
{{ meta_fields|join(' | ') }}
|
||||
</span><br>
|
||||
{% if task.attachments %}
|
||||
<div class="attachments">
|
||||
<span class="field-name">Attachments:</span>
|
||||
<ul>
|
||||
{% for att in task.attachments %}
|
||||
<li><a class="attachment-link" href="{{ att.local_file }}" download>{{ att.file_name or att.local_file }}</a></li>
|
||||
{% endfor %}
|
||||
</ul>
|
||||
</div>
|
||||
{% endif %}
|
||||
{% if task.comments %}
|
||||
<div class="comments">
|
||||
<span class="field-name">Comments:</span>
|
||||
<ul>
|
||||
{% for comment in task.comments %}
|
||||
<li>
|
||||
{{ comment.content | markdown | safe }}
|
||||
<span class="meta">({{ comment.posted_at }})</span>
|
||||
{% set attachment = comment.attachment %}
|
||||
{% if attachment and (attachment.local_file or attachment.file_url) %}
|
||||
<div class="comment-attachment">
|
||||
Attachment:
|
||||
{% if attachment.local_file %}
|
||||
<a class="attachment-link" href="{{ attachment.local_file }}" download>{{ attachment.file_name or attachment.local_file }}</a>
|
||||
{% elif attachment.file_url %}
|
||||
<a class="attachment-link" href="{{ attachment.file_url }}" target="_blank">{{ attachment.file_name or attachment.file_url }}</a>
|
||||
{% endif %}
|
||||
</div>
|
||||
{% endif %}
|
||||
</li>
|
||||
{% endfor %}
|
||||
</ul>
|
||||
</div>
|
||||
{% endif %}
|
||||
{% if task.subtasks %}
|
||||
<div class="subtasks">
|
||||
{% for child in task.subtasks %}
|
||||
{{ render_task(child, level + 1) }}
|
||||
{% endfor %}
|
||||
</div>
|
||||
{% endif %}
|
||||
</div>
|
||||
{% endmacro %}
|
||||
|
||||
<nav style="margin-bottom:2em;">
|
||||
<h2 style="font-size:1.2em;">Projects</h2>
|
||||
<ul>
|
||||
@ -30,50 +126,79 @@
|
||||
{% endfor %}
|
||||
</ul>
|
||||
</nav>
|
||||
|
||||
{% for project in projects %}
|
||||
<div class="project" id="project-{{ project.id }}">
|
||||
<h2>{{ project.name }} {% if project.is_archived %}<span class="meta">[Archived]</span>{% endif %}</h2>
|
||||
<div class="meta">
|
||||
<span>ID: {{ project.id }}</span> | <span>Color: {{ project.color }}</span> | <span>Created: {{ project.created_at }}</span>
|
||||
</div>
|
||||
|
||||
<h3>Active Tasks</h3>
|
||||
<div class="task-list">
|
||||
{% for task in project.tasks %}
|
||||
<div class="task">
|
||||
<span class="field">Content:</span> {{ task.content }}<br>
|
||||
<span class="meta">ID: {{ task.id }} | Due: {{ task.due }} | Priority: {{ task.priority }}</span><br>
|
||||
{% if task.attachments %}
|
||||
<div class="attachments">
|
||||
<span class="field">Attachments:</span>
|
||||
<ul>
|
||||
{% for att in task.attachments %}
|
||||
<li><a class="attachment-link" href="{{ att.local_file }}" download>{{ att.file_name or att.local_file }}</a></li>
|
||||
{% endfor %}
|
||||
</ul>
|
||||
</div>
|
||||
{% endif %}
|
||||
{% if task.comments %}
|
||||
<div class="comments">
|
||||
<span class="field">Comments:</span>
|
||||
<ul>
|
||||
{% for comment in task.comments %}
|
||||
<li>{{ comment.content }} <span class="meta">({{ comment.posted_at }})</span></li>
|
||||
{% endfor %}
|
||||
</ul>
|
||||
</div>
|
||||
{% endif %}
|
||||
</div>
|
||||
{{ render_task(task, 0) }}
|
||||
{% else %}
|
||||
<p class="meta">No active tasks.</p>
|
||||
{% endfor %}
|
||||
</div>
|
||||
|
||||
<h3>Completed Tasks</h3>
|
||||
<div class="task-list">
|
||||
{% for task in project.completed_tasks %}
|
||||
<div class="task completed">
|
||||
<span class="field">Content:</span> {{ task.content }}<br>
|
||||
<span class="meta">ID: {{ task.id }} | Due: {{ task.due }} | Priority: {{ task.priority }}</span><br>
|
||||
<div class="task-name">{{ task.content | markdown | safe }}</div>
|
||||
{% if task.description %}
|
||||
<div class="task-desc">{{ task.description | markdown | safe }}</div>
|
||||
{% endif %}
|
||||
<span class="meta">
|
||||
{% set meta_fields = [] %}
|
||||
{% if task.id is not none %}
|
||||
{% set _ = meta_fields.append('ID: ' ~ task.id) %}
|
||||
{% endif %}
|
||||
{% if task.due and task.due.date %}
|
||||
{% set due_dt = task.due.date %}
|
||||
{% if due_dt.__class__.__name__ == 'datetime' or due_dt.__class__.__name__ == 'date' %}
|
||||
{% set due_fmt = due_dt.strftime('%Y-%m-%d') %}
|
||||
{% else %}
|
||||
{% set due_str = due_dt|string %}
|
||||
{% if 'T' in due_str %}
|
||||
{% set due_fmt = due_str[:10] %}
|
||||
{% else %}
|
||||
{% set due_fmt = due_str %}
|
||||
{% endif %}
|
||||
{% endif %}
|
||||
{% set _ = meta_fields.append('Due: ' ~ due_fmt) %}
|
||||
{% endif %}
|
||||
{% if task.due and task.due.is_recurring %}
|
||||
{% if task.due.string %}
|
||||
{% set _ = meta_fields.append('Recurring: ' ~ task.due.string) %}
|
||||
{% endif %}
|
||||
{% endif %}
|
||||
{% if task.priority is not none %}
|
||||
{% set _ = meta_fields.append('Priority: ' ~ task.priority) %}
|
||||
{% endif %}
|
||||
{% if task.completed_at %}
|
||||
{% if task.completed_at.__class__.__name__ == 'datetime' or task.completed_at.__class__.__name__ == 'date' %}
|
||||
{% set completed_fmt = task.completed_at.strftime('%Y-%m-%d') %}
|
||||
{% else %}
|
||||
{% set completed_str = task.completed_at|string %}
|
||||
{% if 'T' in completed_str %}
|
||||
{% set completed_fmt = completed_str[:10] %}
|
||||
{% else %}
|
||||
{% set completed_fmt = completed_str %}
|
||||
{% endif %}
|
||||
{% endif %}
|
||||
{% set _ = meta_fields.append('Completed: ' ~ completed_fmt) %}
|
||||
{% endif %}
|
||||
{{ meta_fields|join(' | ') }}
|
||||
</span><br>
|
||||
{% if task.parent_task %}
|
||||
<div class="meta">Parent task: {{ task.parent_task.content | markdown | safe }}</div>
|
||||
{% endif %}
|
||||
{% if task.attachments %}
|
||||
<div class="attachments">
|
||||
<span class="field">Attachments:</span>
|
||||
<span class="field-name">Attachments:</span>
|
||||
<ul>
|
||||
{% for att in task.attachments %}
|
||||
<li><a class="attachment-link" href="{{ att.local_file }}" download>{{ att.file_name or att.local_file }}</a></li>
|
||||
@ -83,15 +208,31 @@
|
||||
{% endif %}
|
||||
{% if task.comments %}
|
||||
<div class="comments">
|
||||
<span class="field">Comments:</span>
|
||||
<span class="field-name">Comments:</span>
|
||||
<ul>
|
||||
{% for comment in task.comments %}
|
||||
<li>{{ comment.content }} <span class="meta">({{ comment.posted_at }})</span></li>
|
||||
<li>
|
||||
{{ comment.content | markdown | safe }}
|
||||
<span class="meta">({{ comment.posted_at }})</span>
|
||||
{% set attachment = comment.attachment %}
|
||||
{% if attachment and (attachment.local_file or attachment.file_url) %}
|
||||
<div class="comment-attachment">
|
||||
Attachment:
|
||||
{% if attachment.local_file %}
|
||||
<a class="attachment-link" href="{{ attachment.local_file }}" download>{{ attachment.file_name or attachment.local_file }}</a>
|
||||
{% elif attachment.file_url %}
|
||||
<a class="attachment-link" href="{{ attachment.file_url }}" target="_blank">{{ attachment.file_name or attachment.file_url }}</a>
|
||||
{% endif %}
|
||||
</div>
|
||||
{% endif %}
|
||||
</li>
|
||||
{% endfor %}
|
||||
</ul>
|
||||
</div>
|
||||
{% endif %}
|
||||
</div>
|
||||
{% else %}
|
||||
<p class="meta">No completed tasks in this period.</p>
|
||||
{% endfor %}
|
||||
</div>
|
||||
</div>
|
||||
|
Reference in New Issue
Block a user