Over the past six months, I have been following anime on the Dimension Library, but it seems there were some issues recently that caused the anime updates to stop. So, I forced myself to create this anime tracking tool, Xiao Xiang's Nest.
The days spent building the nest were really bumpy and confusing, but I was genuinely happy as each problem was solved little by little. I hope this joy can be conveyed through this little article; I actually enjoyed the process a lot.
The materials for this little nest are:
- A server with 1 core, 1GB RAM, 10GB disk, 500GB monthly traffic, and 50Mbps bandwidth for $3 per month, installed with the Baota panel.
- A .top domain purchased from Porkbun, $1.6 for the first year and $4.9 for renewal.
- A brave heart.
The structure of this little nest is:
- Using Rclone to mount OneDrive for storing anime.
- qBittorrent automatically downloads anime via RSS.
- Configured qBittorrent to automatically upload completed downloads to OneDrive.
- Finally, using AList to mount OneDrive for web display.
The process of building this little nest is:
Obtain a Microsoft E5 account and configure OneDrive#
Reference
Free Microsoft E5 Developer Account to Use Office Genuine and OneDrive 5T Storage Space! PDF
The Microsoft E5 developer account allows you to get 5T of OneDrive storage space for free * 26 (both the main account and 25 sub-accounts have 5T).
Apply for an E5 Developer Account#
Go to the official website and click join now to apply at https://developer.microsoft.com/en-us/microsoft-365/dev-program, just follow the prompts.
After the application is completed, you will be redirected to this interface.
Here you can see the developer account you just applied for, “username@domain.onmicrosoft.com”.
Configure Sub-Accounts and OneDrive#
(Actually, you can use the main account directly, but I feel creating a sub-account is better.)
Now enter the Microsoft 365 admin center user management page at https://admin.microsoft.com/#/users.
Click Add a user to add a sub-account.
Then, I am not sure how to fill in the Assign product licenses, actually my sub-account was created automatically, and I saw it filled like this.
Just click Next until the addition is complete.
After adding the sub-account, click on the sub-account, and in the pop-up account management page, click OneDrive, find Storage used and click Edit to set the space to 5T. The default space when creating a user is 1T, and you can also click Manage default storage to modify it.
Now, log in to OneDrive using the sub-account, and you will see the 5T space.
About E5 Renewal Issues#
The first subscription of the E5 developer account will receive 90 days of subscription time, and there are no hard conditions for renewal. The key is whether Microsoft's algorithm thinks we are engaged in development activities.
The subsequent use of rclone and alist to upload and download anime daily will require calling the API of the application we created, so the automatic renewal success rate should still be quite high.
If automatic renewal fails, there is still a 30-day data retention period, and you can apply for another E5 account to transfer the data, or you can also contact me for help with storage. It seems that after 60 days of renewal failure, you can apply for an E5 account again.
Disable Microsoft Authenticator Verification#
After the account is created, you may see this interface when logging in.
How to Disable Forced Microsoft Authenticator Verification Login in Office365
Refer to the above, but the settings interface seems to have changed, so I explored it again.
In the Microsoft 365 admin center, click Show all in the menu, find Identity.
Clicking here will take you to the Microsoft Entra Admin Center.
Find Identity - Overview - Properties, then at the bottom of the page click [Manage Security Defaults] to set it to Disabled.
Now, when you log in again, the Microsoft Authenticator verification login will not appear.
Rclone Mount OneDrive#
Reference
Using Rclone to Mount OneDrive or Google Drive on Linux and Set to Start Automatically - Zhihu PDF
Using Rclone to Mount OneDrive on Debian/Ubuntu - Rat's Blog PDF
Rclone Advanced Usage Tutorial – Self-Built Private API Mounting OneDrive | AnYun.net – AnYun.ORG PDF
You can use the built-in API of Rclone (leave id and key blank), or you can build your own API to mount. Building your own API should be more helpful for E5 renewal.
Download Rclone for Windows to obtain the token, download link: https://rclone.org/downloads/.
Install Rclone on Linux: curl https://rclone.org/install.sh | bash
.
Build a Private API#
Building a private API mainly requires obtaining Client ID: client_id and Client Secret: client_secret.
Log in to https://portal.azure.com/#home, be careful to distinguish between the main account and sub-accounts, use the account you want to mount OneDrive.
Search for App registrations, click Services - App registrations, and click New registration.
Name it rclone (you can name it whatever you want).
Supported account types: Accounts in any organizational directory (...) and personal Microsoft accounts (...),
Redirect URI: Web http://localhost.
After registration, you will get the Application (client) ID (client_id).
Then select "Certificates & secrets", click "New client secret", you can leave the description blank, choose the longest duration, and click "Add".
After that, you will get the client secret client_secret.
(Note: client_secret is the "value" not the "secret ID", it will disappear after adding the password, please record it in time.)
Set API Permissions
Click API permissions
, click "Microsoft Graph", and add permissions Files.Read
, Files.ReadWrite
, Files.Read.All
, Files.ReadWrite.All
, offline_access
, User.Read
(you can search to add).
After clicking Update permissions, confirm the permissions again.
Using Rclone to Mount OneDrive with a Self-Built API#
Obtain Token
Download rclone on your local computer. For example, on Windows, unzip and enter the folder where rclone.exe
is located, type cmd
in the Explorer address bar, and press Enter to open the command prompt in the current path.
Use the Client ID: client_id and Client Secret: client_secret obtained in the previous step, replace Client_ID and Client_secret in the following command and execute.
rclone authorize "onedrive" "Client_ID" "Client_secret"
After logging in and authorizing in the Microsoft login window that pops up in the browser (be careful to distinguish the logged-in account), the command prompt window will display the token, copy and save it.
Rclone Connection Configuration
On Linux, enter the command rclone config
and follow the prompts to set it up.
(TIPS: Since RCLONE is updated from time to time, the menu options may have changed slightly by the time you read this tutorial, but the general idea will not change, do not blindly copy operations.)
[root@xxxxxxx ~]# rclone config
2023/11/10 11:51:37 NOTICE: Config file "/root/.config/rclone/rclone.conf" not found - using defaults
No remotes found, make a new one?
n) New remote
s) Set configuration password
q) Quit config
n/s/q> n # Enter n to create a connection
Enter name for new remote.
name> onedrive # Enter the name, here I use onedrive (you can fill it in freely)
Option Storage.
Type of storage to configure.
Choose a number from below, or type in your own value.
1 / 1Fichier
\ (fichier)
2 / Akamai NetStorage
\ (netstorage)
3 / Alias for an existing remote
\ (alias)
4 / Amazon Drive
\ (amazon cloud drive)
....
31 / Microsoft OneDrive
\ (onedrive)
....
Storage> 31 # Choose 31, Microsoft OneDrive, be careful that this serial number may change at any time, fill it in carefully.
Option client_id.
OAuth Client Id.
Leave blank normally.
Enter a value. Press Enter to leave empty.
client_id> xxxx-xxx #!!! Fill in the client ID of the self-built API: client_id
Option client_secret.
OAuth Client Secret.
Leave blank normally.
Enter a value. Press Enter to leave empty.
client_secret> xxxx-xxx #!!! Fill in the client secret of the self-built API: client_secret
Option region.
Choose national cloud region for OneDrive.
Choose a number from below, or type in your own string value.
Press Enter for the default (global).
1 / Microsoft Cloud Global
\ (global)
2 / Microsoft Cloud for US Government
\ (us)
3 / Microsoft Cloud Germany
\ (de)
4 / Azure and Office 365 operated by Vnet Group in China
\ (cn)
region> 1 # Just press Enter or fill in 1.
Edit advanced config?
y) Yes
n) No (default)
y/n> n # Just press Enter or fill in n.
Use web browser to automatically authenticate rclone with remote?
* Say Y if the machine running rclone has a web browser you can use
* Say N if running rclone on a (remote) machine without web browser access
If not sure try Y. If Y failed, try N.
y) Yes (default)
n) No
y/n> n # Fill in n.
Option config_token.
For this to work, you will need rclone available on a machine that has
a web browser available.
For more help and alternate methods see: https://rclone.org/remote_setup/
Execute the following on the machine with the web browser (same rclone version recommended):
rclone authorize "onedrive"
Then paste the result.
Enter a value.
config_token> # Fill in the token obtained earlier.
Option config_type.
Type of connection
Choose a number from below, or type in an existing string value.
Press Enter for the default (onedrive).
1 / OneDrive Personal or Business
\ (onedrive)
2 / Root Sharepoint site
\ (sharepoint)
/ Sharepoint site name or URL
3 | E.g. mysite or https://contoso.sharepoint.com/sites/mysite
\ (url)
4 / Search for a Sharepoint site
\ (search)
5 / Type in driveID (advanced)
\ (driveid)
6 / Type in SiteID (advanced)
\ (siteid)
/ Sharepoint server-relative path (advanced)
7 | E.g. /teams/hr
\ (path)
config_type> 1 # Fill in 1 to select OneDrive.
Option config_driveid.
Select drive you want to use
Choose a number from below, or type in your own string value.
Press Enter for the default (xxxxxx).
1 / OneDrive (business)
\ (xxxxxx)
config_driveid> 1 # Choose 1 as prompted.
Drive OK?
Found drive "root" of type "business"
URL: https://xxxx-my.sharepoint.com/personal/xxx_xxxx_onmicrosoft_com/Documents
y) Yes (default)
n) No
y/n> y # Fill in y.
Configuration complete.
Options:
- type: onedrive
- token: {"access_token":xxxxxxxxx}
- drive_id: xxxxxx
- drive_type: business
Keep this "onedrive" remote?
y) Yes this is OK (default)
e) Edit this remote
d) Delete this remote
y/e/d> y # Fill in y.
Current remotes:
Name Type
==== ====
onedrive onedrive
e) Edit existing remote
n) New remote
d) Delete remote
r) Rename remote
c) Copy remote
s) Set configuration password
q) Quit config
e/n/d/r/c/s/q> q # Choose q to exit.
Mount OneDrive
Create a mount directory, choose a location according to personal preference, I chose /home/onedrive.
# Create a local folder, the path is up to you, i.e., the following LocalFolder
mkdir /home/onedrive
# Mount as a disk, replace the parameters DriveName, Folder, LocalFolder according to the instructions.
rclone mount DriveName:Folder LocalFolder --copy-links --no-gzip-encoding --no-check-certificate --allow-other --allow-non-empty --umask 000
# My command is
rclone mount onedrive:/ /home/onedrive --copy-links --no-gzip-encoding --no-check-certificate --allow-other --allow-non-empty --umask 000
DriveName
is the name
filled in during initialization, Folder
is the folder in OneDrive
, and LocalFolder
is the local folder on the VPS.
If you encounter the error NOTICE: One drive root 'test': poll-interval is not supported by this remote
during the mounting process, you can ignore this error.
If there is an error containing "fusermount3", you need to install fuse3: yum install fuse3.
After executing the mount command, enter df -h
in a new terminal window to check.
Once the mount is successful, ctrl+c
to stop or close the window executing the mount command, and start setting it to run on boot.
Run on Boot#
Applicable for Linux using the systemctl
command.
# Modify the following to change all parameters except rclone from the command you manually ran above.
command=" mount onedrive:/ /home/onedrive --copy-links --no-gzip-encoding --no-check-certificate --allow-other --allow-non-empty --umask 000"
# The following is a complete command, copy it all to the SSH client to run.
cat > /etc/systemd/system/rclone.service <<EOF
[Unit]
Description=Rclone
After=network-online.target
[Service]
Type=simple
ExecStart=$(command -v rclone) ${command}
Restart=on-abort
User=root
[Install]
WantedBy=default.target
EOF
Start: systemctl start rclone
Set to run on boot: systemctl enable rclone
Restart: systemctl restart rclone
Stop: systemctl stop rclone
Status: systemctl status rclone
Restart, and if everything is fine, the rclone mount of OneDrive is complete.
Install and Configure qBittorrent-nox#
Reference
qBittorrent + Rclone to Automatically Upload to OneDrive, Google Drive, etc., and Automatically Delete Local Files – Geek Xuan PDF
The version of qb installed via yum is too low (AutoBangumi cannot connect, which troubled me for a long time), and compiling the installation seems too troublesome and requires a lot of memory, so I used the executable version of qb. This is mentioned in the article above:
Releases · userdocs/qbittorrent-nox-static.
Install qBittorrent-nox#
# Download
wget "https://github.com/userdocs/qbittorrent-nox-static/releases/download/release-4.6.0_v2.0.9/x86_64-qbittorrent-nox"
# Move to the program directory and rename it to qbittorrent-nox
mv ./x86_64-qbittorrent-nox /usr/bin/qbittorrent-nox
# Grant execution permissions
chmod a+x /usr/bin/qbittorrent-nox
Initialize qBittorrent
After completion, execute the command qbittorrent-nox
, enter y
and press Enter to confirm the agreement. Access http://server public IP address:8080
to control qb, and make sure to open port 8080.
Enter the initial username admin
and the initial password adminadmin
to log in. Then immediately reset the new username and password and save.
If everything is normal, stop qb in the terminal using Ctrl + C
, and start setting it to run on boot.
Set to Run on Boot
cat > /etc/systemd/system/qbittorrent.service << EOF
[Unit]
Description=qBittorrent Daemon Service
After=network.target
[Service]
LimitNOFILE=512000
User=root
ExecStart=/usr/bin/qbittorrent-nox
ExecStop=/usr/bin/killall -w qbittorrent-nox
[Install]
WantedBy=multi-user.target
EOF
Start: systemctl start qbittorrent
Set to run on boot: systemctl enable qbittorrent
Now the installation of qBittorrent-nox is basically complete, and here are some settings based on personal circumstances.
Configure qBittorrent-nox
Open the listening port of qb.
Downloads - Preallocate disk space for all files to prevent insufficient download space.
Speed - Global speed limit, limit the download speed to prevent the automatic upload to OneDrive set later from lagging behind the download speed.
BitTorrent - Torrent queue, set the maximum active download number to 1.
BTTracker: https://github.com/XIU2/TrackersListCollection/blob/master/README-ZH.md
Automatically Upload After Download Completion#
Prepare Automatic Upload Script
Automatic upload script: qBittorrent + Rclone to Automatically Upload to OneDrive, Google Drive, etc., and Automatically Delete Local Files – Geek Xuan
To save the anime in the appropriate directory, I made a few modifications: qb_auto.sh.
Modify the configuration in the script according to your information, then upload the script to the server. I saved it in /root/qbauto/qb_auto.sh
.
Also, give the script execution permissions chmod a+x /root/qbauto/qb_auto.sh
.
#!/bin/bash
torrent_name=$1
content_dir=$2
root_dir=$3
save_dir=$4
files_num=$5
torrent_size=$6
file_hash=$7
qb_version="4.6.0" # Change to your qbit version
qb_username="xxxxxx" # qbit username
qb_password="xxxxxxxxx" # qbit password
qb_web_url="http://localhost:8080" # qbit webui address
leeching_mode="true" # Leeching mode, true to automatically delete local seeds and files after download completion
log_dir="/root/qbauto" # Log output directory
rclone_dest="onedrive" # rclone configured storage name
rclone_parallel="32" # qbit upload threads, default 4
auto_del_flag="rclone" # Add tags or categories to identify uploaded seeds, v4.0.4+ version adds tag "rclone", older versions identify by adding category "rclone".
# Just modify the above parameters.
# 2023-11-15, added by Sakiko
# The save path on the cloud disk, extracts the save path on the cloud disk from the download path on the host.
# For example, if the download path is /root/Downloads/Sakiko/Bangumi/xxx, it will extract /Sakiko/Bangumi/xxx and save it to the cloud disk.
# You can modify "/root/Downloads" according to your download path.
rclone_dest_save_dir=${save_dir#*/root/Downloads}
if [ ! -d ${log_dir} ]
then
mkdir -p ${log_dir}
fi
version=$(echo $qb_version | grep -P -o "([0-9]\.){2}[0-9]" | sed s/\\.//g)
function qb_login(){
if [ ${version} -gt 404 ]
then
qb_v="1"
cookie=$(curl -i --header "Referer: ${qb_web_url}" --data "username=${qb_username}&password=${qb_password}" "${qb_web_url}/api/v2/auth/login" | grep -P -o 'SID=\S{32}')
if [ -n ${cookie} ]
then
echo "[$(date '+%Y-%m-%d %H:%M:%S')] Login successful! cookie:${cookie}" >> ${log_dir}/autodel.log
else
echo "[$(date '+%Y-%m-%d %H:%M:%S')] Login failed!" >> ${log_dir}/autodel.log
fi
elif [[ ${version} -le 404 && ${version} -ge 320 ]]
then
qb_v="2"
cookie=$(curl -i --header "Referer: ${qb_web_url}" --data "username=${qb_username}&password=${qb_password}" "${qb_web_url}/login" | grep -P -o 'SID=\S{32}')
if [ -n ${cookie} ]
then
echo "[$(date '+%Y-%m-%d %H:%M:%S')] Login successful! cookie:${cookie}" >> ${log_dir}/autodel.log
else
echo "[$(date '+%Y-%m-%d %H:%M:%S')] Login failed" >> ${log_dir}/autodel.log
fi
elif [[ ${version} -ge 310 && ${version} -lt 320 ]]
then
qb_v="3"
echo "Old version, please upgrade in time."
exit
else
qb_v="0"
exit
fi
}
function qb_del(){
if [ ${leeching_mode} == "true" ]
then
if [ ${qb_v} == "1" ]
then
curl -X POST -d "hashes=${file_hash}&deleteFiles=true" "${qb_web_url}/api/v2/torrents/delete" --cookie ${cookie}
echo "[$(date '+%Y-%m-%d %H:%M:%S')] Deletion successful! Torrent name:${torrent_name}" >> ${log_dir}/qb.log
elif [ ${qb_v} == "2" ]
then
curl -X POST -d "hashes=${file_hash}&deleteFiles=true" "${qb_web_url}/api/v2/torrents/delete" --cookie ${cookie}
else
curl -X POST -d "hashes=${file_hash}&deleteFiles=true" "${qb_web_url}/api/v2/torrents/delete" --cookie ${cookie}
echo "[$(date '+%Y-%m-%d %H:%M:%S')] Deletion successful! Torrent file:${torrent_name}" >> ${log_dir}/qb.log
echo "qb_v=${qb_v}" >> ${log_dir}/qb.log
fi
else
echo "[$(date '+%Y-%m-%d %H:%M:%S')] Do not automatically delete uploaded seeds" >> ${log_dir}/qb.log
fi
}
# 2023-11-15, added by Sakiko
# function rclone_copy(){
# if [ ${type} == "file" ]
# then
# rclone_copy_cmd=$(rclone -v copy --transfers ${rclone_parallel} --log-file ${log_dir}/qbauto_copy.log "${content_dir}" ${rclone_dest}:/qbit/)
# elif [ ${type} == "dir" ]
# then
# rclone_copy_cmd=$(rclone -v copy --transfers ${rclone_parallel} --log-file ${log_dir}/qbauto_copy.log "${content_dir}"/ ${rclone_dest}:/qbit/"${torrent_name}")
# fi
# }
# 2023-11-15, added by Sakiko
# Modify the rclone_copy function so that when uploading to the cloud disk, it saves to the custom automatically extracted download path rclone_dest_save_dir.
function rclone_copy(){
if [ ${type} == "file" ]
then
rclone_copy_cmd=$(rclone -v copy --transfers ${rclone_parallel} --log-file ${log_dir}/qbauto_copy.log "${content_dir}" ${rclone_dest}:"${rclone_dest_save_dir}"/)
elif [ ${type} == "dir" ]
then
rclone_copy_cmd=$(rclone -v copy --transfers ${rclone_parallel} --log-file ${log_dir}/qbauto_copy.log "${content_dir}"/ ${rclone_dest}:"${rclone_dest_save_dir}"/"${torrent_name}")
fi
}
function qb_add_auto_del_tags(){
if [ ${qb_v} == "1" ]
then
curl -X POST -d "hashes=${file_hash}&tags=${auto_del_flag}" "${qb_web_url}/api/v2/torrents/addTags" --cookie "${cookie}"
elif [ ${qb_v} == "2" ]
then
curl -X POST -d "hashes=${file_hash}&category=${auto_del_flag}" "${qb_web_url}/command/setCategory" --cookie ${cookie}
else
echo "qb_v=${qb_v}" >> ${log_dir}/qb.log
fi
}
if [ -f "${content_dir}" ]
then
echo "[$(date '+%Y-%m-%d %H:%M:%S')] Type: File" >> ${log_dir}/qb.log
type="file"
rclone_copy
qb_login
qb_add_auto_del_tags
qb_del
# rm -rf ${content_dir}
elif [ -d "${content_dir}" ]
then
echo "[$(date '+%Y-%m-%d %H:%M:%S')] Type: Directory" >> ${log_dir}/qb.log
type="dir"
rclone_copy
qb_login
qb_add_auto_del_tags
qb_del
# rm -rf ${content_dir}
else
echo "[$(date '+%Y-%m-%d %H:%M:%S')] Unknown type, cancel upload" >> ${log_dir}/qb.log
fi
echo "Torrent name: ${torrent_name}" >> ${log_dir}/qb.log
echo "Content path: ${content_dir}" >> ${log_dir}/qb.log
echo "Root directory: ${root_dir}" >> ${log_dir}/qb.log
echo "Save path: ${save_dir}" >> ${log_dir}/qb.log
echo "Number of files: ${files_num}" >> ${log_dir}/qb.log
echo "File size: ${torrent_size}Bytes" >> ${log_dir}/qb.log
echo "HASH:${file_hash}" >> ${log_dir}/qb.log
echo "Cookie:${cookie}" >> ${log_dir}/qb.log
echo -e "-------------------------------------------------------------\n" >> ${log_dir}/qb.log
Configure qBittorrent for Automatic Upload
Go to the qBittorrent settings, download options, find "Run external program on torrent completion", check it, and fill in the command below. /root/qbauto/qb_auto.sh
is the address of my script, change it to yours, and save it.
bash /root/qbauto/qb_auto.sh "%N" "%F" "%R" "%D" "%C" "%Z" "%I"
At this point, the automatic upload function has been realized, and you can try downloading a magnet link.
I need to apologize for leeching all the time; I'm really sorry. The hard drive is just too small 🥺, adding a hard drive costs an extra dollar a month 😢.
Download Anime via RSS#
Actually, I am also using AutoBangumi for downloads now. However, the author tweeted that they are worried about legal issues, so I feel a bit guilty; it's better to use it more discreetly.
AB is not necessary; qb also has RSS subscription functionality, but it will be a bit more troublesome.
Mainly using the following anime download sites:
1. Mikan Project
Website: https://mikanani.me/ https://mikanime.tv/
2. Anime Garden
Website: https://share.dmhy.org/
【Website Favorites】Anime Download Site Recommendations
Download Anime via Mikan Project
Enter the anime page through the homepage link or search, and choose your favorite subtitle group (I personally like ANI).
Then click the RSS icon and copy the address on the redirected page.
Open your qb, click on the RSS in the upper right corner, then click New RSS Subscription, enter the link you just copied, and confirm.
After adding the RSS subscription, click on the RSS downloader in the upper right corner, click the plus sign to add a download rule and name it.
After adding the RSS subscription, click on the rule to set it. Set the save directory, choose the correct subscription source application rules.
Also, set it to not contain \d+-\d
to prevent it from downloading collections (although Mikan's [ANI] generally doesn't have collections, but it's better to add it).
After clicking save, the matching anime will appear on the right and start downloading.
After the download is complete, the automatic upload script will upload it to the OneDrive directory /Sakiko/Bangumi/Dragon Hunter Ragnar/Season 1
based on the save path I set.
The RSS subscription source update interval is set to 30 minutes by default, which can be set in Options - RSS. It will check for anime updates and download every 30 minutes.
Simplified and Traditional Filtering
Some subtitle groups in Mikan have both Simplified and Traditional, so set it to include Simplified. More complex filtering can be done using regular expressions.
Anime Garden Anime Download
Anime Garden requires flexible use of search to filter anime and obtain RSS.
Install and Configure AList to Mount OneDrive#
Installation#
Follow the official tutorial for installation; I used Docker for installation.
Official tutorial: https://alist.nn.ci/zh/guide/install/docker.html
# docker-cli
docker run -d --restart=always -v /etc/alist:/opt/alist/data -p 5244:5244 -e PUID=0 -e PGID=0 -e UMASK=022 --name="alist" xhofe/alist:latest
# After installation, you need to manually set a password.
# Generate a random password.
docker exec -it alist ./alist admin random
# Manually set a password, `NEW_PASSWORD` is the password you need to set.
docker exec -it alist ./alist admin set NEW_PASSWORD
After installation, access http://server public IP address:5244
to log in to Alist, with the username as admin and the password set earlier.
After logging in, click on Manage
at the bottom of the page to set the username and password again, and you can also set the icon and title.
Mount OneDrive#
Official tutorial: https://alist.nn.ci/zh/guide/drivers/onedrive.html
In the AList management page, select Storage, click Add, and choose OneDrive to see the following page.
Among so many options, the main things you need to obtain are Client ID, Client Secret, Refresh Token.
These things are of the same nature as those obtained in the Rclone mount of OneDrive; you need to create an application to obtain them.
There is still a little difference, mainly to refer to the official tutorial https://alist.nn.ci/zh/guide/drivers/onedrive.html, and set the redirect URL for the application registration to https://alist.nn.ci/tool/onedrive/callback.
After registering the application, you will get the Client ID and Client Secret, then fill in the form at https://alist.nn.ci/tool/onedrive/request to obtain the Refresh Token.
Now you can start filling in the OneDrive content.
Driver onedrive
Mount Path /onedrive # Mounted to the /onedrive directory in the AList interface.
Serial Number Default is 0
Remarks
Cache Expiration Time Default is 30
Web Proxy Default is off
WebDAV Policy Default is 302 redirect
Download Proxy URL Default is empty
Sorting Default is empty
Sorting Method Default is empty
Extraction Folder Default is empty
Enable Signature Default is off
Root Folder Path / # Mounted to the root path of OneDrive to the /onedrive directory in the AList interface (the mount path set above).
Region Global
Is SharePoint No
Client ID Fill in the one obtained earlier.
Client Secret Fill in the one obtained earlier.
Redirect URI https://alist.nn.ci/tool/onedrive/callback
Refresh Token Fill in the one obtained earlier.
Site ID Default is empty
Chunk Size Default is 5
Click Add to complete.
Now you can see the OneDrive folder on the homepage.
Guest Configuration#
Now your AList can only be viewed after logging in. By configuring a guest account, you can view it without logging in.
In the AList management page, select Users, and then enable the guest account.
By setting the basic path, you can make a specified directory the homepage. Now the directory that guests see is Sakiko.
As for the introduction below, just upload a readme.md in the /Sakiko directory in OneDrive.
Until now, I am still accessing via IP + port number; configuring a reverse proxy will allow access via domain name.
Configure Reverse Proxy to Map Domain Name#
Refer to the official AList tutorial: https://alist.nn.ci/zh/guide/install/reverse-proxy.html.
If you are using the Baota panel, it is very simple; you can imitate the setup for qbittorrent.
The final effect is like this; you can watch online or download, and the speed is quite fast because it does not go through the server's traffic but downloads directly from OneDrive (it seems to be WebDAV policy 302 redirect? This is very good).
Others#
Restart When qb is Not Downloading#
I have always set the server to restart at midnight on Monday via the Baota panel, but if qb has download tasks, it may cause problems.
So, I added a check in the restart script to see if there are any files in the download directory. If there are no files, it means there are no download tasks, and then restart.
#!/bin/bash
folder="/root/Downloads"
files=$(find "$folder" -type f)
echo "$(date +%Y-%m-%d)"
if [ -n "$files" ]; then
echo "Suspected downloading:"
echo "$files"
echo "Cancel restart"
else
echo "Status normal, proceeding to restart"
reboot
fi
Solve Slow Opening of Alist Page#
In the Alist management backend - Settings - Global, replace the custom header content with Alibaba Cloud CDN Solve Slow Opening and Loading of Alist Page - Bilibili.
Original header: <script src="https://polyfill.io/v3/polyfill.min.js?features=String.prototype.replaceAll"></script>
Alibaba Cloud: <script src="https://polyfill.alicdn.com/v3/polyfill.min.js?features=String.prototype.replaceAll"></script>
Postscript#
Ah, I finally finished writing it. There will definitely be many baka places, and I hope everyone can teach me.
The inspiration for this little nest comes from the Dimension Library curator's My Anime Tracking Solution. I was a bit shocked when I saw it at that time. In fact, before this period, I didn't even know what Docker was. This little nest and this little article really taught me a lot, allowing me to configure a domain name for xlog and also get a MetaMask wallet, finally bringing me a little closer to my aspirations.