mirror of
https://github.com/yt-dlp/yt-dlp.git
synced 2025-07-01 14:00:59 +02:00
Compare commits
No commits in common. "master" and "2024.10.07" have entirely different histories.
master
...
2024.10.07
39
.github/ISSUE_TEMPLATE/1_broken_site.yml
vendored
39
.github/ISSUE_TEMPLATE/1_broken_site.yml
vendored
@ -2,11 +2,13 @@ name: Broken site support
|
|||||||
description: Report issue with yt-dlp on a supported site
|
description: Report issue with yt-dlp on a supported site
|
||||||
labels: [triage, site-bug]
|
labels: [triage, site-bug]
|
||||||
body:
|
body:
|
||||||
- type: markdown
|
- type: checkboxes
|
||||||
attributes:
|
attributes:
|
||||||
value: |
|
label: DO NOT REMOVE OR SKIP THE ISSUE TEMPLATE
|
||||||
> [!IMPORTANT]
|
description: Fill all fields even if you think it is irrelevant for the issue
|
||||||
> Not providing the required (*) information or removing the template will result in your issue being closed and ignored.
|
options:
|
||||||
|
- label: I understand that I will be **blocked** if I *intentionally* remove or skip any mandatory\* field
|
||||||
|
required: true
|
||||||
- type: checkboxes
|
- type: checkboxes
|
||||||
id: checklist
|
id: checklist
|
||||||
attributes:
|
attributes:
|
||||||
@ -22,7 +24,9 @@ body:
|
|||||||
required: true
|
required: true
|
||||||
- label: I've checked that all URLs and arguments with special characters are [properly quoted or escaped](https://github.com/yt-dlp/yt-dlp/wiki/FAQ#video-url-contains-an-ampersand--and-im-getting-some-strange-output-1-2839-or-v-is-not-recognized-as-an-internal-or-external-command)
|
- label: I've checked that all URLs and arguments with special characters are [properly quoted or escaped](https://github.com/yt-dlp/yt-dlp/wiki/FAQ#video-url-contains-an-ampersand--and-im-getting-some-strange-output-1-2839-or-v-is-not-recognized-as-an-internal-or-external-command)
|
||||||
required: true
|
required: true
|
||||||
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766), [the FAQ](https://github.com/yt-dlp/yt-dlp/wiki/FAQ), and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=is%3Aissue%20-label%3Aspam%20%20) for similar issues **including closed ones**. DO NOT post duplicates
|
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
|
||||||
|
required: true
|
||||||
|
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
|
||||||
required: true
|
required: true
|
||||||
- label: I've read about [sharing account credentials](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#are-you-willing-to-share-account-details-if-needed) and I'm willing to share it if required
|
- label: I've read about [sharing account credentials](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#are-you-willing-to-share-account-details-if-needed) and I'm willing to share it if required
|
||||||
- type: input
|
- type: input
|
||||||
@ -43,8 +47,6 @@ body:
|
|||||||
id: verbose
|
id: verbose
|
||||||
attributes:
|
attributes:
|
||||||
label: Provide verbose output that clearly demonstrates the problem
|
label: Provide verbose output that clearly demonstrates the problem
|
||||||
description: |
|
|
||||||
This is mandatory unless absolutely impossible to provide. If you are unable to provide the output, please explain why.
|
|
||||||
options:
|
options:
|
||||||
- label: Run **your** yt-dlp command with **-vU** flag added (`yt-dlp -vU <your command line>`)
|
- label: Run **your** yt-dlp command with **-vU** flag added (`yt-dlp -vU <your command line>`)
|
||||||
required: true
|
required: true
|
||||||
@ -61,18 +63,25 @@ body:
|
|||||||
placeholder: |
|
placeholder: |
|
||||||
[debug] Command-line config: ['-vU', 'https://www.youtube.com/watch?v=BaW_jenozKc']
|
[debug] Command-line config: ['-vU', 'https://www.youtube.com/watch?v=BaW_jenozKc']
|
||||||
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
|
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
|
||||||
[debug] yt-dlp version nightly@... from yt-dlp/yt-dlp-nightly-builds [1a176d874] (win_exe)
|
[debug] yt-dlp version nightly@... from yt-dlp/yt-dlp [b634ba742] (win_exe)
|
||||||
[debug] Python 3.10.11 (CPython AMD64 64bit) - Windows-10-10.0.20348-SP0 (OpenSSL 1.1.1t 7 Feb 2023)
|
[debug] Python 3.8.10 (CPython 64bit) - Windows-10-10.0.22000-SP0
|
||||||
[debug] exe versions: ffmpeg 7.0.2 (setts), ffprobe 7.0.2
|
[debug] exe versions: ffmpeg N-106550-g072101bd52-20220410 (fdk,setts), ffprobe N-106624-g391ce570c8-20220415, phantomjs 2.1.1
|
||||||
[debug] Optional libraries: Cryptodome-3.21.0, brotli-1.1.0, certifi-2024.08.30, curl_cffi-0.5.10, mutagen-1.47.0, requests-2.32.3, sqlite3-3.40.1, urllib3-2.2.3, websockets-13.1
|
[debug] Optional libraries: Cryptodome-3.15.0, brotli-1.0.9, certifi-2022.06.15, mutagen-1.45.1, sqlite3-2.6.0, websockets-10.3
|
||||||
[debug] Proxy map: {}
|
[debug] Proxy map: {}
|
||||||
[debug] Request Handlers: urllib, requests, websockets, curl_cffi
|
[debug] Request Handlers: urllib, requests
|
||||||
[debug] Loaded 1838 extractors
|
[debug] Loaded 1893 extractors
|
||||||
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp/releases/latest
|
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp-nightly-builds/releases/latest
|
||||||
Latest version: nightly@... from yt-dlp/yt-dlp-nightly-builds
|
|
||||||
yt-dlp is up to date (nightly@... from yt-dlp/yt-dlp-nightly-builds)
|
yt-dlp is up to date (nightly@... from yt-dlp/yt-dlp-nightly-builds)
|
||||||
[youtube] Extracting URL: https://www.youtube.com/watch?v=BaW_jenozKc
|
[youtube] Extracting URL: https://www.youtube.com/watch?v=BaW_jenozKc
|
||||||
<more lines>
|
<more lines>
|
||||||
render: shell
|
render: shell
|
||||||
validations:
|
validations:
|
||||||
required: true
|
required: true
|
||||||
|
- type: markdown
|
||||||
|
attributes:
|
||||||
|
value: |
|
||||||
|
> [!CAUTION]
|
||||||
|
> ### GitHub is experiencing a high volume of malicious spam comments.
|
||||||
|
> ### If you receive any replies asking you download a file, do NOT follow the download links!
|
||||||
|
>
|
||||||
|
> Note that this issue may be temporarily locked as an anti-spam measure after it is opened.
|
||||||
|
@ -2,11 +2,13 @@ name: Site support request
|
|||||||
description: Request support for a new site
|
description: Request support for a new site
|
||||||
labels: [triage, site-request]
|
labels: [triage, site-request]
|
||||||
body:
|
body:
|
||||||
- type: markdown
|
- type: checkboxes
|
||||||
attributes:
|
attributes:
|
||||||
value: |
|
label: DO NOT REMOVE OR SKIP THE ISSUE TEMPLATE
|
||||||
> [!IMPORTANT]
|
description: Fill all fields even if you think it is irrelevant for the issue
|
||||||
> Not providing the required (*) information or removing the template will result in your issue being closed and ignored.
|
options:
|
||||||
|
- label: I understand that I will be **blocked** if I *intentionally* remove or skip any mandatory\* field
|
||||||
|
required: true
|
||||||
- type: checkboxes
|
- type: checkboxes
|
||||||
id: checklist
|
id: checklist
|
||||||
attributes:
|
attributes:
|
||||||
@ -22,7 +24,9 @@ body:
|
|||||||
required: true
|
required: true
|
||||||
- label: I've checked that none of provided URLs [violate any copyrights](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#is-the-website-primarily-used-for-piracy) or contain any [DRM](https://en.wikipedia.org/wiki/Digital_rights_management) to the best of my knowledge
|
- label: I've checked that none of provided URLs [violate any copyrights](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#is-the-website-primarily-used-for-piracy) or contain any [DRM](https://en.wikipedia.org/wiki/Digital_rights_management) to the best of my knowledge
|
||||||
required: true
|
required: true
|
||||||
- label: I've searched the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=is%3Aissue%20-label%3Aspam%20%20) for similar requests **including closed ones**. DO NOT post duplicates
|
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
|
||||||
|
required: true
|
||||||
|
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
|
||||||
required: true
|
required: true
|
||||||
- label: I've read about [sharing account credentials](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#are-you-willing-to-share-account-details-if-needed) and am willing to share it if required
|
- label: I've read about [sharing account credentials](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#are-you-willing-to-share-account-details-if-needed) and am willing to share it if required
|
||||||
- type: input
|
- type: input
|
||||||
@ -55,8 +59,6 @@ body:
|
|||||||
id: verbose
|
id: verbose
|
||||||
attributes:
|
attributes:
|
||||||
label: Provide verbose output that clearly demonstrates the problem
|
label: Provide verbose output that clearly demonstrates the problem
|
||||||
description: |
|
|
||||||
This is mandatory unless absolutely impossible to provide. If you are unable to provide the output, please explain why.
|
|
||||||
options:
|
options:
|
||||||
- label: Run **your** yt-dlp command with **-vU** flag added (`yt-dlp -vU <your command line>`)
|
- label: Run **your** yt-dlp command with **-vU** flag added (`yt-dlp -vU <your command line>`)
|
||||||
required: true
|
required: true
|
||||||
@ -73,18 +75,25 @@ body:
|
|||||||
placeholder: |
|
placeholder: |
|
||||||
[debug] Command-line config: ['-vU', 'https://www.youtube.com/watch?v=BaW_jenozKc']
|
[debug] Command-line config: ['-vU', 'https://www.youtube.com/watch?v=BaW_jenozKc']
|
||||||
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
|
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
|
||||||
[debug] yt-dlp version nightly@... from yt-dlp/yt-dlp-nightly-builds [1a176d874] (win_exe)
|
[debug] yt-dlp version nightly@... from yt-dlp/yt-dlp [b634ba742] (win_exe)
|
||||||
[debug] Python 3.10.11 (CPython AMD64 64bit) - Windows-10-10.0.20348-SP0 (OpenSSL 1.1.1t 7 Feb 2023)
|
[debug] Python 3.8.10 (CPython 64bit) - Windows-10-10.0.22000-SP0
|
||||||
[debug] exe versions: ffmpeg 7.0.2 (setts), ffprobe 7.0.2
|
[debug] exe versions: ffmpeg N-106550-g072101bd52-20220410 (fdk,setts), ffprobe N-106624-g391ce570c8-20220415, phantomjs 2.1.1
|
||||||
[debug] Optional libraries: Cryptodome-3.21.0, brotli-1.1.0, certifi-2024.08.30, curl_cffi-0.5.10, mutagen-1.47.0, requests-2.32.3, sqlite3-3.40.1, urllib3-2.2.3, websockets-13.1
|
[debug] Optional libraries: Cryptodome-3.15.0, brotli-1.0.9, certifi-2022.06.15, mutagen-1.45.1, sqlite3-2.6.0, websockets-10.3
|
||||||
[debug] Proxy map: {}
|
[debug] Proxy map: {}
|
||||||
[debug] Request Handlers: urllib, requests, websockets, curl_cffi
|
[debug] Request Handlers: urllib, requests
|
||||||
[debug] Loaded 1838 extractors
|
[debug] Loaded 1893 extractors
|
||||||
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp/releases/latest
|
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp-nightly-builds/releases/latest
|
||||||
Latest version: nightly@... from yt-dlp/yt-dlp-nightly-builds
|
|
||||||
yt-dlp is up to date (nightly@... from yt-dlp/yt-dlp-nightly-builds)
|
yt-dlp is up to date (nightly@... from yt-dlp/yt-dlp-nightly-builds)
|
||||||
[youtube] Extracting URL: https://www.youtube.com/watch?v=BaW_jenozKc
|
[youtube] Extracting URL: https://www.youtube.com/watch?v=BaW_jenozKc
|
||||||
<more lines>
|
<more lines>
|
||||||
render: shell
|
render: shell
|
||||||
validations:
|
validations:
|
||||||
required: true
|
required: true
|
||||||
|
- type: markdown
|
||||||
|
attributes:
|
||||||
|
value: |
|
||||||
|
> [!CAUTION]
|
||||||
|
> ### GitHub is experiencing a high volume of malicious spam comments.
|
||||||
|
> ### If you receive any replies asking you download a file, do NOT follow the download links!
|
||||||
|
>
|
||||||
|
> Note that this issue may be temporarily locked as an anti-spam measure after it is opened.
|
||||||
|
@ -1,12 +1,14 @@
|
|||||||
name: Site feature request
|
name: Site feature request
|
||||||
description: Request new functionality for a site supported by yt-dlp
|
description: Request a new functionality for a supported site
|
||||||
labels: [triage, site-enhancement]
|
labels: [triage, site-enhancement]
|
||||||
body:
|
body:
|
||||||
- type: markdown
|
- type: checkboxes
|
||||||
attributes:
|
attributes:
|
||||||
value: |
|
label: DO NOT REMOVE OR SKIP THE ISSUE TEMPLATE
|
||||||
> [!IMPORTANT]
|
description: Fill all fields even if you think it is irrelevant for the issue
|
||||||
> Not providing the required (*) information or removing the template will result in your issue being closed and ignored.
|
options:
|
||||||
|
- label: I understand that I will be **blocked** if I *intentionally* remove or skip any mandatory\* field
|
||||||
|
required: true
|
||||||
- type: checkboxes
|
- type: checkboxes
|
||||||
id: checklist
|
id: checklist
|
||||||
attributes:
|
attributes:
|
||||||
@ -20,7 +22,9 @@ body:
|
|||||||
required: true
|
required: true
|
||||||
- label: I've checked that all provided URLs are playable in a browser with the same IP and same login details
|
- label: I've checked that all provided URLs are playable in a browser with the same IP and same login details
|
||||||
required: true
|
required: true
|
||||||
- label: I've searched the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=is%3Aissue%20-label%3Aspam%20%20) for similar requests **including closed ones**. DO NOT post duplicates
|
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
|
||||||
|
required: true
|
||||||
|
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
|
||||||
required: true
|
required: true
|
||||||
- label: I've read about [sharing account credentials](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#are-you-willing-to-share-account-details-if-needed) and I'm willing to share it if required
|
- label: I've read about [sharing account credentials](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#are-you-willing-to-share-account-details-if-needed) and I'm willing to share it if required
|
||||||
- type: input
|
- type: input
|
||||||
@ -51,8 +55,6 @@ body:
|
|||||||
id: verbose
|
id: verbose
|
||||||
attributes:
|
attributes:
|
||||||
label: Provide verbose output that clearly demonstrates the problem
|
label: Provide verbose output that clearly demonstrates the problem
|
||||||
description: |
|
|
||||||
This is mandatory unless absolutely impossible to provide. If you are unable to provide the output, please explain why.
|
|
||||||
options:
|
options:
|
||||||
- label: Run **your** yt-dlp command with **-vU** flag added (`yt-dlp -vU <your command line>`)
|
- label: Run **your** yt-dlp command with **-vU** flag added (`yt-dlp -vU <your command line>`)
|
||||||
required: true
|
required: true
|
||||||
@ -69,18 +71,25 @@ body:
|
|||||||
placeholder: |
|
placeholder: |
|
||||||
[debug] Command-line config: ['-vU', 'https://www.youtube.com/watch?v=BaW_jenozKc']
|
[debug] Command-line config: ['-vU', 'https://www.youtube.com/watch?v=BaW_jenozKc']
|
||||||
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
|
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
|
||||||
[debug] yt-dlp version nightly@... from yt-dlp/yt-dlp-nightly-builds [1a176d874] (win_exe)
|
[debug] yt-dlp version nightly@... from yt-dlp/yt-dlp [b634ba742] (win_exe)
|
||||||
[debug] Python 3.10.11 (CPython AMD64 64bit) - Windows-10-10.0.20348-SP0 (OpenSSL 1.1.1t 7 Feb 2023)
|
[debug] Python 3.8.10 (CPython 64bit) - Windows-10-10.0.22000-SP0
|
||||||
[debug] exe versions: ffmpeg 7.0.2 (setts), ffprobe 7.0.2
|
[debug] exe versions: ffmpeg N-106550-g072101bd52-20220410 (fdk,setts), ffprobe N-106624-g391ce570c8-20220415, phantomjs 2.1.1
|
||||||
[debug] Optional libraries: Cryptodome-3.21.0, brotli-1.1.0, certifi-2024.08.30, curl_cffi-0.5.10, mutagen-1.47.0, requests-2.32.3, sqlite3-3.40.1, urllib3-2.2.3, websockets-13.1
|
[debug] Optional libraries: Cryptodome-3.15.0, brotli-1.0.9, certifi-2022.06.15, mutagen-1.45.1, sqlite3-2.6.0, websockets-10.3
|
||||||
[debug] Proxy map: {}
|
[debug] Proxy map: {}
|
||||||
[debug] Request Handlers: urllib, requests, websockets, curl_cffi
|
[debug] Request Handlers: urllib, requests
|
||||||
[debug] Loaded 1838 extractors
|
[debug] Loaded 1893 extractors
|
||||||
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp/releases/latest
|
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp-nightly-builds/releases/latest
|
||||||
Latest version: nightly@... from yt-dlp/yt-dlp-nightly-builds
|
|
||||||
yt-dlp is up to date (nightly@... from yt-dlp/yt-dlp-nightly-builds)
|
yt-dlp is up to date (nightly@... from yt-dlp/yt-dlp-nightly-builds)
|
||||||
[youtube] Extracting URL: https://www.youtube.com/watch?v=BaW_jenozKc
|
[youtube] Extracting URL: https://www.youtube.com/watch?v=BaW_jenozKc
|
||||||
<more lines>
|
<more lines>
|
||||||
render: shell
|
render: shell
|
||||||
validations:
|
validations:
|
||||||
required: true
|
required: true
|
||||||
|
- type: markdown
|
||||||
|
attributes:
|
||||||
|
value: |
|
||||||
|
> [!CAUTION]
|
||||||
|
> ### GitHub is experiencing a high volume of malicious spam comments.
|
||||||
|
> ### If you receive any replies asking you download a file, do NOT follow the download links!
|
||||||
|
>
|
||||||
|
> Note that this issue may be temporarily locked as an anti-spam measure after it is opened.
|
||||||
|
43
.github/ISSUE_TEMPLATE/4_bug_report.yml
vendored
43
.github/ISSUE_TEMPLATE/4_bug_report.yml
vendored
@ -2,11 +2,13 @@ name: Core bug report
|
|||||||
description: Report a bug unrelated to any particular site or extractor
|
description: Report a bug unrelated to any particular site or extractor
|
||||||
labels: [triage, bug]
|
labels: [triage, bug]
|
||||||
body:
|
body:
|
||||||
- type: markdown
|
- type: checkboxes
|
||||||
attributes:
|
attributes:
|
||||||
value: |
|
label: DO NOT REMOVE OR SKIP THE ISSUE TEMPLATE
|
||||||
> [!IMPORTANT]
|
description: Fill all fields even if you think it is irrelevant for the issue
|
||||||
> Not providing the required (*) information or removing the template will result in your issue being closed and ignored.
|
options:
|
||||||
|
- label: I understand that I will be **blocked** if I *intentionally* remove or skip any mandatory\* field
|
||||||
|
required: true
|
||||||
- type: checkboxes
|
- type: checkboxes
|
||||||
id: checklist
|
id: checklist
|
||||||
attributes:
|
attributes:
|
||||||
@ -18,7 +20,13 @@ body:
|
|||||||
required: true
|
required: true
|
||||||
- label: I've verified that I have **updated yt-dlp to nightly or master** ([update instructions](https://github.com/yt-dlp/yt-dlp#update-channels))
|
- label: I've verified that I have **updated yt-dlp to nightly or master** ([update instructions](https://github.com/yt-dlp/yt-dlp#update-channels))
|
||||||
required: true
|
required: true
|
||||||
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766), [the FAQ](https://github.com/yt-dlp/yt-dlp/wiki/FAQ), and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=is%3Aissue%20-label%3Aspam%20%20) for similar issues **including closed ones**. DO NOT post duplicates
|
- label: I've checked that all provided URLs are playable in a browser with the same IP and same login details
|
||||||
|
required: true
|
||||||
|
- label: I've checked that all URLs and arguments with special characters are [properly quoted or escaped](https://github.com/yt-dlp/yt-dlp/wiki/FAQ#video-url-contains-an-ampersand--and-im-getting-some-strange-output-1-2839-or-v-is-not-recognized-as-an-internal-or-external-command)
|
||||||
|
required: true
|
||||||
|
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
|
||||||
|
required: true
|
||||||
|
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
|
||||||
required: true
|
required: true
|
||||||
- type: textarea
|
- type: textarea
|
||||||
id: description
|
id: description
|
||||||
@ -32,8 +40,6 @@ body:
|
|||||||
id: verbose
|
id: verbose
|
||||||
attributes:
|
attributes:
|
||||||
label: Provide verbose output that clearly demonstrates the problem
|
label: Provide verbose output that clearly demonstrates the problem
|
||||||
description: |
|
|
||||||
This is mandatory unless absolutely impossible to provide. If you are unable to provide the output, please explain why.
|
|
||||||
options:
|
options:
|
||||||
- label: Run **your** yt-dlp command with **-vU** flag added (`yt-dlp -vU <your command line>`)
|
- label: Run **your** yt-dlp command with **-vU** flag added (`yt-dlp -vU <your command line>`)
|
||||||
required: true
|
required: true
|
||||||
@ -50,18 +56,25 @@ body:
|
|||||||
placeholder: |
|
placeholder: |
|
||||||
[debug] Command-line config: ['-vU', 'https://www.youtube.com/watch?v=BaW_jenozKc']
|
[debug] Command-line config: ['-vU', 'https://www.youtube.com/watch?v=BaW_jenozKc']
|
||||||
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
|
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
|
||||||
[debug] yt-dlp version nightly@... from yt-dlp/yt-dlp-nightly-builds [1a176d874] (win_exe)
|
[debug] yt-dlp version nightly@... from yt-dlp/yt-dlp [b634ba742] (win_exe)
|
||||||
[debug] Python 3.10.11 (CPython AMD64 64bit) - Windows-10-10.0.20348-SP0 (OpenSSL 1.1.1t 7 Feb 2023)
|
[debug] Python 3.8.10 (CPython 64bit) - Windows-10-10.0.22000-SP0
|
||||||
[debug] exe versions: ffmpeg 7.0.2 (setts), ffprobe 7.0.2
|
[debug] exe versions: ffmpeg N-106550-g072101bd52-20220410 (fdk,setts), ffprobe N-106624-g391ce570c8-20220415, phantomjs 2.1.1
|
||||||
[debug] Optional libraries: Cryptodome-3.21.0, brotli-1.1.0, certifi-2024.08.30, curl_cffi-0.5.10, mutagen-1.47.0, requests-2.32.3, sqlite3-3.40.1, urllib3-2.2.3, websockets-13.1
|
[debug] Optional libraries: Cryptodome-3.15.0, brotli-1.0.9, certifi-2022.06.15, mutagen-1.45.1, sqlite3-2.6.0, websockets-10.3
|
||||||
[debug] Proxy map: {}
|
[debug] Proxy map: {}
|
||||||
[debug] Request Handlers: urllib, requests, websockets, curl_cffi
|
[debug] Request Handlers: urllib, requests
|
||||||
[debug] Loaded 1838 extractors
|
[debug] Loaded 1893 extractors
|
||||||
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp/releases/latest
|
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp-nightly-builds/releases/latest
|
||||||
Latest version: nightly@... from yt-dlp/yt-dlp-nightly-builds
|
|
||||||
yt-dlp is up to date (nightly@... from yt-dlp/yt-dlp-nightly-builds)
|
yt-dlp is up to date (nightly@... from yt-dlp/yt-dlp-nightly-builds)
|
||||||
[youtube] Extracting URL: https://www.youtube.com/watch?v=BaW_jenozKc
|
[youtube] Extracting URL: https://www.youtube.com/watch?v=BaW_jenozKc
|
||||||
<more lines>
|
<more lines>
|
||||||
render: shell
|
render: shell
|
||||||
validations:
|
validations:
|
||||||
required: true
|
required: true
|
||||||
|
- type: markdown
|
||||||
|
attributes:
|
||||||
|
value: |
|
||||||
|
> [!CAUTION]
|
||||||
|
> ### GitHub is experiencing a high volume of malicious spam comments.
|
||||||
|
> ### If you receive any replies asking you download a file, do NOT follow the download links!
|
||||||
|
>
|
||||||
|
> Note that this issue may be temporarily locked as an anti-spam measure after it is opened.
|
||||||
|
41
.github/ISSUE_TEMPLATE/5_feature_request.yml
vendored
41
.github/ISSUE_TEMPLATE/5_feature_request.yml
vendored
@ -1,12 +1,14 @@
|
|||||||
name: Feature request
|
name: Feature request
|
||||||
description: Request a new feature unrelated to any particular site or extractor
|
description: Request a new functionality unrelated to any particular site or extractor
|
||||||
labels: [triage, enhancement]
|
labels: [triage, enhancement]
|
||||||
body:
|
body:
|
||||||
- type: markdown
|
- type: checkboxes
|
||||||
attributes:
|
attributes:
|
||||||
value: |
|
label: DO NOT REMOVE OR SKIP THE ISSUE TEMPLATE
|
||||||
> [!IMPORTANT]
|
description: Fill all fields even if you think it is irrelevant for the issue
|
||||||
> Not providing the required (*) information or removing the template will result in your issue being closed and ignored.
|
options:
|
||||||
|
- label: I understand that I will be **blocked** if I *intentionally* remove or skip any mandatory\* field
|
||||||
|
required: true
|
||||||
- type: checkboxes
|
- type: checkboxes
|
||||||
id: checklist
|
id: checklist
|
||||||
attributes:
|
attributes:
|
||||||
@ -20,7 +22,9 @@ body:
|
|||||||
required: true
|
required: true
|
||||||
- label: I've verified that I have **updated yt-dlp to nightly or master** ([update instructions](https://github.com/yt-dlp/yt-dlp#update-channels))
|
- label: I've verified that I have **updated yt-dlp to nightly or master** ([update instructions](https://github.com/yt-dlp/yt-dlp#update-channels))
|
||||||
required: true
|
required: true
|
||||||
- label: I've searched the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=is%3Aissue%20-label%3Aspam%20%20) for similar requests **including closed ones**. DO NOT post duplicates
|
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
|
||||||
|
required: true
|
||||||
|
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
|
||||||
required: true
|
required: true
|
||||||
- type: textarea
|
- type: textarea
|
||||||
id: description
|
id: description
|
||||||
@ -34,8 +38,6 @@ body:
|
|||||||
id: verbose
|
id: verbose
|
||||||
attributes:
|
attributes:
|
||||||
label: Provide verbose output that clearly demonstrates the problem
|
label: Provide verbose output that clearly demonstrates the problem
|
||||||
description: |
|
|
||||||
This is mandatory unless absolutely impossible to provide. If you are unable to provide the output, please explain why.
|
|
||||||
options:
|
options:
|
||||||
- label: Run **your** yt-dlp command with **-vU** flag added (`yt-dlp -vU <your command line>`)
|
- label: Run **your** yt-dlp command with **-vU** flag added (`yt-dlp -vU <your command line>`)
|
||||||
- label: "If using API, add `'verbose': True` to `YoutubeDL` params instead"
|
- label: "If using API, add `'verbose': True` to `YoutubeDL` params instead"
|
||||||
@ -50,16 +52,23 @@ body:
|
|||||||
placeholder: |
|
placeholder: |
|
||||||
[debug] Command-line config: ['-vU', 'https://www.youtube.com/watch?v=BaW_jenozKc']
|
[debug] Command-line config: ['-vU', 'https://www.youtube.com/watch?v=BaW_jenozKc']
|
||||||
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
|
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
|
||||||
[debug] yt-dlp version nightly@... from yt-dlp/yt-dlp-nightly-builds [1a176d874] (win_exe)
|
[debug] yt-dlp version nightly@... from yt-dlp/yt-dlp [b634ba742] (win_exe)
|
||||||
[debug] Python 3.10.11 (CPython AMD64 64bit) - Windows-10-10.0.20348-SP0 (OpenSSL 1.1.1t 7 Feb 2023)
|
[debug] Python 3.8.10 (CPython 64bit) - Windows-10-10.0.22000-SP0
|
||||||
[debug] exe versions: ffmpeg 7.0.2 (setts), ffprobe 7.0.2
|
[debug] exe versions: ffmpeg N-106550-g072101bd52-20220410 (fdk,setts), ffprobe N-106624-g391ce570c8-20220415, phantomjs 2.1.1
|
||||||
[debug] Optional libraries: Cryptodome-3.21.0, brotli-1.1.0, certifi-2024.08.30, curl_cffi-0.5.10, mutagen-1.47.0, requests-2.32.3, sqlite3-3.40.1, urllib3-2.2.3, websockets-13.1
|
[debug] Optional libraries: Cryptodome-3.15.0, brotli-1.0.9, certifi-2022.06.15, mutagen-1.45.1, sqlite3-2.6.0, websockets-10.3
|
||||||
[debug] Proxy map: {}
|
[debug] Proxy map: {}
|
||||||
[debug] Request Handlers: urllib, requests, websockets, curl_cffi
|
[debug] Request Handlers: urllib, requests
|
||||||
[debug] Loaded 1838 extractors
|
[debug] Loaded 1893 extractors
|
||||||
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp/releases/latest
|
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp-nightly-builds/releases/latest
|
||||||
Latest version: nightly@... from yt-dlp/yt-dlp-nightly-builds
|
|
||||||
yt-dlp is up to date (nightly@... from yt-dlp/yt-dlp-nightly-builds)
|
yt-dlp is up to date (nightly@... from yt-dlp/yt-dlp-nightly-builds)
|
||||||
[youtube] Extracting URL: https://www.youtube.com/watch?v=BaW_jenozKc
|
[youtube] Extracting URL: https://www.youtube.com/watch?v=BaW_jenozKc
|
||||||
<more lines>
|
<more lines>
|
||||||
render: shell
|
render: shell
|
||||||
|
- type: markdown
|
||||||
|
attributes:
|
||||||
|
value: |
|
||||||
|
> [!CAUTION]
|
||||||
|
> ### GitHub is experiencing a high volume of malicious spam comments.
|
||||||
|
> ### If you receive any replies asking you download a file, do NOT follow the download links!
|
||||||
|
>
|
||||||
|
> Note that this issue may be temporarily locked as an anti-spam measure after it is opened.
|
||||||
|
41
.github/ISSUE_TEMPLATE/6_question.yml
vendored
41
.github/ISSUE_TEMPLATE/6_question.yml
vendored
@ -1,12 +1,14 @@
|
|||||||
name: Ask question
|
name: Ask question
|
||||||
description: Ask a question about using yt-dlp
|
description: Ask yt-dlp related question
|
||||||
labels: [question]
|
labels: [question]
|
||||||
body:
|
body:
|
||||||
- type: markdown
|
- type: checkboxes
|
||||||
attributes:
|
attributes:
|
||||||
value: |
|
label: DO NOT REMOVE OR SKIP THE ISSUE TEMPLATE
|
||||||
> [!IMPORTANT]
|
description: Fill all fields even if you think it is irrelevant for the issue
|
||||||
> Not providing the required (*) information or removing the template will result in your issue being closed and ignored.
|
options:
|
||||||
|
- label: I understand that I will be **blocked** if I *intentionally* remove or skip any mandatory\* field
|
||||||
|
required: true
|
||||||
- type: markdown
|
- type: markdown
|
||||||
attributes:
|
attributes:
|
||||||
value: |
|
value: |
|
||||||
@ -26,7 +28,9 @@ body:
|
|||||||
required: true
|
required: true
|
||||||
- label: I've verified that I have **updated yt-dlp to nightly or master** ([update instructions](https://github.com/yt-dlp/yt-dlp#update-channels))
|
- label: I've verified that I have **updated yt-dlp to nightly or master** ([update instructions](https://github.com/yt-dlp/yt-dlp#update-channels))
|
||||||
required: true
|
required: true
|
||||||
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766), [the FAQ](https://github.com/yt-dlp/yt-dlp/wiki/FAQ), and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=is%3Aissue%20-label%3Aspam%20%20) for similar questions **including closed ones**. DO NOT post duplicates
|
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar questions **including closed ones**. DO NOT post duplicates
|
||||||
|
required: true
|
||||||
|
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
|
||||||
required: true
|
required: true
|
||||||
- type: textarea
|
- type: textarea
|
||||||
id: question
|
id: question
|
||||||
@ -40,8 +44,6 @@ body:
|
|||||||
id: verbose
|
id: verbose
|
||||||
attributes:
|
attributes:
|
||||||
label: Provide verbose output that clearly demonstrates the problem
|
label: Provide verbose output that clearly demonstrates the problem
|
||||||
description: |
|
|
||||||
This is mandatory unless absolutely impossible to provide. If you are unable to provide the output, please explain why.
|
|
||||||
options:
|
options:
|
||||||
- label: Run **your** yt-dlp command with **-vU** flag added (`yt-dlp -vU <your command line>`)
|
- label: Run **your** yt-dlp command with **-vU** flag added (`yt-dlp -vU <your command line>`)
|
||||||
- label: "If using API, add `'verbose': True` to `YoutubeDL` params instead"
|
- label: "If using API, add `'verbose': True` to `YoutubeDL` params instead"
|
||||||
@ -56,16 +58,23 @@ body:
|
|||||||
placeholder: |
|
placeholder: |
|
||||||
[debug] Command-line config: ['-vU', 'https://www.youtube.com/watch?v=BaW_jenozKc']
|
[debug] Command-line config: ['-vU', 'https://www.youtube.com/watch?v=BaW_jenozKc']
|
||||||
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
|
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
|
||||||
[debug] yt-dlp version nightly@... from yt-dlp/yt-dlp-nightly-builds [1a176d874] (win_exe)
|
[debug] yt-dlp version nightly@... from yt-dlp/yt-dlp [b634ba742] (win_exe)
|
||||||
[debug] Python 3.10.11 (CPython AMD64 64bit) - Windows-10-10.0.20348-SP0 (OpenSSL 1.1.1t 7 Feb 2023)
|
[debug] Python 3.8.10 (CPython 64bit) - Windows-10-10.0.22000-SP0
|
||||||
[debug] exe versions: ffmpeg 7.0.2 (setts), ffprobe 7.0.2
|
[debug] exe versions: ffmpeg N-106550-g072101bd52-20220410 (fdk,setts), ffprobe N-106624-g391ce570c8-20220415, phantomjs 2.1.1
|
||||||
[debug] Optional libraries: Cryptodome-3.21.0, brotli-1.1.0, certifi-2024.08.30, curl_cffi-0.5.10, mutagen-1.47.0, requests-2.32.3, sqlite3-3.40.1, urllib3-2.2.3, websockets-13.1
|
[debug] Optional libraries: Cryptodome-3.15.0, brotli-1.0.9, certifi-2022.06.15, mutagen-1.45.1, sqlite3-2.6.0, websockets-10.3
|
||||||
[debug] Proxy map: {}
|
[debug] Proxy map: {}
|
||||||
[debug] Request Handlers: urllib, requests, websockets, curl_cffi
|
[debug] Request Handlers: urllib, requests
|
||||||
[debug] Loaded 1838 extractors
|
[debug] Loaded 1893 extractors
|
||||||
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp/releases/latest
|
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp-nightly-builds/releases/latest
|
||||||
Latest version: nightly@... from yt-dlp/yt-dlp-nightly-builds
|
|
||||||
yt-dlp is up to date (nightly@... from yt-dlp/yt-dlp-nightly-builds)
|
yt-dlp is up to date (nightly@... from yt-dlp/yt-dlp-nightly-builds)
|
||||||
[youtube] Extracting URL: https://www.youtube.com/watch?v=BaW_jenozKc
|
[youtube] Extracting URL: https://www.youtube.com/watch?v=BaW_jenozKc
|
||||||
<more lines>
|
<more lines>
|
||||||
render: shell
|
render: shell
|
||||||
|
- type: markdown
|
||||||
|
attributes:
|
||||||
|
value: |
|
||||||
|
> [!CAUTION]
|
||||||
|
> ### GitHub is experiencing a high volume of malicious spam comments.
|
||||||
|
> ### If you receive any replies asking you download a file, do NOT follow the download links!
|
||||||
|
>
|
||||||
|
> Note that this issue may be temporarily locked as an anti-spam measure after it is opened.
|
||||||
|
7
.github/ISSUE_TEMPLATE/config.yml
vendored
7
.github/ISSUE_TEMPLATE/config.yml
vendored
@ -1,5 +1,8 @@
|
|||||||
blank_issues_enabled: false
|
blank_issues_enabled: false
|
||||||
contact_links:
|
contact_links:
|
||||||
- name: Get help on Discord
|
- name: Get help from the community on Discord
|
||||||
url: https://discord.gg/H5MNcFW63r
|
url: https://discord.gg/H5MNcFW63r
|
||||||
about: Join the yt-dlp Discord server for support and discussion
|
about: Join the yt-dlp Discord for community-powered support!
|
||||||
|
- name: Matrix Bridge to the Discord server
|
||||||
|
url: https://matrix.to/#/#yt-dlp:matrix.org
|
||||||
|
about: For those who do not want to use Discord
|
||||||
|
@ -18,7 +18,9 @@ body:
|
|||||||
required: true
|
required: true
|
||||||
- label: I've checked that all URLs and arguments with special characters are [properly quoted or escaped](https://github.com/yt-dlp/yt-dlp/wiki/FAQ#video-url-contains-an-ampersand--and-im-getting-some-strange-output-1-2839-or-v-is-not-recognized-as-an-internal-or-external-command)
|
- label: I've checked that all URLs and arguments with special characters are [properly quoted or escaped](https://github.com/yt-dlp/yt-dlp/wiki/FAQ#video-url-contains-an-ampersand--and-im-getting-some-strange-output-1-2839-or-v-is-not-recognized-as-an-internal-or-external-command)
|
||||||
required: true
|
required: true
|
||||||
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766), [the FAQ](https://github.com/yt-dlp/yt-dlp/wiki/FAQ), and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=is%%3Aissue%%20-label%%3Aspam%%20%%20) for similar issues **including closed ones**. DO NOT post duplicates
|
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
|
||||||
|
required: true
|
||||||
|
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
|
||||||
required: true
|
required: true
|
||||||
- label: I've read about [sharing account credentials](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#are-you-willing-to-share-account-details-if-needed) and I'm willing to share it if required
|
- label: I've read about [sharing account credentials](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#are-you-willing-to-share-account-details-if-needed) and I'm willing to share it if required
|
||||||
- type: input
|
- type: input
|
||||||
|
@ -18,7 +18,9 @@ body:
|
|||||||
required: true
|
required: true
|
||||||
- label: I've checked that none of provided URLs [violate any copyrights](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#is-the-website-primarily-used-for-piracy) or contain any [DRM](https://en.wikipedia.org/wiki/Digital_rights_management) to the best of my knowledge
|
- label: I've checked that none of provided URLs [violate any copyrights](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#is-the-website-primarily-used-for-piracy) or contain any [DRM](https://en.wikipedia.org/wiki/Digital_rights_management) to the best of my knowledge
|
||||||
required: true
|
required: true
|
||||||
- label: I've searched the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=is%%3Aissue%%20-label%%3Aspam%%20%%20) for similar requests **including closed ones**. DO NOT post duplicates
|
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
|
||||||
|
required: true
|
||||||
|
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
|
||||||
required: true
|
required: true
|
||||||
- label: I've read about [sharing account credentials](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#are-you-willing-to-share-account-details-if-needed) and am willing to share it if required
|
- label: I've read about [sharing account credentials](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#are-you-willing-to-share-account-details-if-needed) and am willing to share it if required
|
||||||
- type: input
|
- type: input
|
||||||
|
@ -1,5 +1,5 @@
|
|||||||
name: Site feature request
|
name: Site feature request
|
||||||
description: Request new functionality for a site supported by yt-dlp
|
description: Request a new functionality for a supported site
|
||||||
labels: [triage, site-enhancement]
|
labels: [triage, site-enhancement]
|
||||||
body:
|
body:
|
||||||
%(no_skip)s
|
%(no_skip)s
|
||||||
@ -16,7 +16,9 @@ body:
|
|||||||
required: true
|
required: true
|
||||||
- label: I've checked that all provided URLs are playable in a browser with the same IP and same login details
|
- label: I've checked that all provided URLs are playable in a browser with the same IP and same login details
|
||||||
required: true
|
required: true
|
||||||
- label: I've searched the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=is%%3Aissue%%20-label%%3Aspam%%20%%20) for similar requests **including closed ones**. DO NOT post duplicates
|
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
|
||||||
|
required: true
|
||||||
|
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
|
||||||
required: true
|
required: true
|
||||||
- label: I've read about [sharing account credentials](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#are-you-willing-to-share-account-details-if-needed) and I'm willing to share it if required
|
- label: I've read about [sharing account credentials](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#are-you-willing-to-share-account-details-if-needed) and I'm willing to share it if required
|
||||||
- type: input
|
- type: input
|
||||||
|
8
.github/ISSUE_TEMPLATE_tmpl/4_bug_report.yml
vendored
8
.github/ISSUE_TEMPLATE_tmpl/4_bug_report.yml
vendored
@ -14,7 +14,13 @@ body:
|
|||||||
required: true
|
required: true
|
||||||
- label: I've verified that I have **updated yt-dlp to nightly or master** ([update instructions](https://github.com/yt-dlp/yt-dlp#update-channels))
|
- label: I've verified that I have **updated yt-dlp to nightly or master** ([update instructions](https://github.com/yt-dlp/yt-dlp#update-channels))
|
||||||
required: true
|
required: true
|
||||||
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766), [the FAQ](https://github.com/yt-dlp/yt-dlp/wiki/FAQ), and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=is%%3Aissue%%20-label%%3Aspam%%20%%20) for similar issues **including closed ones**. DO NOT post duplicates
|
- label: I've checked that all provided URLs are playable in a browser with the same IP and same login details
|
||||||
|
required: true
|
||||||
|
- label: I've checked that all URLs and arguments with special characters are [properly quoted or escaped](https://github.com/yt-dlp/yt-dlp/wiki/FAQ#video-url-contains-an-ampersand--and-im-getting-some-strange-output-1-2839-or-v-is-not-recognized-as-an-internal-or-external-command)
|
||||||
|
required: true
|
||||||
|
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
|
||||||
|
required: true
|
||||||
|
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
|
||||||
required: true
|
required: true
|
||||||
- type: textarea
|
- type: textarea
|
||||||
id: description
|
id: description
|
||||||
|
@ -1,5 +1,5 @@
|
|||||||
name: Feature request
|
name: Feature request
|
||||||
description: Request a new feature unrelated to any particular site or extractor
|
description: Request a new functionality unrelated to any particular site or extractor
|
||||||
labels: [triage, enhancement]
|
labels: [triage, enhancement]
|
||||||
body:
|
body:
|
||||||
%(no_skip)s
|
%(no_skip)s
|
||||||
@ -16,7 +16,9 @@ body:
|
|||||||
required: true
|
required: true
|
||||||
- label: I've verified that I have **updated yt-dlp to nightly or master** ([update instructions](https://github.com/yt-dlp/yt-dlp#update-channels))
|
- label: I've verified that I have **updated yt-dlp to nightly or master** ([update instructions](https://github.com/yt-dlp/yt-dlp#update-channels))
|
||||||
required: true
|
required: true
|
||||||
- label: I've searched the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=is%%3Aissue%%20-label%%3Aspam%%20%%20) for similar requests **including closed ones**. DO NOT post duplicates
|
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
|
||||||
|
required: true
|
||||||
|
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
|
||||||
required: true
|
required: true
|
||||||
- type: textarea
|
- type: textarea
|
||||||
id: description
|
id: description
|
||||||
|
6
.github/ISSUE_TEMPLATE_tmpl/6_question.yml
vendored
6
.github/ISSUE_TEMPLATE_tmpl/6_question.yml
vendored
@ -1,5 +1,5 @@
|
|||||||
name: Ask question
|
name: Ask question
|
||||||
description: Ask a question about using yt-dlp
|
description: Ask yt-dlp related question
|
||||||
labels: [question]
|
labels: [question]
|
||||||
body:
|
body:
|
||||||
%(no_skip)s
|
%(no_skip)s
|
||||||
@ -22,7 +22,9 @@ body:
|
|||||||
required: true
|
required: true
|
||||||
- label: I've verified that I have **updated yt-dlp to nightly or master** ([update instructions](https://github.com/yt-dlp/yt-dlp#update-channels))
|
- label: I've verified that I have **updated yt-dlp to nightly or master** ([update instructions](https://github.com/yt-dlp/yt-dlp#update-channels))
|
||||||
required: true
|
required: true
|
||||||
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766), [the FAQ](https://github.com/yt-dlp/yt-dlp/wiki/FAQ), and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=is%%3Aissue%%20-label%%3Aspam%%20%%20) for similar questions **including closed ones**. DO NOT post duplicates
|
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar questions **including closed ones**. DO NOT post duplicates
|
||||||
|
required: true
|
||||||
|
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
|
||||||
required: true
|
required: true
|
||||||
- type: textarea
|
- type: textarea
|
||||||
id: question
|
id: question
|
||||||
|
37
.github/PULL_REQUEST_TEMPLATE.md
vendored
37
.github/PULL_REQUEST_TEMPLATE.md
vendored
@ -1,17 +1,14 @@
|
|||||||
<!--
|
**IMPORTANT**: PRs without the template will be CLOSED
|
||||||
**IMPORTANT**: PRs without the template will be CLOSED
|
|
||||||
|
|
||||||
Due to the high volume of pull requests, it may be a while before your PR is reviewed.
|
|
||||||
Please try to keep your pull request focused on a single bugfix or new feature.
|
|
||||||
Pull requests with a vast scope and/or very large diff will take much longer to review.
|
|
||||||
It is recommended for new contributors to stick to smaller pull requests, so you can receive much more immediate feedback as you familiarize yourself with the codebase.
|
|
||||||
|
|
||||||
PLEASE AVOID FORCE-PUSHING after opening a PR, as it makes reviewing more difficult.
|
|
||||||
-->
|
|
||||||
|
|
||||||
### Description of your *pull request* and other information
|
### Description of your *pull request* and other information
|
||||||
|
|
||||||
ADD DETAILED DESCRIPTION HERE
|
<!--
|
||||||
|
|
||||||
|
Explanation of your *pull request* in arbitrary form goes here. Please **make sure the description explains the purpose and effect** of your *pull request* and is worded well enough to be understood. Provide as much **context and examples** as possible
|
||||||
|
|
||||||
|
-->
|
||||||
|
|
||||||
|
ADD DESCRIPTION HERE
|
||||||
|
|
||||||
Fixes #
|
Fixes #
|
||||||
|
|
||||||
@ -19,22 +16,24 @@ Fixes #
|
|||||||
<details open><summary>Template</summary> <!-- OPEN is intentional -->
|
<details open><summary>Template</summary> <!-- OPEN is intentional -->
|
||||||
|
|
||||||
<!--
|
<!--
|
||||||
# PLEASE FOLLOW THE GUIDE BELOW
|
|
||||||
|
|
||||||
- You will be asked some questions, please read them **carefully** and answer honestly
|
# PLEASE FOLLOW THE GUIDE BELOW
|
||||||
- Put an `x` into all the boxes `[ ]` relevant to your *pull request* (like [x])
|
|
||||||
- Use *Preview* tab to see what your *pull request* will actually look like
|
- You will be asked some questions, please read them **carefully** and answer honestly
|
||||||
|
- Put an `x` into all the boxes `[ ]` relevant to your *pull request* (like [x])
|
||||||
|
- Use *Preview* tab to see how your *pull request* will actually look like
|
||||||
|
|
||||||
-->
|
-->
|
||||||
|
|
||||||
### Before submitting a *pull request* make sure you have:
|
### Before submitting a *pull request* make sure you have:
|
||||||
- [ ] At least skimmed through [contributing guidelines](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#developer-instructions) including [yt-dlp coding conventions](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#yt-dlp-coding-conventions)
|
- [ ] At least skimmed through [contributing guidelines](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#developer-instructions) including [yt-dlp coding conventions](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#yt-dlp-coding-conventions)
|
||||||
- [ ] [Searched](https://github.com/yt-dlp/yt-dlp/search?q=is%3Apr&type=Issues) the bugtracker for similar pull requests
|
- [ ] [Searched](https://github.com/yt-dlp/yt-dlp/search?q=is%3Apr&type=Issues) the bugtracker for similar pull requests
|
||||||
|
|
||||||
### In order to be accepted and merged into yt-dlp each piece of code must be in public domain or released under [Unlicense](http://unlicense.org/). Check those that apply and remove the others:
|
### In order to be accepted and merged into yt-dlp each piece of code must be in public domain or released under [Unlicense](http://unlicense.org/). Check all of the following options that apply:
|
||||||
- [ ] I am the original author of the code in this PR, and I am willing to release it under [Unlicense](http://unlicense.org/)
|
- [ ] I am the original author of this code and I am willing to release it under [Unlicense](http://unlicense.org/)
|
||||||
- [ ] I am not the original author of the code in this PR, but it is in the public domain or released under [Unlicense](http://unlicense.org/) (provide reliable evidence)
|
- [ ] I am not the original author of this code but it is in public domain or released under [Unlicense](http://unlicense.org/) (provide reliable evidence)
|
||||||
|
|
||||||
### What is the purpose of your *pull request*? Check those that apply and remove the others:
|
### What is the purpose of your *pull request*?
|
||||||
- [ ] Fix or improvement to an extractor (Make sure to add/update tests)
|
- [ ] Fix or improvement to an extractor (Make sure to add/update tests)
|
||||||
- [ ] New extractor ([Piracy websites will not be accepted](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#is-the-website-primarily-used-for-piracy))
|
- [ ] New extractor ([Piracy websites will not be accepted](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#is-the-website-primarily-used-for-piracy))
|
||||||
- [ ] Core bug fix/improvement
|
- [ ] Core bug fix/improvement
|
||||||
|
83
.github/workflows/build.yml
vendored
83
.github/workflows/build.yml
vendored
@ -72,7 +72,7 @@ on:
|
|||||||
default: true
|
default: true
|
||||||
type: boolean
|
type: boolean
|
||||||
windows:
|
windows:
|
||||||
description: yt-dlp.exe, yt-dlp_win.zip
|
description: yt-dlp.exe, yt-dlp_min.exe, yt-dlp_win.zip
|
||||||
default: true
|
default: true
|
||||||
type: boolean
|
type: boolean
|
||||||
windows32:
|
windows32:
|
||||||
@ -192,31 +192,29 @@ jobs:
|
|||||||
with:
|
with:
|
||||||
path: ./repo
|
path: ./repo
|
||||||
- name: Virtualized Install, Prepare & Build
|
- name: Virtualized Install, Prepare & Build
|
||||||
uses: yt-dlp/run-on-arch-action@v3
|
uses: yt-dlp/run-on-arch-action@v2
|
||||||
with:
|
with:
|
||||||
# Ref: https://github.com/uraimo/run-on-arch-action/issues/55
|
# Ref: https://github.com/uraimo/run-on-arch-action/issues/55
|
||||||
env: |
|
env: |
|
||||||
GITHUB_WORKFLOW: build
|
GITHUB_WORKFLOW: build
|
||||||
githubToken: ${{ github.token }} # To cache image
|
githubToken: ${{ github.token }} # To cache image
|
||||||
arch: ${{ matrix.architecture }}
|
arch: ${{ matrix.architecture }}
|
||||||
distro: ubuntu20.04 # Standalone executable should be built on minimum supported OS
|
distro: ubuntu18.04 # Standalone executable should be built on minimum supported OS
|
||||||
dockerRunArgs: --volume "${PWD}/repo:/repo"
|
dockerRunArgs: --volume "${PWD}/repo:/repo"
|
||||||
install: | # Installing Python 3.10 from the Deadsnakes repo raises errors
|
install: | # Installing Python 3.10 from the Deadsnakes repo raises errors
|
||||||
apt update
|
apt update
|
||||||
apt -y install zlib1g-dev libffi-dev python3.9 python3.9-dev python3.9-distutils python3-pip \
|
apt -y install zlib1g-dev libffi-dev python3.8 python3.8-dev python3.8-distutils python3-pip
|
||||||
python3-secretstorage # Cannot build cryptography wheel in virtual armv7 environment
|
python3.8 -m pip install -U pip setuptools wheel
|
||||||
python3.9 -m pip install -U pip wheel 'setuptools>=71.0.2'
|
# Cannot access any files from the repo directory at this stage
|
||||||
# XXX: Keep this in sync with pyproject.toml (it can't be accessed at this stage) and exclude secretstorage
|
python3.8 -m pip install -U Pyinstaller mutagen pycryptodomex websockets brotli certifi secretstorage cffi
|
||||||
python3.9 -m pip install -U Pyinstaller mutagen pycryptodomex brotli certifi cffi \
|
|
||||||
'requests>=2.32.2,<3' 'urllib3>=1.26.17,<3' 'websockets>=13.0'
|
|
||||||
|
|
||||||
run: |
|
run: |
|
||||||
cd repo
|
cd repo
|
||||||
python3.9 devscripts/install_deps.py -o --include build
|
python3.8 devscripts/install_deps.py -o --include build
|
||||||
python3.9 devscripts/install_deps.py --include pyinstaller # Cached versions may be out of date
|
python3.8 devscripts/install_deps.py --include pyinstaller --include secretstorage # Cached version may be out of date
|
||||||
python3.9 devscripts/update-version.py -c "${{ inputs.channel }}" -r "${{ needs.process.outputs.origin }}" "${{ inputs.version }}"
|
python3.8 devscripts/update-version.py -c "${{ inputs.channel }}" -r "${{ needs.process.outputs.origin }}" "${{ inputs.version }}"
|
||||||
python3.9 devscripts/make_lazy_extractors.py
|
python3.8 devscripts/make_lazy_extractors.py
|
||||||
python3.9 -m bundle.pyinstaller
|
python3.8 -m bundle.pyinstaller
|
||||||
|
|
||||||
if ${{ vars.UPDATE_TO_VERIFICATION && 'true' || 'false' }}; then
|
if ${{ vars.UPDATE_TO_VERIFICATION && 'true' || 'false' }}; then
|
||||||
arch="${{ (matrix.architecture == 'armv7' && 'armv7l') || matrix.architecture }}"
|
arch="${{ (matrix.architecture == 'armv7' && 'armv7l') || matrix.architecture }}"
|
||||||
@ -242,7 +240,7 @@ jobs:
|
|||||||
permissions:
|
permissions:
|
||||||
contents: read
|
contents: read
|
||||||
actions: write # For cleaning up cache
|
actions: write # For cleaning up cache
|
||||||
runs-on: macos-13
|
runs-on: macos-12
|
||||||
|
|
||||||
steps:
|
steps:
|
||||||
- uses: actions/checkout@v4
|
- uses: actions/checkout@v4
|
||||||
@ -256,7 +254,7 @@ jobs:
|
|||||||
with:
|
with:
|
||||||
path: |
|
path: |
|
||||||
~/yt-dlp-build-venv
|
~/yt-dlp-build-venv
|
||||||
key: cache-reqs-${{ github.job }}-${{ github.ref }}
|
key: cache-reqs-${{ github.job }}
|
||||||
|
|
||||||
- name: Install Requirements
|
- name: Install Requirements
|
||||||
run: |
|
run: |
|
||||||
@ -331,21 +329,24 @@ jobs:
|
|||||||
if: steps.restore-cache.outputs.cache-hit == 'true'
|
if: steps.restore-cache.outputs.cache-hit == 'true'
|
||||||
env:
|
env:
|
||||||
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||||
cache_key: cache-reqs-${{ github.job }}-${{ github.ref }}
|
cache_key: cache-reqs-${{ github.job }}
|
||||||
|
repository: ${{ github.repository }}
|
||||||
|
branch: ${{ github.ref }}
|
||||||
run: |
|
run: |
|
||||||
gh cache delete "${cache_key}"
|
gh extension install actions/gh-actions-cache
|
||||||
|
gh actions-cache delete "${cache_key}" -R "${repository}" -B "${branch}" --confirm
|
||||||
|
|
||||||
- name: Cache requirements
|
- name: Cache requirements
|
||||||
uses: actions/cache/save@v4
|
uses: actions/cache/save@v4
|
||||||
with:
|
with:
|
||||||
path: |
|
path: |
|
||||||
~/yt-dlp-build-venv
|
~/yt-dlp-build-venv
|
||||||
key: cache-reqs-${{ github.job }}-${{ github.ref }}
|
key: cache-reqs-${{ github.job }}
|
||||||
|
|
||||||
macos_legacy:
|
macos_legacy:
|
||||||
needs: process
|
needs: process
|
||||||
if: inputs.macos_legacy
|
if: inputs.macos_legacy
|
||||||
runs-on: macos-13
|
runs-on: macos-12
|
||||||
|
|
||||||
steps:
|
steps:
|
||||||
- uses: actions/checkout@v4
|
- uses: actions/checkout@v4
|
||||||
@ -402,13 +403,13 @@ jobs:
|
|||||||
steps:
|
steps:
|
||||||
- uses: actions/checkout@v4
|
- uses: actions/checkout@v4
|
||||||
- uses: actions/setup-python@v5
|
- uses: actions/setup-python@v5
|
||||||
with:
|
with: # 3.8 is used for Win7 support
|
||||||
python-version: "3.10"
|
python-version: "3.8"
|
||||||
- name: Install Requirements
|
- name: Install Requirements
|
||||||
run: | # Custom pyinstaller built with https://github.com/yt-dlp/pyinstaller-builds
|
run: | # Custom pyinstaller built with https://github.com/yt-dlp/pyinstaller-builds
|
||||||
python devscripts/install_deps.py -o --include build
|
python devscripts/install_deps.py -o --include build
|
||||||
python devscripts/install_deps.py --include curl-cffi
|
python devscripts/install_deps.py --include curl-cffi
|
||||||
python -m pip install -U "https://yt-dlp.github.io/Pyinstaller-Builds/x86_64/pyinstaller-6.13.0-py3-none-any.whl"
|
python -m pip install -U "https://yt-dlp.github.io/Pyinstaller-Builds/x86_64/pyinstaller-6.10.0-py3-none-any.whl"
|
||||||
|
|
||||||
- name: Prepare
|
- name: Prepare
|
||||||
run: |
|
run: |
|
||||||
@ -418,12 +419,22 @@ jobs:
|
|||||||
run: |
|
run: |
|
||||||
python -m bundle.pyinstaller
|
python -m bundle.pyinstaller
|
||||||
python -m bundle.pyinstaller --onedir
|
python -m bundle.pyinstaller --onedir
|
||||||
|
Move-Item ./dist/yt-dlp.exe ./dist/yt-dlp_real.exe
|
||||||
Compress-Archive -Path ./dist/yt-dlp/* -DestinationPath ./dist/yt-dlp_win.zip
|
Compress-Archive -Path ./dist/yt-dlp/* -DestinationPath ./dist/yt-dlp_win.zip
|
||||||
|
|
||||||
|
- name: Install Requirements (py2exe)
|
||||||
|
run: |
|
||||||
|
python devscripts/install_deps.py --include py2exe
|
||||||
|
- name: Build (py2exe)
|
||||||
|
run: |
|
||||||
|
python -m bundle.py2exe
|
||||||
|
Move-Item ./dist/yt-dlp.exe ./dist/yt-dlp_min.exe
|
||||||
|
Move-Item ./dist/yt-dlp_real.exe ./dist/yt-dlp.exe
|
||||||
|
|
||||||
- name: Verify --update-to
|
- name: Verify --update-to
|
||||||
if: vars.UPDATE_TO_VERIFICATION
|
if: vars.UPDATE_TO_VERIFICATION
|
||||||
run: |
|
run: |
|
||||||
foreach ($name in @("yt-dlp")) {
|
foreach ($name in @("yt-dlp","yt-dlp_min")) {
|
||||||
Copy-Item "./dist/${name}.exe" "./dist/${name}_downgraded.exe"
|
Copy-Item "./dist/${name}.exe" "./dist/${name}_downgraded.exe"
|
||||||
$version = & "./dist/${name}.exe" --version
|
$version = & "./dist/${name}.exe" --version
|
||||||
& "./dist/${name}_downgraded.exe" -v --update-to yt-dlp/yt-dlp@2023.03.04
|
& "./dist/${name}_downgraded.exe" -v --update-to yt-dlp/yt-dlp@2023.03.04
|
||||||
@ -439,6 +450,7 @@ jobs:
|
|||||||
name: build-bin-${{ github.job }}
|
name: build-bin-${{ github.job }}
|
||||||
path: |
|
path: |
|
||||||
dist/yt-dlp.exe
|
dist/yt-dlp.exe
|
||||||
|
dist/yt-dlp_min.exe
|
||||||
dist/yt-dlp_win.zip
|
dist/yt-dlp_win.zip
|
||||||
compression-level: 0
|
compression-level: 0
|
||||||
|
|
||||||
@ -451,13 +463,13 @@ jobs:
|
|||||||
- uses: actions/checkout@v4
|
- uses: actions/checkout@v4
|
||||||
- uses: actions/setup-python@v5
|
- uses: actions/setup-python@v5
|
||||||
with:
|
with:
|
||||||
python-version: "3.10"
|
python-version: "3.8"
|
||||||
architecture: "x86"
|
architecture: "x86"
|
||||||
- name: Install Requirements
|
- name: Install Requirements
|
||||||
run: |
|
run: |
|
||||||
python devscripts/install_deps.py -o --include build
|
python devscripts/install_deps.py -o --include build
|
||||||
python devscripts/install_deps.py
|
python devscripts/install_deps.py
|
||||||
python -m pip install -U "https://yt-dlp.github.io/Pyinstaller-Builds/i686/pyinstaller-6.13.0-py3-none-any.whl"
|
python -m pip install -U "https://yt-dlp.github.io/Pyinstaller-Builds/i686/pyinstaller-6.10.0-py3-none-any.whl"
|
||||||
|
|
||||||
- name: Prepare
|
- name: Prepare
|
||||||
run: |
|
run: |
|
||||||
@ -501,8 +513,7 @@ jobs:
|
|||||||
- windows32
|
- windows32
|
||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-latest
|
||||||
steps:
|
steps:
|
||||||
- name: Download artifacts
|
- uses: actions/download-artifact@v4
|
||||||
uses: actions/download-artifact@v4
|
|
||||||
with:
|
with:
|
||||||
path: artifact
|
path: artifact
|
||||||
pattern: build-bin-*
|
pattern: build-bin-*
|
||||||
@ -526,29 +537,13 @@ jobs:
|
|||||||
lock 2022.08.18.36 .+ Python 3\.6
|
lock 2022.08.18.36 .+ Python 3\.6
|
||||||
lock 2023.11.16 (?!win_x86_exe).+ Python 3\.7
|
lock 2023.11.16 (?!win_x86_exe).+ Python 3\.7
|
||||||
lock 2023.11.16 win_x86_exe .+ Windows-(?:Vista|2008Server)
|
lock 2023.11.16 win_x86_exe .+ Windows-(?:Vista|2008Server)
|
||||||
lock 2024.10.22 py2exe .+
|
|
||||||
lock 2024.10.22 linux_(?:armv7l|aarch64)_exe .+-glibc2\.(?:[12]?\d|30)\b
|
|
||||||
lock 2024.10.22 (?!\w+_exe).+ Python 3\.8
|
|
||||||
lock 2024.10.22 win(?:_x86)?_exe Python 3\.[78].+ Windows-(?:7-|2008ServerR2)
|
|
||||||
lockV2 yt-dlp/yt-dlp 2022.08.18.36 .+ Python 3\.6
|
lockV2 yt-dlp/yt-dlp 2022.08.18.36 .+ Python 3\.6
|
||||||
lockV2 yt-dlp/yt-dlp 2023.11.16 (?!win_x86_exe).+ Python 3\.7
|
lockV2 yt-dlp/yt-dlp 2023.11.16 (?!win_x86_exe).+ Python 3\.7
|
||||||
lockV2 yt-dlp/yt-dlp 2023.11.16 win_x86_exe .+ Windows-(?:Vista|2008Server)
|
lockV2 yt-dlp/yt-dlp 2023.11.16 win_x86_exe .+ Windows-(?:Vista|2008Server)
|
||||||
lockV2 yt-dlp/yt-dlp 2024.10.22 py2exe .+
|
|
||||||
lockV2 yt-dlp/yt-dlp 2024.10.22 linux_(?:armv7l|aarch64)_exe .+-glibc2\.(?:[12]?\d|30)\b
|
|
||||||
lockV2 yt-dlp/yt-dlp 2024.10.22 (?!\w+_exe).+ Python 3\.8
|
|
||||||
lockV2 yt-dlp/yt-dlp 2024.10.22 win(?:_x86)?_exe Python 3\.[78].+ Windows-(?:7-|2008ServerR2)
|
|
||||||
lockV2 yt-dlp/yt-dlp-nightly-builds 2023.11.15.232826 (?!win_x86_exe).+ Python 3\.7
|
lockV2 yt-dlp/yt-dlp-nightly-builds 2023.11.15.232826 (?!win_x86_exe).+ Python 3\.7
|
||||||
lockV2 yt-dlp/yt-dlp-nightly-builds 2023.11.15.232826 win_x86_exe .+ Windows-(?:Vista|2008Server)
|
lockV2 yt-dlp/yt-dlp-nightly-builds 2023.11.15.232826 win_x86_exe .+ Windows-(?:Vista|2008Server)
|
||||||
lockV2 yt-dlp/yt-dlp-nightly-builds 2024.10.22.051025 py2exe .+
|
|
||||||
lockV2 yt-dlp/yt-dlp-nightly-builds 2024.10.22.051025 linux_(?:armv7l|aarch64)_exe .+-glibc2\.(?:[12]?\d|30)\b
|
|
||||||
lockV2 yt-dlp/yt-dlp-nightly-builds 2024.10.22.051025 (?!\w+_exe).+ Python 3\.8
|
|
||||||
lockV2 yt-dlp/yt-dlp-nightly-builds 2024.10.22.051025 win(?:_x86)?_exe Python 3\.[78].+ Windows-(?:7-|2008ServerR2)
|
|
||||||
lockV2 yt-dlp/yt-dlp-master-builds 2023.11.15.232812 (?!win_x86_exe).+ Python 3\.7
|
lockV2 yt-dlp/yt-dlp-master-builds 2023.11.15.232812 (?!win_x86_exe).+ Python 3\.7
|
||||||
lockV2 yt-dlp/yt-dlp-master-builds 2023.11.15.232812 win_x86_exe .+ Windows-(?:Vista|2008Server)
|
lockV2 yt-dlp/yt-dlp-master-builds 2023.11.15.232812 win_x86_exe .+ Windows-(?:Vista|2008Server)
|
||||||
lockV2 yt-dlp/yt-dlp-master-builds 2024.10.22.045052 py2exe .+
|
|
||||||
lockV2 yt-dlp/yt-dlp-master-builds 2024.10.22.060347 linux_(?:armv7l|aarch64)_exe .+-glibc2\.(?:[12]?\d|30)\b
|
|
||||||
lockV2 yt-dlp/yt-dlp-master-builds 2024.10.22.060347 (?!\w+_exe).+ Python 3\.8
|
|
||||||
lockV2 yt-dlp/yt-dlp-master-builds 2024.10.22.060347 win(?:_x86)?_exe Python 3\.[78].+ Windows-(?:7-|2008ServerR2)
|
|
||||||
EOF
|
EOF
|
||||||
|
|
||||||
- name: Sign checksum files
|
- name: Sign checksum files
|
||||||
|
6
.github/workflows/codeql.yml
vendored
6
.github/workflows/codeql.yml
vendored
@ -33,7 +33,7 @@ jobs:
|
|||||||
|
|
||||||
# Initializes the CodeQL tools for scanning.
|
# Initializes the CodeQL tools for scanning.
|
||||||
- name: Initialize CodeQL
|
- name: Initialize CodeQL
|
||||||
uses: github/codeql-action/init@v3
|
uses: github/codeql-action/init@v2
|
||||||
with:
|
with:
|
||||||
languages: ${{ matrix.language }}
|
languages: ${{ matrix.language }}
|
||||||
# If you wish to specify custom queries, you can do so here or in a config file.
|
# If you wish to specify custom queries, you can do so here or in a config file.
|
||||||
@ -47,7 +47,7 @@ jobs:
|
|||||||
# Autobuild attempts to build any compiled languages (C/C++, C#, Go, Java, or Swift).
|
# Autobuild attempts to build any compiled languages (C/C++, C#, Go, Java, or Swift).
|
||||||
# If this step fails, then you should remove it and run the build manually (see below)
|
# If this step fails, then you should remove it and run the build manually (see below)
|
||||||
- name: Autobuild
|
- name: Autobuild
|
||||||
uses: github/codeql-action/autobuild@v3
|
uses: github/codeql-action/autobuild@v2
|
||||||
|
|
||||||
# ℹ️ Command-line programs to run using the OS shell.
|
# ℹ️ Command-line programs to run using the OS shell.
|
||||||
# 📚 See https://docs.github.com/en/actions/using-workflows/workflow-syntax-for-github-actions#jobsjob_idstepsrun
|
# 📚 See https://docs.github.com/en/actions/using-workflows/workflow-syntax-for-github-actions#jobsjob_idstepsrun
|
||||||
@ -60,6 +60,6 @@ jobs:
|
|||||||
# ./location_of_script_within_repo/buildscript.sh
|
# ./location_of_script_within_repo/buildscript.sh
|
||||||
|
|
||||||
- name: Perform CodeQL Analysis
|
- name: Perform CodeQL Analysis
|
||||||
uses: github/codeql-action/analyze@v3
|
uses: github/codeql-action/analyze@v2
|
||||||
with:
|
with:
|
||||||
category: "/language:${{matrix.language}}"
|
category: "/language:${{matrix.language}}"
|
||||||
|
16
.github/workflows/core.yml
vendored
16
.github/workflows/core.yml
vendored
@ -6,7 +6,7 @@ on:
|
|||||||
- devscripts/**
|
- devscripts/**
|
||||||
- test/**
|
- test/**
|
||||||
- yt_dlp/**.py
|
- yt_dlp/**.py
|
||||||
- '!yt_dlp/extractor/**.py'
|
- '!yt_dlp/extractor/*.py'
|
||||||
- yt_dlp/extractor/__init__.py
|
- yt_dlp/extractor/__init__.py
|
||||||
- yt_dlp/extractor/common.py
|
- yt_dlp/extractor/common.py
|
||||||
- yt_dlp/extractor/extractors.py
|
- yt_dlp/extractor/extractors.py
|
||||||
@ -16,7 +16,7 @@ on:
|
|||||||
- devscripts/**
|
- devscripts/**
|
||||||
- test/**
|
- test/**
|
||||||
- yt_dlp/**.py
|
- yt_dlp/**.py
|
||||||
- '!yt_dlp/extractor/**.py'
|
- '!yt_dlp/extractor/*.py'
|
||||||
- yt_dlp/extractor/__init__.py
|
- yt_dlp/extractor/__init__.py
|
||||||
- yt_dlp/extractor/common.py
|
- yt_dlp/extractor/common.py
|
||||||
- yt_dlp/extractor/extractors.py
|
- yt_dlp/extractor/extractors.py
|
||||||
@ -36,20 +36,16 @@ jobs:
|
|||||||
fail-fast: false
|
fail-fast: false
|
||||||
matrix:
|
matrix:
|
||||||
os: [ubuntu-latest]
|
os: [ubuntu-latest]
|
||||||
# CPython 3.9 is in quick-test
|
# CPython 3.8 is in quick-test
|
||||||
python-version: ['3.10', '3.11', '3.12', '3.13', pypy-3.10]
|
python-version: ['3.9', '3.10', '3.11', '3.12', pypy-3.8, pypy-3.10]
|
||||||
include:
|
include:
|
||||||
# atleast one of each CPython/PyPy tests must be in windows
|
# atleast one of each CPython/PyPy tests must be in windows
|
||||||
- os: windows-latest
|
- os: windows-latest
|
||||||
python-version: '3.9'
|
python-version: '3.8'
|
||||||
- os: windows-latest
|
|
||||||
python-version: '3.10'
|
|
||||||
- os: windows-latest
|
- os: windows-latest
|
||||||
python-version: '3.12'
|
python-version: '3.12'
|
||||||
- os: windows-latest
|
- os: windows-latest
|
||||||
python-version: '3.13'
|
python-version: pypy-3.9
|
||||||
- os: windows-latest
|
|
||||||
python-version: pypy-3.10
|
|
||||||
steps:
|
steps:
|
||||||
- uses: actions/checkout@v4
|
- uses: actions/checkout@v4
|
||||||
- name: Set up Python ${{ matrix.python-version }}
|
- name: Set up Python ${{ matrix.python-version }}
|
||||||
|
6
.github/workflows/download.yml
vendored
6
.github/workflows/download.yml
vendored
@ -28,13 +28,13 @@ jobs:
|
|||||||
fail-fast: true
|
fail-fast: true
|
||||||
matrix:
|
matrix:
|
||||||
os: [ubuntu-latest]
|
os: [ubuntu-latest]
|
||||||
python-version: ['3.10', '3.11', '3.12', '3.13', pypy-3.10]
|
python-version: ['3.10', '3.11', '3.12', pypy-3.8, pypy-3.10]
|
||||||
include:
|
include:
|
||||||
# atleast one of each CPython/PyPy tests must be in windows
|
# atleast one of each CPython/PyPy tests must be in windows
|
||||||
- os: windows-latest
|
- os: windows-latest
|
||||||
python-version: '3.9'
|
python-version: '3.8'
|
||||||
- os: windows-latest
|
- os: windows-latest
|
||||||
python-version: pypy-3.10
|
python-version: pypy-3.9
|
||||||
steps:
|
steps:
|
||||||
- uses: actions/checkout@v4
|
- uses: actions/checkout@v4
|
||||||
- name: Set up Python ${{ matrix.python-version }}
|
- name: Set up Python ${{ matrix.python-version }}
|
||||||
|
8
.github/workflows/quick-test.yml
vendored
8
.github/workflows/quick-test.yml
vendored
@ -10,10 +10,10 @@ jobs:
|
|||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-latest
|
||||||
steps:
|
steps:
|
||||||
- uses: actions/checkout@v4
|
- uses: actions/checkout@v4
|
||||||
- name: Set up Python 3.9
|
- name: Set up Python 3.8
|
||||||
uses: actions/setup-python@v5
|
uses: actions/setup-python@v5
|
||||||
with:
|
with:
|
||||||
python-version: '3.9'
|
python-version: '3.8'
|
||||||
- name: Install test requirements
|
- name: Install test requirements
|
||||||
run: python3 ./devscripts/install_deps.py -o --include test
|
run: python3 ./devscripts/install_deps.py -o --include test
|
||||||
- name: Run tests
|
- name: Run tests
|
||||||
@ -29,7 +29,7 @@ jobs:
|
|||||||
- uses: actions/checkout@v4
|
- uses: actions/checkout@v4
|
||||||
- uses: actions/setup-python@v5
|
- uses: actions/setup-python@v5
|
||||||
with:
|
with:
|
||||||
python-version: '3.9'
|
python-version: '3.8'
|
||||||
- name: Install dev dependencies
|
- name: Install dev dependencies
|
||||||
run: python3 ./devscripts/install_deps.py -o --include static-analysis
|
run: python3 ./devscripts/install_deps.py -o --include static-analysis
|
||||||
- name: Make lazy extractors
|
- name: Make lazy extractors
|
||||||
@ -38,5 +38,3 @@ jobs:
|
|||||||
run: ruff check --output-format github .
|
run: ruff check --output-format github .
|
||||||
- name: Run autopep8
|
- name: Run autopep8
|
||||||
run: autopep8 --diff .
|
run: autopep8 --diff .
|
||||||
- name: Check file mode
|
|
||||||
run: git ls-files --format="%(objectmode) %(path)" yt_dlp/ | ( ! grep -v "^100644" )
|
|
||||||
|
17
.github/workflows/release-master.yml
vendored
17
.github/workflows/release-master.yml
vendored
@ -28,20 +28,3 @@ jobs:
|
|||||||
actions: write # For cleaning up cache
|
actions: write # For cleaning up cache
|
||||||
id-token: write # mandatory for trusted publishing
|
id-token: write # mandatory for trusted publishing
|
||||||
secrets: inherit
|
secrets: inherit
|
||||||
|
|
||||||
publish_pypi:
|
|
||||||
needs: [release]
|
|
||||||
if: vars.MASTER_PYPI_PROJECT != ''
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
permissions:
|
|
||||||
id-token: write # mandatory for trusted publishing
|
|
||||||
steps:
|
|
||||||
- name: Download artifacts
|
|
||||||
uses: actions/download-artifact@v4
|
|
||||||
with:
|
|
||||||
path: dist
|
|
||||||
name: build-pypi
|
|
||||||
- name: Publish to PyPI
|
|
||||||
uses: pypa/gh-action-pypi-publish@release/v1
|
|
||||||
with:
|
|
||||||
verbose: true
|
|
||||||
|
17
.github/workflows/release-nightly.yml
vendored
17
.github/workflows/release-nightly.yml
vendored
@ -41,20 +41,3 @@ jobs:
|
|||||||
actions: write # For cleaning up cache
|
actions: write # For cleaning up cache
|
||||||
id-token: write # mandatory for trusted publishing
|
id-token: write # mandatory for trusted publishing
|
||||||
secrets: inherit
|
secrets: inherit
|
||||||
|
|
||||||
publish_pypi:
|
|
||||||
needs: [release]
|
|
||||||
if: vars.NIGHTLY_PYPI_PROJECT != ''
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
permissions:
|
|
||||||
id-token: write # mandatory for trusted publishing
|
|
||||||
steps:
|
|
||||||
- name: Download artifacts
|
|
||||||
uses: actions/download-artifact@v4
|
|
||||||
with:
|
|
||||||
path: dist
|
|
||||||
name: build-pypi
|
|
||||||
- name: Publish to PyPI
|
|
||||||
uses: pypa/gh-action-pypi-publish@release/v1
|
|
||||||
with:
|
|
||||||
verbose: true
|
|
||||||
|
18
.github/workflows/release.yml
vendored
18
.github/workflows/release.yml
vendored
@ -2,6 +2,10 @@ name: Release
|
|||||||
on:
|
on:
|
||||||
workflow_call:
|
workflow_call:
|
||||||
inputs:
|
inputs:
|
||||||
|
prerelease:
|
||||||
|
required: false
|
||||||
|
default: true
|
||||||
|
type: boolean
|
||||||
source:
|
source:
|
||||||
required: false
|
required: false
|
||||||
default: ''
|
default: ''
|
||||||
@ -14,10 +18,6 @@ on:
|
|||||||
required: false
|
required: false
|
||||||
default: ''
|
default: ''
|
||||||
type: string
|
type: string
|
||||||
prerelease:
|
|
||||||
required: false
|
|
||||||
default: true
|
|
||||||
type: boolean
|
|
||||||
workflow_dispatch:
|
workflow_dispatch:
|
||||||
inputs:
|
inputs:
|
||||||
source:
|
source:
|
||||||
@ -278,17 +278,7 @@ jobs:
|
|||||||
make clean-cache
|
make clean-cache
|
||||||
python -m build --no-isolation .
|
python -m build --no-isolation .
|
||||||
|
|
||||||
- name: Upload artifacts
|
|
||||||
if: github.event_name != 'workflow_dispatch'
|
|
||||||
uses: actions/upload-artifact@v4
|
|
||||||
with:
|
|
||||||
name: build-pypi
|
|
||||||
path: |
|
|
||||||
dist/*
|
|
||||||
compression-level: 0
|
|
||||||
|
|
||||||
- name: Publish to PyPI
|
- name: Publish to PyPI
|
||||||
if: github.event_name == 'workflow_dispatch'
|
|
||||||
uses: pypa/gh-action-pypi-publish@release/v1
|
uses: pypa/gh-action-pypi-publish@release/v1
|
||||||
with:
|
with:
|
||||||
verbose: true
|
verbose: true
|
||||||
|
41
.github/workflows/signature-tests.yml
vendored
41
.github/workflows/signature-tests.yml
vendored
@ -1,41 +0,0 @@
|
|||||||
name: Signature Tests
|
|
||||||
on:
|
|
||||||
push:
|
|
||||||
paths:
|
|
||||||
- .github/workflows/signature-tests.yml
|
|
||||||
- test/test_youtube_signature.py
|
|
||||||
- yt_dlp/jsinterp.py
|
|
||||||
pull_request:
|
|
||||||
paths:
|
|
||||||
- .github/workflows/signature-tests.yml
|
|
||||||
- test/test_youtube_signature.py
|
|
||||||
- yt_dlp/jsinterp.py
|
|
||||||
permissions:
|
|
||||||
contents: read
|
|
||||||
|
|
||||||
concurrency:
|
|
||||||
group: signature-tests-${{ github.event.pull_request.number || github.ref }}
|
|
||||||
cancel-in-progress: ${{ github.event_name == 'pull_request' }}
|
|
||||||
|
|
||||||
jobs:
|
|
||||||
tests:
|
|
||||||
name: Signature Tests
|
|
||||||
runs-on: ${{ matrix.os }}
|
|
||||||
strategy:
|
|
||||||
fail-fast: false
|
|
||||||
matrix:
|
|
||||||
os: [ubuntu-latest, windows-latest]
|
|
||||||
python-version: ['3.9', '3.10', '3.11', '3.12', '3.13', pypy-3.10, pypy-3.11]
|
|
||||||
steps:
|
|
||||||
- uses: actions/checkout@v4
|
|
||||||
- name: Set up Python ${{ matrix.python-version }}
|
|
||||||
uses: actions/setup-python@v5
|
|
||||||
with:
|
|
||||||
python-version: ${{ matrix.python-version }}
|
|
||||||
- name: Install test requirements
|
|
||||||
run: python3 ./devscripts/install_deps.py --only-optional --include test
|
|
||||||
- name: Run tests
|
|
||||||
timeout-minutes: 15
|
|
||||||
run: |
|
|
||||||
python3 -m yt_dlp -v || true # Print debug head
|
|
||||||
python3 ./devscripts/run_tests.py test/test_youtube_signature.py
|
|
3
.gitignore
vendored
3
.gitignore
vendored
@ -92,7 +92,6 @@ updates_key.pem
|
|||||||
*.class
|
*.class
|
||||||
*.isorted
|
*.isorted
|
||||||
*.stackdump
|
*.stackdump
|
||||||
uv.lock
|
|
||||||
|
|
||||||
# Generated
|
# Generated
|
||||||
AUTHORS
|
AUTHORS
|
||||||
@ -105,8 +104,6 @@ README.txt
|
|||||||
*.zsh
|
*.zsh
|
||||||
*.spec
|
*.spec
|
||||||
test/testdata/sigs/player-*.js
|
test/testdata/sigs/player-*.js
|
||||||
test/testdata/thumbnails/empty.webp
|
|
||||||
test/testdata/thumbnails/foo\ %d\ bar/foo_%d.*
|
|
||||||
|
|
||||||
# Binary
|
# Binary
|
||||||
/youtube-dl
|
/youtube-dl
|
||||||
|
@ -37,18 +37,14 @@ Bugs and suggestions should be reported at: [yt-dlp/yt-dlp/issues](https://githu
|
|||||||
**Please include the full output of yt-dlp when run with `-vU`**, i.e. **add** `-vU` flag to **your command line**, copy the **whole** output and post it in the issue body wrapped in \`\`\` for better formatting. It should look similar to this:
|
**Please include the full output of yt-dlp when run with `-vU`**, i.e. **add** `-vU` flag to **your command line**, copy the **whole** output and post it in the issue body wrapped in \`\`\` for better formatting. It should look similar to this:
|
||||||
```
|
```
|
||||||
$ yt-dlp -vU <your command line>
|
$ yt-dlp -vU <your command line>
|
||||||
[debug] Command-line config: ['-vU', 'https://www.example.com/']
|
[debug] Command-line config: ['-v', 'demo.com']
|
||||||
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
|
[debug] Encodings: locale UTF-8, fs utf-8, out utf-8, pref UTF-8
|
||||||
[debug] yt-dlp version nightly@... from yt-dlp/yt-dlp-nightly-builds [1a176d874] (win_exe)
|
[debug] yt-dlp version 2021.09.25 (zip)
|
||||||
[debug] Python 3.10.11 (CPython AMD64 64bit) - Windows-10-10.0.20348-SP0 (OpenSSL 1.1.1t 7 Feb 2023)
|
[debug] Python version 3.8.10 (CPython 64bit) - Linux-5.4.0-74-generic-x86_64-with-glibc2.29
|
||||||
[debug] exe versions: ffmpeg 7.0.2 (setts), ffprobe 7.0.2
|
[debug] exe versions: ffmpeg 4.2.4, ffprobe 4.2.4
|
||||||
[debug] Optional libraries: Cryptodome-3.21.0, brotli-1.1.0, certifi-2024.08.30, curl_cffi-0.5.10, mutagen-1.47.0, requests-2.32.3, sqlite3-3.40.1, urllib3-2.2.3, websockets-13.1
|
|
||||||
[debug] Proxy map: {}
|
[debug] Proxy map: {}
|
||||||
[debug] Request Handlers: urllib, requests, websockets, curl_cffi
|
Current Build Hash 25cc412d1d3c0725a1f2f5b7e4682f6fb40e6d15f7024e96f7afd572e9919535
|
||||||
[debug] Loaded 1838 extractors
|
yt-dlp is up to date (2021.09.25)
|
||||||
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp/releases/latest
|
|
||||||
Latest version: nightly@... from yt-dlp/yt-dlp-nightly-builds
|
|
||||||
yt-dlp is up to date (nightly@... from yt-dlp/yt-dlp-nightly-builds)
|
|
||||||
...
|
...
|
||||||
```
|
```
|
||||||
**Do not post screenshots of verbose logs; only plain text is acceptable.**
|
**Do not post screenshots of verbose logs; only plain text is acceptable.**
|
||||||
@ -237,7 +233,7 @@ After you have ensured this site is distributing its content legally, you can fo
|
|||||||
# * MD5 checksum; start the string with 'md5:', e.g.
|
# * MD5 checksum; start the string with 'md5:', e.g.
|
||||||
# 'description': 'md5:098f6bcd4621d373cade4e832627b4f6',
|
# 'description': 'md5:098f6bcd4621d373cade4e832627b4f6',
|
||||||
# * A regular expression; start the string with 're:', e.g.
|
# * A regular expression; start the string with 're:', e.g.
|
||||||
# 'thumbnail': r're:https?://.*\.jpg$',
|
# 'thumbnail': r're:^https?://.*\.jpg$',
|
||||||
# * A count of elements in a list; start the string with 'count:', e.g.
|
# * A count of elements in a list; start the string with 'count:', e.g.
|
||||||
# 'tags': 'count:10',
|
# 'tags': 'count:10',
|
||||||
# * Any Python type, e.g.
|
# * Any Python type, e.g.
|
||||||
@ -272,7 +268,7 @@ After you have ensured this site is distributing its content legally, you can fo
|
|||||||
|
|
||||||
You can use `hatch fmt` to automatically fix problems. Rules that the linter/formatter enforces should not be disabled with `# noqa` unless a maintainer requests it. The only exception allowed is for old/printf-style string formatting in GraphQL query templates (use `# noqa: UP031`).
|
You can use `hatch fmt` to automatically fix problems. Rules that the linter/formatter enforces should not be disabled with `# noqa` unless a maintainer requests it. The only exception allowed is for old/printf-style string formatting in GraphQL query templates (use `# noqa: UP031`).
|
||||||
|
|
||||||
1. Make sure your code works under all [Python](https://www.python.org/) versions supported by yt-dlp, namely CPython >=3.9 and PyPy >=3.10. Backward compatibility is not required for even older versions of Python.
|
1. Make sure your code works under all [Python](https://www.python.org/) versions supported by yt-dlp, namely CPython and PyPy for Python 3.8 and above. Backward compatibility is not required for even older versions of Python.
|
||||||
1. When the tests pass, [add](https://git-scm.com/docs/git-add) the new files, [commit](https://git-scm.com/docs/git-commit) them and [push](https://git-scm.com/docs/git-push) the result, like this:
|
1. When the tests pass, [add](https://git-scm.com/docs/git-add) the new files, [commit](https://git-scm.com/docs/git-commit) them and [push](https://git-scm.com/docs/git-push) the result, like this:
|
||||||
|
|
||||||
```shell
|
```shell
|
||||||
@ -306,9 +302,10 @@ Extractors are very fragile by nature since they depend on the layout of the sou
|
|||||||
For extraction to work yt-dlp relies on metadata your extractor extracts and provides to yt-dlp expressed by an [information dictionary](yt_dlp/extractor/common.py#L119-L440) or simply *info dict*. Only the following meta fields in the *info dict* are considered mandatory for a successful extraction process by yt-dlp:
|
For extraction to work yt-dlp relies on metadata your extractor extracts and provides to yt-dlp expressed by an [information dictionary](yt_dlp/extractor/common.py#L119-L440) or simply *info dict*. Only the following meta fields in the *info dict* are considered mandatory for a successful extraction process by yt-dlp:
|
||||||
|
|
||||||
- `id` (media identifier)
|
- `id` (media identifier)
|
||||||
|
- `title` (media title)
|
||||||
- `url` (media download URL) or `formats`
|
- `url` (media download URL) or `formats`
|
||||||
|
|
||||||
The aforementioned metadata fields are the critical data without which extraction does not make any sense. If any of them fail to be extracted, then the extractor is considered broken. All other metadata extraction should be completely non-fatal.
|
The aforementioned metafields are the critical data that the extraction does not make any sense without and if any of them fail to be extracted then the extractor is considered completely broken. While all extractors must return a `title`, they must also allow it's extraction to be non-fatal.
|
||||||
|
|
||||||
For pornographic sites, appropriate `age_limit` must also be returned.
|
For pornographic sites, appropriate `age_limit` must also be returned.
|
||||||
|
|
||||||
|
106
CONTRIBUTORS
106
CONTRIBUTORS
@ -678,109 +678,3 @@ coreywright
|
|||||||
eric321
|
eric321
|
||||||
poyhen
|
poyhen
|
||||||
tetra-fox
|
tetra-fox
|
||||||
444995
|
|
||||||
63427083
|
|
||||||
allendema
|
|
||||||
DarkZeros
|
|
||||||
DTrombett
|
|
||||||
imranh2
|
|
||||||
KarboniteKream
|
|
||||||
mikkovedru
|
|
||||||
pktiuk
|
|
||||||
rubyevadestaxes
|
|
||||||
avagordon01
|
|
||||||
CounterPillow
|
|
||||||
JoseAngelB
|
|
||||||
KBelmin
|
|
||||||
kesor
|
|
||||||
MellowKyler
|
|
||||||
Wesley107772
|
|
||||||
a13ssandr0
|
|
||||||
ChocoLZS
|
|
||||||
doe1080
|
|
||||||
hugovdev
|
|
||||||
jshumphrey
|
|
||||||
julionc
|
|
||||||
manavchaudhary1
|
|
||||||
powergold1
|
|
||||||
Sakura286
|
|
||||||
SamDecrock
|
|
||||||
stratus-ss
|
|
||||||
subrat-lima
|
|
||||||
gitninja1234
|
|
||||||
jkruse
|
|
||||||
xiaomac
|
|
||||||
wesson09
|
|
||||||
Crypto90
|
|
||||||
MutantPiggieGolem1
|
|
||||||
Sanceilaks
|
|
||||||
Strkmn
|
|
||||||
0x9fff00
|
|
||||||
4ft35t
|
|
||||||
7x11x13
|
|
||||||
b5i
|
|
||||||
cotko
|
|
||||||
d3d9
|
|
||||||
Dioarya
|
|
||||||
finch71
|
|
||||||
hexahigh
|
|
||||||
InvalidUsernameException
|
|
||||||
jixunmoe
|
|
||||||
knackku
|
|
||||||
krandor
|
|
||||||
kvk-2015
|
|
||||||
lonble
|
|
||||||
msm595
|
|
||||||
n10dollar
|
|
||||||
NecroRomnt
|
|
||||||
pjrobertson
|
|
||||||
subsense
|
|
||||||
test20140
|
|
||||||
arantius
|
|
||||||
entourage8
|
|
||||||
lfavole
|
|
||||||
mp3butcher
|
|
||||||
slipinthedove
|
|
||||||
YoshiTabletopGamer
|
|
||||||
Arc8ne
|
|
||||||
benfaerber
|
|
||||||
chrisellsworth
|
|
||||||
fries1234
|
|
||||||
Kenshin9977
|
|
||||||
MichaelDeBoey
|
|
||||||
msikma
|
|
||||||
pedro
|
|
||||||
pferreir
|
|
||||||
red-acid
|
|
||||||
refack
|
|
||||||
rysson
|
|
||||||
somini
|
|
||||||
thedenv
|
|
||||||
vallovic
|
|
||||||
arabcoders
|
|
||||||
mireq
|
|
||||||
mlabeeb03
|
|
||||||
1271
|
|
||||||
CasperMcFadden95
|
|
||||||
Kicer86
|
|
||||||
Kiritomo
|
|
||||||
leeblackc
|
|
||||||
meGAmeS1
|
|
||||||
NeonMan
|
|
||||||
pj47x
|
|
||||||
troex
|
|
||||||
WouterGordts
|
|
||||||
baierjan
|
|
||||||
GeoffreyFrogeye
|
|
||||||
Pawka
|
|
||||||
v3DJG6GL
|
|
||||||
yozel
|
|
||||||
brian6932
|
|
||||||
iednod55
|
|
||||||
maxbin123
|
|
||||||
nullpos
|
|
||||||
anlar
|
|
||||||
eason1478
|
|
||||||
ceandreasen
|
|
||||||
chauhantirth
|
|
||||||
helpimnotdrowning
|
|
||||||
|
722
Changelog.md
722
Changelog.md
@ -4,728 +4,6 @@
|
|||||||
# To create a release, dispatch the https://github.com/yt-dlp/yt-dlp/actions/workflows/release.yml workflow on master
|
# To create a release, dispatch the https://github.com/yt-dlp/yt-dlp/actions/workflows/release.yml workflow on master
|
||||||
-->
|
-->
|
||||||
|
|
||||||
### 2025.06.30
|
|
||||||
|
|
||||||
#### Core changes
|
|
||||||
- **jsinterp**: [Fix `extract_object`](https://github.com/yt-dlp/yt-dlp/commit/958153a226214c86879e36211ac191bf78289578) ([#13580](https://github.com/yt-dlp/yt-dlp/issues/13580)) by [seproDev](https://github.com/seproDev)
|
|
||||||
|
|
||||||
#### Extractor changes
|
|
||||||
- **bilibilispacevideo**: [Extract hidden-mode collections as playlists](https://github.com/yt-dlp/yt-dlp/commit/99b85ac102047446e6adf5b62bfc3c8d80b53778) ([#13533](https://github.com/yt-dlp/yt-dlp/issues/13533)) by [c-basalt](https://github.com/c-basalt)
|
|
||||||
- **hotstar**
|
|
||||||
- [Fix extractor](https://github.com/yt-dlp/yt-dlp/commit/b5bd057fe86550f3aa67f2fc8790d1c6a251c57b) ([#13530](https://github.com/yt-dlp/yt-dlp/issues/13530)) by [bashonly](https://github.com/bashonly), [chauhantirth](https://github.com/chauhantirth) (With fixes in [e9f1576](https://github.com/yt-dlp/yt-dlp/commit/e9f157669e24953a88d15ce22053649db7a8e81e) by [bashonly](https://github.com/bashonly))
|
|
||||||
- [Fix metadata extraction](https://github.com/yt-dlp/yt-dlp/commit/0a6b1044899f452cd10b6c7a6b00fa985a9a8b97) ([#13560](https://github.com/yt-dlp/yt-dlp/issues/13560)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- [Raise for login required](https://github.com/yt-dlp/yt-dlp/commit/5e292baad62c749b6c340621ab2d0f904165ddfb) ([#10405](https://github.com/yt-dlp/yt-dlp/issues/10405)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- series: [Fix extractor](https://github.com/yt-dlp/yt-dlp/commit/4bd9a7ade7e0508b9795b3e72a69eeb40788b62b) ([#13564](https://github.com/yt-dlp/yt-dlp/issues/13564)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **jiocinema**: [Remove extractors](https://github.com/yt-dlp/yt-dlp/commit/7e2504f941a11ea2b0dba00de3f0295cdc253e79) ([#13565](https://github.com/yt-dlp/yt-dlp/issues/13565)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **kick**: [Support subscriber-only content](https://github.com/yt-dlp/yt-dlp/commit/b16722ede83377f77ea8352dcd0a6ca8e83b8f0f) ([#13550](https://github.com/yt-dlp/yt-dlp/issues/13550)) by [helpimnotdrowning](https://github.com/helpimnotdrowning)
|
|
||||||
- **niconico**: live: [Fix extractor and downloader](https://github.com/yt-dlp/yt-dlp/commit/06c1a8cdffe14050206683253726875144192ef5) ([#13158](https://github.com/yt-dlp/yt-dlp/issues/13158)) by [doe1080](https://github.com/doe1080)
|
|
||||||
- **sauceplus**: [Add extractor](https://github.com/yt-dlp/yt-dlp/commit/35fc33fbc51c7f5392fb2300f65abf6cf107ef90) ([#13567](https://github.com/yt-dlp/yt-dlp/issues/13567)) by [bashonly](https://github.com/bashonly), [ceandreasen](https://github.com/ceandreasen)
|
|
||||||
- **sproutvideo**: [Support browser impersonation](https://github.com/yt-dlp/yt-dlp/commit/11b9416e10cff7513167d76d6c47774fcdd3e26a) ([#13589](https://github.com/yt-dlp/yt-dlp/issues/13589)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **youtube**: [Fix premium formats extraction](https://github.com/yt-dlp/yt-dlp/commit/2ba5391cd68ed4f2415c827d2cecbcbc75ace10b) ([#13586](https://github.com/yt-dlp/yt-dlp/issues/13586)) by [bashonly](https://github.com/bashonly)
|
|
||||||
|
|
||||||
#### Misc. changes
|
|
||||||
- **ci**: [Add signature tests](https://github.com/yt-dlp/yt-dlp/commit/1b883846347addeab12663fd74317fd544341a1c) ([#13582](https://github.com/yt-dlp/yt-dlp/issues/13582)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **cleanup**: Miscellaneous: [b018784](https://github.com/yt-dlp/yt-dlp/commit/b0187844988e557c7e1e6bb1aabd4c1176768d86) by [bashonly](https://github.com/bashonly)
|
|
||||||
|
|
||||||
### 2025.06.25
|
|
||||||
|
|
||||||
#### Extractor changes
|
|
||||||
- [Add `_search_nuxt_json` helper](https://github.com/yt-dlp/yt-dlp/commit/51887484e46ab6015c041cb1ab626a55f25a03bd) ([#13386](https://github.com/yt-dlp/yt-dlp/issues/13386)) by [bashonly](https://github.com/bashonly), [Grub4K](https://github.com/Grub4K)
|
|
||||||
- **brightcove**: new: [Improve metadata extraction](https://github.com/yt-dlp/yt-dlp/commit/e6bd4a3da295b760ab20b39c18ce8934d312c2bf) ([#13461](https://github.com/yt-dlp/yt-dlp/issues/13461)) by [doe1080](https://github.com/doe1080)
|
|
||||||
- **huya**: live: [Fix extractor](https://github.com/yt-dlp/yt-dlp/commit/2600849badb0d08c55b58dcc77a13af6ba423da6) ([#13520](https://github.com/yt-dlp/yt-dlp/issues/13520)) by [doe1080](https://github.com/doe1080)
|
|
||||||
- **hypergryph**: [Improve metadata extraction](https://github.com/yt-dlp/yt-dlp/commit/1722c55400ff30bb5aee5dd7a262f0b7e9ce2f0e) ([#13415](https://github.com/yt-dlp/yt-dlp/issues/13415)) by [doe1080](https://github.com/doe1080), [eason1478](https://github.com/eason1478)
|
|
||||||
- **lsm**: [Fix extractors](https://github.com/yt-dlp/yt-dlp/commit/c57412d1f9cf0124adc972a47858ac42b740c61d) ([#13126](https://github.com/yt-dlp/yt-dlp/issues/13126)) by [Caesim404](https://github.com/Caesim404)
|
|
||||||
- **mave**: [Add extractor](https://github.com/yt-dlp/yt-dlp/commit/1838a1ce5d4ade80770ba9162eaffc9a1607dc70) ([#13380](https://github.com/yt-dlp/yt-dlp/issues/13380)) by [anlar](https://github.com/anlar)
|
|
||||||
- **sportdeutschland**: [Fix extractor](https://github.com/yt-dlp/yt-dlp/commit/a4ce4327c9836691d3b6b00e44a90b6741601ed8) ([#13519](https://github.com/yt-dlp/yt-dlp/issues/13519)) by [DTrombett](https://github.com/DTrombett)
|
|
||||||
- **sproutvideo**: [Fix extractor](https://github.com/yt-dlp/yt-dlp/commit/5b559d0072b7164daf06bacdc41c6f11283452c8) ([#13544](https://github.com/yt-dlp/yt-dlp/issues/13544)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **tv8.it**: [Support slugless URLs](https://github.com/yt-dlp/yt-dlp/commit/3bd30291601c47fa4a257983473884103ecab0c7) ([#13478](https://github.com/yt-dlp/yt-dlp/issues/13478)) by [DTrombett](https://github.com/DTrombett)
|
|
||||||
- **youtube**
|
|
||||||
- [Check any `ios` m3u8 formats prior to download](https://github.com/yt-dlp/yt-dlp/commit/8f94b76cbf7bbd9dfd8762c63cdea04f90f1297f) ([#13524](https://github.com/yt-dlp/yt-dlp/issues/13524)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- [Improve player context payloads](https://github.com/yt-dlp/yt-dlp/commit/ff6f94041aeee19c5559e1c1cd693960a1c1dd14) ([#13539](https://github.com/yt-dlp/yt-dlp/issues/13539)) by [bashonly](https://github.com/bashonly)
|
|
||||||
|
|
||||||
#### Misc. changes
|
|
||||||
- **test**: `traversal`: [Fix morsel tests for Python 3.14](https://github.com/yt-dlp/yt-dlp/commit/73bf10211668e4a59ccafd790e06ee82d9fea9ea) ([#13471](https://github.com/yt-dlp/yt-dlp/issues/13471)) by [Grub4K](https://github.com/Grub4K)
|
|
||||||
|
|
||||||
### 2025.06.09
|
|
||||||
|
|
||||||
#### Extractor changes
|
|
||||||
- [Improve JSON LD thumbnails extraction](https://github.com/yt-dlp/yt-dlp/commit/85c8a405e3651dc041b758f4744d4fb3c4c55e01) ([#13368](https://github.com/yt-dlp/yt-dlp/issues/13368)) by [bashonly](https://github.com/bashonly), [doe1080](https://github.com/doe1080)
|
|
||||||
- **10play**: [Fix extractor](https://github.com/yt-dlp/yt-dlp/commit/6d265388c6e943419ac99e9151cf75a3265f980f) ([#13349](https://github.com/yt-dlp/yt-dlp/issues/13349)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **adobepass**
|
|
||||||
- [Add Fubo MSO](https://github.com/yt-dlp/yt-dlp/commit/eee90acc47d7f8de24afaa8b0271ccaefdf6e88c) ([#13131](https://github.com/yt-dlp/yt-dlp/issues/13131)) by [maxbin123](https://github.com/maxbin123)
|
|
||||||
- [Always add newer user-agent when required](https://github.com/yt-dlp/yt-dlp/commit/0ee1102268cf31b07f8a8318a47424c66b2f7378) ([#13131](https://github.com/yt-dlp/yt-dlp/issues/13131)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- [Fix Philo MSO authentication](https://github.com/yt-dlp/yt-dlp/commit/943083edcd3df45aaa597a6967bc6c95b720f54c) ([#13335](https://github.com/yt-dlp/yt-dlp/issues/13335)) by [Sipherdrakon](https://github.com/Sipherdrakon)
|
|
||||||
- [Rework to require software statement](https://github.com/yt-dlp/yt-dlp/commit/711c5d5d098fee2992a1a624b1c4b30364b91426) ([#13131](https://github.com/yt-dlp/yt-dlp/issues/13131)) by [bashonly](https://github.com/bashonly), [maxbin123](https://github.com/maxbin123)
|
|
||||||
- [Validate login URL before sending credentials](https://github.com/yt-dlp/yt-dlp/commit/89c1b349ad81318d9d3bea76c01c891696e58d38) ([#13131](https://github.com/yt-dlp/yt-dlp/issues/13131)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **aenetworks**
|
|
||||||
- [Fix playlist extractors](https://github.com/yt-dlp/yt-dlp/commit/f37d599a697e82fe68b423865897d55bae34f373) ([#13408](https://github.com/yt-dlp/yt-dlp/issues/13408)) by [Sipherdrakon](https://github.com/Sipherdrakon)
|
|
||||||
- [Fix provider-locked content extraction](https://github.com/yt-dlp/yt-dlp/commit/6693d6603358ae6beca834dbd822a7917498b813) ([#13131](https://github.com/yt-dlp/yt-dlp/issues/13131)) by [maxbin123](https://github.com/maxbin123)
|
|
||||||
- **bilibilibangumi**: [Fix extractor](https://github.com/yt-dlp/yt-dlp/commit/13e55162719528d42d2133e16b65ff59a667a6e4) ([#13416](https://github.com/yt-dlp/yt-dlp/issues/13416)) by [c-basalt](https://github.com/c-basalt)
|
|
||||||
- **brightcove**: new: [Adapt to new AdobePass requirement](https://github.com/yt-dlp/yt-dlp/commit/98f8eec956e3b16cb66a3d49cc71af3807db795e) ([#13131](https://github.com/yt-dlp/yt-dlp/issues/13131)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **cu.ntv.co.jp**: [Fix extractor](https://github.com/yt-dlp/yt-dlp/commit/aa863ddab9b1d104678e9cf39bb76f5b14fca660) ([#13302](https://github.com/yt-dlp/yt-dlp/issues/13302)) by [doe1080](https://github.com/doe1080), [nullpos](https://github.com/nullpos)
|
|
||||||
- **go**: [Fix provider-locked content extraction](https://github.com/yt-dlp/yt-dlp/commit/2e5bf002dad16f5ce35aa2023d392c9e518fcd8f) ([#13131](https://github.com/yt-dlp/yt-dlp/issues/13131)) by [bashonly](https://github.com/bashonly), [maxbin123](https://github.com/maxbin123)
|
|
||||||
- **nbc**: [Rework and adapt extractors to new AdobePass flow](https://github.com/yt-dlp/yt-dlp/commit/2d7949d5642bc37d1e71bf00c9a55260e5505d58) ([#13131](https://github.com/yt-dlp/yt-dlp/issues/13131)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **nobelprize**: [Fix extractor](https://github.com/yt-dlp/yt-dlp/commit/97ddfefeb4faba6e61cd80996c16952b8eab16f3) ([#13205](https://github.com/yt-dlp/yt-dlp/issues/13205)) by [doe1080](https://github.com/doe1080)
|
|
||||||
- **odnoklassniki**: [Detect and raise when login is required](https://github.com/yt-dlp/yt-dlp/commit/148a1eb4c59e127965396c7a6e6acf1979de459e) ([#13361](https://github.com/yt-dlp/yt-dlp/issues/13361)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **patreon**: [Fix m3u8 formats extraction](https://github.com/yt-dlp/yt-dlp/commit/e0d6c0822930f6e63f574d46d946a58b73ecd10c) ([#13266](https://github.com/yt-dlp/yt-dlp/issues/13266)) by [bashonly](https://github.com/bashonly) (With fixes in [1a8a03e](https://github.com/yt-dlp/yt-dlp/commit/1a8a03ea8d827107319a18076ee3505090667c5a))
|
|
||||||
- **podchaser**: [Fix extractor](https://github.com/yt-dlp/yt-dlp/commit/538eb305673c26bff6a2b12f1c96375fe02ce41a) ([#13271](https://github.com/yt-dlp/yt-dlp/issues/13271)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **sr**: mediathek: [Improve metadata extraction](https://github.com/yt-dlp/yt-dlp/commit/e3c605a61f4cc2de9059f37434fa108c3c20f58e) ([#13294](https://github.com/yt-dlp/yt-dlp/issues/13294)) by [doe1080](https://github.com/doe1080)
|
|
||||||
- **stacommu**: [Avoid partial stream formats](https://github.com/yt-dlp/yt-dlp/commit/5d96527be80dc1ed1702d9cd548ff86de570ad70) ([#13412](https://github.com/yt-dlp/yt-dlp/issues/13412)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **startrek**: [Fix extractor](https://github.com/yt-dlp/yt-dlp/commit/a8bf0011bde92b3f1324a98bfbd38932fd3ebe18) ([#13188](https://github.com/yt-dlp/yt-dlp/issues/13188)) by [doe1080](https://github.com/doe1080)
|
|
||||||
- **svt**: play: [Fix extractor](https://github.com/yt-dlp/yt-dlp/commit/e1b6062f8c4a3fa33c65269d48d09ec78de765a2) ([#13329](https://github.com/yt-dlp/yt-dlp/issues/13329)) by [barsnick](https://github.com/barsnick), [bashonly](https://github.com/bashonly)
|
|
||||||
- **telecinco**: [Fix extractor](https://github.com/yt-dlp/yt-dlp/commit/03dba2012d9bd3f402fa8c2f122afba89bbd22a4) ([#13379](https://github.com/yt-dlp/yt-dlp/issues/13379)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **theplatform**: [Improve metadata extraction](https://github.com/yt-dlp/yt-dlp/commit/ed108b3ea481c6a4b5215a9302ba92d74baa2425) ([#13131](https://github.com/yt-dlp/yt-dlp/issues/13131)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **toutiao**: [Add extractor](https://github.com/yt-dlp/yt-dlp/commit/f8051e3a61686c5db1de5f5746366ecfbc3ad20c) ([#13246](https://github.com/yt-dlp/yt-dlp/issues/13246)) by [doe1080](https://github.com/doe1080)
|
|
||||||
- **turner**: [Adapt extractors to new AdobePass flow](https://github.com/yt-dlp/yt-dlp/commit/0daddc780d3ac5bebc3a3ec5b884d9243cbc0745) ([#13131](https://github.com/yt-dlp/yt-dlp/issues/13131)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **twitcasting**: [Fix password-protected livestream support](https://github.com/yt-dlp/yt-dlp/commit/52f9729c9a92ad4656d746ff0b1acecb87b3e96d) ([#13097](https://github.com/yt-dlp/yt-dlp/issues/13097)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **twitter**: broadcast: [Support events URLs](https://github.com/yt-dlp/yt-dlp/commit/7794374de8afb20499b023107e2abfd4e6b93ee4) ([#13248](https://github.com/yt-dlp/yt-dlp/issues/13248)) by [doe1080](https://github.com/doe1080)
|
|
||||||
- **umg**: de: [Rework extractor](https://github.com/yt-dlp/yt-dlp/commit/4e7c1ea346b510280218b47e8653dbbca3a69870) ([#13373](https://github.com/yt-dlp/yt-dlp/issues/13373)) by [doe1080](https://github.com/doe1080)
|
|
||||||
- **vice**: [Mark extractors as broken](https://github.com/yt-dlp/yt-dlp/commit/6121559e027a04574690799c1776bc42bb51af31) ([#13131](https://github.com/yt-dlp/yt-dlp/issues/13131)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **vimeo**: [Extract subtitles from player subdomain](https://github.com/yt-dlp/yt-dlp/commit/c723c4e5e78263df178dbe69844a3d05f3ef9e35) ([#13350](https://github.com/yt-dlp/yt-dlp/issues/13350)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **watchespn**: [Fix provider-locked content extraction](https://github.com/yt-dlp/yt-dlp/commit/b094747e93cfb0a2c53007120e37d0d84d41f030) ([#13131](https://github.com/yt-dlp/yt-dlp/issues/13131)) by [maxbin123](https://github.com/maxbin123)
|
|
||||||
- **weverse**: [Support login with oauth refresh tokens](https://github.com/yt-dlp/yt-dlp/commit/3fe72e9eea38d9a58211cde42cfaa577ce020e2c) ([#13284](https://github.com/yt-dlp/yt-dlp/issues/13284)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **youtube**
|
|
||||||
- [Add `tv_simply` player client](https://github.com/yt-dlp/yt-dlp/commit/1fd0e88b67db53ad163393d6965f68e908fa70e3) ([#13389](https://github.com/yt-dlp/yt-dlp/issues/13389)) by [gamer191](https://github.com/gamer191)
|
|
||||||
- [Extract srt subtitles](https://github.com/yt-dlp/yt-dlp/commit/231349786e8c42089c2e079ec94c0ea866c37999) ([#13411](https://github.com/yt-dlp/yt-dlp/issues/13411)) by [gamer191](https://github.com/gamer191)
|
|
||||||
- [Fix `--mark-watched` support](https://github.com/yt-dlp/yt-dlp/commit/b5be29fa58ec98226e11621fd9c58585bcff6879) ([#13222](https://github.com/yt-dlp/yt-dlp/issues/13222)) by [brian6932](https://github.com/brian6932), [iednod55](https://github.com/iednod55)
|
|
||||||
- [Fix automatic captions for some client combinations](https://github.com/yt-dlp/yt-dlp/commit/53ea743a9c158f8ca2d75a09ca44ba68606042d8) ([#13268](https://github.com/yt-dlp/yt-dlp/issues/13268)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- [Improve signature extraction debug output](https://github.com/yt-dlp/yt-dlp/commit/d30a49742cfa22e61c47df4ac0e7334d648fb85d) ([#13327](https://github.com/yt-dlp/yt-dlp/issues/13327)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- [Rework nsig function name extraction](https://github.com/yt-dlp/yt-dlp/commit/9e38b273b7ac942e7e9fc05a651ed810ab7d30ba) ([#13403](https://github.com/yt-dlp/yt-dlp/issues/13403)) by [Grub4K](https://github.com/Grub4K)
|
|
||||||
- [nsig code improvements and cleanup](https://github.com/yt-dlp/yt-dlp/commit/f7bbf5a617f9ab54ef51eaef99be36e175b5e9c3) ([#13280](https://github.com/yt-dlp/yt-dlp/issues/13280)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **zdf**: [Fix language extraction and format sorting](https://github.com/yt-dlp/yt-dlp/commit/db162b76f6bdece50babe2e0cacfe56888c2e125) ([#13313](https://github.com/yt-dlp/yt-dlp/issues/13313)) by [InvalidUsernameException](https://github.com/InvalidUsernameException)
|
|
||||||
|
|
||||||
#### Misc. changes
|
|
||||||
- **build**
|
|
||||||
- [Exclude `pkg_resources` from being collected](https://github.com/yt-dlp/yt-dlp/commit/cc749a8a3b8b6e5c05318868c72a403f376a1b38) ([#13320](https://github.com/yt-dlp/yt-dlp/issues/13320)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- [Fix macOS requirements caching](https://github.com/yt-dlp/yt-dlp/commit/201812100f315c6727a4418698d5b4e8a79863d4) ([#13328](https://github.com/yt-dlp/yt-dlp/issues/13328)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **cleanup**: Miscellaneous: [339614a](https://github.com/yt-dlp/yt-dlp/commit/339614a173c74b42d63e858c446a9cae262a13af) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **test**: postprocessors: [Remove binary thumbnail test data](https://github.com/yt-dlp/yt-dlp/commit/a9b370069838e84d44ac7ad095d657003665885a) ([#13341](https://github.com/yt-dlp/yt-dlp/issues/13341)) by [bashonly](https://github.com/bashonly)
|
|
||||||
|
|
||||||
### 2025.05.22
|
|
||||||
|
|
||||||
#### Core changes
|
|
||||||
- **cookies**: [Fix Linux desktop environment detection](https://github.com/yt-dlp/yt-dlp/commit/e491fd4d090db3af52a82863fb0553dd5e17fb85) ([#13197](https://github.com/yt-dlp/yt-dlp/issues/13197)) by [mbway](https://github.com/mbway)
|
|
||||||
- **jsinterp**: [Fix increment/decrement evaluation](https://github.com/yt-dlp/yt-dlp/commit/167d7a9f0ffd1b4fe600193441bdb7358db2740b) ([#13238](https://github.com/yt-dlp/yt-dlp/issues/13238)) by [bashonly](https://github.com/bashonly), [seproDev](https://github.com/seproDev)
|
|
||||||
|
|
||||||
#### Extractor changes
|
|
||||||
- **1tv**: [Fix extractor](https://github.com/yt-dlp/yt-dlp/commit/41c0a1fb89628696f8bb88e2b9f3a68f355b8c26) ([#13168](https://github.com/yt-dlp/yt-dlp/issues/13168)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **amcnetworks**: [Fix extractor](https://github.com/yt-dlp/yt-dlp/commit/464c84fedf78eef822a431361155f108b5df96d7) ([#13147](https://github.com/yt-dlp/yt-dlp/issues/13147)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **bitchute**: [Fix extractor](https://github.com/yt-dlp/yt-dlp/commit/1d0f6539c47e5d5c68c3c47cdb7075339e2885ac) ([#13081](https://github.com/yt-dlp/yt-dlp/issues/13081)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **cartoonnetwork**: [Remove extractor](https://github.com/yt-dlp/yt-dlp/commit/7dbb47f84f0ee1266a3a01f58c9bc4c76d76794a) ([#13148](https://github.com/yt-dlp/yt-dlp/issues/13148)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **iprima**: [Fix login support](https://github.com/yt-dlp/yt-dlp/commit/a7d9a5eb79ceeecb851389f3f2c88597871ca3f2) ([#12937](https://github.com/yt-dlp/yt-dlp/issues/12937)) by [baierjan](https://github.com/baierjan)
|
|
||||||
- **jiosaavn**
|
|
||||||
- artist: [Add extractor](https://github.com/yt-dlp/yt-dlp/commit/586b557b124f954d3f625360ebe970989022ad97) ([#12803](https://github.com/yt-dlp/yt-dlp/issues/12803)) by [subrat-lima](https://github.com/subrat-lima)
|
|
||||||
- playlist, show: [Add extractor](https://github.com/yt-dlp/yt-dlp/commit/317f4b8006c2c0f0f64f095b1485163ad97c9053) ([#12803](https://github.com/yt-dlp/yt-dlp/issues/12803)) by [subrat-lima](https://github.com/subrat-lima)
|
|
||||||
- show: [Add extractor](https://github.com/yt-dlp/yt-dlp/commit/6839276496d8814cf16f58b637e45663467928e6) ([#12803](https://github.com/yt-dlp/yt-dlp/issues/12803)) by [subrat-lima](https://github.com/subrat-lima)
|
|
||||||
- **lrtradio**: [Fix extractor](https://github.com/yt-dlp/yt-dlp/commit/abf58dcd6a09e14eec4ea82ae12f79a0337cb383) ([#13200](https://github.com/yt-dlp/yt-dlp/issues/13200)) by [Pawka](https://github.com/Pawka)
|
|
||||||
- **nebula**: [Support `--mark-watched`](https://github.com/yt-dlp/yt-dlp/commit/20f288bdc2173c7cc58d709d25ca193c1f6001e7) ([#13120](https://github.com/yt-dlp/yt-dlp/issues/13120)) by [GeoffreyFrogeye](https://github.com/GeoffreyFrogeye)
|
|
||||||
- **niconico**
|
|
||||||
- [Fix error handling](https://github.com/yt-dlp/yt-dlp/commit/f569be4602c2a857087e495d5d7ed6060cd97abe) ([#13236](https://github.com/yt-dlp/yt-dlp/issues/13236)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- live: [Fix extractor](https://github.com/yt-dlp/yt-dlp/commit/7a7b85c9014d96421e18aa7ea5f4c1bee5ceece0) ([#13045](https://github.com/yt-dlp/yt-dlp/issues/13045)) by [doe1080](https://github.com/doe1080)
|
|
||||||
- **nytimesarticle**: [Fix extraction](https://github.com/yt-dlp/yt-dlp/commit/b26bc32579c00ef579d75a835807ccc87d20ee0a) ([#13104](https://github.com/yt-dlp/yt-dlp/issues/13104)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **once**: [Remove extractor](https://github.com/yt-dlp/yt-dlp/commit/f475e8b529d18efdad603ffda02a56e707fe0e2c) ([#13164](https://github.com/yt-dlp/yt-dlp/issues/13164)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **picarto**: vod: [Support `/profile/` video URLs](https://github.com/yt-dlp/yt-dlp/commit/31e090cb787f3504ec25485adff9a2a51d056734) ([#13227](https://github.com/yt-dlp/yt-dlp/issues/13227)) by [subrat-lima](https://github.com/subrat-lima)
|
|
||||||
- **playsuisse**: [Improve metadata extraction](https://github.com/yt-dlp/yt-dlp/commit/d880e060803ae8ed5a047e578cca01e1f0e630ce) ([#12466](https://github.com/yt-dlp/yt-dlp/issues/12466)) by [v3DJG6GL](https://github.com/v3DJG6GL)
|
|
||||||
- **sprout**: [Remove extractor](https://github.com/yt-dlp/yt-dlp/commit/cbcfe6378dde33a650e3852ab17ad4503b8e008d) ([#13149](https://github.com/yt-dlp/yt-dlp/issues/13149)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **svtpage**: [Fix extractor](https://github.com/yt-dlp/yt-dlp/commit/ea8498ed534642dd7e925961b97b934987142fd3) ([#12957](https://github.com/yt-dlp/yt-dlp/issues/12957)) by [diman8](https://github.com/diman8)
|
|
||||||
- **twitch**: [Support `--live-from-start`](https://github.com/yt-dlp/yt-dlp/commit/00b1bec55249cf2ad6271d36492c51b34b6459d1) ([#13202](https://github.com/yt-dlp/yt-dlp/issues/13202)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **vimeo**: event: [Add extractor](https://github.com/yt-dlp/yt-dlp/commit/545c1a5b6f2fe88722b41aef0e7485bf3be3f3f9) ([#13216](https://github.com/yt-dlp/yt-dlp/issues/13216)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **wat.tv**: [Improve error handling](https://github.com/yt-dlp/yt-dlp/commit/f123cc83b3aea45053f5fa1d9141048b01fc2774) ([#13111](https://github.com/yt-dlp/yt-dlp/issues/13111)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **weverse**: [Fix live extraction](https://github.com/yt-dlp/yt-dlp/commit/5328eda8820cc5f21dcf917684d23fbdca41831d) ([#13084](https://github.com/yt-dlp/yt-dlp/issues/13084)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **xinpianchang**: [Fix extractor](https://github.com/yt-dlp/yt-dlp/commit/83fabf352489d52843f67e6e9cc752db86d27e6e) ([#13245](https://github.com/yt-dlp/yt-dlp/issues/13245)) by [garret1317](https://github.com/garret1317)
|
|
||||||
- **youtube**
|
|
||||||
- [Add PO token support for subtitles](https://github.com/yt-dlp/yt-dlp/commit/32ed5f107c6c641958d1cd2752e130de4db55a13) ([#13234](https://github.com/yt-dlp/yt-dlp/issues/13234)) by [bashonly](https://github.com/bashonly), [coletdjnz](https://github.com/coletdjnz)
|
|
||||||
- [Add `web_embedded` client for age-restricted videos](https://github.com/yt-dlp/yt-dlp/commit/0feec6dc131f488428bf881519e7c69766fbb9ae) ([#13089](https://github.com/yt-dlp/yt-dlp/issues/13089)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- [Add a PO Token Provider Framework](https://github.com/yt-dlp/yt-dlp/commit/2685654a37141cca63eda3a92da0e2706e23ccfd) ([#12840](https://github.com/yt-dlp/yt-dlp/issues/12840)) by [coletdjnz](https://github.com/coletdjnz)
|
|
||||||
- [Extract `media_type` for all videos](https://github.com/yt-dlp/yt-dlp/commit/ded11ebc9afba6ba33923375103e9be2d7c804e7) ([#13136](https://github.com/yt-dlp/yt-dlp/issues/13136)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- [Fix `--live-from-start` support for premieres](https://github.com/yt-dlp/yt-dlp/commit/8f303afb43395be360cafd7ad4ce2b6e2eedfb8a) ([#13079](https://github.com/yt-dlp/yt-dlp/issues/13079)) by [arabcoders](https://github.com/arabcoders)
|
|
||||||
- [Fix geo-restriction error handling](https://github.com/yt-dlp/yt-dlp/commit/c7e575e31608c19c5b26c10a4229db89db5fc9a8) ([#13217](https://github.com/yt-dlp/yt-dlp/issues/13217)) by [yozel](https://github.com/yozel)
|
|
||||||
|
|
||||||
#### Misc. changes
|
|
||||||
- **build**
|
|
||||||
- [Bump PyInstaller to v6.13.0](https://github.com/yt-dlp/yt-dlp/commit/17cf9088d0d535e4a7feffbf02bd49cd9dae5ab9) ([#13082](https://github.com/yt-dlp/yt-dlp/issues/13082)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- [Bump run-on-arch-action to v3](https://github.com/yt-dlp/yt-dlp/commit/9064d2482d1fe722bbb4a49731fe0711c410d1c8) ([#13088](https://github.com/yt-dlp/yt-dlp/issues/13088)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **cleanup**: Miscellaneous: [7977b32](https://github.com/yt-dlp/yt-dlp/commit/7977b329ed97b216e37bd402f4935f28c00eac9e) by [bashonly](https://github.com/bashonly)
|
|
||||||
|
|
||||||
### 2025.04.30
|
|
||||||
|
|
||||||
#### Important changes
|
|
||||||
- **New option `--preset-alias`/`-t` has been added**
|
|
||||||
This provides convenient predefined aliases for common use cases. Available presets include `mp4`, `mp3`, `mkv`, `aac`, and `sleep`. See [the README](https://github.com/yt-dlp/yt-dlp/blob/master/README.md#preset-aliases) for more details.
|
|
||||||
|
|
||||||
#### Core changes
|
|
||||||
- [Add `--preset-alias` option](https://github.com/yt-dlp/yt-dlp/commit/88eb1e7a9a2720ac89d653c0d0e40292388823bb) ([#12839](https://github.com/yt-dlp/yt-dlp/issues/12839)) by [Grub4K](https://github.com/Grub4K), [seproDev](https://github.com/seproDev)
|
|
||||||
- **utils**
|
|
||||||
- `_yield_json_ld`: [Make function less fatal](https://github.com/yt-dlp/yt-dlp/commit/45f01de00e1bc076b7f676a669736326178647b1) ([#12855](https://github.com/yt-dlp/yt-dlp/issues/12855)) by [seproDev](https://github.com/seproDev)
|
|
||||||
- `url_or_none`: [Support WebSocket URLs](https://github.com/yt-dlp/yt-dlp/commit/a473e592337edb8ca40cde52c1fcaee261c54df9) ([#12848](https://github.com/yt-dlp/yt-dlp/issues/12848)) by [doe1080](https://github.com/doe1080)
|
|
||||||
|
|
||||||
#### Extractor changes
|
|
||||||
- **abematv**: [Fix thumbnail extraction](https://github.com/yt-dlp/yt-dlp/commit/f5736bb35bde62348caebf7b188668655e316deb) ([#12859](https://github.com/yt-dlp/yt-dlp/issues/12859)) by [Kiritomo](https://github.com/Kiritomo)
|
|
||||||
- **atresplayer**: [Rework extractor](https://github.com/yt-dlp/yt-dlp/commit/839d64325356310e6de6cd9cad28fb546619ca63) ([#11424](https://github.com/yt-dlp/yt-dlp/issues/11424)) by [meGAmeS1](https://github.com/meGAmeS1), [seproDev](https://github.com/seproDev)
|
|
||||||
- **bpb**: [Fix formats extraction](https://github.com/yt-dlp/yt-dlp/commit/80736b9c90818adee933a155079b8535bc06819f) ([#13015](https://github.com/yt-dlp/yt-dlp/issues/13015)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **cda**: [Fix formats extraction](https://github.com/yt-dlp/yt-dlp/commit/9032f981362ea0be90626fab51ec37934feded6d) ([#12975](https://github.com/yt-dlp/yt-dlp/issues/12975)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **cdafolder**: [Extend `_VALID_URL`](https://github.com/yt-dlp/yt-dlp/commit/cb271d445bc2d866c9a3404b1d8f59bcb77447df) ([#12919](https://github.com/yt-dlp/yt-dlp/issues/12919)) by [fireattack](https://github.com/fireattack), [Kicer86](https://github.com/Kicer86)
|
|
||||||
- **crowdbunker**: [Make format extraction non-fatal](https://github.com/yt-dlp/yt-dlp/commit/4ebf41309d04a6e196944f1c0f5f0154cff0055a) ([#12836](https://github.com/yt-dlp/yt-dlp/issues/12836)) by [seproDev](https://github.com/seproDev)
|
|
||||||
- **dacast**: [Support tokenized URLs](https://github.com/yt-dlp/yt-dlp/commit/e7e3b7a55c456da4a5a812b4fefce4dce8e6a616) ([#12979](https://github.com/yt-dlp/yt-dlp/issues/12979)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **dzen.ru**: [Rework extractors](https://github.com/yt-dlp/yt-dlp/commit/a3f2b54c2535d862de6efa9cfaa6ca9a2b2f7dd6) ([#12852](https://github.com/yt-dlp/yt-dlp/issues/12852)) by [seproDev](https://github.com/seproDev)
|
|
||||||
- **generic**: [Fix MPD extraction for `file://` URLs](https://github.com/yt-dlp/yt-dlp/commit/34a061a295d156934417c67ee98070b94943006b) ([#12978](https://github.com/yt-dlp/yt-dlp/issues/12978)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **getcourseru**: [Fix extractors](https://github.com/yt-dlp/yt-dlp/commit/741fd809bc4d301c19b53877692ae510334a6750) ([#12943](https://github.com/yt-dlp/yt-dlp/issues/12943)) by [troex](https://github.com/troex)
|
|
||||||
- **ivoox**: [Add extractor](https://github.com/yt-dlp/yt-dlp/commit/7faa18b83dcfc74a1a1e2034e6b0369c495ca645) ([#12768](https://github.com/yt-dlp/yt-dlp/issues/12768)) by [NeonMan](https://github.com/NeonMan), [seproDev](https://github.com/seproDev)
|
|
||||||
- **kika**: [Add playlist extractor](https://github.com/yt-dlp/yt-dlp/commit/3c1c75ecb8ab352f422b59af46fff2be992e4115) ([#12832](https://github.com/yt-dlp/yt-dlp/issues/12832)) by [1100101](https://github.com/1100101)
|
|
||||||
- **linkedin**
|
|
||||||
- [Support feed URLs](https://github.com/yt-dlp/yt-dlp/commit/73a26f9ee68610e33c0b4407b77355f2ab7afd0e) ([#12927](https://github.com/yt-dlp/yt-dlp/issues/12927)) by [seproDev](https://github.com/seproDev)
|
|
||||||
- events: [Add extractor](https://github.com/yt-dlp/yt-dlp/commit/b37ff4de5baf4e4e70c6a0ec34e136a279ad20af) ([#12926](https://github.com/yt-dlp/yt-dlp/issues/12926)) by [bashonly](https://github.com/bashonly), [seproDev](https://github.com/seproDev)
|
|
||||||
- **loco**: [Fix extractor](https://github.com/yt-dlp/yt-dlp/commit/f5a37ea40e20865b976ffeeff13eeae60292eb23) ([#12934](https://github.com/yt-dlp/yt-dlp/issues/12934)) by [seproDev](https://github.com/seproDev)
|
|
||||||
- **lrtradio**: [Add extractor](https://github.com/yt-dlp/yt-dlp/commit/74e90dd9b8f9c1a5c48a2515126654f4d398d687) ([#12801](https://github.com/yt-dlp/yt-dlp/issues/12801)) by [subrat-lima](https://github.com/subrat-lima)
|
|
||||||
- **manyvids**: [Fix extractor](https://github.com/yt-dlp/yt-dlp/commit/77aa15e98f34c4ad425aabf39dd1ee37b48f772c) ([#10907](https://github.com/yt-dlp/yt-dlp/issues/10907)) by [pj47x](https://github.com/pj47x)
|
|
||||||
- **mixcloud**: [Refactor extractor](https://github.com/yt-dlp/yt-dlp/commit/db6d1f145ad583e0220637726029f8f2fa6200a0) ([#12830](https://github.com/yt-dlp/yt-dlp/issues/12830)) by [seproDev](https://github.com/seproDev), [WouterGordts](https://github.com/WouterGordts)
|
|
||||||
- **mlbtv**: [Fix device ID caching](https://github.com/yt-dlp/yt-dlp/commit/36da6360e130197df927ee93409519ce3f4075f5) ([#12980](https://github.com/yt-dlp/yt-dlp/issues/12980)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **niconico**
|
|
||||||
- [Fix login support](https://github.com/yt-dlp/yt-dlp/commit/25cd7c1ecbb6cbf21dd3a6e59608e4af94715ecc) ([#13008](https://github.com/yt-dlp/yt-dlp/issues/13008)) by [doe1080](https://github.com/doe1080)
|
|
||||||
- [Remove DMC formats support](https://github.com/yt-dlp/yt-dlp/commit/7d05aa99c65352feae1cd9a3ff8784b64bfe382a) ([#12916](https://github.com/yt-dlp/yt-dlp/issues/12916)) by [doe1080](https://github.com/doe1080)
|
|
||||||
- live: [Fix extractor](https://github.com/yt-dlp/yt-dlp/commit/1d45e30537bf83e069184a440703e4c43b2e0198) ([#12809](https://github.com/yt-dlp/yt-dlp/issues/12809)) by [Snack-X](https://github.com/Snack-X)
|
|
||||||
- **panopto**: [Fix formats extraction](https://github.com/yt-dlp/yt-dlp/commit/9d26daa04ad5108257bc5e30f7f040c7f1fe7a5a) ([#12925](https://github.com/yt-dlp/yt-dlp/issues/12925)) by [seproDev](https://github.com/seproDev)
|
|
||||||
- **parti**: [Add extractors](https://github.com/yt-dlp/yt-dlp/commit/425017531fbc3369becb5a44013e26f26efabf45) ([#12769](https://github.com/yt-dlp/yt-dlp/issues/12769)) by [benfaerber](https://github.com/benfaerber)
|
|
||||||
- **raiplay**: [Fix DRM detection](https://github.com/yt-dlp/yt-dlp/commit/dce82346245e35a46fda836ca2089805d2347935) ([#12971](https://github.com/yt-dlp/yt-dlp/issues/12971)) by [DTrombett](https://github.com/DTrombett)
|
|
||||||
- **reddit**: [Support `--ignore-no-formats-error`](https://github.com/yt-dlp/yt-dlp/commit/28f04e8a5e383ff531db646190b4be45554610d6) ([#12993](https://github.com/yt-dlp/yt-dlp/issues/12993)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **royalive**: [Add extractor](https://github.com/yt-dlp/yt-dlp/commit/e1847535e28788414a25546a45bebcada2f34558) ([#12817](https://github.com/yt-dlp/yt-dlp/issues/12817)) by [CasperMcFadden95](https://github.com/CasperMcFadden95)
|
|
||||||
- **rtve**: [Rework extractors](https://github.com/yt-dlp/yt-dlp/commit/f07ee91c71920ab1187a7ea756720e81aa406a9d) ([#10388](https://github.com/yt-dlp/yt-dlp/issues/10388)) by [meGAmeS1](https://github.com/meGAmeS1), [seproDev](https://github.com/seproDev)
|
|
||||||
- **rumble**: [Improve format extraction](https://github.com/yt-dlp/yt-dlp/commit/58d0c83457b93b3c9a81eb6bc5a4c65f25e949df) ([#12838](https://github.com/yt-dlp/yt-dlp/issues/12838)) by [seproDev](https://github.com/seproDev)
|
|
||||||
- **tokfmpodcast**: [Fix formats extraction](https://github.com/yt-dlp/yt-dlp/commit/91832111a12d87499294a0f430829b8c2254c339) ([#12842](https://github.com/yt-dlp/yt-dlp/issues/12842)) by [selfisekai](https://github.com/selfisekai)
|
|
||||||
- **tv2dk**: [Fix extractor](https://github.com/yt-dlp/yt-dlp/commit/a3e91df30a45943f40759d2c1e0b6c2ca4b2a263) ([#12945](https://github.com/yt-dlp/yt-dlp/issues/12945)) by [bashonly](https://github.com/bashonly), [seproDev](https://github.com/seproDev)
|
|
||||||
- **tvp**: vod: [Improve `_VALID_URL`](https://github.com/yt-dlp/yt-dlp/commit/4e69a626cce51428bc1d66dc606a56d9498b03a5) ([#12923](https://github.com/yt-dlp/yt-dlp/issues/12923)) by [seproDev](https://github.com/seproDev)
|
|
||||||
- **tvw**: tvchannels: [Add extractor](https://github.com/yt-dlp/yt-dlp/commit/ed8ad1b4d6b9d7a1426ff5192ff924f3371e4721) ([#12721](https://github.com/yt-dlp/yt-dlp/issues/12721)) by [fries1234](https://github.com/fries1234)
|
|
||||||
- **twitcasting**: [Fix livestream extraction](https://github.com/yt-dlp/yt-dlp/commit/de271a06fd6d20d4f55597ff7f90e4d913de0a52) ([#12977](https://github.com/yt-dlp/yt-dlp/issues/12977)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **twitch**: clips: [Fix uploader metadata extraction](https://github.com/yt-dlp/yt-dlp/commit/1ae6bff564a65af41e94f1a4727892471ecdd05a) ([#13022](https://github.com/yt-dlp/yt-dlp/issues/13022)) by [1271](https://github.com/1271)
|
|
||||||
- **twitter**
|
|
||||||
- [Fix extraction when logged-in](https://github.com/yt-dlp/yt-dlp/commit/1cf39ddf3d10b6512daa7dd139e5f6c0dc548bbc) ([#13024](https://github.com/yt-dlp/yt-dlp/issues/13024)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- spaces: [Improve metadata extraction](https://github.com/yt-dlp/yt-dlp/commit/70599e53b736bb75922b737e6e0d4f76e419bb20) ([#12911](https://github.com/yt-dlp/yt-dlp/issues/12911)) by [doe1080](https://github.com/doe1080)
|
|
||||||
- **vimeo**: [Extract from mobile API](https://github.com/yt-dlp/yt-dlp/commit/22ac81a0692019ac833cf282e4ef99718e9ef3fa) ([#13034](https://github.com/yt-dlp/yt-dlp/issues/13034)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **vk**
|
|
||||||
- [Fix chapters extraction](https://github.com/yt-dlp/yt-dlp/commit/5361a7c6e2933c919716e0cb1e3116c28c40419f) ([#12821](https://github.com/yt-dlp/yt-dlp/issues/12821)) by [seproDev](https://github.com/seproDev)
|
|
||||||
- [Fix uploader extraction](https://github.com/yt-dlp/yt-dlp/commit/2381881fe58a723853350a6ab750a5efc9f10c85) ([#12985](https://github.com/yt-dlp/yt-dlp/issues/12985)) by [seproDev](https://github.com/seproDev)
|
|
||||||
- **youtube**
|
|
||||||
- [Add context to video request rate limit error](https://github.com/yt-dlp/yt-dlp/commit/26feac3dd142536ad08ad1ed731378cb88e63602) ([#12958](https://github.com/yt-dlp/yt-dlp/issues/12958)) by [coletdjnz](https://github.com/coletdjnz)
|
|
||||||
- [Add extractor arg to skip "initial_data" request](https://github.com/yt-dlp/yt-dlp/commit/ed6c6d7eefbc78fa72e4e60ad6edaa3ee2acc715) ([#12865](https://github.com/yt-dlp/yt-dlp/issues/12865)) by [leeblackc](https://github.com/leeblackc)
|
|
||||||
- [Add warning on video captcha challenge](https://github.com/yt-dlp/yt-dlp/commit/f484c51599a6cd01eb078ea7dc9bbba942967774) ([#12939](https://github.com/yt-dlp/yt-dlp/issues/12939)) by [coletdjnz](https://github.com/coletdjnz)
|
|
||||||
- [Cache signature timestamps](https://github.com/yt-dlp/yt-dlp/commit/61c9a938b390b8334ee3a879fe2d93f714e30138) ([#13047](https://github.com/yt-dlp/yt-dlp/issues/13047)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- [Detect and warn when account cookies are rotated](https://github.com/yt-dlp/yt-dlp/commit/8cb08028f5be2acb9835ce1670b196b9b077052f) ([#13014](https://github.com/yt-dlp/yt-dlp/issues/13014)) by [coletdjnz](https://github.com/coletdjnz)
|
|
||||||
- [Detect player JS variants for any locale](https://github.com/yt-dlp/yt-dlp/commit/c2d6659d1069f8cff97e1fd61d1c59e949e1e63d) ([#13003](https://github.com/yt-dlp/yt-dlp/issues/13003)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- [Do not strictly deprioritize `missing_pot` formats](https://github.com/yt-dlp/yt-dlp/commit/74fc2ae12c24eb6b4e02c6360c89bd05f3c8f740) ([#13061](https://github.com/yt-dlp/yt-dlp/issues/13061)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- [Improve warning for SABR-only/SSAP player responses](https://github.com/yt-dlp/yt-dlp/commit/fd8394bc50301ac5e930aa65aa71ab1b8372b8ab) ([#13049](https://github.com/yt-dlp/yt-dlp/issues/13049)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- tab: [Extract continuation from empty page](https://github.com/yt-dlp/yt-dlp/commit/72ba4879304c2082fecbb472e6cc05ee2d154a3b) ([#12938](https://github.com/yt-dlp/yt-dlp/issues/12938)) by [coletdjnz](https://github.com/coletdjnz)
|
|
||||||
- **zdf**: [Fix extractors](https://github.com/yt-dlp/yt-dlp/commit/7be14109a6bd493a2e881da4f9e30adaf3e7e5d5) ([#12779](https://github.com/yt-dlp/yt-dlp/issues/12779)) by [bashonly](https://github.com/bashonly), [InvalidUsernameException](https://github.com/InvalidUsernameException)
|
|
||||||
|
|
||||||
#### Downloader changes
|
|
||||||
- **niconicodmc**: [Remove downloader](https://github.com/yt-dlp/yt-dlp/commit/8d127b18f81131453eaba05d3bb810d9b73adb75) ([#12916](https://github.com/yt-dlp/yt-dlp/issues/12916)) by [doe1080](https://github.com/doe1080)
|
|
||||||
|
|
||||||
#### Networking changes
|
|
||||||
- [Add PATCH request shortcut](https://github.com/yt-dlp/yt-dlp/commit/ceab4d5ed63a1f135a1816fe967c9d9a1ec7e6e8) ([#12884](https://github.com/yt-dlp/yt-dlp/issues/12884)) by [doe1080](https://github.com/doe1080)
|
|
||||||
|
|
||||||
#### Misc. changes
|
|
||||||
- **ci**: [Add file mode test to code check](https://github.com/yt-dlp/yt-dlp/commit/3690e91265d1d0bbeffaf6a9b8cc9baded1367bd) ([#13036](https://github.com/yt-dlp/yt-dlp/issues/13036)) by [Grub4K](https://github.com/Grub4K)
|
|
||||||
- **cleanup**: Miscellaneous: [505b400](https://github.com/yt-dlp/yt-dlp/commit/505b400795af557bdcfd9d4fa7e9133b26ef431c) by [bashonly](https://github.com/bashonly), [seproDev](https://github.com/seproDev)
|
|
||||||
|
|
||||||
### 2025.03.31
|
|
||||||
|
|
||||||
#### Core changes
|
|
||||||
- [Add `--compat-options 2024`](https://github.com/yt-dlp/yt-dlp/commit/22e34adbd741e1c7072015debd615dc3fb71c401) ([#12789](https://github.com/yt-dlp/yt-dlp/issues/12789)) by [seproDev](https://github.com/seproDev)
|
|
||||||
|
|
||||||
#### Extractor changes
|
|
||||||
- **francaisfacile**: [Add extractor](https://github.com/yt-dlp/yt-dlp/commit/bb321cfdc3fd4400598ddb12a15862bc2ac8fc10) ([#12787](https://github.com/yt-dlp/yt-dlp/issues/12787)) by [mlabeeb03](https://github.com/mlabeeb03)
|
|
||||||
- **generic**: [Validate response before checking m3u8 live status](https://github.com/yt-dlp/yt-dlp/commit/9a1ec1d36e172d252714cef712a6d091e0a0c4f2) ([#12784](https://github.com/yt-dlp/yt-dlp/issues/12784)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **microsoftlearnepisode**: [Extract more formats](https://github.com/yt-dlp/yt-dlp/commit/d63696f23a341ee36a3237ccb5d5e14b34c2c579) ([#12799](https://github.com/yt-dlp/yt-dlp/issues/12799)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **mlbtv**: [Fix radio-only extraction](https://github.com/yt-dlp/yt-dlp/commit/f033d86b96b36f8c5289dd7c3304f42d4d9f6ff4) ([#12792](https://github.com/yt-dlp/yt-dlp/issues/12792)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **on24**: [Support `mainEvent` URLs](https://github.com/yt-dlp/yt-dlp/commit/e465b078ead75472fcb7b86f6ccaf2b5d3bc4c21) ([#12800](https://github.com/yt-dlp/yt-dlp/issues/12800)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **sbs**: [Fix subtitles extraction](https://github.com/yt-dlp/yt-dlp/commit/29560359120f28adaaac67c86fa8442eb72daa0d) ([#12785](https://github.com/yt-dlp/yt-dlp/issues/12785)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **stvr**: [Rename extractor from RTVS to STVR](https://github.com/yt-dlp/yt-dlp/commit/5fc521cbd0ce7b2410d0935369558838728e205d) ([#12788](https://github.com/yt-dlp/yt-dlp/issues/12788)) by [mireq](https://github.com/mireq)
|
|
||||||
- **twitch**: clips: [Extract portrait formats](https://github.com/yt-dlp/yt-dlp/commit/61046c31612b30c749cbdae934b7fe26abe659d7) ([#12763](https://github.com/yt-dlp/yt-dlp/issues/12763)) by [DmitryScaletta](https://github.com/DmitryScaletta)
|
|
||||||
- **youtube**
|
|
||||||
- [Add `player_js_variant` extractor-arg](https://github.com/yt-dlp/yt-dlp/commit/07f04005e40ebdb368920c511e36e98af0077ed3) ([#12767](https://github.com/yt-dlp/yt-dlp/issues/12767)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- tab: [Fix playlist continuation extraction](https://github.com/yt-dlp/yt-dlp/commit/6a6d97b2cbc78f818de05cc96edcdcfd52caa259) ([#12777](https://github.com/yt-dlp/yt-dlp/issues/12777)) by [coletdjnz](https://github.com/coletdjnz)
|
|
||||||
|
|
||||||
#### Misc. changes
|
|
||||||
- **cleanup**: Miscellaneous: [5e457af](https://github.com/yt-dlp/yt-dlp/commit/5e457af57fae9645b1b8fa0ed689229c8fb9656b) by [bashonly](https://github.com/bashonly)
|
|
||||||
|
|
||||||
### 2025.03.27
|
|
||||||
|
|
||||||
#### Core changes
|
|
||||||
- **jsinterp**: [Fix nested attributes and object extraction](https://github.com/yt-dlp/yt-dlp/commit/a8b9ff3c2a0ae25735e580173becc78545b92572) ([#12760](https://github.com/yt-dlp/yt-dlp/issues/12760)) by [bashonly](https://github.com/bashonly), [seproDev](https://github.com/seproDev)
|
|
||||||
|
|
||||||
#### Extractor changes
|
|
||||||
- **youtube**: [Make signature and nsig extraction more robust](https://github.com/yt-dlp/yt-dlp/commit/48be862b32648bff5b3e553e40fca4dcc6e88b28) ([#12761](https://github.com/yt-dlp/yt-dlp/issues/12761)) by [bashonly](https://github.com/bashonly), [seproDev](https://github.com/seproDev)
|
|
||||||
|
|
||||||
### 2025.03.26
|
|
||||||
|
|
||||||
#### Extractor changes
|
|
||||||
- **youtube**
|
|
||||||
- [Fix signature and nsig extraction for player `4fcd6e4a`](https://github.com/yt-dlp/yt-dlp/commit/a550dfc904a02843a26369ae50dbb7c0febfb30e) ([#12748](https://github.com/yt-dlp/yt-dlp/issues/12748)) by [seproDev](https://github.com/seproDev)
|
|
||||||
- [Only cache nsig code on successful decoding](https://github.com/yt-dlp/yt-dlp/commit/ecee97b4fa90d51c48f9154c3a6d5a8ffe46cd5c) ([#12750](https://github.com/yt-dlp/yt-dlp/issues/12750)) by [bashonly](https://github.com/bashonly), [seproDev](https://github.com/seproDev)
|
|
||||||
|
|
||||||
### 2025.03.25
|
|
||||||
|
|
||||||
#### Core changes
|
|
||||||
- [Fix attribute error on failed VT init](https://github.com/yt-dlp/yt-dlp/commit/b872ffec50fd50f790a5a490e006a369a28a3df3) ([#12696](https://github.com/yt-dlp/yt-dlp/issues/12696)) by [Grub4K](https://github.com/Grub4K)
|
|
||||||
- **utils**: `js_to_json`: [Make function less fatal](https://github.com/yt-dlp/yt-dlp/commit/9491b44032b330e05bd5eaa546187005d1e8538e) ([#12715](https://github.com/yt-dlp/yt-dlp/issues/12715)) by [seproDev](https://github.com/seproDev)
|
|
||||||
|
|
||||||
#### Extractor changes
|
|
||||||
- [Fix sorting of HLS audio formats by `GROUP-ID`](https://github.com/yt-dlp/yt-dlp/commit/86ab79e1a5182092321102adf6ca34195803b878) ([#12714](https://github.com/yt-dlp/yt-dlp/issues/12714)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **17live**: vod: [Add extractor](https://github.com/yt-dlp/yt-dlp/commit/3396eb50dcd245b49c0f4aecd6e80ec914095d16) ([#12723](https://github.com/yt-dlp/yt-dlp/issues/12723)) by [subrat-lima](https://github.com/subrat-lima)
|
|
||||||
- **9now.com.au**: [Fix extractor](https://github.com/yt-dlp/yt-dlp/commit/9d5e6de2e7a47226d1f72c713ad45c88ba01db68) ([#12702](https://github.com/yt-dlp/yt-dlp/issues/12702)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **chzzk**: video: [Fix extraction](https://github.com/yt-dlp/yt-dlp/commit/e2dfccaf808b406d5bcb7dd04ae9ce420752dd6f) ([#12692](https://github.com/yt-dlp/yt-dlp/issues/12692)) by [bashonly](https://github.com/bashonly), [dirkf](https://github.com/dirkf)
|
|
||||||
- **deezer**: [Remove extractors](https://github.com/yt-dlp/yt-dlp/commit/be5af3f9e91747768c2b41157851bfbe14c663f7) ([#12704](https://github.com/yt-dlp/yt-dlp/issues/12704)) by [seproDev](https://github.com/seproDev)
|
|
||||||
- **generic**: [Fix MPD base URL parsing](https://github.com/yt-dlp/yt-dlp/commit/5086d4aed6aeb3908c62f49e2d8f74cc0cb05110) ([#12718](https://github.com/yt-dlp/yt-dlp/issues/12718)) by [fireattack](https://github.com/fireattack)
|
|
||||||
- **streaks**: [Add extractor](https://github.com/yt-dlp/yt-dlp/commit/801afeac91f97dc0b58cd39cc7e8c50f619dc4e1) ([#12679](https://github.com/yt-dlp/yt-dlp/issues/12679)) by [doe1080](https://github.com/doe1080)
|
|
||||||
- **tver**: [Fix extractor](https://github.com/yt-dlp/yt-dlp/commit/66e0bab814e4a52ef3e12d81123ad992a29df50e) ([#12659](https://github.com/yt-dlp/yt-dlp/issues/12659)) by [arabcoders](https://github.com/arabcoders), [bashonly](https://github.com/bashonly)
|
|
||||||
- **viki**: [Remove extractors](https://github.com/yt-dlp/yt-dlp/commit/fe4f14b8369038e7c58f7de546d76de1ce3a91ce) ([#12703](https://github.com/yt-dlp/yt-dlp/issues/12703)) by [seproDev](https://github.com/seproDev)
|
|
||||||
- **vrsquare**: [Add extractors](https://github.com/yt-dlp/yt-dlp/commit/b7fbb5a0a16a8e8d3e29c29e26ebed677d0d6ea3) ([#12515](https://github.com/yt-dlp/yt-dlp/issues/12515)) by [doe1080](https://github.com/doe1080)
|
|
||||||
- **youtube**
|
|
||||||
- [Fix PhantomJS nsig fallback](https://github.com/yt-dlp/yt-dlp/commit/4054a2b623bd1e277b49d2e9abc3d112a4b1c7be) ([#12728](https://github.com/yt-dlp/yt-dlp/issues/12728)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- [Fix signature and nsig extraction for player `363db69b`](https://github.com/yt-dlp/yt-dlp/commit/b9c979461b244713bf42691a5bc02834e2ba4b2c) ([#12725](https://github.com/yt-dlp/yt-dlp/issues/12725)) by [bashonly](https://github.com/bashonly)
|
|
||||||
|
|
||||||
#### Networking changes
|
|
||||||
- **Request Handler**: curl_cffi: [Support `curl_cffi` 0.10.x](https://github.com/yt-dlp/yt-dlp/commit/9bf23902ceb948b9685ce1dab575491571720fc6) ([#12670](https://github.com/yt-dlp/yt-dlp/issues/12670)) by [Grub4K](https://github.com/Grub4K)
|
|
||||||
|
|
||||||
#### Misc. changes
|
|
||||||
- **cleanup**: Miscellaneous: [9dde546](https://github.com/yt-dlp/yt-dlp/commit/9dde546e7ee3e1515d88ee3af08b099351455dc0) by [seproDev](https://github.com/seproDev)
|
|
||||||
|
|
||||||
### 2025.03.21
|
|
||||||
|
|
||||||
#### Core changes
|
|
||||||
- [Fix external downloader availability when using `--ffmpeg-location`](https://github.com/yt-dlp/yt-dlp/commit/9f77e04c76e36e1cbbf49bc9eb385fa6ef804b67) ([#12318](https://github.com/yt-dlp/yt-dlp/issues/12318)) by [Kenshin9977](https://github.com/Kenshin9977)
|
|
||||||
- [Load plugins on demand](https://github.com/yt-dlp/yt-dlp/commit/4445f37a7a66b248dbd8376c43137e6e441f138e) ([#11305](https://github.com/yt-dlp/yt-dlp/issues/11305)) by [coletdjnz](https://github.com/coletdjnz), [Grub4K](https://github.com/Grub4K), [pukkandan](https://github.com/pukkandan) (With fixes in [c034d65](https://github.com/yt-dlp/yt-dlp/commit/c034d655487be668222ef9476a16f374584e49a7))
|
|
||||||
- [Support emitting ConEmu progress codes](https://github.com/yt-dlp/yt-dlp/commit/f7a1f2d8132967a62b0f6d5665c6d2dde2d42c09) ([#10649](https://github.com/yt-dlp/yt-dlp/issues/10649)) by [Grub4K](https://github.com/Grub4K)
|
|
||||||
|
|
||||||
#### Extractor changes
|
|
||||||
- **azmedien**: [Fix extractor](https://github.com/yt-dlp/yt-dlp/commit/26a502fc727d0e91b2db6bf4a112823bcc672e85) ([#12375](https://github.com/yt-dlp/yt-dlp/issues/12375)) by [goggle](https://github.com/goggle)
|
|
||||||
- **bilibiliplaylist**: [Fix extractor](https://github.com/yt-dlp/yt-dlp/commit/f5fb2229e66cf59d5bf16065bc041b42a28354a0) ([#12690](https://github.com/yt-dlp/yt-dlp/issues/12690)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **bunnycdn**: [Add extractor](https://github.com/yt-dlp/yt-dlp/commit/3a1583ca75fb523cbad0e5e174387ea7b477d175) ([#11586](https://github.com/yt-dlp/yt-dlp/issues/11586)) by [Grub4K](https://github.com/Grub4K), [seproDev](https://github.com/seproDev)
|
|
||||||
- **canalsurmas**: [Add extractor](https://github.com/yt-dlp/yt-dlp/commit/01a8be4c23f186329d85f9c78db34a55f3294ac5) ([#12497](https://github.com/yt-dlp/yt-dlp/issues/12497)) by [Arc8ne](https://github.com/Arc8ne)
|
|
||||||
- **cda**: [Fix login support](https://github.com/yt-dlp/yt-dlp/commit/be0d819e1103195043f6743650781f0d4d343f6d) ([#12552](https://github.com/yt-dlp/yt-dlp/issues/12552)) by [rysson](https://github.com/rysson)
|
|
||||||
- **cultureunplugged**: [Extend `_VALID_URL`](https://github.com/yt-dlp/yt-dlp/commit/3042afb5fe342d3a00de76704cd7de611acc350e) ([#12486](https://github.com/yt-dlp/yt-dlp/issues/12486)) by [seproDev](https://github.com/seproDev)
|
|
||||||
- **dailymotion**: [Improve embed detection](https://github.com/yt-dlp/yt-dlp/commit/ad60137c141efa5023fbc0ac8579eaefe8b3d8cc) ([#12464](https://github.com/yt-dlp/yt-dlp/issues/12464)) by [seproDev](https://github.com/seproDev)
|
|
||||||
- **gem.cbc.ca**: [Fix login support](https://github.com/yt-dlp/yt-dlp/commit/eb1417786a3027b1e7290ec37ef6aaece50ebed0) ([#12414](https://github.com/yt-dlp/yt-dlp/issues/12414)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **globo**: [Fix subtitles extraction](https://github.com/yt-dlp/yt-dlp/commit/0e1697232fcbba7551f983fd1ba93bb445cbb08b) ([#12270](https://github.com/yt-dlp/yt-dlp/issues/12270)) by [pedro](https://github.com/pedro)
|
|
||||||
- **instagram**
|
|
||||||
- [Add `app_id` extractor-arg](https://github.com/yt-dlp/yt-dlp/commit/a90641c8363fa0c10800b36eb6b01ee22d3a9409) ([#12359](https://github.com/yt-dlp/yt-dlp/issues/12359)) by [chrisellsworth](https://github.com/chrisellsworth)
|
|
||||||
- [Fix extraction of older private posts](https://github.com/yt-dlp/yt-dlp/commit/a59abe0636dc49b22a67246afe35613571b86f05) ([#12451](https://github.com/yt-dlp/yt-dlp/issues/12451)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- [Improve error handling](https://github.com/yt-dlp/yt-dlp/commit/480125560a3b9972d29ae0da850aba8109e6bd41) ([#12410](https://github.com/yt-dlp/yt-dlp/issues/12410)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- story: [Support `--no-playlist`](https://github.com/yt-dlp/yt-dlp/commit/65c3c58c0a67463a150920203cec929045c95a24) ([#12397](https://github.com/yt-dlp/yt-dlp/issues/12397)) by [fireattack](https://github.com/fireattack)
|
|
||||||
- **jamendo**: [Fix thumbnail extraction](https://github.com/yt-dlp/yt-dlp/commit/89a68c4857ddbaf937ff22f12648baaf6b5af840) ([#12622](https://github.com/yt-dlp/yt-dlp/issues/12622)) by [bashonly](https://github.com/bashonly), [JChris246](https://github.com/JChris246)
|
|
||||||
- **ketnet**: [Remove extractor](https://github.com/yt-dlp/yt-dlp/commit/bbada3ec0779422cde34f1ce3dcf595da463b493) ([#12628](https://github.com/yt-dlp/yt-dlp/issues/12628)) by [MichaelDeBoey](https://github.com/MichaelDeBoey)
|
|
||||||
- **lbry**
|
|
||||||
- [Make m3u8 format extraction non-fatal](https://github.com/yt-dlp/yt-dlp/commit/9807181cfbf87bfa732f415c30412bdbd77cbf81) ([#12463](https://github.com/yt-dlp/yt-dlp/issues/12463)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- [Raise appropriate error for non-media files](https://github.com/yt-dlp/yt-dlp/commit/7126b472601814b7fd8c9de02069e8fff1764891) ([#12462](https://github.com/yt-dlp/yt-dlp/issues/12462)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **loco**: [Add extractor](https://github.com/yt-dlp/yt-dlp/commit/983095485c731240aae27c950cb8c24a50827b56) ([#12667](https://github.com/yt-dlp/yt-dlp/issues/12667)) by [DTrombett](https://github.com/DTrombett)
|
|
||||||
- **magellantv**: [Fix extractor](https://github.com/yt-dlp/yt-dlp/commit/172d5fcd778bf2605db7647ebc56b29ed18d24ac) ([#12505](https://github.com/yt-dlp/yt-dlp/issues/12505)) by [seproDev](https://github.com/seproDev)
|
|
||||||
- **mitele**: [Fix extractor](https://github.com/yt-dlp/yt-dlp/commit/7223d29569a48a35ad132a508c115973866838d3) ([#12689](https://github.com/yt-dlp/yt-dlp/issues/12689)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **msn**: [Rework extractor](https://github.com/yt-dlp/yt-dlp/commit/4815dac131d42c51e12c1d05232db0bbbf607329) ([#12513](https://github.com/yt-dlp/yt-dlp/issues/12513)) by [seproDev](https://github.com/seproDev), [thedenv](https://github.com/thedenv)
|
|
||||||
- **n1**: [Fix extraction of newer articles](https://github.com/yt-dlp/yt-dlp/commit/9d70abe4de401175cbbaaa36017806f16b2df9af) ([#12514](https://github.com/yt-dlp/yt-dlp/issues/12514)) by [u-spec-png](https://github.com/u-spec-png)
|
|
||||||
- **nbcstations**: [Fix extractor](https://github.com/yt-dlp/yt-dlp/commit/ebac65aa9e0bf9a97c24d00f7977900d2577364b) ([#12534](https://github.com/yt-dlp/yt-dlp/issues/12534)) by [refack](https://github.com/refack)
|
|
||||||
- **niconico**
|
|
||||||
- [Fix format sorting](https://github.com/yt-dlp/yt-dlp/commit/7508e34f203e97389f1d04db92140b13401dd724) ([#12442](https://github.com/yt-dlp/yt-dlp/issues/12442)) by [xpadev-net](https://github.com/xpadev-net)
|
|
||||||
- live: [Fix thumbnail extraction](https://github.com/yt-dlp/yt-dlp/commit/c2e6e1d5f77f3b720a6266f2869eb750d20e5dc1) ([#12419](https://github.com/yt-dlp/yt-dlp/issues/12419)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **openrec**: [Fix `_VALID_URL`](https://github.com/yt-dlp/yt-dlp/commit/17504f253564cfad86244de2b6346d07d2300ca5) ([#12608](https://github.com/yt-dlp/yt-dlp/issues/12608)) by [fireattack](https://github.com/fireattack)
|
|
||||||
- **pinterest**: [Fix extractor](https://github.com/yt-dlp/yt-dlp/commit/bd0a66816934de70312eea1e71c59c13b401dc3a) ([#12538](https://github.com/yt-dlp/yt-dlp/issues/12538)) by [mikf](https://github.com/mikf)
|
|
||||||
- **playsuisse**: [Fix login support](https://github.com/yt-dlp/yt-dlp/commit/6933f5670cea9c3e2fb16c1caa1eda54d13122c5) ([#12444](https://github.com/yt-dlp/yt-dlp/issues/12444)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **reddit**: [Truncate title](https://github.com/yt-dlp/yt-dlp/commit/d9a53cc1e6fd912daf500ca4f19e9ca88994dbf9) ([#12567](https://github.com/yt-dlp/yt-dlp/issues/12567)) by [seproDev](https://github.com/seproDev)
|
|
||||||
- **rtp**: [Rework extractor](https://github.com/yt-dlp/yt-dlp/commit/8eb9c1bf3b9908cca22ef043602aa24fb9f352c6) ([#11638](https://github.com/yt-dlp/yt-dlp/issues/11638)) by [pferreir](https://github.com/pferreir), [red-acid](https://github.com/red-acid), [seproDev](https://github.com/seproDev), [somini](https://github.com/somini), [vallovic](https://github.com/vallovic)
|
|
||||||
- **softwhiteunderbelly**: [Add extractor](https://github.com/yt-dlp/yt-dlp/commit/652827d5a076c9483c36654ad2cf3fe46219baf4) ([#12281](https://github.com/yt-dlp/yt-dlp/issues/12281)) by [benfaerber](https://github.com/benfaerber)
|
|
||||||
- **soop**: [Fix timestamp extraction](https://github.com/yt-dlp/yt-dlp/commit/8305df00012ff8138a6ff95279d06b54ac607f63) ([#12609](https://github.com/yt-dlp/yt-dlp/issues/12609)) by [msikma](https://github.com/msikma)
|
|
||||||
- **soundcloud**
|
|
||||||
- [Extract tags](https://github.com/yt-dlp/yt-dlp/commit/9deed13d7cce6d3647379e50589c92de89227509) ([#12420](https://github.com/yt-dlp/yt-dlp/issues/12420)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- [Fix thumbnail extraction](https://github.com/yt-dlp/yt-dlp/commit/6deeda5c11f34f613724fa0627879f0d607ba1b4) ([#12447](https://github.com/yt-dlp/yt-dlp/issues/12447)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **tiktok**
|
|
||||||
- [Improve error handling](https://github.com/yt-dlp/yt-dlp/commit/99ea2978757a431eeb2a265b3395ccbe4ce202cf) ([#12445](https://github.com/yt-dlp/yt-dlp/issues/12445)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- [Truncate title](https://github.com/yt-dlp/yt-dlp/commit/83b119dadb0f267f1fb66bf7ed74c097349de79e) ([#12566](https://github.com/yt-dlp/yt-dlp/issues/12566)) by [seproDev](https://github.com/seproDev)
|
|
||||||
- **tv8.it**: [Add live and playlist extractors](https://github.com/yt-dlp/yt-dlp/commit/2ee3a0aff9be2be3bea60640d3d8a0febaf0acb6) ([#12569](https://github.com/yt-dlp/yt-dlp/issues/12569)) by [DTrombett](https://github.com/DTrombett)
|
|
||||||
- **tvw**: [Add extractor](https://github.com/yt-dlp/yt-dlp/commit/42b7440963866e31ff84a5b89030d1c596fa2e6e) ([#12271](https://github.com/yt-dlp/yt-dlp/issues/12271)) by [fries1234](https://github.com/fries1234)
|
|
||||||
- **twitter**
|
|
||||||
- [Fix syndication token generation](https://github.com/yt-dlp/yt-dlp/commit/b8b47547049f5ebc3dd680fc7de70ed0ca9c0d70) ([#12537](https://github.com/yt-dlp/yt-dlp/issues/12537)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- [Truncate title](https://github.com/yt-dlp/yt-dlp/commit/06f6de78db2eceeabd062ab1a3023e0ff9d4df53) ([#12560](https://github.com/yt-dlp/yt-dlp/issues/12560)) by [seproDev](https://github.com/seproDev)
|
|
||||||
- **vk**: [Improve metadata extraction](https://github.com/yt-dlp/yt-dlp/commit/05c8023a27dd37c49163c0498bf98e3e3c1cb4b9) ([#12510](https://github.com/yt-dlp/yt-dlp/issues/12510)) by [seproDev](https://github.com/seproDev)
|
|
||||||
- **vrtmax**: [Rework extractor](https://github.com/yt-dlp/yt-dlp/commit/df9ebeec00d658693252978d1ffb885e67aa6ab6) ([#12479](https://github.com/yt-dlp/yt-dlp/issues/12479)) by [bergoid](https://github.com/bergoid), [MichaelDeBoey](https://github.com/MichaelDeBoey), [seproDev](https://github.com/seproDev)
|
|
||||||
- **weibo**: [Support playlists](https://github.com/yt-dlp/yt-dlp/commit/0bb39788626002a8a67e925580227952c563c8b9) ([#12284](https://github.com/yt-dlp/yt-dlp/issues/12284)) by [4ft35t](https://github.com/4ft35t)
|
|
||||||
- **wsj**: [Support opinion URLs and impersonation](https://github.com/yt-dlp/yt-dlp/commit/7f3006eb0c0659982bb956d71b0bc806bcb0a5f2) ([#12431](https://github.com/yt-dlp/yt-dlp/issues/12431)) by [refack](https://github.com/refack)
|
|
||||||
- **youtube**
|
|
||||||
- [Fix nsig and signature extraction for player `643afba4`](https://github.com/yt-dlp/yt-dlp/commit/9b868518a15599f3d7ef5a1c730dda164c30da9b) ([#12684](https://github.com/yt-dlp/yt-dlp/issues/12684)) by [bashonly](https://github.com/bashonly), [seproDev](https://github.com/seproDev)
|
|
||||||
- [Player client maintenance](https://github.com/yt-dlp/yt-dlp/commit/3380febe9984c21c79c3147c1d390a4cf339bc4c) ([#12603](https://github.com/yt-dlp/yt-dlp/issues/12603)) by [seproDev](https://github.com/seproDev)
|
|
||||||
- [Split into package](https://github.com/yt-dlp/yt-dlp/commit/4432a9390c79253ac830702b226d2e558b636725) ([#12557](https://github.com/yt-dlp/yt-dlp/issues/12557)) by [coletdjnz](https://github.com/coletdjnz)
|
|
||||||
- [Warn on DRM formats](https://github.com/yt-dlp/yt-dlp/commit/e67d786c7cc87bd449d22e0ddef08306891c1173) ([#12593](https://github.com/yt-dlp/yt-dlp/issues/12593)) by [coletdjnz](https://github.com/coletdjnz)
|
|
||||||
- [Warn on missing formats due to SSAP](https://github.com/yt-dlp/yt-dlp/commit/79ec2fdff75c8c1bb89b550266849ad4dec48dd3) ([#12483](https://github.com/yt-dlp/yt-dlp/issues/12483)) by [coletdjnz](https://github.com/coletdjnz)
|
|
||||||
|
|
||||||
#### Networking changes
|
|
||||||
- [Add `keep_header_casing` extension](https://github.com/yt-dlp/yt-dlp/commit/7d18fed8f1983fe6de4ddc810dfb2761ba5744ac) ([#11652](https://github.com/yt-dlp/yt-dlp/issues/11652)) by [coletdjnz](https://github.com/coletdjnz), [Grub4K](https://github.com/Grub4K)
|
|
||||||
- [Always add unsupported suffix on version mismatch](https://github.com/yt-dlp/yt-dlp/commit/95f8df2f796d0048119615200758199aedcd7cf4) ([#12626](https://github.com/yt-dlp/yt-dlp/issues/12626)) by [Grub4K](https://github.com/Grub4K)
|
|
||||||
|
|
||||||
#### Misc. changes
|
|
||||||
- **cleanup**: Miscellaneous: [f36e4b6](https://github.com/yt-dlp/yt-dlp/commit/f36e4b6e65cb8403791aae2f520697115cb88dec) by [dirkf](https://github.com/dirkf), [gamer191](https://github.com/gamer191), [Grub4K](https://github.com/Grub4K), [seproDev](https://github.com/seproDev)
|
|
||||||
- **test**: [Show all differences for `expect_value` and `expect_dict`](https://github.com/yt-dlp/yt-dlp/commit/a3e0c7d3b267abdf3933b709704a28d43bb46503) ([#12334](https://github.com/yt-dlp/yt-dlp/issues/12334)) by [Grub4K](https://github.com/Grub4K)
|
|
||||||
|
|
||||||
### 2025.02.19
|
|
||||||
|
|
||||||
#### Core changes
|
|
||||||
- **jsinterp**
|
|
||||||
- [Add `js_number_to_string`](https://github.com/yt-dlp/yt-dlp/commit/0d9f061d38c3a4da61972e2adad317079f2f1c84) ([#12110](https://github.com/yt-dlp/yt-dlp/issues/12110)) by [Grub4K](https://github.com/Grub4K)
|
|
||||||
- [Improve zeroise](https://github.com/yt-dlp/yt-dlp/commit/4ca8c44a073d5aa3a3e3112c35b2b23d6ce25ac6) ([#12313](https://github.com/yt-dlp/yt-dlp/issues/12313)) by [seproDev](https://github.com/seproDev)
|
|
||||||
|
|
||||||
#### Extractor changes
|
|
||||||
- **acast**: [Support shows.acast.com URLs](https://github.com/yt-dlp/yt-dlp/commit/57c717fee4bfbc9309845bbb48901b72e4b69304) ([#12223](https://github.com/yt-dlp/yt-dlp/issues/12223)) by [barsnick](https://github.com/barsnick)
|
|
||||||
- **cwtv**
|
|
||||||
- [Fix extractor](https://github.com/yt-dlp/yt-dlp/commit/18a28514e306e822eab4f3a79c76d515bf076406) ([#12207](https://github.com/yt-dlp/yt-dlp/issues/12207)) by [arantius](https://github.com/arantius)
|
|
||||||
- movie: [Add extractor](https://github.com/yt-dlp/yt-dlp/commit/03c3d705778c07739e0034b51490877cffdc0983) ([#12227](https://github.com/yt-dlp/yt-dlp/issues/12227)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **digiview**: [Add extractor](https://github.com/yt-dlp/yt-dlp/commit/f53553087d3fde9dcd61d6e9f98caf09db1d8ef2) ([#9902](https://github.com/yt-dlp/yt-dlp/issues/9902)) by [lfavole](https://github.com/lfavole)
|
|
||||||
- **dropbox**: [Fix extraction](https://github.com/yt-dlp/yt-dlp/commit/861aeec449c8f3c062d962945b234ff0341f61f3) ([#12228](https://github.com/yt-dlp/yt-dlp/issues/12228)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **francetv**
|
|
||||||
- site
|
|
||||||
- [Fix extractor](https://github.com/yt-dlp/yt-dlp/commit/817483ccc68aed6049ed9c4a2ffae44ca82d2b1c) ([#12236](https://github.com/yt-dlp/yt-dlp/issues/12236)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- [Fix livestream extraction](https://github.com/yt-dlp/yt-dlp/commit/1295bbedd45fa8d9bc3f7a194864ae280297848e) ([#12316](https://github.com/yt-dlp/yt-dlp/issues/12316)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **francetvinfo.fr**: [Fix extractor](https://github.com/yt-dlp/yt-dlp/commit/5c4c2ddfaa47988b4d50c1ad4988badc0b4f30c2) ([#12402](https://github.com/yt-dlp/yt-dlp/issues/12402)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **gem.cbc.ca**: [Fix extractors](https://github.com/yt-dlp/yt-dlp/commit/5271ef48c6f61c145e03e18e960995d2e651d205) ([#12404](https://github.com/yt-dlp/yt-dlp/issues/12404)) by [bashonly](https://github.com/bashonly), [dirkf](https://github.com/dirkf)
|
|
||||||
- **generic**: [Extract `live_status` for DASH manifest URLs](https://github.com/yt-dlp/yt-dlp/commit/19edaa44fcd375f54e63d6227b092f5252d3e889) ([#12256](https://github.com/yt-dlp/yt-dlp/issues/12256)) by [mp3butcher](https://github.com/mp3butcher)
|
|
||||||
- **globo**: [Fix extractor](https://github.com/yt-dlp/yt-dlp/commit/f8d0161455f00add65585ca1a476a7b5d56f5f96) ([#11795](https://github.com/yt-dlp/yt-dlp/issues/11795)) by [slipinthedove](https://github.com/slipinthedove), [YoshiTabletopGamer](https://github.com/YoshiTabletopGamer)
|
|
||||||
- **goplay**: [Fix extractor](https://github.com/yt-dlp/yt-dlp/commit/d59f14a0a7a8b55e6bf468237def62b73ab4a517) ([#12237](https://github.com/yt-dlp/yt-dlp/issues/12237)) by [alard](https://github.com/alard)
|
|
||||||
- **pbs**: [Support www.thirteen.org URLs](https://github.com/yt-dlp/yt-dlp/commit/9fb8ab2ff67fb699f60cce09163a580976e90c0e) ([#11191](https://github.com/yt-dlp/yt-dlp/issues/11191)) by [rohieb](https://github.com/rohieb)
|
|
||||||
- **reddit**: [Bypass gated subreddit warning](https://github.com/yt-dlp/yt-dlp/commit/6ca23ffaa4663cb552f937f0b1e9769b66db11bd) ([#12335](https://github.com/yt-dlp/yt-dlp/issues/12335)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **twitter**: [Fix syndication token generation](https://github.com/yt-dlp/yt-dlp/commit/14cd7f3443c6da4d49edaefcc12da9dee86e243e) ([#12107](https://github.com/yt-dlp/yt-dlp/issues/12107)) by [Grub4K](https://github.com/Grub4K), [pjrobertson](https://github.com/pjrobertson)
|
|
||||||
- **youtube**
|
|
||||||
- [Retry on more critical requests](https://github.com/yt-dlp/yt-dlp/commit/d48e612609d012abbea3785be4d26d78a014abb2) ([#12339](https://github.com/yt-dlp/yt-dlp/issues/12339)) by [coletdjnz](https://github.com/coletdjnz)
|
|
||||||
- [nsig workaround for `tce` player JS](https://github.com/yt-dlp/yt-dlp/commit/ec17fb16e8d69d4e3e10fb73bf3221be8570dfee) ([#12401](https://github.com/yt-dlp/yt-dlp/issues/12401)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **zdf**: [Extract more metadata](https://github.com/yt-dlp/yt-dlp/commit/241ace4f104d50fdf7638f9203927aefcf57a1f7) ([#9565](https://github.com/yt-dlp/yt-dlp/issues/9565)) by [StefanLobbenmeier](https://github.com/StefanLobbenmeier) (With fixes in [e7882b6](https://github.com/yt-dlp/yt-dlp/commit/e7882b682b959e476d8454911655b3e9b14c79b2) by [bashonly](https://github.com/bashonly))
|
|
||||||
|
|
||||||
#### Downloader changes
|
|
||||||
- **hls**
|
|
||||||
- [Fix `BYTERANGE` logic](https://github.com/yt-dlp/yt-dlp/commit/10b7ff68e98f17655e31952f6e17120b2d7dda96) ([#11972](https://github.com/yt-dlp/yt-dlp/issues/11972)) by [entourage8](https://github.com/entourage8)
|
|
||||||
- [Support `--write-pages` for m3u8 media playlists](https://github.com/yt-dlp/yt-dlp/commit/be69468752ff598cacee57bb80533deab2367a5d) ([#12333](https://github.com/yt-dlp/yt-dlp/issues/12333)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- [Support `hls_media_playlist_data` format field](https://github.com/yt-dlp/yt-dlp/commit/c987be0acb6872c6561f28aa28171e803393d851) ([#12322](https://github.com/yt-dlp/yt-dlp/issues/12322)) by [bashonly](https://github.com/bashonly)
|
|
||||||
|
|
||||||
#### Misc. changes
|
|
||||||
- [Improve Issue/PR templates](https://github.com/yt-dlp/yt-dlp/commit/517ddf3c3f12560ab93e3d36244dc82db9f97818) ([#11499](https://github.com/yt-dlp/yt-dlp/issues/11499)) by [seproDev](https://github.com/seproDev) (With fixes in [4ecb833](https://github.com/yt-dlp/yt-dlp/commit/4ecb833472c90e078567b561fb7c089f1aa9587b) by [bashonly](https://github.com/bashonly))
|
|
||||||
- **cleanup**: Miscellaneous: [4985a40](https://github.com/yt-dlp/yt-dlp/commit/4985a4041770eaa0016271809a1fd950dc809a55) by [dirkf](https://github.com/dirkf), [Grub4K](https://github.com/Grub4K), [StefanLobbenmeier](https://github.com/StefanLobbenmeier)
|
|
||||||
- **docs**: [Add note to `supportedsites.md`](https://github.com/yt-dlp/yt-dlp/commit/01a63629a21781458dcbd38779898e117678f5ff) ([#12382](https://github.com/yt-dlp/yt-dlp/issues/12382)) by [seproDev](https://github.com/seproDev)
|
|
||||||
- **test**: download: [Validate and sort info dict fields](https://github.com/yt-dlp/yt-dlp/commit/208163447408c78673b08c172beafe5c310fb167) ([#12299](https://github.com/yt-dlp/yt-dlp/issues/12299)) by [bashonly](https://github.com/bashonly), [pzhlkj6612](https://github.com/pzhlkj6612)
|
|
||||||
|
|
||||||
### 2025.01.26
|
|
||||||
|
|
||||||
#### Core changes
|
|
||||||
- [Fix float comparison values in format filters](https://github.com/yt-dlp/yt-dlp/commit/f7d071e8aa3bf67ed7e0f881e749ca9ab50b3f8f) ([#11880](https://github.com/yt-dlp/yt-dlp/issues/11880)) by [bashonly](https://github.com/bashonly), [Dioarya](https://github.com/Dioarya)
|
|
||||||
- **utils**: `sanitize_path`: [Fix some incorrect behavior](https://github.com/yt-dlp/yt-dlp/commit/fc12e724a3b4988cfc467d2981887dde48c26b69) ([#11923](https://github.com/yt-dlp/yt-dlp/issues/11923)) by [Grub4K](https://github.com/Grub4K)
|
|
||||||
|
|
||||||
#### Extractor changes
|
|
||||||
- **1tv**: [Support sport1tv.ru domain](https://github.com/yt-dlp/yt-dlp/commit/61ae5dc34ac775d6c122575e21ef2153b1273a2b) ([#11889](https://github.com/yt-dlp/yt-dlp/issues/11889)) by [kvk-2015](https://github.com/kvk-2015)
|
|
||||||
- **abematv**: [Support season extraction](https://github.com/yt-dlp/yt-dlp/commit/c709cc41cbc16edc846e0a431cfa8508396d4cb6) ([#11771](https://github.com/yt-dlp/yt-dlp/issues/11771)) by [middlingphys](https://github.com/middlingphys)
|
|
||||||
- **bilibili**
|
|
||||||
- [Support space `/lists/` URLs](https://github.com/yt-dlp/yt-dlp/commit/465167910407449354eb48e9861efd0819f53eb5) ([#11964](https://github.com/yt-dlp/yt-dlp/issues/11964)) by [c-basalt](https://github.com/c-basalt)
|
|
||||||
- [Support space video list extraction without login](https://github.com/yt-dlp/yt-dlp/commit/78912ed9c81f109169b828c397294a6cf8eacf41) ([#12089](https://github.com/yt-dlp/yt-dlp/issues/12089)) by [grqz](https://github.com/grqz)
|
|
||||||
- **bilibilidynamic**: [Add extractor](https://github.com/yt-dlp/yt-dlp/commit/9676b05715b61c8c5dd5598871e60d8807fb1a86) ([#11838](https://github.com/yt-dlp/yt-dlp/issues/11838)) by [finch71](https://github.com/finch71), [grqz](https://github.com/grqz)
|
|
||||||
- **bluesky**: [Prefer source format](https://github.com/yt-dlp/yt-dlp/commit/ccda63934df7de2823f0834218c4254c7c4d2e4c) ([#12154](https://github.com/yt-dlp/yt-dlp/issues/12154)) by [0x9fff00](https://github.com/0x9fff00)
|
|
||||||
- **crunchyroll**: [Remove extractors](https://github.com/yt-dlp/yt-dlp/commit/ff44ed53061e065804da6275d182d7928cc03a5e) ([#12195](https://github.com/yt-dlp/yt-dlp/issues/12195)) by [seproDev](https://github.com/seproDev)
|
|
||||||
- **dropout**: [Fix extraction](https://github.com/yt-dlp/yt-dlp/commit/164368610456e2d96b279f8b120dea08f7b1d74f) ([#12102](https://github.com/yt-dlp/yt-dlp/issues/12102)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **eggs**: [Add extractors](https://github.com/yt-dlp/yt-dlp/commit/20c765d02385a105c8ef13b6f7a737491d29c19a) ([#11904](https://github.com/yt-dlp/yt-dlp/issues/11904)) by [seproDev](https://github.com/seproDev), [subsense](https://github.com/subsense)
|
|
||||||
- **funimation**: [Remove extractors](https://github.com/yt-dlp/yt-dlp/commit/cdcf1e86726b8fa44f7e7126bbf1c18e1798d25c) ([#12167](https://github.com/yt-dlp/yt-dlp/issues/12167)) by [doe1080](https://github.com/doe1080)
|
|
||||||
- **goodgame**: [Fix extractor](https://github.com/yt-dlp/yt-dlp/commit/e7cc02b14d8d323f805d14325a9c95593a170d28) ([#12173](https://github.com/yt-dlp/yt-dlp/issues/12173)) by [NecroRomnt](https://github.com/NecroRomnt)
|
|
||||||
- **lbry**: [Support signed URLs](https://github.com/yt-dlp/yt-dlp/commit/de30f652ffb7623500215f5906844f2ae0d92c7b) ([#12138](https://github.com/yt-dlp/yt-dlp/issues/12138)) by [seproDev](https://github.com/seproDev)
|
|
||||||
- **naver**: [Fix m3u8 formats extraction](https://github.com/yt-dlp/yt-dlp/commit/b3007c44cdac38187fc6600de76959a7079a44d1) ([#12037](https://github.com/yt-dlp/yt-dlp/issues/12037)) by [kclauhk](https://github.com/kclauhk)
|
|
||||||
- **nest**: [Add extractors](https://github.com/yt-dlp/yt-dlp/commit/1ef3ee7500c4ab8c26f7fdc5b0ad1da4d16eec8e) ([#11747](https://github.com/yt-dlp/yt-dlp/issues/11747)) by [pabs3](https://github.com/pabs3), [seproDev](https://github.com/seproDev)
|
|
||||||
- **niconico**: series: [Fix extractor](https://github.com/yt-dlp/yt-dlp/commit/bc88b904cd02314da41ce1b2fdf046d0680fe965) ([#11822](https://github.com/yt-dlp/yt-dlp/issues/11822)) by [test20140](https://github.com/test20140)
|
|
||||||
- **nrk**
|
|
||||||
- [Extract more formats](https://github.com/yt-dlp/yt-dlp/commit/89198bb23b4d03e0473ac408bfb50d67c2f71165) ([#12069](https://github.com/yt-dlp/yt-dlp/issues/12069)) by [hexahigh](https://github.com/hexahigh)
|
|
||||||
- [Fix extraction](https://github.com/yt-dlp/yt-dlp/commit/45732e2590a1bd0bc9608f5eb68c59341ca84f02) ([#12193](https://github.com/yt-dlp/yt-dlp/issues/12193)) by [hexahigh](https://github.com/hexahigh)
|
|
||||||
- **patreon**: [Extract attachment filename as `alt_title`](https://github.com/yt-dlp/yt-dlp/commit/e2e73b5c65593ec0a5e685663e6ec0f4aaffc1f1) ([#12000](https://github.com/yt-dlp/yt-dlp/issues/12000)) by [msm595](https://github.com/msm595)
|
|
||||||
- **pbs**: [Fix extractor](https://github.com/yt-dlp/yt-dlp/commit/13825ab77815ee6e1603abbecbb9f3795057b93c) ([#12024](https://github.com/yt-dlp/yt-dlp/issues/12024)) by [dirkf](https://github.com/dirkf), [krandor](https://github.com/krandor), [n10dollar](https://github.com/n10dollar)
|
|
||||||
- **piramidetv**: [Add extractors](https://github.com/yt-dlp/yt-dlp/commit/af2c821d74049b519895288aca23cee81fc4b049) ([#10777](https://github.com/yt-dlp/yt-dlp/issues/10777)) by [HobbyistDev](https://github.com/HobbyistDev), [kclauhk](https://github.com/kclauhk), [seproDev](https://github.com/seproDev)
|
|
||||||
- **redgifs**: [Support `/ifr/` URLs](https://github.com/yt-dlp/yt-dlp/commit/4850ce91d163579fa615c3c0d44c9bd64682c22b) ([#11805](https://github.com/yt-dlp/yt-dlp/issues/11805)) by [invertico](https://github.com/invertico)
|
|
||||||
- **rtvslo.si**: show: [Extract more metadata](https://github.com/yt-dlp/yt-dlp/commit/3fc46086562857d5493cbcff687f76e4e4ed303f) ([#12136](https://github.com/yt-dlp/yt-dlp/issues/12136)) by [cotko](https://github.com/cotko)
|
|
||||||
- **senategov**: [Fix extractors](https://github.com/yt-dlp/yt-dlp/commit/68221ecc87c6a3f3515757bac2a0f9674a38e3f2) ([#9361](https://github.com/yt-dlp/yt-dlp/issues/9361)) by [Grabien](https://github.com/Grabien), [seproDev](https://github.com/seproDev)
|
|
||||||
- **soundcloud**
|
|
||||||
- [Extract more metadata](https://github.com/yt-dlp/yt-dlp/commit/6d304133ab32bcd1eb78ff1467f1a41dd9b66c33) ([#11945](https://github.com/yt-dlp/yt-dlp/issues/11945)) by [7x11x13](https://github.com/7x11x13)
|
|
||||||
- user: [Add `/comments` page support](https://github.com/yt-dlp/yt-dlp/commit/7bfb4f72e490310d2681c7f4815218a2ebbc73ee) ([#11999](https://github.com/yt-dlp/yt-dlp/issues/11999)) by [7x11x13](https://github.com/7x11x13)
|
|
||||||
- **subsplash**: [Add extractors](https://github.com/yt-dlp/yt-dlp/commit/5d904b077d2f58ae44bdf208d2dcfcc3ff8347f5) ([#11054](https://github.com/yt-dlp/yt-dlp/issues/11054)) by [seproDev](https://github.com/seproDev), [subrat-lima](https://github.com/subrat-lima)
|
|
||||||
- **theatercomplextownppv**: [Support `live` URLs](https://github.com/yt-dlp/yt-dlp/commit/797d2472a299692e01ad1500e8c3b7bc1daa7fe4) ([#11720](https://github.com/yt-dlp/yt-dlp/issues/11720)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **vimeo**: [Fix thumbnail extraction](https://github.com/yt-dlp/yt-dlp/commit/9ff330948c92f6b2e1d9c928787362ab19cd6c62) ([#12142](https://github.com/yt-dlp/yt-dlp/issues/12142)) by [jixunmoe](https://github.com/jixunmoe)
|
|
||||||
- **vimp**: Playlist: [Add support for tags](https://github.com/yt-dlp/yt-dlp/commit/d4f5be1735c8feaeb3308666e0b878e9782f529d) ([#11688](https://github.com/yt-dlp/yt-dlp/issues/11688)) by [FestplattenSchnitzel](https://github.com/FestplattenSchnitzel)
|
|
||||||
- **weibo**: [Extend `_VALID_URL`](https://github.com/yt-dlp/yt-dlp/commit/a567f97b62ae9f6d6f5a9376c361512ab8dceda2) ([#12088](https://github.com/yt-dlp/yt-dlp/issues/12088)) by [4ft35t](https://github.com/4ft35t)
|
|
||||||
- **xhamster**: [Various improvements](https://github.com/yt-dlp/yt-dlp/commit/3b99a0f0e07f0120ab416f34a8f5ab75d4fdf1d1) ([#11738](https://github.com/yt-dlp/yt-dlp/issues/11738)) by [knackku](https://github.com/knackku)
|
|
||||||
- **xiaohongshu**: [Extract more formats](https://github.com/yt-dlp/yt-dlp/commit/f9f24ae376a9eaca777816479a4a29f6f0ce7681) ([#12147](https://github.com/yt-dlp/yt-dlp/issues/12147)) by [seproDev](https://github.com/seproDev)
|
|
||||||
- **youtube**
|
|
||||||
- [Download `tv` client Innertube config](https://github.com/yt-dlp/yt-dlp/commit/326fb1ffaf4e8349f1fe8ba2a81839652e044bff) ([#12168](https://github.com/yt-dlp/yt-dlp/issues/12168)) by [coletdjnz](https://github.com/coletdjnz)
|
|
||||||
- [Extract `media_type` for livestreams](https://github.com/yt-dlp/yt-dlp/commit/421bc72103d1faed473a451299cd17d6abb433bb) ([#11605](https://github.com/yt-dlp/yt-dlp/issues/11605)) by [nosoop](https://github.com/nosoop)
|
|
||||||
- [Restore convenience workarounds](https://github.com/yt-dlp/yt-dlp/commit/f0d4b8a5d6354b294bc9631cf15a7160b7bad5de) ([#12181](https://github.com/yt-dlp/yt-dlp/issues/12181)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- [Update `ios` player client](https://github.com/yt-dlp/yt-dlp/commit/de82acf8769282ce321a86737ecc1d4bef0e82a7) ([#12155](https://github.com/yt-dlp/yt-dlp/issues/12155)) by [b5i](https://github.com/b5i)
|
|
||||||
- [Use different PO token for GVS and Player](https://github.com/yt-dlp/yt-dlp/commit/6b91d232e316efa406035915532eb126fbaeea38) ([#12090](https://github.com/yt-dlp/yt-dlp/issues/12090)) by [coletdjnz](https://github.com/coletdjnz)
|
|
||||||
- tab: [Improve shorts title extraction](https://github.com/yt-dlp/yt-dlp/commit/76ac023ff02f06e8c003d104f02a03deeddebdcd) ([#11997](https://github.com/yt-dlp/yt-dlp/issues/11997)) by [bashonly](https://github.com/bashonly), [d3d9](https://github.com/d3d9)
|
|
||||||
- **zdf**: [Fix extractors](https://github.com/yt-dlp/yt-dlp/commit/bb69f5dab79fb32c4ec0d50e05f7fa26d05d54ba) ([#11041](https://github.com/yt-dlp/yt-dlp/issues/11041)) by [InvalidUsernameException](https://github.com/InvalidUsernameException)
|
|
||||||
|
|
||||||
#### Misc. changes
|
|
||||||
- **cleanup**: Miscellaneous: [3b45319](https://github.com/yt-dlp/yt-dlp/commit/3b4531934465580be22937fecbb6e1a3a9e2334f) by [bashonly](https://github.com/bashonly), [lonble](https://github.com/lonble), [pjrobertson](https://github.com/pjrobertson), [seproDev](https://github.com/seproDev)
|
|
||||||
|
|
||||||
### 2025.01.15
|
|
||||||
|
|
||||||
#### Extractor changes
|
|
||||||
- **youtube**: [Do not use `web_creator` as a default client](https://github.com/yt-dlp/yt-dlp/commit/c8541f8b13e743fcfa06667530d13fee8686e22a) ([#12087](https://github.com/yt-dlp/yt-dlp/issues/12087)) by [bashonly](https://github.com/bashonly)
|
|
||||||
|
|
||||||
### 2025.01.12
|
|
||||||
|
|
||||||
#### Core changes
|
|
||||||
- [Fix filename sanitization with `--no-windows-filenames`](https://github.com/yt-dlp/yt-dlp/commit/8346b549150003df988538e54c9d8bc4de568979) ([#11988](https://github.com/yt-dlp/yt-dlp/issues/11988)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- [Validate retries values are non-negative](https://github.com/yt-dlp/yt-dlp/commit/1f4e1e85a27c5b43e34d7706cfd88ffce1b56a4a) ([#11927](https://github.com/yt-dlp/yt-dlp/issues/11927)) by [Strkmn](https://github.com/Strkmn)
|
|
||||||
|
|
||||||
#### Extractor changes
|
|
||||||
- **drtalks**: [Add extractor](https://github.com/yt-dlp/yt-dlp/commit/1f489f4a45691cac3f9e787d22a3a8a086229ba6) ([#10831](https://github.com/yt-dlp/yt-dlp/issues/10831)) by [pzhlkj6612](https://github.com/pzhlkj6612), [seproDev](https://github.com/seproDev)
|
|
||||||
- **plvideo**: [Add extractor](https://github.com/yt-dlp/yt-dlp/commit/3c14e9191f3035b9a729d1d87bc0381f42de57cf) ([#10657](https://github.com/yt-dlp/yt-dlp/issues/10657)) by [Sanceilaks](https://github.com/Sanceilaks), [seproDev](https://github.com/seproDev)
|
|
||||||
- **vine**: [Remove extractors](https://github.com/yt-dlp/yt-dlp/commit/e2ef4fece6c9742d1733e3bae408c4787765f78c) ([#11700](https://github.com/yt-dlp/yt-dlp/issues/11700)) by [allendema](https://github.com/allendema)
|
|
||||||
- **xiaohongshu**: [Extend `_VALID_URL`](https://github.com/yt-dlp/yt-dlp/commit/763ed06ee69f13949397897bd42ff2ec3dc3d384) ([#11806](https://github.com/yt-dlp/yt-dlp/issues/11806)) by [HobbyistDev](https://github.com/HobbyistDev)
|
|
||||||
- **youtube**
|
|
||||||
- [Fix DASH formats incorrectly skipped in some situations](https://github.com/yt-dlp/yt-dlp/commit/0b6b7742c2e7f2a1fcb0b54ef3dd484bab404b3f) ([#11910](https://github.com/yt-dlp/yt-dlp/issues/11910)) by [coletdjnz](https://github.com/coletdjnz)
|
|
||||||
- [Refactor cookie auth](https://github.com/yt-dlp/yt-dlp/commit/75079f4e3f7dce49b61ef01da7adcd9876a0ca3b) ([#11989](https://github.com/yt-dlp/yt-dlp/issues/11989)) by [coletdjnz](https://github.com/coletdjnz)
|
|
||||||
- [Use `tv` instead of `mweb` client by default](https://github.com/yt-dlp/yt-dlp/commit/712d2abb32f59b2d246be2901255f84f1a4c30b3) ([#12059](https://github.com/yt-dlp/yt-dlp/issues/12059)) by [coletdjnz](https://github.com/coletdjnz)
|
|
||||||
|
|
||||||
#### Misc. changes
|
|
||||||
- **cleanup**: Miscellaneous: [dade5e3](https://github.com/yt-dlp/yt-dlp/commit/dade5e35c89adaad04408bfef766820dbca06ebe) by [grqz](https://github.com/grqz), [Grub4K](https://github.com/Grub4K), [seproDev](https://github.com/seproDev)
|
|
||||||
|
|
||||||
### 2024.12.23
|
|
||||||
|
|
||||||
#### Core changes
|
|
||||||
- [Don't sanitize filename on Unix when `--no-windows-filenames`](https://github.com/yt-dlp/yt-dlp/commit/6fc85f617a5850307fd5b258477070e6ee177796) ([#9591](https://github.com/yt-dlp/yt-dlp/issues/9591)) by [pukkandan](https://github.com/pukkandan)
|
|
||||||
- **update**
|
|
||||||
- [Check 64-bitness when upgrading ARM builds](https://github.com/yt-dlp/yt-dlp/commit/b91c3925c2059970daa801cb131c0c2f4f302e72) ([#11819](https://github.com/yt-dlp/yt-dlp/issues/11819)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- [Fix endless update loop for `linux_exe` builds](https://github.com/yt-dlp/yt-dlp/commit/3d3ee458c1fe49dd5ebd7651a092119d23eb7000) ([#11827](https://github.com/yt-dlp/yt-dlp/issues/11827)) by [bashonly](https://github.com/bashonly)
|
|
||||||
|
|
||||||
#### Extractor changes
|
|
||||||
- **soundcloud**: [Various fixes](https://github.com/yt-dlp/yt-dlp/commit/d298693b1b266d198e8eeecb90ea17c4a031268f) ([#11820](https://github.com/yt-dlp/yt-dlp/issues/11820)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **youtube**
|
|
||||||
- [Add age-gate workaround for some embeddable videos](https://github.com/yt-dlp/yt-dlp/commit/09a6c687126f04e243fcb105a828787efddd1030) ([#11821](https://github.com/yt-dlp/yt-dlp/issues/11821)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- [Fix `uploader_id` extraction](https://github.com/yt-dlp/yt-dlp/commit/1a8851b689763e5173b96f70f8a71df0e4a44b66) ([#11818](https://github.com/yt-dlp/yt-dlp/issues/11818)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- [Player client maintenance](https://github.com/yt-dlp/yt-dlp/commit/65cf46cddd873fd229dbb0fc0689bca4c201c6b6) ([#11893](https://github.com/yt-dlp/yt-dlp/issues/11893)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- [Skip iOS formats that require PO Token](https://github.com/yt-dlp/yt-dlp/commit/9f42e68a74f3f00b0253fe70763abd57cac4237b) ([#11890](https://github.com/yt-dlp/yt-dlp/issues/11890)) by [coletdjnz](https://github.com/coletdjnz)
|
|
||||||
|
|
||||||
### 2024.12.13
|
|
||||||
|
|
||||||
#### Extractor changes
|
|
||||||
- **patreon**: campaign: [Support /c/ URLs](https://github.com/yt-dlp/yt-dlp/commit/bc262bcad4d3683ceadf61a7eb87e233e72adef3) ([#11756](https://github.com/yt-dlp/yt-dlp/issues/11756)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **soundcloud**: [Fix extraction](https://github.com/yt-dlp/yt-dlp/commit/f4d3e9e6dc25077b79849a31a2f67f93fdc01e62) ([#11777](https://github.com/yt-dlp/yt-dlp/issues/11777)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **youtube**
|
|
||||||
- [Fix `release_date` extraction](https://github.com/yt-dlp/yt-dlp/commit/d5e2a379f2adcb28bc48c7d9e90716d7278f89d2) ([#11759](https://github.com/yt-dlp/yt-dlp/issues/11759)) by [MutantPiggieGolem1](https://github.com/MutantPiggieGolem1)
|
|
||||||
- [Fix signature function extraction for `2f1832d2`](https://github.com/yt-dlp/yt-dlp/commit/5460cd91891bf613a2065e2fc278d9903c37a127) ([#11801](https://github.com/yt-dlp/yt-dlp/issues/11801)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- [Prioritize original language over auto-dubbed audio](https://github.com/yt-dlp/yt-dlp/commit/dc3c4fddcc653989dae71fc563d82a308fc898cc) ([#11803](https://github.com/yt-dlp/yt-dlp/issues/11803)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- search_url: [Fix playlist searches](https://github.com/yt-dlp/yt-dlp/commit/f6c73aad5f1a67544bea137ebd9d1e22e0e56567) ([#11782](https://github.com/yt-dlp/yt-dlp/issues/11782)) by [Crypto90](https://github.com/Crypto90)
|
|
||||||
|
|
||||||
#### Misc. changes
|
|
||||||
- **cleanup**: [Make more playlist entries lazy](https://github.com/yt-dlp/yt-dlp/commit/54216696261bc07cacd9a837c501d9e0b7fed09e) ([#11763](https://github.com/yt-dlp/yt-dlp/issues/11763)) by [seproDev](https://github.com/seproDev)
|
|
||||||
|
|
||||||
### 2024.12.06
|
|
||||||
|
|
||||||
#### Core changes
|
|
||||||
- **cookies**: [Add `--cookies-from-browser` support for MS Store Firefox](https://github.com/yt-dlp/yt-dlp/commit/354cb4026cf2191e1a130ec2a627b95cabfbc60a) ([#11731](https://github.com/yt-dlp/yt-dlp/issues/11731)) by [wesson09](https://github.com/wesson09)
|
|
||||||
|
|
||||||
#### Extractor changes
|
|
||||||
- **bilibili**: [Fix HD formats extraction](https://github.com/yt-dlp/yt-dlp/commit/fca3eb5f8be08d5fab2e18b45b7281a12e566725) ([#11734](https://github.com/yt-dlp/yt-dlp/issues/11734)) by [grqz](https://github.com/grqz)
|
|
||||||
- **soundcloud**: [Fix formats extraction](https://github.com/yt-dlp/yt-dlp/commit/2feb28028ee48f2185d2d95076e62accb09b9e2e) ([#11742](https://github.com/yt-dlp/yt-dlp/issues/11742)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **youtube**
|
|
||||||
- [Fix `n` sig extraction for player `3bb1f723`](https://github.com/yt-dlp/yt-dlp/commit/a95ee6d8803fca9157adecf63732ab58bf87fd88) ([#11750](https://github.com/yt-dlp/yt-dlp/issues/11750)) by [bashonly](https://github.com/bashonly) (With fixes in [4bd2655](https://github.com/yt-dlp/yt-dlp/commit/4bd2655398aed450456197a6767639114a24eac2))
|
|
||||||
- [Fix signature function extraction](https://github.com/yt-dlp/yt-dlp/commit/4c85ccd1366c88cf93982f8350f58eed17355981) ([#11751](https://github.com/yt-dlp/yt-dlp/issues/11751)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- [Player client maintenance](https://github.com/yt-dlp/yt-dlp/commit/2e49c789d3eebc39af8910705d65a98bca0e4c4f) ([#11724](https://github.com/yt-dlp/yt-dlp/issues/11724)) by [bashonly](https://github.com/bashonly)
|
|
||||||
|
|
||||||
### 2024.12.03
|
|
||||||
|
|
||||||
#### Core changes
|
|
||||||
- [Add `playlist_webpage_url` field](https://github.com/yt-dlp/yt-dlp/commit/7d6c259a03bc4707a319e5e8c6eff0278707874b) ([#11613](https://github.com/yt-dlp/yt-dlp/issues/11613)) by [seproDev](https://github.com/seproDev)
|
|
||||||
|
|
||||||
#### Extractor changes
|
|
||||||
- [Handle fragmented formats in `_remove_duplicate_formats`](https://github.com/yt-dlp/yt-dlp/commit/e0500cbf796323551bbabe5b8ed8c75a511ba47a) ([#11637](https://github.com/yt-dlp/yt-dlp/issues/11637)) by [Grub4K](https://github.com/Grub4K)
|
|
||||||
- **bilibili**
|
|
||||||
- [Always try to extract HD formats](https://github.com/yt-dlp/yt-dlp/commit/dc1687648077c5bf64863b307ecc5ab7e029bd8d) ([#10559](https://github.com/yt-dlp/yt-dlp/issues/10559)) by [grqz](https://github.com/grqz)
|
|
||||||
- [Fix extractor](https://github.com/yt-dlp/yt-dlp/commit/239f5f36fe04603bec59c8b975f6a792f10246db) ([#11667](https://github.com/yt-dlp/yt-dlp/issues/11667)) by [grqz](https://github.com/grqz) (With fixes in [f05a1cd](https://github.com/yt-dlp/yt-dlp/commit/f05a1cd1492fc98dc8d80d2081d632a1879913d2) by [bashonly](https://github.com/bashonly), [grqz](https://github.com/grqz))
|
|
||||||
- [Fix subtitles and chapters extraction](https://github.com/yt-dlp/yt-dlp/commit/a13a336aa6f906812701abec8101b73b73db8ff7) ([#11708](https://github.com/yt-dlp/yt-dlp/issues/11708)) by [xiaomac](https://github.com/xiaomac)
|
|
||||||
- **chaturbate**: [Fix support for non-public streams](https://github.com/yt-dlp/yt-dlp/commit/4b5eec0aaa7c02627f27a386591b735b90e681a8) ([#11624](https://github.com/yt-dlp/yt-dlp/issues/11624)) by [jkruse](https://github.com/jkruse)
|
|
||||||
- **dacast**: [Fix HLS AES formats extraction](https://github.com/yt-dlp/yt-dlp/commit/0a0d80800b9350d1a4c4b18d82cfb77ffbc3c507) ([#11644](https://github.com/yt-dlp/yt-dlp/issues/11644)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **dropbox**: [Fix password-protected video extraction](https://github.com/yt-dlp/yt-dlp/commit/00dcde728635633eee969ad4d498b9f233c4a94e) ([#11636](https://github.com/yt-dlp/yt-dlp/issues/11636)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **duoplay**: [Fix extractor](https://github.com/yt-dlp/yt-dlp/commit/62cba8a1bedbfc0ddde7267ae57b72bf5f7ea7b1) ([#11588](https://github.com/yt-dlp/yt-dlp/issues/11588)) by [bashonly](https://github.com/bashonly), [glensc](https://github.com/glensc)
|
|
||||||
- **facebook**: [Support more groups URLs](https://github.com/yt-dlp/yt-dlp/commit/e0f1ae813b36e783e2348ba2a1566e12f5cd8f6e) ([#11576](https://github.com/yt-dlp/yt-dlp/issues/11576)) by [grqz](https://github.com/grqz)
|
|
||||||
- **instagram**: [Support `share` URLs](https://github.com/yt-dlp/yt-dlp/commit/360aed810ad85db950df586282d256516c98cd2d) ([#11677](https://github.com/yt-dlp/yt-dlp/issues/11677)) by [grqz](https://github.com/grqz)
|
|
||||||
- **microsoftembed**: [Make format extraction non fatal](https://github.com/yt-dlp/yt-dlp/commit/2bea7936323ca4b6f3b9b1fdd892566223e30efa) ([#11654](https://github.com/yt-dlp/yt-dlp/issues/11654)) by [seproDev](https://github.com/seproDev)
|
|
||||||
- **mitele**: [Fix extractor](https://github.com/yt-dlp/yt-dlp/commit/cd0f934604587ed793e9177f6a127e5dcf99a7dd) ([#11683](https://github.com/yt-dlp/yt-dlp/issues/11683)) by [DarkZeros](https://github.com/DarkZeros)
|
|
||||||
- **stripchat**: [Fix extractor](https://github.com/yt-dlp/yt-dlp/commit/16336c51d0848a6868a4fa04e749fa03548b4913) ([#11596](https://github.com/yt-dlp/yt-dlp/issues/11596)) by [gitninja1234](https://github.com/gitninja1234)
|
|
||||||
- **tiktok**: [Deprioritize animated thumbnails](https://github.com/yt-dlp/yt-dlp/commit/910ecc422930bca14e2abe4986f5f92359e3cea8) ([#11645](https://github.com/yt-dlp/yt-dlp/issues/11645)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **vk**: [Fix extractors](https://github.com/yt-dlp/yt-dlp/commit/c038a7b187ba24360f14134842a7a2cf897c33b1) ([#11715](https://github.com/yt-dlp/yt-dlp/issues/11715)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **youtube**
|
|
||||||
- [Adjust player clients for site changes](https://github.com/yt-dlp/yt-dlp/commit/0d146c1e36f467af30e87b7af651bdee67b73500) ([#11663](https://github.com/yt-dlp/yt-dlp/issues/11663)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- tab: [Fix playlists tab extraction](https://github.com/yt-dlp/yt-dlp/commit/fe70f20aedf528fdee332131bc9b6710e54e6f10) ([#11615](https://github.com/yt-dlp/yt-dlp/issues/11615)) by [seproDev](https://github.com/seproDev)
|
|
||||||
|
|
||||||
#### Networking changes
|
|
||||||
- **Request Handler**: websockets: [Support websockets 14.0+](https://github.com/yt-dlp/yt-dlp/commit/c7316373c0a886f65a07a51e50ee147bb3294c85) ([#11616](https://github.com/yt-dlp/yt-dlp/issues/11616)) by [coletdjnz](https://github.com/coletdjnz)
|
|
||||||
|
|
||||||
#### Misc. changes
|
|
||||||
- **cleanup**
|
|
||||||
- [Bump ruff to 0.8.x](https://github.com/yt-dlp/yt-dlp/commit/d8fb3490863653182864d2a53522f350d67a9ff8) ([#11608](https://github.com/yt-dlp/yt-dlp/issues/11608)) by [seproDev](https://github.com/seproDev)
|
|
||||||
- Miscellaneous
|
|
||||||
- [ccf0a6b](https://github.com/yt-dlp/yt-dlp/commit/ccf0a6b86b7f68a75463804fe485ec240b8635f0) by [bashonly](https://github.com/bashonly), [pzhlkj6612](https://github.com/pzhlkj6612)
|
|
||||||
- [2b67ac3](https://github.com/yt-dlp/yt-dlp/commit/2b67ac300ac8b44368fb121637d1743cea8c5b6b) by [bashonly](https://github.com/bashonly), [seproDev](https://github.com/seproDev)
|
|
||||||
|
|
||||||
### 2024.11.18
|
|
||||||
|
|
||||||
#### Important changes
|
|
||||||
- **Login with OAuth is no longer supported for YouTube**
|
|
||||||
Due to a change made by the site, yt-dlp is no longer able to support OAuth login for YouTube. [Read more](https://github.com/yt-dlp/yt-dlp/issues/11462#issuecomment-2471703090)
|
|
||||||
|
|
||||||
#### Core changes
|
|
||||||
- [Catch broken Cryptodome installations](https://github.com/yt-dlp/yt-dlp/commit/b83ca24eb72e1e558b0185bd73975586c0bc0546) ([#11486](https://github.com/yt-dlp/yt-dlp/issues/11486)) by [seproDev](https://github.com/seproDev)
|
|
||||||
- **utils**
|
|
||||||
- [Fix `join_nonempty`, add `**kwargs` to `unpack`](https://github.com/yt-dlp/yt-dlp/commit/39d79c9b9cf23411d935910685c40aa1a2fdb409) ([#11559](https://github.com/yt-dlp/yt-dlp/issues/11559)) by [Grub4K](https://github.com/Grub4K)
|
|
||||||
- `subs_list_to_dict`: [Add `lang` default parameter](https://github.com/yt-dlp/yt-dlp/commit/c014fbcddcb4c8f79d914ac5bb526758b540ea33) ([#11508](https://github.com/yt-dlp/yt-dlp/issues/11508)) by [Grub4K](https://github.com/Grub4K)
|
|
||||||
|
|
||||||
#### Extractor changes
|
|
||||||
- [Allow `ext` override for thumbnails](https://github.com/yt-dlp/yt-dlp/commit/eb64ae7d5def6df2aba74fb703e7f168fb299865) ([#11545](https://github.com/yt-dlp/yt-dlp/issues/11545)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **adobepass**: [Fix provider requests](https://github.com/yt-dlp/yt-dlp/commit/85fdc66b6e01d19a94b4f39b58e3c0cf23600902) ([#11472](https://github.com/yt-dlp/yt-dlp/issues/11472)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **archive.org**: [Fix comments extraction](https://github.com/yt-dlp/yt-dlp/commit/f2a4983df7a64c4e93b56f79dbd16a781bd90206) ([#11527](https://github.com/yt-dlp/yt-dlp/issues/11527)) by [jshumphrey](https://github.com/jshumphrey)
|
|
||||||
- **bandlab**: [Add extractors](https://github.com/yt-dlp/yt-dlp/commit/6365e92589e4bc17b8fffb0125a716d144ad2137) ([#11535](https://github.com/yt-dlp/yt-dlp/issues/11535)) by [seproDev](https://github.com/seproDev)
|
|
||||||
- **chaturbate**
|
|
||||||
- [Extract from API and support impersonation](https://github.com/yt-dlp/yt-dlp/commit/720b3dc453c342bc2e8df7dbc0acaab4479de46c) ([#11555](https://github.com/yt-dlp/yt-dlp/issues/11555)) by [powergold1](https://github.com/powergold1) (With fixes in [7cecd29](https://github.com/yt-dlp/yt-dlp/commit/7cecd299e4a5ef1f0f044b2fedc26f17e41f15e3) by [seproDev](https://github.com/seproDev))
|
|
||||||
- [Support alternate domains](https://github.com/yt-dlp/yt-dlp/commit/a9f85670d03ab993dc589f21a9ffffcad61392d5) ([#10595](https://github.com/yt-dlp/yt-dlp/issues/10595)) by [manavchaudhary1](https://github.com/manavchaudhary1)
|
|
||||||
- **cloudflarestream**: [Avoid extraction via videodelivery.net](https://github.com/yt-dlp/yt-dlp/commit/2db8c2e7d57a1784b06057c48e3e91023720d195) ([#11478](https://github.com/yt-dlp/yt-dlp/issues/11478)) by [hugovdev](https://github.com/hugovdev)
|
|
||||||
- **ctvnews**
|
|
||||||
- [Fix extractor](https://github.com/yt-dlp/yt-dlp/commit/f351440f1dc5b3dfbfc5737b037a869d946056fe) ([#11534](https://github.com/yt-dlp/yt-dlp/issues/11534)) by [bashonly](https://github.com/bashonly), [jshumphrey](https://github.com/jshumphrey)
|
|
||||||
- [Fix playlist ID extraction](https://github.com/yt-dlp/yt-dlp/commit/f9d98509a898737c12977b2e2117277bada2c196) ([#8892](https://github.com/yt-dlp/yt-dlp/issues/8892)) by [qbnu](https://github.com/qbnu)
|
|
||||||
- **digitalconcerthall**: [Support login with access/refresh tokens](https://github.com/yt-dlp/yt-dlp/commit/f7257588bdff5f0b0452635a66b253a783c97357) ([#11571](https://github.com/yt-dlp/yt-dlp/issues/11571)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **facebook**: [Fix formats extraction](https://github.com/yt-dlp/yt-dlp/commit/bacc31b05a04181b63100c481565256b14813a5e) ([#11513](https://github.com/yt-dlp/yt-dlp/issues/11513)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **gamedevtv**: [Add extractor](https://github.com/yt-dlp/yt-dlp/commit/be3579aaf0c3b71a0a3195e1955415d5e4d6b3d8) ([#11368](https://github.com/yt-dlp/yt-dlp/issues/11368)) by [bashonly](https://github.com/bashonly), [stratus-ss](https://github.com/stratus-ss)
|
|
||||||
- **goplay**: [Fix extractor](https://github.com/yt-dlp/yt-dlp/commit/6b43a8d84b881d769b480ba6e20ec691e9d1b92d) ([#11466](https://github.com/yt-dlp/yt-dlp/issues/11466)) by [bashonly](https://github.com/bashonly), [SamDecrock](https://github.com/SamDecrock)
|
|
||||||
- **kenh14**: [Add extractor](https://github.com/yt-dlp/yt-dlp/commit/eb15fd5a32d8b35ef515f7a3d1158c03025648ff) ([#3996](https://github.com/yt-dlp/yt-dlp/issues/3996)) by [krichbanana](https://github.com/krichbanana), [pzhlkj6612](https://github.com/pzhlkj6612)
|
|
||||||
- **litv**: [Fix extractor](https://github.com/yt-dlp/yt-dlp/commit/e079ffbda66de150c0a9ebef05e89f61bb4d5f76) ([#11071](https://github.com/yt-dlp/yt-dlp/issues/11071)) by [jiru](https://github.com/jiru)
|
|
||||||
- **mixchmovie**: [Add extractor](https://github.com/yt-dlp/yt-dlp/commit/0ec9bfed4d4a52bfb4f8733da1acf0aeeae21e6b) ([#10897](https://github.com/yt-dlp/yt-dlp/issues/10897)) by [Sakura286](https://github.com/Sakura286)
|
|
||||||
- **patreon**: [Fix comments extraction](https://github.com/yt-dlp/yt-dlp/commit/1d253b0a27110d174c40faf8fb1c999d099e0cde) ([#11530](https://github.com/yt-dlp/yt-dlp/issues/11530)) by [bashonly](https://github.com/bashonly), [jshumphrey](https://github.com/jshumphrey)
|
|
||||||
- **pialive**: [Add extractor](https://github.com/yt-dlp/yt-dlp/commit/d867f99622ef7fba690b08da56c39d739b822bb7) ([#10811](https://github.com/yt-dlp/yt-dlp/issues/10811)) by [ChocoLZS](https://github.com/ChocoLZS)
|
|
||||||
- **radioradicale**: [Add extractor](https://github.com/yt-dlp/yt-dlp/commit/70c55cb08f780eab687e881ef42bb5c6007d290b) ([#5607](https://github.com/yt-dlp/yt-dlp/issues/5607)) by [a13ssandr0](https://github.com/a13ssandr0), [pzhlkj6612](https://github.com/pzhlkj6612)
|
|
||||||
- **reddit**: [Improve error handling](https://github.com/yt-dlp/yt-dlp/commit/7ea2787920cccc6b8ea30791993d114fbd564434) ([#11573](https://github.com/yt-dlp/yt-dlp/issues/11573)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **redgifsuser**: [Fix extraction](https://github.com/yt-dlp/yt-dlp/commit/d215fba7edb69d4fa665f43663756fd260b1489f) ([#11531](https://github.com/yt-dlp/yt-dlp/issues/11531)) by [jshumphrey](https://github.com/jshumphrey)
|
|
||||||
- **rutube**: [Rework extractors](https://github.com/yt-dlp/yt-dlp/commit/e398217aae19bb25f91797bfbe8a3243698d7f45) ([#11480](https://github.com/yt-dlp/yt-dlp/issues/11480)) by [seproDev](https://github.com/seproDev)
|
|
||||||
- **sonylivseries**: [Add `sort_order` extractor-arg](https://github.com/yt-dlp/yt-dlp/commit/2009cb27e17014787bf63eaa2ada51293d54f22a) ([#11569](https://github.com/yt-dlp/yt-dlp/issues/11569)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **soop**: [Fix thumbnail extraction](https://github.com/yt-dlp/yt-dlp/commit/c699bafc5038b59c9afe8c2e69175fb66424c832) ([#11545](https://github.com/yt-dlp/yt-dlp/issues/11545)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **spankbang**: [Support browser impersonation](https://github.com/yt-dlp/yt-dlp/commit/8388ec256f7753b02488788e3cfa771f6e1db247) ([#11542](https://github.com/yt-dlp/yt-dlp/issues/11542)) by [jshumphrey](https://github.com/jshumphrey)
|
|
||||||
- **spreaker**
|
|
||||||
- [Support episode pages and access keys](https://github.com/yt-dlp/yt-dlp/commit/c39016f66df76d14284c705736ca73db8055d8de) ([#11489](https://github.com/yt-dlp/yt-dlp/issues/11489)) by [julionc](https://github.com/julionc)
|
|
||||||
- [Support podcast and feed pages](https://github.com/yt-dlp/yt-dlp/commit/c6737310619022248f5d0fd13872073cac168453) ([#10968](https://github.com/yt-dlp/yt-dlp/issues/10968)) by [subrat-lima](https://github.com/subrat-lima)
|
|
||||||
- **youtube**
|
|
||||||
- [Player client maintenance](https://github.com/yt-dlp/yt-dlp/commit/637d62a3a9fc723d68632c1af25c30acdadeeb85) ([#11528](https://github.com/yt-dlp/yt-dlp/issues/11528)) by [bashonly](https://github.com/bashonly), [seproDev](https://github.com/seproDev)
|
|
||||||
- [Remove broken OAuth support](https://github.com/yt-dlp/yt-dlp/commit/52c0ffe40ad6e8404d93296f575007b05b04c686) ([#11558](https://github.com/yt-dlp/yt-dlp/issues/11558)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- tab: [Fix podcasts tab extraction](https://github.com/yt-dlp/yt-dlp/commit/37cd7660eaff397c551ee18d80507702342b0c2b) ([#11567](https://github.com/yt-dlp/yt-dlp/issues/11567)) by [seproDev](https://github.com/seproDev)
|
|
||||||
|
|
||||||
#### Misc. changes
|
|
||||||
- **build**
|
|
||||||
- [Bump PyInstaller version pin to `>=6.11.1`](https://github.com/yt-dlp/yt-dlp/commit/f9c8deb4e5887ff5150e911ac0452e645f988044) ([#11507](https://github.com/yt-dlp/yt-dlp/issues/11507)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- [Enable attestations for trusted publishing](https://github.com/yt-dlp/yt-dlp/commit/f13df591d4d7ca8e2f31b35c9c91e69ba9e9b013) ([#11420](https://github.com/yt-dlp/yt-dlp/issues/11420)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- [Pin `websockets` version to >=13.0,<14](https://github.com/yt-dlp/yt-dlp/commit/240a7d43c8a67ffb86d44dc276805aa43c358dcc) ([#11488](https://github.com/yt-dlp/yt-dlp/issues/11488)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **cleanup**
|
|
||||||
- [Deprecate more compat functions](https://github.com/yt-dlp/yt-dlp/commit/f95a92b3d0169a784ee15a138fbe09d82b2754a1) ([#11439](https://github.com/yt-dlp/yt-dlp/issues/11439)) by [seproDev](https://github.com/seproDev)
|
|
||||||
- [Remove dead extractors](https://github.com/yt-dlp/yt-dlp/commit/10fc719bc7f1eef469389c5219102266ef411f29) ([#11566](https://github.com/yt-dlp/yt-dlp/issues/11566)) by [doe1080](https://github.com/doe1080)
|
|
||||||
- Miscellaneous: [da252d9](https://github.com/yt-dlp/yt-dlp/commit/da252d9d322af3e2178ac5eae324809502a0a862) by [bashonly](https://github.com/bashonly), [Grub4K](https://github.com/Grub4K), [seproDev](https://github.com/seproDev)
|
|
||||||
|
|
||||||
### 2024.11.04
|
|
||||||
|
|
||||||
#### Important changes
|
|
||||||
- **Beginning with this release, yt-dlp's Python dependencies *must* be installed using the `default` group**
|
|
||||||
If you're installing yt-dlp with pip/pipx or requiring yt-dlp in your own Python project, you'll need to specify `yt-dlp[default]` if you want to also install yt-dlp's optional dependencies (which were previously included by default). [Read more](https://github.com/yt-dlp/yt-dlp/pull/11255)
|
|
||||||
- **The minimum *required* Python version has been raised to 3.9**
|
|
||||||
Python 3.8 reached its end-of-life on 2024.10.07, and yt-dlp has now removed support for it. As an unfortunate side effect, the official `yt-dlp.exe` and `yt-dlp_x86.exe` binaries are no longer supported on Windows 7. [Read more](https://github.com/yt-dlp/yt-dlp/issues/10086)
|
|
||||||
|
|
||||||
#### Core changes
|
|
||||||
- [Allow thumbnails with `.jpe` extension](https://github.com/yt-dlp/yt-dlp/commit/5bc5fb2835ea59bdf326bd12176d74d2c7348a95) ([#11408](https://github.com/yt-dlp/yt-dlp/issues/11408)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- [Expand paths in `--plugin-dirs`](https://github.com/yt-dlp/yt-dlp/commit/914af9a0cf51c9a3f74aa88d952bee8334c67511) ([#11334](https://github.com/yt-dlp/yt-dlp/issues/11334)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- [Fix `--netrc` empty string parsing for Python <=3.10](https://github.com/yt-dlp/yt-dlp/commit/88402b714ec124633933737bc156b172a3dec3d6) ([#11414](https://github.com/yt-dlp/yt-dlp/issues/11414)) by [bashonly](https://github.com/bashonly), [Grub4K](https://github.com/Grub4K)
|
|
||||||
- [Populate format sorting fields before dependent fields](https://github.com/yt-dlp/yt-dlp/commit/5c880ef42e9c2b2fc412f6d69dad37d34fb75a62) ([#11353](https://github.com/yt-dlp/yt-dlp/issues/11353)) by [Grub4K](https://github.com/Grub4K)
|
|
||||||
- [Prioritize AV1](https://github.com/yt-dlp/yt-dlp/commit/3945677a75e94a1fecc085432d791e1c21220cd3) ([#11153](https://github.com/yt-dlp/yt-dlp/issues/11153)) by [seproDev](https://github.com/seproDev)
|
|
||||||
- [Remove Python 3.8 support](https://github.com/yt-dlp/yt-dlp/commit/d784464399b600ba9516bbcec6286f11d68974dd) ([#11321](https://github.com/yt-dlp/yt-dlp/issues/11321)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **aes**: [Fix GCM pad length calculation](https://github.com/yt-dlp/yt-dlp/commit/beae2db127d3b5017cbcf685da9de7a9ef496541) ([#11438](https://github.com/yt-dlp/yt-dlp/issues/11438)) by [seproDev](https://github.com/seproDev)
|
|
||||||
- **cookies**: [Support chrome table version 24](https://github.com/yt-dlp/yt-dlp/commit/4613096f2e6eab9dcbac0e98b6cec760bbc99375) ([#11425](https://github.com/yt-dlp/yt-dlp/issues/11425)) by [kesor](https://github.com/kesor), [seproDev](https://github.com/seproDev)
|
|
||||||
- **utils**
|
|
||||||
- [Allow partial application for more functions](https://github.com/yt-dlp/yt-dlp/commit/b6dc2c49e8793c6dfa21275e61caf49ec1148b81) ([#11391](https://github.com/yt-dlp/yt-dlp/issues/11391)) by [bashonly](https://github.com/bashonly), [Grub4K](https://github.com/Grub4K) (With fixes in [422195e](https://github.com/yt-dlp/yt-dlp/commit/422195ec70a00b0d2002b238cacbae7790c57fdf) by [Grub4K](https://github.com/Grub4K))
|
|
||||||
- [Fix `find_element` by class](https://github.com/yt-dlp/yt-dlp/commit/f93c16395cea1fe9ffc3c594d3e019c3b214544c) ([#11402](https://github.com/yt-dlp/yt-dlp/issues/11402)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- [Fix and improve `find_element` and `find_elements`](https://github.com/yt-dlp/yt-dlp/commit/b103aca24d35b72b405c340357dc01a0ed534281) ([#11443](https://github.com/yt-dlp/yt-dlp/issues/11443)) by [bashonly](https://github.com/bashonly), [Grub4K](https://github.com/Grub4K)
|
|
||||||
|
|
||||||
#### Extractor changes
|
|
||||||
- [Resolve `language` to ISO639-2 for ISM formats](https://github.com/yt-dlp/yt-dlp/commit/21cdcf03a237a0c4979c941d5a5385cae44c7906) ([#11359](https://github.com/yt-dlp/yt-dlp/issues/11359)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **ardmediathek**: [Extract chapters](https://github.com/yt-dlp/yt-dlp/commit/59f8dd8239c31f00b708da53b39b1e2e9409b6e6) ([#11442](https://github.com/yt-dlp/yt-dlp/issues/11442)) by [iw0nderhow](https://github.com/iw0nderhow)
|
|
||||||
- **bfmtv**: [Fix extractors](https://github.com/yt-dlp/yt-dlp/commit/754940e9a558565d6bd3c0c529802569b1d0ae4e) ([#11444](https://github.com/yt-dlp/yt-dlp/issues/11444)) by [seproDev](https://github.com/seproDev)
|
|
||||||
- **bluesky**: [Add extractor](https://github.com/yt-dlp/yt-dlp/commit/5c7a5aaab27e9c3cb367b663a6136ca58866e547) ([#11055](https://github.com/yt-dlp/yt-dlp/issues/11055)) by [MellowKyler](https://github.com/MellowKyler), [seproDev](https://github.com/seproDev)
|
|
||||||
- **ccma**: [Support new 3cat.cat domain](https://github.com/yt-dlp/yt-dlp/commit/330335386d4f7603d92d6796798375336005275e) ([#11222](https://github.com/yt-dlp/yt-dlp/issues/11222)) by [JoseAngelB](https://github.com/JoseAngelB)
|
|
||||||
- **chzzk**: video: [Fix extraction](https://github.com/yt-dlp/yt-dlp/commit/9c6534da81e485b2325b3489ee4128943e6d3e4b) ([#11228](https://github.com/yt-dlp/yt-dlp/issues/11228)) by [hui1601](https://github.com/hui1601)
|
|
||||||
- **cnn**: [Fix extractor](https://github.com/yt-dlp/yt-dlp/commit/9acf79c91a8c6c55ca972747c6858e784e2da351) ([#10185](https://github.com/yt-dlp/yt-dlp/issues/10185)) by [kylegustavo](https://github.com/kylegustavo), [seproDev](https://github.com/seproDev)
|
|
||||||
- **dailymotion**
|
|
||||||
- [Improve embed extraction](https://github.com/yt-dlp/yt-dlp/commit/a403dcf9be20b49cbb3017328f4aaa352fb6d685) ([#10843](https://github.com/yt-dlp/yt-dlp/issues/10843)) by [bashonly](https://github.com/bashonly), [pzhlkj6612](https://github.com/pzhlkj6612)
|
|
||||||
- [Support shortened URLs](https://github.com/yt-dlp/yt-dlp/commit/d1358231371f20fa23020fa9176be3b56119873e) ([#11374](https://github.com/yt-dlp/yt-dlp/issues/11374)) by [bashonly](https://github.com/bashonly), [seproDev](https://github.com/seproDev)
|
|
||||||
- **facebook**: [Fix formats extraction](https://github.com/yt-dlp/yt-dlp/commit/ec9b25043f399de6a591d8370d32bf0e66c117f2) ([#11343](https://github.com/yt-dlp/yt-dlp/issues/11343)) by [kclauhk](https://github.com/kclauhk)
|
|
||||||
- **generic**: [Do not impersonate by default](https://github.com/yt-dlp/yt-dlp/commit/c29f5a7fae93a08f3cfbb6127b2faa75145b06a0) ([#11336](https://github.com/yt-dlp/yt-dlp/issues/11336)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **nfl**: [Fix extractors](https://github.com/yt-dlp/yt-dlp/commit/838f4385de8300a4dd4e7ffbbf0e5b7b85fb52c2) ([#11409](https://github.com/yt-dlp/yt-dlp/issues/11409)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **niconicouser**: [Fix extractor](https://github.com/yt-dlp/yt-dlp/commit/6abef74232c0fc695cd803c18ae446cacb129389) ([#11324](https://github.com/yt-dlp/yt-dlp/issues/11324)) by [Wesley107772](https://github.com/Wesley107772)
|
|
||||||
- **soundcloud**: [Extract artists](https://github.com/yt-dlp/yt-dlp/commit/f101e5d34c97c608156ad5396714c2a2edca966a) ([#11377](https://github.com/yt-dlp/yt-dlp/issues/11377)) by [seproDev](https://github.com/seproDev)
|
|
||||||
- **tumblr**: [Support more URLs](https://github.com/yt-dlp/yt-dlp/commit/b03267bf0675eeb8df5baf1daac7cf67840c91a5) ([#6057](https://github.com/yt-dlp/yt-dlp/issues/6057)) by [selfisekai](https://github.com/selfisekai), [seproDev](https://github.com/seproDev)
|
|
||||||
- **twitter**: [Remove cookies migration workaround](https://github.com/yt-dlp/yt-dlp/commit/76802f461332d444e596437c42374fa237fa5174) ([#11392](https://github.com/yt-dlp/yt-dlp/issues/11392)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **vimeo**: [Fix API retries](https://github.com/yt-dlp/yt-dlp/commit/57212a5f97ce367590aaa5c3e9a135eead8f81f7) ([#11351](https://github.com/yt-dlp/yt-dlp/issues/11351)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **yle_areena**: [Support live events](https://github.com/yt-dlp/yt-dlp/commit/a6783a3b9905e547f6c1d4df9d7c7999feda8afa) ([#11358](https://github.com/yt-dlp/yt-dlp/issues/11358)) by [bashonly](https://github.com/bashonly), [CounterPillow](https://github.com/CounterPillow)
|
|
||||||
- **youtube**: [Adjust OAuth refresh token handling](https://github.com/yt-dlp/yt-dlp/commit/d569a8845254d90ce13ad74ae76695e8d6441068) ([#11414](https://github.com/yt-dlp/yt-dlp/issues/11414)) by [bashonly](https://github.com/bashonly)
|
|
||||||
|
|
||||||
#### Misc. changes
|
|
||||||
- **build**
|
|
||||||
- [Disable attestations for trusted publishing](https://github.com/yt-dlp/yt-dlp/commit/428ffb75aa3534b275cf54de42693a4d261519da) ([#11418](https://github.com/yt-dlp/yt-dlp/issues/11418)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- [Move optional dependencies to the `default` group](https://github.com/yt-dlp/yt-dlp/commit/87884f15580910e4e0fe0e1db73508debc657471) ([#11255](https://github.com/yt-dlp/yt-dlp/issues/11255)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- [Use Ubuntu 20.04 and Python 3.9 for Linux ARM builds](https://github.com/yt-dlp/yt-dlp/commit/dd2e24446954246a2ec4d4a7e95531f52a14b351) ([#8638](https://github.com/yt-dlp/yt-dlp/issues/8638)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **cleanup**
|
|
||||||
- Miscellaneous
|
|
||||||
- [ea9e35d](https://github.com/yt-dlp/yt-dlp/commit/ea9e35d85fba5eab341cdcaf1eaed69b57f7e465) by [bashonly](https://github.com/bashonly)
|
|
||||||
- [c998238](https://github.com/yt-dlp/yt-dlp/commit/c998238c2e76c62d1d29962c6e8ebe916cc7913b) by [bashonly](https://github.com/bashonly), [KBelmin](https://github.com/KBelmin)
|
|
||||||
- [197d0b0](https://github.com/yt-dlp/yt-dlp/commit/197d0b03b6a3c8fe4fa5ace630eeffec629bf72c) by [avagordon01](https://github.com/avagordon01), [bashonly](https://github.com/bashonly), [grqz](https://github.com/grqz), [Grub4K](https://github.com/Grub4K), [seproDev](https://github.com/seproDev)
|
|
||||||
- **devscripts**: `make_changelog`: [Parse full commit message for fixes](https://github.com/yt-dlp/yt-dlp/commit/0a3991edae0e10f2ea41ece9fdea5e48f789f1de) ([#11366](https://github.com/yt-dlp/yt-dlp/issues/11366)) by [bashonly](https://github.com/bashonly), [Grub4K](https://github.com/Grub4K)
|
|
||||||
|
|
||||||
### 2024.10.22
|
|
||||||
|
|
||||||
#### Important changes
|
|
||||||
- **Following this release, yt-dlp's Python dependencies *must* be installed using the `default` group**
|
|
||||||
If you're installing yt-dlp with pip/pipx or requiring yt-dlp in your own Python project, you'll need to specify `yt-dlp[default]` if you want to also install yt-dlp's optional dependencies (which were previously included by default). [Read more](https://github.com/yt-dlp/yt-dlp/pull/11255)
|
|
||||||
- **py2exe is no longer supported**
|
|
||||||
This release's `yt-dlp_min.exe` will be the last, and it's actually a PyInstaller-bundled executable so that yt-dlp users updating their py2exe build with `-U` will be automatically migrated. [Read more](https://github.com/yt-dlp/yt-dlp/issues/10087)
|
|
||||||
|
|
||||||
#### Core changes
|
|
||||||
- [Add extractor helpers](https://github.com/yt-dlp/yt-dlp/commit/d710a6ca7c622705c0c8c8a3615916f531137d5d) ([#10653](https://github.com/yt-dlp/yt-dlp/issues/10653)) by [Grub4K](https://github.com/Grub4K)
|
|
||||||
- [Add option `--plugin-dirs`](https://github.com/yt-dlp/yt-dlp/commit/0f593dca9fa995d88eb763170a932da61c8f24dc) ([#11277](https://github.com/yt-dlp/yt-dlp/issues/11277)) by [coletdjnz](https://github.com/coletdjnz), [imranh2](https://github.com/imranh2)
|
|
||||||
- **cookies**: [Fix compatibility for Python <=3.9 in traceback](https://github.com/yt-dlp/yt-dlp/commit/c5f0f58efd8c3930de8202c15a5c53b1b635bd51) by [Grub4K](https://github.com/Grub4K)
|
|
||||||
- **utils**
|
|
||||||
- `Popen`: [Reset PyInstaller environment](https://github.com/yt-dlp/yt-dlp/commit/fbc66e3ab35743cc847a21223c67d88bb463cd9c) ([#11258](https://github.com/yt-dlp/yt-dlp/issues/11258)) by [bashonly](https://github.com/bashonly), [Grub4K](https://github.com/Grub4K)
|
|
||||||
- `sanitize_path`: [Reimplement function](https://github.com/yt-dlp/yt-dlp/commit/85b87c991af25dcb35630fa94580fd418e78ee33) ([#11198](https://github.com/yt-dlp/yt-dlp/issues/11198)) by [Grub4K](https://github.com/Grub4K)
|
|
||||||
|
|
||||||
#### Extractor changes
|
|
||||||
- **adobepass**: [Use newer user-agent for provider redirect request](https://github.com/yt-dlp/yt-dlp/commit/dcfeea4dd5e5686821350baa6c7767a011944867) ([#11250](https://github.com/yt-dlp/yt-dlp/issues/11250)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **afreecatv**: [Adapt extractors to new sooplive.co.kr domain](https://github.com/yt-dlp/yt-dlp/commit/46fe60ff19395698a87113b2944453779e04ab9d) ([#11266](https://github.com/yt-dlp/yt-dlp/issues/11266)) by [63427083](https://github.com/63427083), [bashonly](https://github.com/bashonly)
|
|
||||||
- **cda**: [Support folders](https://github.com/yt-dlp/yt-dlp/commit/c4d95f67ddc522297bb1fea875255cf94b34d595) ([#10786](https://github.com/yt-dlp/yt-dlp/issues/10786)) by [pktiuk](https://github.com/pktiuk)
|
|
||||||
- **cwtv**: [Fix extraction](https://github.com/yt-dlp/yt-dlp/commit/9d43dcb2c5c38f443f84dfc126cd32720e1a1ad6) ([#11230](https://github.com/yt-dlp/yt-dlp/issues/11230)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **drtv**: [Fix extractor](https://github.com/yt-dlp/yt-dlp/commit/f4338714241b11d9d43768ae71a25f5e952f677d) ([#11141](https://github.com/yt-dlp/yt-dlp/issues/11141)) by [444995](https://github.com/444995)
|
|
||||||
- **funk**: [Extend `_VALID_URL`](https://github.com/yt-dlp/yt-dlp/commit/8de431ec97a4b62b73df8f686b6e21e462775336) ([#11269](https://github.com/yt-dlp/yt-dlp/issues/11269)) by [seproDev](https://github.com/seproDev)
|
|
||||||
- **gem.cbc.ca**: [Fix formats extraction](https://github.com/yt-dlp/yt-dlp/commit/40054cb4a7ebbea30d335d444e6f58b298a3baa0) ([#11196](https://github.com/yt-dlp/yt-dlp/issues/11196)) by [DavidSkrundz](https://github.com/DavidSkrundz)
|
|
||||||
- **generic**: [Impersonate browser by default](https://github.com/yt-dlp/yt-dlp/commit/edfd095b1917701c5046bd51f9542897c17d41a7) ([#11206](https://github.com/yt-dlp/yt-dlp/issues/11206)) by [Grub4K](https://github.com/Grub4K)
|
|
||||||
- **imgur**
|
|
||||||
- [Fix thumbnail extraction](https://github.com/yt-dlp/yt-dlp/commit/87408ccfd772ddf31a8323d8151c24f9577cbc9f) ([#11298](https://github.com/yt-dlp/yt-dlp/issues/11298)) by [seproDev](https://github.com/seproDev)
|
|
||||||
- [Support new URL format](https://github.com/yt-dlp/yt-dlp/commit/5af774d7a36c00bea618c7047c9326532cd3f616) ([#11075](https://github.com/yt-dlp/yt-dlp/issues/11075)) by [Deer-Spangle](https://github.com/Deer-Spangle)
|
|
||||||
- **patreon**: campaign: [Stricter URL matching](https://github.com/yt-dlp/yt-dlp/commit/babb70960595e2146f06f81affc29c7e713e34e2) ([#11235](https://github.com/yt-dlp/yt-dlp/issues/11235)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **reddit**: [Detect and raise when login is required](https://github.com/yt-dlp/yt-dlp/commit/cba7868502f04175fecf9ab3e363296aee7ebec2) ([#11202](https://github.com/yt-dlp/yt-dlp/issues/11202)) by [pzhlkj6612](https://github.com/pzhlkj6612)
|
|
||||||
- **substack**: [Resolve podcast file extensions](https://github.com/yt-dlp/yt-dlp/commit/3148c1822f66533998278f0a1cf842b9bea1526a) ([#11275](https://github.com/yt-dlp/yt-dlp/issues/11275)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **telecinco**: [Fix extractors](https://github.com/yt-dlp/yt-dlp/commit/0b7ec08816fb196cd41d392f8331b4eb8366c4f8) ([#11142](https://github.com/yt-dlp/yt-dlp/issues/11142)) by [bashonly](https://github.com/bashonly), [DarkZeros](https://github.com/DarkZeros)
|
|
||||||
- **tubitv**: [Strip extra whitespace from titles](https://github.com/yt-dlp/yt-dlp/commit/e68b4c19af122876561a41f2dd8093fae7b417c7) ([#10795](https://github.com/yt-dlp/yt-dlp/issues/10795)) by [allendema](https://github.com/allendema)
|
|
||||||
- **tver**: [Support series URLs](https://github.com/yt-dlp/yt-dlp/commit/ceaea731b6e314dbbdfb2e358d7677785ed0b4fc) ([#9507](https://github.com/yt-dlp/yt-dlp/issues/9507)) by [pzhlkj6612](https://github.com/pzhlkj6612), [vvto33](https://github.com/vvto33)
|
|
||||||
- **twitter**: spaces: [Allow extraction when not logged in](https://github.com/yt-dlp/yt-dlp/commit/679c68240a26481ea7c07cc0c014745631ea8481) ([#11289](https://github.com/yt-dlp/yt-dlp/issues/11289)) by [rubyevadestaxes](https://github.com/rubyevadestaxes)
|
|
||||||
- **weverse**: [Fix extractor](https://github.com/yt-dlp/yt-dlp/commit/5310fa87f6cb7f66bf42e2520878952fbf6b1652) ([#11215](https://github.com/yt-dlp/yt-dlp/issues/11215)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **youtube**
|
|
||||||
- [Fix `comment_count` extraction](https://github.com/yt-dlp/yt-dlp/commit/7af1ddaaf2a6a0a750373a9ab53c7770af4f9fe4) ([#11274](https://github.com/yt-dlp/yt-dlp/issues/11274)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- [Remove broken `android_producer` client](https://github.com/yt-dlp/yt-dlp/commit/fed53d70bdb7d3e37ef63dd7fcf0ef74356167fd) ([#11297](https://github.com/yt-dlp/yt-dlp/issues/11297)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- [Remove broken age-restriction workaround](https://github.com/yt-dlp/yt-dlp/commit/ec2f4bf0823a13043f98f5bd0bf6677837bf09dc) ([#11297](https://github.com/yt-dlp/yt-dlp/issues/11297)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- [Support logging in with OAuth](https://github.com/yt-dlp/yt-dlp/commit/b8635c1d4779da195e71aa281f73aaad702c935e) ([#11001](https://github.com/yt-dlp/yt-dlp/issues/11001)) by [coletdjnz](https://github.com/coletdjnz)
|
|
||||||
|
|
||||||
#### Misc. changes
|
|
||||||
- **build**
|
|
||||||
- [Migrate `py2exe` builds to `win_exe`](https://github.com/yt-dlp/yt-dlp/commit/a886cf3e900f4a2ec00af705f883539269545609) ([#11256](https://github.com/yt-dlp/yt-dlp/issues/11256)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- [Use `macos-13` image for macOS builds](https://github.com/yt-dlp/yt-dlp/commit/64d84d75ca8c19ec06558cc7c511f5f4f7a822bc) ([#11236](https://github.com/yt-dlp/yt-dlp/issues/11236)) by [bashonly](https://github.com/bashonly)
|
|
||||||
- `make_lazy_extractors`: [Force running without plugins](https://github.com/yt-dlp/yt-dlp/commit/1a830394a21a81a3e9918f9e175abc9fbb21f089) ([#11205](https://github.com/yt-dlp/yt-dlp/issues/11205)) by [Grub4K](https://github.com/Grub4K)
|
|
||||||
- **cleanup**: Miscellaneous: [67adeb7](https://github.com/yt-dlp/yt-dlp/commit/67adeb7bab00662ba55d473e405b301abb42fe61) by [bashonly](https://github.com/bashonly), [DTrombett](https://github.com/DTrombett), [grqz](https://github.com/grqz), [Grub4K](https://github.com/Grub4K), [KarboniteKream](https://github.com/KarboniteKream), [mikkovedru](https://github.com/mikkovedru), [seproDev](https://github.com/seproDev)
|
|
||||||
- **test**: [Allow running tests explicitly](https://github.com/yt-dlp/yt-dlp/commit/16eb28026a2ddf5608d0a628ef15949b8d3805a9) ([#11203](https://github.com/yt-dlp/yt-dlp/issues/11203)) by [Grub4K](https://github.com/Grub4K)
|
|
||||||
|
|
||||||
### 2024.10.07
|
### 2024.10.07
|
||||||
|
|
||||||
#### Core changes
|
#### Core changes
|
||||||
|
5
Makefile
5
Makefile
@ -18,11 +18,10 @@ pypi-files: AUTHORS Changelog.md LICENSE README.md README.txt supportedsites \
|
|||||||
tar pypi-files lazy-extractors install uninstall
|
tar pypi-files lazy-extractors install uninstall
|
||||||
|
|
||||||
clean-test:
|
clean-test:
|
||||||
rm -rf tmp/ *.annotations.xml *.aria2 *.description *.dump *.frag \
|
rm -rf test/testdata/sigs/player-*.js tmp/ *.annotations.xml *.aria2 *.description *.dump *.frag \
|
||||||
*.frag.aria2 *.frag.urls *.info.json *.live_chat.json *.meta *.part* *.tmp *.temp *.unknown_video *.ytdl \
|
*.frag.aria2 *.frag.urls *.info.json *.live_chat.json *.meta *.part* *.tmp *.temp *.unknown_video *.ytdl \
|
||||||
*.3gp *.ape *.ass *.avi *.desktop *.f4v *.flac *.flv *.gif *.jpeg *.jpg *.lrc *.m4a *.m4v *.mhtml *.mkv *.mov *.mp3 *.mp4 \
|
*.3gp *.ape *.ass *.avi *.desktop *.f4v *.flac *.flv *.gif *.jpeg *.jpg *.lrc *.m4a *.m4v *.mhtml *.mkv *.mov *.mp3 *.mp4 \
|
||||||
*.mpg *.mpga *.oga *.ogg *.opus *.png *.sbv *.srt *.ssa *.swf *.tt *.ttml *.url *.vtt *.wav *.webloc *.webm *.webp \
|
*.mpg *.mpga *.oga *.ogg *.opus *.png *.sbv *.srt *.ssa *.swf *.tt *.ttml *.url *.vtt *.wav *.webloc *.webm *.webp
|
||||||
test/testdata/sigs/player-*.js test/testdata/thumbnails/empty.webp "test/testdata/thumbnails/foo %d bar/foo_%d."*
|
|
||||||
clean-dist:
|
clean-dist:
|
||||||
rm -rf yt-dlp.1.temp.md yt-dlp.1 README.txt MANIFEST build/ dist/ .coverage cover/ yt-dlp.tar.gz completions/ \
|
rm -rf yt-dlp.1.temp.md yt-dlp.1 README.txt MANIFEST build/ dist/ .coverage cover/ yt-dlp.tar.gz completions/ \
|
||||||
yt_dlp/extractor/lazy_extractors.py *.spec CONTRIBUTING.md.tmp yt-dlp yt-dlp.exe yt_dlp.egg-info/ AUTHORS
|
yt_dlp/extractor/lazy_extractors.py *.spec CONTRIBUTING.md.tmp yt-dlp yt-dlp.exe yt_dlp.egg-info/ AUTHORS
|
||||||
|
264
README.md
264
README.md
@ -4,8 +4,9 @@
|
|||||||
[](#readme)
|
[](#readme)
|
||||||
|
|
||||||
[](#installation "Installation")
|
[](#installation "Installation")
|
||||||
[](https://pypi.org/project/yt-dlp "PyPI")
|
[](https://pypi.org/project/yt-dlp "PyPi")
|
||||||
[](Collaborators.md#collaborators "Donate")
|
[](Collaborators.md#collaborators "Donate")
|
||||||
|
[](https://matrix.to/#/#yt-dlp:matrix.org "Matrix")
|
||||||
[](https://discord.gg/H5MNcFW63r "Discord")
|
[](https://discord.gg/H5MNcFW63r "Discord")
|
||||||
[](supportedsites.md "Supported Sites")
|
[](supportedsites.md "Supported Sites")
|
||||||
[](LICENSE "License")
|
[](LICENSE "License")
|
||||||
@ -44,7 +45,6 @@ yt-dlp is a feature-rich command-line audio/video downloader with support for [t
|
|||||||
* [Post-processing Options](#post-processing-options)
|
* [Post-processing Options](#post-processing-options)
|
||||||
* [SponsorBlock Options](#sponsorblock-options)
|
* [SponsorBlock Options](#sponsorblock-options)
|
||||||
* [Extractor Options](#extractor-options)
|
* [Extractor Options](#extractor-options)
|
||||||
* [Preset Aliases](#preset-aliases)
|
|
||||||
* [CONFIGURATION](#configuration)
|
* [CONFIGURATION](#configuration)
|
||||||
* [Configuration file encoding](#configuration-file-encoding)
|
* [Configuration file encoding](#configuration-file-encoding)
|
||||||
* [Authentication with netrc](#authentication-with-netrc)
|
* [Authentication with netrc](#authentication-with-netrc)
|
||||||
@ -81,7 +81,7 @@ yt-dlp is a feature-rich command-line audio/video downloader with support for [t
|
|||||||
[](https://github.com/yt-dlp/yt-dlp/releases/latest/download/yt-dlp.exe)
|
[](https://github.com/yt-dlp/yt-dlp/releases/latest/download/yt-dlp.exe)
|
||||||
[](https://github.com/yt-dlp/yt-dlp/releases/latest/download/yt-dlp)
|
[](https://github.com/yt-dlp/yt-dlp/releases/latest/download/yt-dlp)
|
||||||
[](https://github.com/yt-dlp/yt-dlp/releases/latest/download/yt-dlp_macos)
|
[](https://github.com/yt-dlp/yt-dlp/releases/latest/download/yt-dlp_macos)
|
||||||
[](https://pypi.org/project/yt-dlp)
|
[](https://pypi.org/project/yt-dlp)
|
||||||
[](https://github.com/yt-dlp/yt-dlp/releases/latest/download/yt-dlp.tar.gz)
|
[](https://github.com/yt-dlp/yt-dlp/releases/latest/download/yt-dlp.tar.gz)
|
||||||
[](#release-files)
|
[](#release-files)
|
||||||
[](https://github.com/yt-dlp/yt-dlp/releases)
|
[](https://github.com/yt-dlp/yt-dlp/releases)
|
||||||
@ -98,14 +98,15 @@ You can install yt-dlp using [the binaries](#release-files), [pip](https://pypi.
|
|||||||
File|Description
|
File|Description
|
||||||
:---|:---
|
:---|:---
|
||||||
[yt-dlp](https://github.com/yt-dlp/yt-dlp/releases/latest/download/yt-dlp)|Platform-independent [zipimport](https://docs.python.org/3/library/zipimport.html) binary. Needs Python (recommended for **Linux/BSD**)
|
[yt-dlp](https://github.com/yt-dlp/yt-dlp/releases/latest/download/yt-dlp)|Platform-independent [zipimport](https://docs.python.org/3/library/zipimport.html) binary. Needs Python (recommended for **Linux/BSD**)
|
||||||
[yt-dlp.exe](https://github.com/yt-dlp/yt-dlp/releases/latest/download/yt-dlp.exe)|Windows (Win8+) standalone x64 binary (recommended for **Windows**)
|
[yt-dlp.exe](https://github.com/yt-dlp/yt-dlp/releases/latest/download/yt-dlp.exe)|Windows (Win7 SP1+) standalone x64 binary (recommended for **Windows**)
|
||||||
[yt-dlp_macos](https://github.com/yt-dlp/yt-dlp/releases/latest/download/yt-dlp_macos)|Universal MacOS (10.15+) standalone executable (recommended for **MacOS**)
|
[yt-dlp_macos](https://github.com/yt-dlp/yt-dlp/releases/latest/download/yt-dlp_macos)|Universal MacOS (10.15+) standalone executable (recommended for **MacOS**)
|
||||||
|
|
||||||
#### Alternatives
|
#### Alternatives
|
||||||
|
|
||||||
File|Description
|
File|Description
|
||||||
:---|:---
|
:---|:---
|
||||||
[yt-dlp_x86.exe](https://github.com/yt-dlp/yt-dlp/releases/latest/download/yt-dlp_x86.exe)|Windows (Win8+) standalone x86 (32-bit) binary
|
[yt-dlp_x86.exe](https://github.com/yt-dlp/yt-dlp/releases/latest/download/yt-dlp_x86.exe)|Windows (Win7 SP1+) standalone x86 (32-bit) binary
|
||||||
|
[yt-dlp_min.exe](https://github.com/yt-dlp/yt-dlp/releases/latest/download/yt-dlp_min.exe)|Windows (Win7 SP1+) standalone x64 binary built with `py2exe`<br/> ([Not recommended](#standalone-py2exe-builds-windows))
|
||||||
[yt-dlp_linux](https://github.com/yt-dlp/yt-dlp/releases/latest/download/yt-dlp_linux)|Linux standalone x64 binary
|
[yt-dlp_linux](https://github.com/yt-dlp/yt-dlp/releases/latest/download/yt-dlp_linux)|Linux standalone x64 binary
|
||||||
[yt-dlp_linux_armv7l](https://github.com/yt-dlp/yt-dlp/releases/latest/download/yt-dlp_linux_armv7l)|Linux standalone armv7l (32-bit) binary
|
[yt-dlp_linux_armv7l](https://github.com/yt-dlp/yt-dlp/releases/latest/download/yt-dlp_linux_armv7l)|Linux standalone armv7l (32-bit) binary
|
||||||
[yt-dlp_linux_aarch64](https://github.com/yt-dlp/yt-dlp/releases/latest/download/yt-dlp_linux_aarch64)|Linux standalone aarch64 (64-bit) binary
|
[yt-dlp_linux_aarch64](https://github.com/yt-dlp/yt-dlp/releases/latest/download/yt-dlp_linux_aarch64)|Linux standalone aarch64 (64-bit) binary
|
||||||
@ -172,11 +173,11 @@ python3 -m pip install -U --pre "yt-dlp[default]"
|
|||||||
```
|
```
|
||||||
|
|
||||||
## DEPENDENCIES
|
## DEPENDENCIES
|
||||||
Python versions 3.9+ (CPython) and 3.10+ (PyPy) are supported. Other versions and implementations may or may not work correctly.
|
Python versions 3.8+ (CPython and PyPy) are supported. Other versions and implementations may or may not work correctly.
|
||||||
|
|
||||||
<!-- Python 3.5+ uses VC++14 and it is already embedded in the binary created
|
<!-- Python 3.5+ uses VC++14 and it is already embedded in the binary created
|
||||||
<!x-- https://www.microsoft.com/en-us/download/details.aspx?id=26999 --x>
|
<!x-- https://www.microsoft.com/en-us/download/details.aspx?id=26999 --x>
|
||||||
On Windows, [Microsoft Visual C++ 2010 SP1 Redistributable Package (x86)](https://download.microsoft.com/download/1/6/5/165255E7-1014-4D0A-B094-B6A430A6BFFC/vcredist_x86.exe) is also necessary to run yt-dlp. You probably already have this, but if the executable throws an error due to missing `MSVCR100.dll` you need to install it manually.
|
On windows, [Microsoft Visual C++ 2010 SP1 Redistributable Package (x86)](https://download.microsoft.com/download/1/6/5/165255E7-1014-4D0A-B094-B6A430A6BFFC/vcredist_x86.exe) is also necessary to run yt-dlp. You probably already have this, but if the executable throws an error due to missing `MSVCR100.dll` you need to install it manually.
|
||||||
-->
|
-->
|
||||||
|
|
||||||
While all the other dependencies are optional, `ffmpeg` and `ffprobe` are highly recommended
|
While all the other dependencies are optional, `ffmpeg` and `ffprobe` are highly recommended
|
||||||
@ -253,19 +254,31 @@ On some systems, you may need to use `py` or `python` instead of `python3`.
|
|||||||
**Important**: Running `pyinstaller` directly **instead of** using `python -m bundle.pyinstaller` is **not** officially supported. This may or may not work correctly.
|
**Important**: Running `pyinstaller` directly **instead of** using `python -m bundle.pyinstaller` is **not** officially supported. This may or may not work correctly.
|
||||||
|
|
||||||
### Platform-independent Binary (UNIX)
|
### Platform-independent Binary (UNIX)
|
||||||
You will need the build tools `python` (3.9+), `zip`, `make` (GNU), `pandoc`\* and `pytest`\*.
|
You will need the build tools `python` (3.8+), `zip`, `make` (GNU), `pandoc`\* and `pytest`\*.
|
||||||
|
|
||||||
After installing these, simply run `make`.
|
After installing these, simply run `make`.
|
||||||
|
|
||||||
You can also run `make yt-dlp` instead to compile only the binary without updating any of the additional files. (The build tools marked with **\*** are not needed for this)
|
You can also run `make yt-dlp` instead to compile only the binary without updating any of the additional files. (The build tools marked with **\*** are not needed for this)
|
||||||
|
|
||||||
|
### Standalone Py2Exe Builds (Windows)
|
||||||
|
|
||||||
|
While we provide the option to build with [py2exe](https://www.py2exe.org), it is recommended to build [using PyInstaller](#standalone-pyinstaller-builds) instead since the py2exe builds **cannot contain `pycryptodomex`/`certifi`/`requests` and need VC++14** on the target computer to run.
|
||||||
|
|
||||||
|
If you wish to build it anyway, install Python (if it is not already installed) and you can run the following commands:
|
||||||
|
|
||||||
|
```
|
||||||
|
py devscripts/install_deps.py --include py2exe
|
||||||
|
py devscripts/make_lazy_extractors.py
|
||||||
|
py -m bundle.py2exe
|
||||||
|
```
|
||||||
|
|
||||||
### Related scripts
|
### Related scripts
|
||||||
|
|
||||||
* **`devscripts/install_deps.py`** - Install dependencies for yt-dlp.
|
* **`devscripts/install_deps.py`** - Install dependencies for yt-dlp.
|
||||||
* **`devscripts/update-version.py`** - Update the version number based on the current date.
|
* **`devscripts/update-version.py`** - Update the version number based on the current date.
|
||||||
* **`devscripts/set-variant.py`** - Set the build variant of the executable.
|
* **`devscripts/set-variant.py`** - Set the build variant of the executable.
|
||||||
* **`devscripts/make_changelog.py`** - Create a markdown changelog using short commit messages and update `CONTRIBUTORS` file.
|
* **`devscripts/make_changelog.py`** - Create a markdown changelog using short commit messages and update `CONTRIBUTORS` file.
|
||||||
* **`devscripts/make_lazy_extractors.py`** - Create lazy extractors. Running this before building the binaries (any variant) will improve their startup performance. Set the environment variable `YTDLP_NO_LAZY_EXTRACTORS` to something nonempty to forcefully disable lazy extractor loading.
|
* **`devscripts/make_lazy_extractors.py`** - Create lazy extractors. Running this before building the binaries (any variant) will improve their startup performance. Set the environment variable `YTDLP_NO_LAZY_EXTRACTORS=1` if you wish to forcefully disable lazy extractor loading.
|
||||||
|
|
||||||
Note: See their `--help` for more info.
|
Note: See their `--help` for more info.
|
||||||
|
|
||||||
@ -335,22 +348,13 @@ If you fork the project on GitHub, you can run your fork's [build workflow](.git
|
|||||||
containing directory ("-" for stdin). Can be
|
containing directory ("-" for stdin). Can be
|
||||||
used multiple times and inside other
|
used multiple times and inside other
|
||||||
configuration files
|
configuration files
|
||||||
--plugin-dirs PATH Path to an additional directory to search
|
--flat-playlist Do not extract the videos of a playlist,
|
||||||
for plugins. This option can be used
|
only list them
|
||||||
multiple times to add multiple directories.
|
|
||||||
Use "default" to search the default plugin
|
|
||||||
directories (default)
|
|
||||||
--no-plugin-dirs Clear plugin directories to search,
|
|
||||||
including defaults and those provided by
|
|
||||||
previous --plugin-dirs
|
|
||||||
--flat-playlist Do not extract a playlist's URL result
|
|
||||||
entries; some entry metadata may be missing
|
|
||||||
and downloading may be bypassed
|
|
||||||
--no-flat-playlist Fully extract the videos of a playlist
|
--no-flat-playlist Fully extract the videos of a playlist
|
||||||
(default)
|
(default)
|
||||||
--live-from-start Download livestreams from the start.
|
--live-from-start Download livestreams from the start.
|
||||||
Currently experimental and only supported
|
Currently only supported for YouTube
|
||||||
for YouTube and Twitch
|
(Experimental)
|
||||||
--no-live-from-start Download livestreams from the current time
|
--no-live-from-start Download livestreams from the current time
|
||||||
(default)
|
(default)
|
||||||
--wait-for-video MIN[-MAX] Wait for scheduled streams to become
|
--wait-for-video MIN[-MAX] Wait for scheduled streams to become
|
||||||
@ -376,23 +380,17 @@ If you fork the project on GitHub, you can run your fork's [build workflow](.git
|
|||||||
an alias starts with a dash "-", it is
|
an alias starts with a dash "-", it is
|
||||||
prefixed with "--". Arguments are parsed
|
prefixed with "--". Arguments are parsed
|
||||||
according to the Python string formatting
|
according to the Python string formatting
|
||||||
mini-language. E.g. --alias get-audio,-X "-S
|
mini-language. E.g. --alias get-audio,-X
|
||||||
aext:{0},abr -x --audio-format {0}" creates
|
"-S=aext:{0},abr -x --audio-format {0}"
|
||||||
options "--get-audio" and "-X" that takes an
|
creates options "--get-audio" and "-X" that
|
||||||
argument (ARG0) and expands to "-S
|
takes an argument (ARG0) and expands to
|
||||||
aext:ARG0,abr -x --audio-format ARG0". All
|
"-S=aext:ARG0,abr -x --audio-format ARG0".
|
||||||
defined aliases are listed in the --help
|
All defined aliases are listed in the --help
|
||||||
output. Alias options can trigger more
|
output. Alias options can trigger more
|
||||||
aliases; so be careful to avoid defining
|
aliases; so be careful to avoid defining
|
||||||
recursive options. As a safety measure, each
|
recursive options. As a safety measure, each
|
||||||
alias may be triggered a maximum of 100
|
alias may be triggered a maximum of 100
|
||||||
times. This option can be used multiple times
|
times. This option can be used multiple times
|
||||||
-t, --preset-alias PRESET Applies a predefined set of options. e.g.
|
|
||||||
--preset-alias mp3. The following presets
|
|
||||||
are available: mp3, aac, mp4, mkv, sleep.
|
|
||||||
See the "Preset Aliases" section at the end
|
|
||||||
for more info. This option can be used
|
|
||||||
multiple times
|
|
||||||
|
|
||||||
## Network Options:
|
## Network Options:
|
||||||
--proxy URL Use the specified HTTP/HTTPS/SOCKS proxy. To
|
--proxy URL Use the specified HTTP/HTTPS/SOCKS proxy. To
|
||||||
@ -446,10 +444,10 @@ If you fork the project on GitHub, you can run your fork's [build workflow](.git
|
|||||||
E.g. "--date today-2weeks" downloads only
|
E.g. "--date today-2weeks" downloads only
|
||||||
videos uploaded on the same day two weeks ago
|
videos uploaded on the same day two weeks ago
|
||||||
--datebefore DATE Download only videos uploaded on or before
|
--datebefore DATE Download only videos uploaded on or before
|
||||||
this date. The date formats accepted are the
|
this date. The date formats accepted is the
|
||||||
same as --date
|
same as --date
|
||||||
--dateafter DATE Download only videos uploaded on or after
|
--dateafter DATE Download only videos uploaded on or after
|
||||||
this date. The date formats accepted are the
|
this date. The date formats accepted is the
|
||||||
same as --date
|
same as --date
|
||||||
--match-filters FILTER Generic video filter. Any "OUTPUT TEMPLATE"
|
--match-filters FILTER Generic video filter. Any "OUTPUT TEMPLATE"
|
||||||
field can be compared with a number or a
|
field can be compared with a number or a
|
||||||
@ -487,8 +485,7 @@ If you fork the project on GitHub, you can run your fork's [build workflow](.git
|
|||||||
--no-download-archive Do not use archive file (default)
|
--no-download-archive Do not use archive file (default)
|
||||||
--max-downloads NUMBER Abort after downloading NUMBER files
|
--max-downloads NUMBER Abort after downloading NUMBER files
|
||||||
--break-on-existing Stop the download process when encountering
|
--break-on-existing Stop the download process when encountering
|
||||||
a file that is in the archive supplied with
|
a file that is in the archive
|
||||||
the --download-archive option
|
|
||||||
--no-break-on-existing Do not stop the download process when
|
--no-break-on-existing Do not stop the download process when
|
||||||
encountering a file that is in the archive
|
encountering a file that is in the archive
|
||||||
(default)
|
(default)
|
||||||
@ -620,7 +617,8 @@ If you fork the project on GitHub, you can run your fork's [build workflow](.git
|
|||||||
--no-restrict-filenames Allow Unicode characters, "&" and spaces in
|
--no-restrict-filenames Allow Unicode characters, "&" and spaces in
|
||||||
filenames (default)
|
filenames (default)
|
||||||
--windows-filenames Force filenames to be Windows-compatible
|
--windows-filenames Force filenames to be Windows-compatible
|
||||||
--no-windows-filenames Sanitize filenames only minimally
|
--no-windows-filenames Make filenames Windows-compatible only if
|
||||||
|
using Windows (default)
|
||||||
--trim-filenames LENGTH Limit the filename length (excluding
|
--trim-filenames LENGTH Limit the filename length (excluding
|
||||||
extension) to the specified number of
|
extension) to the specified number of
|
||||||
characters
|
characters
|
||||||
@ -734,16 +732,16 @@ If you fork the project on GitHub, you can run your fork's [build workflow](.git
|
|||||||
used. This option can be used multiple times
|
used. This option can be used multiple times
|
||||||
--print-to-file [WHEN:]TEMPLATE FILE
|
--print-to-file [WHEN:]TEMPLATE FILE
|
||||||
Append given template to the file. The
|
Append given template to the file. The
|
||||||
values of WHEN and TEMPLATE are the same as
|
values of WHEN and TEMPLATE are same as that
|
||||||
that of --print. FILE uses the same syntax
|
of --print. FILE uses the same syntax as the
|
||||||
as the output template. This option can be
|
output template. This option can be used
|
||||||
used multiple times
|
multiple times
|
||||||
-j, --dump-json Quiet, but print JSON information for each
|
-j, --dump-json Quiet, but print JSON information for each
|
||||||
video. Simulate unless --no-simulate is
|
video. Simulate unless --no-simulate is
|
||||||
used. See "OUTPUT TEMPLATE" for a
|
used. See "OUTPUT TEMPLATE" for a
|
||||||
description of available keys
|
description of available keys
|
||||||
-J, --dump-single-json Quiet, but print JSON information for each
|
-J, --dump-single-json Quiet, but print JSON information for each
|
||||||
URL or infojson passed. Simulate unless
|
url or infojson passed. Simulate unless
|
||||||
--no-simulate is used. If the URL refers to
|
--no-simulate is used. If the URL refers to
|
||||||
a playlist, the whole playlist information
|
a playlist, the whole playlist information
|
||||||
is dumped in a single line
|
is dumped in a single line
|
||||||
@ -818,9 +816,9 @@ If you fork the project on GitHub, you can run your fork's [build workflow](.git
|
|||||||
--no-audio-multistreams Only one audio stream is downloaded for each
|
--no-audio-multistreams Only one audio stream is downloaded for each
|
||||||
output file (default)
|
output file (default)
|
||||||
--prefer-free-formats Prefer video formats with free containers
|
--prefer-free-formats Prefer video formats with free containers
|
||||||
over non-free ones of the same quality. Use
|
over non-free ones of same quality. Use with
|
||||||
with "-S ext" to strictly prefer free
|
"-S ext" to strictly prefer free containers
|
||||||
containers irrespective of quality
|
irrespective of quality
|
||||||
--no-prefer-free-formats Don't give any special preference to free
|
--no-prefer-free-formats Don't give any special preference to free
|
||||||
containers (default)
|
containers (default)
|
||||||
--check-formats Make sure formats are selected only from
|
--check-formats Make sure formats are selected only from
|
||||||
@ -845,17 +843,15 @@ If you fork the project on GitHub, you can run your fork's [build workflow](.git
|
|||||||
(default) (Alias: --no-write-automatic-subs)
|
(default) (Alias: --no-write-automatic-subs)
|
||||||
--list-subs List available subtitles of each video.
|
--list-subs List available subtitles of each video.
|
||||||
Simulate unless --no-simulate is used
|
Simulate unless --no-simulate is used
|
||||||
--sub-format FORMAT Subtitle format; accepts formats preference
|
--sub-format FORMAT Subtitle format; accepts formats preference,
|
||||||
separated by "/", e.g. "srt" or "ass/srt/best"
|
e.g. "srt" or "ass/srt/best"
|
||||||
--sub-langs LANGS Languages of the subtitles to download (can
|
--sub-langs LANGS Languages of the subtitles to download (can
|
||||||
be regex) or "all" separated by commas, e.g.
|
be regex) or "all" separated by commas, e.g.
|
||||||
--sub-langs "en.*,ja" (where "en.*" is a
|
--sub-langs "en.*,ja". You can prefix the
|
||||||
regex pattern that matches "en" followed by
|
language code with a "-" to exclude it from
|
||||||
0 or more of any character). You can prefix
|
the requested languages, e.g. --sub-langs
|
||||||
the language code with a "-" to exclude it
|
all,-live_chat. Use --list-subs for a list
|
||||||
from the requested languages, e.g. --sub-
|
of available language tags
|
||||||
langs all,-live_chat. Use --list-subs for a
|
|
||||||
list of available language tags
|
|
||||||
|
|
||||||
## Authentication Options:
|
## Authentication Options:
|
||||||
-u, --username USERNAME Login with this account ID
|
-u, --username USERNAME Login with this account ID
|
||||||
@ -903,9 +899,9 @@ If you fork the project on GitHub, you can run your fork's [build workflow](.git
|
|||||||
necessary (currently supported: avi, flv,
|
necessary (currently supported: avi, flv,
|
||||||
gif, mkv, mov, mp4, webm, aac, aiff, alac,
|
gif, mkv, mov, mp4, webm, aac, aiff, alac,
|
||||||
flac, m4a, mka, mp3, ogg, opus, vorbis,
|
flac, m4a, mka, mp3, ogg, opus, vorbis,
|
||||||
wav). If the target container does not
|
wav). If target container does not support
|
||||||
support the video/audio codec, remuxing will
|
the video/audio codec, remuxing will fail.
|
||||||
fail. You can specify multiple rules; e.g.
|
You can specify multiple rules; e.g.
|
||||||
"aac>m4a/mov>mp4/mkv" will remux aac to m4a,
|
"aac>m4a/mov>mp4/mkv" will remux aac to m4a,
|
||||||
mov to mp4 and anything else to mkv
|
mov to mp4 and anything else to mkv
|
||||||
--recode-video FORMAT Re-encode the video into another format if
|
--recode-video FORMAT Re-encode the video into another format if
|
||||||
@ -973,29 +969,29 @@ If you fork the project on GitHub, you can run your fork's [build workflow](.git
|
|||||||
are the same as that of --use-postprocessor
|
are the same as that of --use-postprocessor
|
||||||
(default: pre_process)
|
(default: pre_process)
|
||||||
--xattrs Write metadata to the video file's xattrs
|
--xattrs Write metadata to the video file's xattrs
|
||||||
(using Dublin Core and XDG standards)
|
(using dublin core and xdg standards)
|
||||||
--concat-playlist POLICY Concatenate videos in a playlist. One of
|
--concat-playlist POLICY Concatenate videos in a playlist. One of
|
||||||
"never", "always", or "multi_video"
|
"never", "always", or "multi_video"
|
||||||
(default; only when the videos form a single
|
(default; only when the videos form a single
|
||||||
show). All the video files must have the
|
show). All the video files must have same
|
||||||
same codecs and number of streams to be
|
codecs and number of streams to be
|
||||||
concatenable. The "pl_video:" prefix can be
|
concatable. The "pl_video:" prefix can be
|
||||||
used with "--paths" and "--output" to set
|
used with "--paths" and "--output" to set
|
||||||
the output filename for the concatenated
|
the output filename for the concatenated
|
||||||
files. See "OUTPUT TEMPLATE" for details
|
files. See "OUTPUT TEMPLATE" for details
|
||||||
--fixup POLICY Automatically correct known faults of the
|
--fixup POLICY Automatically correct known faults of the
|
||||||
file. One of never (do nothing), warn (only
|
file. One of never (do nothing), warn (only
|
||||||
emit a warning), detect_or_warn (the
|
emit a warning), detect_or_warn (the
|
||||||
default; fix the file if we can, warn
|
default; fix file if we can, warn
|
||||||
otherwise), force (try fixing even if the
|
otherwise), force (try fixing even if file
|
||||||
file already exists)
|
already exists)
|
||||||
--ffmpeg-location PATH Location of the ffmpeg binary; either the
|
--ffmpeg-location PATH Location of the ffmpeg binary; either the
|
||||||
path to the binary or its containing directory
|
path to the binary or its containing directory
|
||||||
--exec [WHEN:]CMD Execute a command, optionally prefixed with
|
--exec [WHEN:]CMD Execute a command, optionally prefixed with
|
||||||
when to execute it, separated by a ":".
|
when to execute it, separated by a ":".
|
||||||
Supported values of "WHEN" are the same as
|
Supported values of "WHEN" are the same as
|
||||||
that of --use-postprocessor (default:
|
that of --use-postprocessor (default:
|
||||||
after_move). The same syntax as the output
|
after_move). Same syntax as the output
|
||||||
template can be used to pass any field as
|
template can be used to pass any field as
|
||||||
arguments to the command. If no fields are
|
arguments to the command. If no fields are
|
||||||
passed, %(filepath,_filename|)q is appended
|
passed, %(filepath,_filename|)q is appended
|
||||||
@ -1033,7 +1029,7 @@ If you fork the project on GitHub, you can run your fork's [build workflow](.git
|
|||||||
--no-force-keyframes-at-cuts Do not force keyframes around the chapters
|
--no-force-keyframes-at-cuts Do not force keyframes around the chapters
|
||||||
when cutting/splitting (default)
|
when cutting/splitting (default)
|
||||||
--use-postprocessor NAME[:ARGS]
|
--use-postprocessor NAME[:ARGS]
|
||||||
The (case-sensitive) name of plugin
|
The (case sensitive) name of plugin
|
||||||
postprocessors to be enabled, and
|
postprocessors to be enabled, and
|
||||||
(optionally) arguments to be passed to it,
|
(optionally) arguments to be passed to it,
|
||||||
separated by a colon ":". ARGS are a
|
separated by a colon ":". ARGS are a
|
||||||
@ -1046,8 +1042,8 @@ If you fork the project on GitHub, you can run your fork's [build workflow](.git
|
|||||||
--print/--output), "before_dl" (before each
|
--print/--output), "before_dl" (before each
|
||||||
video download), "post_process" (after each
|
video download), "post_process" (after each
|
||||||
video download; default), "after_move"
|
video download; default), "after_move"
|
||||||
(after moving the video file to its final
|
(after moving video file to its final
|
||||||
location), "after_video" (after downloading
|
locations), "after_video" (after downloading
|
||||||
and processing all formats of a video), or
|
and processing all formats of a video), or
|
||||||
"playlist" (at end of playlist). This option
|
"playlist" (at end of playlist). This option
|
||||||
can be used multiple times to add different
|
can be used multiple times to add different
|
||||||
@ -1065,7 +1061,7 @@ Make chapter entries for, or remove various segments (sponsor,
|
|||||||
music_offtopic, poi_highlight, chapter, all
|
music_offtopic, poi_highlight, chapter, all
|
||||||
and default (=all). You can prefix the
|
and default (=all). You can prefix the
|
||||||
category with a "-" to exclude it. See [1]
|
category with a "-" to exclude it. See [1]
|
||||||
for descriptions of the categories. E.g.
|
for description of the categories. E.g.
|
||||||
--sponsorblock-mark all,-preview
|
--sponsorblock-mark all,-preview
|
||||||
[1] https://wiki.sponsor.ajay.app/w/Segment_Categories
|
[1] https://wiki.sponsor.ajay.app/w/Segment_Categories
|
||||||
--sponsorblock-remove CATS SponsorBlock categories to be removed from
|
--sponsorblock-remove CATS SponsorBlock categories to be removed from
|
||||||
@ -1097,7 +1093,7 @@ Make chapter entries for, or remove various segments (sponsor,
|
|||||||
(Alias: --no-allow-dynamic-mpd)
|
(Alias: --no-allow-dynamic-mpd)
|
||||||
--hls-split-discontinuity Split HLS playlists to different formats at
|
--hls-split-discontinuity Split HLS playlists to different formats at
|
||||||
discontinuities such as ad breaks
|
discontinuities such as ad breaks
|
||||||
--no-hls-split-discontinuity Do not split HLS playlists into different
|
--no-hls-split-discontinuity Do not split HLS playlists to different
|
||||||
formats at discontinuities such as ad breaks
|
formats at discontinuities such as ad breaks
|
||||||
(default)
|
(default)
|
||||||
--extractor-args IE_KEY:ARGS Pass ARGS arguments to the IE_KEY extractor.
|
--extractor-args IE_KEY:ARGS Pass ARGS arguments to the IE_KEY extractor.
|
||||||
@ -1105,30 +1101,9 @@ Make chapter entries for, or remove various segments (sponsor,
|
|||||||
can use this option multiple times to give
|
can use this option multiple times to give
|
||||||
arguments for different extractors
|
arguments for different extractors
|
||||||
|
|
||||||
## Preset Aliases:
|
|
||||||
Predefined aliases for convenience and ease of use. Note that future
|
|
||||||
versions of yt-dlp may add or adjust presets, but the existing preset
|
|
||||||
names will not be changed or removed
|
|
||||||
|
|
||||||
-t mp3 -f 'ba[acodec^=mp3]/ba/b' -x --audio-format
|
|
||||||
mp3
|
|
||||||
|
|
||||||
-t aac -f
|
|
||||||
'ba[acodec^=aac]/ba[acodec^=mp4a.40.]/ba/b'
|
|
||||||
-x --audio-format aac
|
|
||||||
|
|
||||||
-t mp4 --merge-output-format mp4 --remux-video mp4
|
|
||||||
-S vcodec:h264,lang,quality,res,fps,hdr:12,a
|
|
||||||
codec:aac
|
|
||||||
|
|
||||||
-t mkv --merge-output-format mkv --remux-video mkv
|
|
||||||
|
|
||||||
-t sleep --sleep-subtitles 5 --sleep-requests 0.75
|
|
||||||
--sleep-interval 10 --max-sleep-interval 20
|
|
||||||
|
|
||||||
# CONFIGURATION
|
# CONFIGURATION
|
||||||
|
|
||||||
You can configure yt-dlp by placing any supported command line option in a configuration file. The configuration is loaded from the following locations:
|
You can configure yt-dlp by placing any supported command line option to a configuration file. The configuration is loaded from the following locations:
|
||||||
|
|
||||||
1. **Main Configuration**:
|
1. **Main Configuration**:
|
||||||
* The file given to `--config-location`
|
* The file given to `--config-location`
|
||||||
@ -1156,15 +1131,15 @@ You can configure yt-dlp by placing any supported command line option in a confi
|
|||||||
* `/etc/yt-dlp/config`
|
* `/etc/yt-dlp/config`
|
||||||
* `/etc/yt-dlp/config.txt`
|
* `/etc/yt-dlp/config.txt`
|
||||||
|
|
||||||
E.g. with the following configuration file, yt-dlp will always extract the audio, copy the mtime, use a proxy and save all videos under `YouTube` directory in your home directory:
|
E.g. with the following configuration file, yt-dlp will always extract the audio, not copy the mtime, use a proxy and save all videos under `YouTube` directory in your home directory:
|
||||||
```
|
```
|
||||||
# Lines starting with # are comments
|
# Lines starting with # are comments
|
||||||
|
|
||||||
# Always extract audio
|
# Always extract audio
|
||||||
-x
|
-x
|
||||||
|
|
||||||
# Copy the mtime
|
# Do not copy the mtime
|
||||||
--mtime
|
--no-mtime
|
||||||
|
|
||||||
# Use this proxy
|
# Use this proxy
|
||||||
--proxy 127.0.0.1:3128
|
--proxy 127.0.0.1:3128
|
||||||
@ -1173,7 +1148,7 @@ E.g. with the following configuration file, yt-dlp will always extract the audio
|
|||||||
-o ~/YouTube/%(title)s.%(ext)s
|
-o ~/YouTube/%(title)s.%(ext)s
|
||||||
```
|
```
|
||||||
|
|
||||||
**Note**: Options in a configuration file are just the same options aka switches used in regular command line calls; thus there **must be no whitespace** after `-` or `--`, e.g. `-o` or `--proxy` but not `- o` or `-- proxy`. They must also be quoted when necessary, as if it were a UNIX shell.
|
**Note**: Options in configuration file are just the same options aka switches used in regular command line calls; thus there **must be no whitespace** after `-` or `--`, e.g. `-o` or `--proxy` but not `- o` or `-- proxy`. They must also be quoted when necessary, as if it were a UNIX shell.
|
||||||
|
|
||||||
You can use `--ignore-config` if you want to disable all configuration files for a particular yt-dlp run. If `--ignore-config` is found inside any configuration file, no further configuration will be loaded. For example, having the option in the portable configuration file prevents loading of home, user, and system configurations. Additionally, (for backward compatibility) if `--ignore-config` is found inside the system configuration file, the user configuration is not loaded.
|
You can use `--ignore-config` if you want to disable all configuration files for a particular yt-dlp run. If `--ignore-config` is found inside any configuration file, no further configuration will be loaded. For example, having the option in the portable configuration file prevents loading of home, user, and system configurations. Additionally, (for backward compatibility) if `--ignore-config` is found inside the system configuration file, the user configuration is not loaded.
|
||||||
|
|
||||||
@ -1207,13 +1182,13 @@ As an alternative to using the `.netrc` file, which has the disadvantage of keep
|
|||||||
|
|
||||||
E.g. To use an encrypted `.netrc` file stored as `.authinfo.gpg`
|
E.g. To use an encrypted `.netrc` file stored as `.authinfo.gpg`
|
||||||
```
|
```
|
||||||
yt-dlp --netrc-cmd 'gpg --decrypt ~/.authinfo.gpg' 'https://www.youtube.com/watch?v=BaW_jenozKc'
|
yt-dlp --netrc-cmd 'gpg --decrypt ~/.authinfo.gpg' https://www.youtube.com/watch?v=BaW_jenozKc
|
||||||
```
|
```
|
||||||
|
|
||||||
|
|
||||||
### Notes about environment variables
|
### Notes about environment variables
|
||||||
* Environment variables are normally specified as `${VARIABLE}`/`$VARIABLE` on UNIX and `%VARIABLE%` on Windows; but is always shown as `${VARIABLE}` in this documentation
|
* Environment variables are normally specified as `${VARIABLE}`/`$VARIABLE` on UNIX and `%VARIABLE%` on Windows; but is always shown as `${VARIABLE}` in this documentation
|
||||||
* yt-dlp also allows using UNIX-style variables on Windows for path-like options; e.g. `--output`, `--config-location`
|
* yt-dlp also allow using UNIX-style variables on Windows for path-like options; e.g. `--output`, `--config-location`
|
||||||
* If unset, `${XDG_CONFIG_HOME}` defaults to `~/.config` and `${XDG_CACHE_HOME}` to `~/.cache`
|
* If unset, `${XDG_CONFIG_HOME}` defaults to `~/.config` and `${XDG_CACHE_HOME}` to `~/.cache`
|
||||||
* On Windows, `~` points to `${HOME}` if present; or, `${USERPROFILE}` or `${HOMEDRIVE}${HOMEPATH}` otherwise
|
* On Windows, `~` points to `${HOME}` if present; or, `${USERPROFILE}` or `${HOMEDRIVE}${HOMEPATH}` otherwise
|
||||||
* On Windows, `${USERPROFILE}` generally points to `C:\Users\<user name>` and `${APPDATA}` to `${USERPROFILE}\AppData\Roaming`
|
* On Windows, `${USERPROFILE}` generally points to `C:\Users\<user name>` and `${APPDATA}` to `${USERPROFILE}\AppData\Roaming`
|
||||||
@ -1294,7 +1269,7 @@ The available fields are:
|
|||||||
- `like_count` (numeric): Number of positive ratings of the video
|
- `like_count` (numeric): Number of positive ratings of the video
|
||||||
- `dislike_count` (numeric): Number of negative ratings of the video
|
- `dislike_count` (numeric): Number of negative ratings of the video
|
||||||
- `repost_count` (numeric): Number of reposts of the video
|
- `repost_count` (numeric): Number of reposts of the video
|
||||||
- `average_rating` (numeric): Average rating given by users, the scale used depends on the webpage
|
- `average_rating` (numeric): Average rating give by users, the scale used depends on the webpage
|
||||||
- `comment_count` (numeric): Number of comments on the video (For some extractors, comments are only downloaded at the end, and so this field cannot be used)
|
- `comment_count` (numeric): Number of comments on the video (For some extractors, comments are only downloaded at the end, and so this field cannot be used)
|
||||||
- `age_limit` (numeric): Age restriction for the video (years)
|
- `age_limit` (numeric): Age restriction for the video (years)
|
||||||
- `live_status` (string): One of "not_live", "is_live", "is_upcoming", "was_live", "post_live" (was live, but VOD is not yet processed)
|
- `live_status` (string): One of "not_live", "is_live", "is_upcoming", "was_live", "post_live" (was live, but VOD is not yet processed)
|
||||||
@ -1321,11 +1296,10 @@ The available fields are:
|
|||||||
- `playlist_uploader_id` (string): Nickname or id of the playlist uploader
|
- `playlist_uploader_id` (string): Nickname or id of the playlist uploader
|
||||||
- `playlist_channel` (string): Display name of the channel that uploaded the playlist
|
- `playlist_channel` (string): Display name of the channel that uploaded the playlist
|
||||||
- `playlist_channel_id` (string): Identifier of the channel that uploaded the playlist
|
- `playlist_channel_id` (string): Identifier of the channel that uploaded the playlist
|
||||||
- `playlist_webpage_url` (string): URL of the playlist webpage
|
|
||||||
- `webpage_url` (string): A URL to the video webpage which, if given to yt-dlp, should yield the same result again
|
- `webpage_url` (string): A URL to the video webpage which, if given to yt-dlp, should yield the same result again
|
||||||
- `webpage_url_basename` (string): The basename of the webpage URL
|
- `webpage_url_basename` (string): The basename of the webpage URL
|
||||||
- `webpage_url_domain` (string): The domain of the webpage URL
|
- `webpage_url_domain` (string): The domain of the webpage URL
|
||||||
- `original_url` (string): The URL given by the user (or the same as `webpage_url` for playlist entries)
|
- `original_url` (string): The URL given by the user (or same as `webpage_url` for playlist entries)
|
||||||
- `categories` (list): List of categories the video belongs to
|
- `categories` (list): List of categories the video belongs to
|
||||||
- `tags` (list): List of tags assigned to the video
|
- `tags` (list): List of tags assigned to the video
|
||||||
- `cast` (list): List of cast members
|
- `cast` (list): List of cast members
|
||||||
@ -1402,7 +1376,7 @@ Each aforementioned sequence when referenced in an output template will be repla
|
|||||||
|
|
||||||
**Tip**: Look at the `-j` output to identify which fields are available for the particular URL
|
**Tip**: Look at the `-j` output to identify which fields are available for the particular URL
|
||||||
|
|
||||||
For numeric sequences, you can use [numeric related formatting](https://docs.python.org/3/library/stdtypes.html#printf-style-string-formatting); e.g. `%(view_count)05d` will result in a string with view count padded with zeros up to 5 characters, like in `00042`.
|
For numeric sequences you can use [numeric related formatting](https://docs.python.org/3/library/stdtypes.html#printf-style-string-formatting); e.g. `%(view_count)05d` will result in a string with view count padded with zeros up to 5 characters, like in `00042`.
|
||||||
|
|
||||||
Output templates can also contain arbitrary hierarchical path, e.g. `-o "%(playlist)s/%(playlist_index)s - %(title)s.%(ext)s"` which will result in downloading each video in a directory corresponding to this path template. Any missing directory will be automatically created for you.
|
Output templates can also contain arbitrary hierarchical path, e.g. `-o "%(playlist)s/%(playlist_index)s - %(title)s.%(ext)s"` which will result in downloading each video in a directory corresponding to this path template. Any missing directory will be automatically created for you.
|
||||||
|
|
||||||
@ -1444,7 +1418,7 @@ $ yt-dlp -P "C:/MyVideos" -o "%(series)s/%(season_number)s - %(season)s/%(episod
|
|||||||
|
|
||||||
# Download video as "C:\MyVideos\uploader\title.ext", subtitles as "C:\MyVideos\subs\uploader\title.ext"
|
# Download video as "C:\MyVideos\uploader\title.ext", subtitles as "C:\MyVideos\subs\uploader\title.ext"
|
||||||
# and put all temporary files in "C:\MyVideos\tmp"
|
# and put all temporary files in "C:\MyVideos\tmp"
|
||||||
$ yt-dlp -P "C:/MyVideos" -P "temp:tmp" -P "subtitle:subs" -o "%(uploader)s/%(title)s.%(ext)s" BaW_jenozKc --write-subs
|
$ yt-dlp -P "C:/MyVideos" -P "temp:tmp" -P "subtitle:subs" -o "%(uploader)s/%(title)s.%(ext)s" BaW_jenoz --write-subs
|
||||||
|
|
||||||
# Download video as "C:\MyVideos\uploader\title.ext" and subtitles as "C:\MyVideos\uploader\subs\title.ext"
|
# Download video as "C:\MyVideos\uploader\title.ext" and subtitles as "C:\MyVideos\uploader\subs\title.ext"
|
||||||
$ yt-dlp -P "C:/MyVideos" -o "%(uploader)s/%(title)s.%(ext)s" -o "subtitle:%(uploader)s/subs/%(title)s.%(ext)s" BaW_jenozKc --write-subs
|
$ yt-dlp -P "C:/MyVideos" -o "%(uploader)s/%(title)s.%(ext)s" -o "subtitle:%(uploader)s/subs/%(title)s.%(ext)s" BaW_jenozKc --write-subs
|
||||||
@ -1554,7 +1528,7 @@ The available fields are:
|
|||||||
- `hasvid`: Gives priority to formats that have a video stream
|
- `hasvid`: Gives priority to formats that have a video stream
|
||||||
- `hasaud`: Gives priority to formats that have an audio stream
|
- `hasaud`: Gives priority to formats that have an audio stream
|
||||||
- `ie_pref`: The format preference
|
- `ie_pref`: The format preference
|
||||||
- `lang`: The language preference as determined by the extractor (e.g. original language preferred over audio description)
|
- `lang`: The language preference
|
||||||
- `quality`: The quality of the format
|
- `quality`: The quality of the format
|
||||||
- `source`: The preference of the source
|
- `source`: The preference of the source
|
||||||
- `proto`: Protocol used for download (`https`/`ftps` > `http`/`ftp` > `m3u8_native`/`m3u8` > `http_dash_segments`> `websocket_frag` > `mms`/`rtsp` > `f4f`/`f4m`)
|
- `proto`: Protocol used for download (`https`/`ftps` > `http`/`ftp` > `m3u8_native`/`m3u8` > `http_dash_segments`> `websocket_frag` > `mms`/`rtsp` > `f4f`/`f4m`)
|
||||||
@ -1583,9 +1557,9 @@ The available fields are:
|
|||||||
|
|
||||||
All fields, unless specified otherwise, are sorted in descending order. To reverse this, prefix the field with a `+`. E.g. `+res` prefers format with the smallest resolution. Additionally, you can suffix a preferred value for the fields, separated by a `:`. E.g. `res:720` prefers larger videos, but no larger than 720p and the smallest video if there are no videos less than 720p. For `codec` and `ext`, you can provide two preferred values, the first for video and the second for audio. E.g. `+codec:avc:m4a` (equivalent to `+vcodec:avc,+acodec:m4a`) sets the video codec preference to `h264` > `h265` > `vp9` > `vp9.2` > `av01` > `vp8` > `h263` > `theora` and audio codec preference to `mp4a` > `aac` > `vorbis` > `opus` > `mp3` > `ac3` > `dts`. You can also make the sorting prefer the nearest values to the provided by using `~` as the delimiter. E.g. `filesize~1G` prefers the format with filesize closest to 1 GiB.
|
All fields, unless specified otherwise, are sorted in descending order. To reverse this, prefix the field with a `+`. E.g. `+res` prefers format with the smallest resolution. Additionally, you can suffix a preferred value for the fields, separated by a `:`. E.g. `res:720` prefers larger videos, but no larger than 720p and the smallest video if there are no videos less than 720p. For `codec` and `ext`, you can provide two preferred values, the first for video and the second for audio. E.g. `+codec:avc:m4a` (equivalent to `+vcodec:avc,+acodec:m4a`) sets the video codec preference to `h264` > `h265` > `vp9` > `vp9.2` > `av01` > `vp8` > `h263` > `theora` and audio codec preference to `mp4a` > `aac` > `vorbis` > `opus` > `mp3` > `ac3` > `dts`. You can also make the sorting prefer the nearest values to the provided by using `~` as the delimiter. E.g. `filesize~1G` prefers the format with filesize closest to 1 GiB.
|
||||||
|
|
||||||
The fields `hasvid` and `ie_pref` are always given highest priority in sorting, irrespective of the user-defined order. This behavior can be changed by using `--format-sort-force`. Apart from these, the default order used is: `lang,quality,res,fps,hdr:12,vcodec,channels,acodec,size,br,asr,proto,ext,hasaud,source,id`. The extractors may override this default order, but they cannot override the user-provided order.
|
The fields `hasvid` and `ie_pref` are always given highest priority in sorting, irrespective of the user-defined order. This behavior can be changed by using `--format-sort-force`. Apart from these, the default order used is: `lang,quality,res,fps,hdr:12,vcodec:vp9.2,channels,acodec,size,br,asr,proto,ext,hasaud,source,id`. The extractors may override this default order, but they cannot override the user-provided order.
|
||||||
|
|
||||||
Note that the default for hdr is `hdr:12`; i.e. Dolby Vision is not preferred. This choice was made since DV formats are not yet fully compatible with most devices. This may be changed in the future.
|
Note that the default has `vcodec:vp9.2`; i.e. `av1` is not preferred. Similarly, the default for hdr is `hdr:12`; i.e. Dolby Vision is not preferred. These choices are made since DV and AV1 formats are not yet fully compatible with most devices. This may be changed in the future as more devices become capable of smoothly playing back these formats.
|
||||||
|
|
||||||
If your format selector is `worst`, the last item is selected after sorting. This means it will select the format that is worst in all respects. Most of the time, what you actually want is the video with the smallest filesize instead. So it is generally better to use `-f best -S +size,+br,+res,+fps`.
|
If your format selector is `worst`, the last item is selected after sorting. This means it will select the format that is worst in all respects. Most of the time, what you actually want is the video with the smallest filesize instead. So it is generally better to use `-f best -S +size,+br,+res,+fps`.
|
||||||
|
|
||||||
@ -1662,11 +1636,11 @@ $ yt-dlp -S "res:480"
|
|||||||
# or the worst video (that also has audio) if there is no video under 50 MB
|
# or the worst video (that also has audio) if there is no video under 50 MB
|
||||||
$ yt-dlp -f "b[filesize<50M] / w"
|
$ yt-dlp -f "b[filesize<50M] / w"
|
||||||
|
|
||||||
# Download the largest video (that also has audio) but no bigger than 50 MB,
|
# Download largest video (that also has audio) but no bigger than 50 MB,
|
||||||
# or the smallest video (that also has audio) if there is no video under 50 MB
|
# or the smallest video (that also has audio) if there is no video under 50 MB
|
||||||
$ yt-dlp -f "b" -S "filesize:50M"
|
$ yt-dlp -f "b" -S "filesize:50M"
|
||||||
|
|
||||||
# Download the best video (that also has audio) that is closest in size to 50 MB
|
# Download best video (that also has audio) that is closest in size to 50 MB
|
||||||
$ yt-dlp -f "b" -S "filesize~50M"
|
$ yt-dlp -f "b" -S "filesize~50M"
|
||||||
|
|
||||||
|
|
||||||
@ -1722,7 +1696,7 @@ The metadata obtained by the extractors can be modified by using `--parse-metada
|
|||||||
|
|
||||||
The general syntax of `--parse-metadata FROM:TO` is to give the name of a field or an [output template](#output-template) to extract data from, and the format to interpret it as, separated by a colon `:`. Either a [Python regular expression](https://docs.python.org/3/library/re.html#regular-expression-syntax) with named capture groups, a single field name, or a similar syntax to the [output template](#output-template) (only `%(field)s` formatting is supported) can be used for `TO`. The option can be used multiple times to parse and modify various fields.
|
The general syntax of `--parse-metadata FROM:TO` is to give the name of a field or an [output template](#output-template) to extract data from, and the format to interpret it as, separated by a colon `:`. Either a [Python regular expression](https://docs.python.org/3/library/re.html#regular-expression-syntax) with named capture groups, a single field name, or a similar syntax to the [output template](#output-template) (only `%(field)s` formatting is supported) can be used for `TO`. The option can be used multiple times to parse and modify various fields.
|
||||||
|
|
||||||
Note that these options preserve their relative order, allowing replacements to be made in parsed fields and vice versa. Also, any field thus created can be used in the [output template](#output-template) and will also affect the media file's metadata added when using `--embed-metadata`.
|
Note that these options preserve their relative order, allowing replacements to be made in parsed fields and viceversa. Also, any field thus created can be used in the [output template](#output-template) and will also affect the media file's metadata added when using `--embed-metadata`.
|
||||||
|
|
||||||
This option also has a few special uses:
|
This option also has a few special uses:
|
||||||
|
|
||||||
@ -1788,34 +1762,28 @@ $ yt-dlp --replace-in-metadata "title,uploader" "[ _]" "-"
|
|||||||
|
|
||||||
# EXTRACTOR ARGUMENTS
|
# EXTRACTOR ARGUMENTS
|
||||||
|
|
||||||
Some extractors accept additional arguments which can be passed using `--extractor-args KEY:ARGS`. `ARGS` is a `;` (semicolon) separated string of `ARG=VAL1,VAL2`. E.g. `--extractor-args "youtube:player-client=tv,mweb;formats=incomplete" --extractor-args "twitter:api=syndication"`
|
Some extractors accept additional arguments which can be passed using `--extractor-args KEY:ARGS`. `ARGS` is a `;` (semicolon) separated string of `ARG=VAL1,VAL2`. E.g. `--extractor-args "youtube:player-client=mediaconnect,web;formats=incomplete" --extractor-args "funimation:version=uncut"`
|
||||||
|
|
||||||
Note: In CLI, `ARG` can use `-` instead of `_`; e.g. `youtube:player-client"` becomes `youtube:player_client"`
|
Note: In CLI, `ARG` can use `-` instead of `_`; e.g. `youtube:player-client"` becomes `youtube:player_client"`
|
||||||
|
|
||||||
The following extractors use this feature:
|
The following extractors use this feature:
|
||||||
|
|
||||||
#### youtube
|
#### youtube
|
||||||
* `lang`: Prefer translated metadata (`title`, `description` etc) of this language code (case-sensitive). By default, the video primary language metadata is preferred, with a fallback to `en` translated. See [youtube/_base.py](https://github.com/yt-dlp/yt-dlp/blob/415b4c9f955b1a0391204bd24a7132590e7b3bdb/yt_dlp/extractor/youtube/_base.py#L402-L409) for the list of supported content language codes
|
* `lang`: Prefer translated metadata (`title`, `description` etc) of this language code (case-sensitive). By default, the video primary language metadata is preferred, with a fallback to `en` translated. See [youtube.py](https://github.com/yt-dlp/yt-dlp/blob/c26f9b991a0681fd3ea548d535919cec1fbbd430/yt_dlp/extractor/youtube.py#L381-L390) for list of supported content language codes
|
||||||
* `skip`: One or more of `hls`, `dash` or `translated_subs` to skip extraction of the m3u8 manifests, dash manifests and [auto-translated subtitles](https://github.com/yt-dlp/yt-dlp/issues/4090#issuecomment-1158102032) respectively
|
* `skip`: One or more of `hls`, `dash` or `translated_subs` to skip extraction of the m3u8 manifests, dash manifests and [auto-translated subtitles](https://github.com/yt-dlp/yt-dlp/issues/4090#issuecomment-1158102032) respectively
|
||||||
* `player_client`: Clients to extract video data from. The currently available clients are `web`, `web_safari`, `web_embedded`, `web_music`, `web_creator`, `mweb`, `ios`, `android`, `android_vr`, `tv`, `tv_simply` and `tv_embedded`. By default, `tv,ios,web` is used, or `tv,web` is used when authenticating with cookies. The `web_music` client is added for `music.youtube.com` URLs when logged-in cookies are used. The `web_embedded` client is added for age-restricted videos but only works if the video is embeddable. The `tv_embedded` and `web_creator` clients are added for age-restricted videos if account age-verification is required. Some clients, such as `web` and `web_music`, require a `po_token` for their formats to be downloadable. Some clients, such as `web_creator`, will only work with authentication. Not all clients support authentication via cookies. You can use `default` for the default clients, or you can use `all` for all clients (not recommended). You can prefix a client with `-` to exclude it, e.g. `youtube:player_client=default,-ios`
|
* `player_client`: Clients to extract video data from. The main clients are `web`, `ios` and `android`, with variants `_music` and `_creator` (e.g. `ios_creator`); and `mediaconnect`, `mweb`, `android_producer`, `android_testsuite`, `android_vr`, `web_safari`, `web_embedded`, `tv` and `tv_embedded` with no variants. By default, `ios,mweb` is used, and `tv_embedded`, `web_creator` and `mediaconnect` are added as required for age-gated videos. Similarly, the music variants are added for `music.youtube.com` urls. Most `android` clients will be given lowest priority since their formats are broken. You can use `all` to use all the clients, and `default` for the default clients. You can prefix a client with `-` to exclude it, e.g. `youtube:player_client=all,-web`
|
||||||
* `player_skip`: Skip some network requests that are generally needed for robust extraction. One or more of `configs` (skip client configs), `webpage` (skip initial webpage), `js` (skip js player), `initial_data` (skip initial data/next ep request). While these options can help reduce the number of requests needed or avoid some rate-limiting, they could cause issues such as missing formats or metadata. See [#860](https://github.com/yt-dlp/yt-dlp/pull/860) and [#12826](https://github.com/yt-dlp/yt-dlp/issues/12826) for more details
|
* `player_skip`: Skip some network requests that are generally needed for robust extraction. One or more of `configs` (skip client configs), `webpage` (skip initial webpage), `js` (skip js player). While these options can help reduce the number of requests needed or avoid some rate-limiting, they could cause some issues. See [#860](https://github.com/yt-dlp/yt-dlp/pull/860) for more details
|
||||||
* `player_params`: YouTube player parameters to use for player requests. Will overwrite any default ones set by yt-dlp.
|
* `player_params`: YouTube player parameters to use for player requests. Will overwrite any default ones set by yt-dlp.
|
||||||
* `player_js_variant`: The player javascript variant to use for signature and nsig deciphering. The known variants are: `main`, `tce`, `tv`, `tv_es6`, `phone`, `tablet`. Only `main` is recommended as a possible workaround; the others are for debugging purposes. The default is to use what is prescribed by the site, and can be selected with `actual`
|
|
||||||
* `comment_sort`: `top` or `new` (default) - choose comment sorting mode (on YouTube's side)
|
* `comment_sort`: `top` or `new` (default) - choose comment sorting mode (on YouTube's side)
|
||||||
* `max_comments`: Limit the amount of comments to gather. Comma-separated list of integers representing `max-comments,max-parents,max-replies,max-replies-per-thread`. Default is `all,all,all,all`
|
* `max_comments`: Limit the amount of comments to gather. Comma-separated list of integers representing `max-comments,max-parents,max-replies,max-replies-per-thread`. Default is `all,all,all,all`
|
||||||
* E.g. `all,all,1000,10` will get a maximum of 1000 replies total, with up to 10 replies per thread. `1000,all,100` will get a maximum of 1000 comments, with a maximum of 100 replies total
|
* E.g. `all,all,1000,10` will get a maximum of 1000 replies total, with up to 10 replies per thread. `1000,all,100` will get a maximum of 1000 comments, with a maximum of 100 replies total
|
||||||
* `formats`: Change the types of formats to return. `dashy` (convert HTTP to DASH), `duplicate` (identical content but different URLs or protocol; includes `dashy`), `incomplete` (cannot be downloaded completely - live dash and post-live m3u8), `missing_pot` (include formats that require a PO Token but are missing one)
|
* `formats`: Change the types of formats to return. `dashy` (convert HTTP to DASH), `duplicate` (identical content but different URLs or protocol; includes `dashy`), `incomplete` (cannot be downloaded completely - live dash and post-live m3u8)
|
||||||
* `innertube_host`: Innertube API host to use for all API requests; e.g. `studio.youtube.com`, `youtubei.googleapis.com`. Note that cookies exported from one subdomain will not work on others
|
* `innertube_host`: Innertube API host to use for all API requests; e.g. `studio.youtube.com`, `youtubei.googleapis.com`. Note that cookies exported from one subdomain will not work on others
|
||||||
* `innertube_key`: Innertube API key to use for all API requests. By default, no API key is used
|
* `innertube_key`: Innertube API key to use for all API requests. By default, no API key is used
|
||||||
* `raise_incomplete_data`: `Incomplete Data Received` raises an error instead of reporting a warning
|
* `raise_incomplete_data`: `Incomplete Data Received` raises an error instead of reporting a warning
|
||||||
* `data_sync_id`: Overrides the account Data Sync ID used in Innertube API requests. This may be needed if you are using an account with `youtube:player_skip=webpage,configs` or `youtubetab:skip=webpage`
|
* `data_sync_id`: Overrides the account Data Sync ID used in Innertube API requests. This may be needed if you are using an account with `youtube:player_skip=webpage,configs` or `youtubetab:skip=webpage`
|
||||||
* `visitor_data`: Overrides the Visitor Data used in Innertube API requests. This should be used with `player_skip=webpage,configs` and without cookies. Note: this may have adverse effects if used improperly. If a session from a browser is wanted, you should pass cookies instead (which contain the Visitor ID)
|
* `visitor_data`: Overrides the Visitor Data used in Innertube API requests. This should be used with `player_skip=webpage,configs` and without cookies. Note: this may have adverse effects if used improperly. If a session from a browser is wanted, you should pass cookies instead (which contain the Visitor ID)
|
||||||
* `po_token`: Proof of Origin (PO) Token(s) to use. Comma seperated list of PO Tokens in the format `CLIENT.CONTEXT+PO_TOKEN`, e.g. `youtube:po_token=web.gvs+XXX,web.player=XXX,web_safari.gvs+YYY`. Context can be any of `gvs` (Google Video Server URLs), `player` (Innertube player request) or `subs` (Subtitles)
|
* `po_token`: Proof of Origin (PO) Token(s) to use for requesting video playback. Comma seperated list of PO Tokens in the format `CLIENT+PO_TOKEN`, e.g. `youtube:po_token=web+XXX,android+YYY`
|
||||||
* `pot_trace`: Enable debug logging for PO Token fetching. Either `true` or `false` (default)
|
|
||||||
* `fetch_pot`: Policy to use for fetching a PO Token from providers. One of `always` (always try fetch a PO Token regardless if the client requires one for the given context), `never` (never fetch a PO Token), or `auto` (default; only fetch a PO Token if the client requires one for the given context)
|
|
||||||
|
|
||||||
#### youtubepot-webpo
|
|
||||||
* `bind_to_visitor_id`: Whether to use the Visitor ID instead of Visitor Data for caching WebPO tokens. Either `true` (default) or `false`
|
|
||||||
|
|
||||||
#### youtubetab (YouTube playlists, channels, feeds, etc.)
|
#### youtubetab (YouTube playlists, channels, feeds, etc.)
|
||||||
* `skip`: One or more of `webpage` (skip initial webpage download), `authcheck` (allow the download of playlists requiring authentication when no initial webpage is downloaded. This may cause unwanted behavior, see [#1122](https://github.com/yt-dlp/yt-dlp/pull/1122) for more details)
|
* `skip`: One or more of `webpage` (skip initial webpage download), `authcheck` (allow the download of playlists requiring authentication when no initial webpage is downloaded. This may cause unwanted behavior, see [#1122](https://github.com/yt-dlp/yt-dlp/pull/1122) for more details)
|
||||||
@ -1827,11 +1795,20 @@ The following extractors use this feature:
|
|||||||
* `key_query`: Passthrough the master m3u8 URL query to its HLS AES-128 decryption key URI if no value is provided, or else apply the query string given as `key_query=VALUE`. Note that this will have no effect if the key URI is provided via the `hls_key` extractor-arg. Does not apply to ffmpeg
|
* `key_query`: Passthrough the master m3u8 URL query to its HLS AES-128 decryption key URI if no value is provided, or else apply the query string given as `key_query=VALUE`. Note that this will have no effect if the key URI is provided via the `hls_key` extractor-arg. Does not apply to ffmpeg
|
||||||
* `hls_key`: An HLS AES-128 key URI *or* key (as hex), and optionally the IV (as hex), in the form of `(URI|KEY)[,IV]`; e.g. `generic:hls_key=ABCDEF1234567980,0xFEDCBA0987654321`. Passing any of these values will force usage of the native HLS downloader and override the corresponding values found in the m3u8 playlist
|
* `hls_key`: An HLS AES-128 key URI *or* key (as hex), and optionally the IV (as hex), in the form of `(URI|KEY)[,IV]`; e.g. `generic:hls_key=ABCDEF1234567980,0xFEDCBA0987654321`. Passing any of these values will force usage of the native HLS downloader and override the corresponding values found in the m3u8 playlist
|
||||||
* `is_live`: Bypass live HLS detection and manually set `live_status` - a value of `false` will set `not_live`, any other value (or no value) will set `is_live`
|
* `is_live`: Bypass live HLS detection and manually set `live_status` - a value of `false` will set `not_live`, any other value (or no value) will set `is_live`
|
||||||
* `impersonate`: Target(s) to try and impersonate with the initial webpage request; e.g. `generic:impersonate=safari,chrome-110`. Use `generic:impersonate` to impersonate any available target, and use `generic:impersonate=false` to disable impersonation (default)
|
|
||||||
|
#### funimation
|
||||||
|
* `language`: Audio languages to extract, e.g. `funimation:language=english,japanese`
|
||||||
|
* `version`: The video version to extract - `uncut` or `simulcast`
|
||||||
|
|
||||||
|
#### crunchyrollbeta (Crunchyroll)
|
||||||
|
* `hardsub`: One or more hardsub versions to extract (in order of preference), or `all` (default: `None` = no hardsubs will be extracted), e.g. `crunchyrollbeta:hardsub=en-US,de-DE`
|
||||||
|
|
||||||
#### vikichannel
|
#### vikichannel
|
||||||
* `video_types`: Types of videos to download - one or more of `episodes`, `movies`, `clips`, `trailers`
|
* `video_types`: Types of videos to download - one or more of `episodes`, `movies`, `clips`, `trailers`
|
||||||
|
|
||||||
|
#### niconico
|
||||||
|
* `segment_duration`: Segment duration in milliseconds for HLS-DMC formats. Use it at your own risk since this feature **may result in your account termination.**
|
||||||
|
|
||||||
#### youtubewebarchive
|
#### youtubewebarchive
|
||||||
* `check_all`: Try to check more at the cost of more requests. One or more of `thumbnails`, `captures`
|
* `check_all`: Try to check more at the cost of more requests. One or more of `thumbnails`, `captures`
|
||||||
|
|
||||||
@ -1843,9 +1820,6 @@ The following extractors use this feature:
|
|||||||
* `vcodec`: vcodec to ignore - one or more of `h264`, `h265`, `dvh265`
|
* `vcodec`: vcodec to ignore - one or more of `h264`, `h265`, `dvh265`
|
||||||
* `dr`: dynamic range to ignore - one or more of `sdr`, `hdr10`, `dv`
|
* `dr`: dynamic range to ignore - one or more of `sdr`, `hdr10`, `dv`
|
||||||
|
|
||||||
#### instagram
|
|
||||||
* `app_id`: The value of the `X-IG-App-ID` header used for API requests. Default is the web app ID, `936619743392459`
|
|
||||||
|
|
||||||
#### niconicochannelplus
|
#### niconicochannelplus
|
||||||
* `max_comments`: Maximum number of comments to extract - default is `120`
|
* `max_comments`: Maximum number of comments to extract - default is `120`
|
||||||
|
|
||||||
@ -1886,7 +1860,7 @@ The following extractors use this feature:
|
|||||||
* `cdn`: One or more CDN IDs to use with the API call for stream URLs, e.g. `gcp_cdn`, `gs_cdn_pc_app`, `gs_cdn_mobile_web`, `gs_cdn_pc_web`
|
* `cdn`: One or more CDN IDs to use with the API call for stream URLs, e.g. `gcp_cdn`, `gs_cdn_pc_app`, `gs_cdn_mobile_web`, `gs_cdn_pc_web`
|
||||||
|
|
||||||
#### soundcloud
|
#### soundcloud
|
||||||
* `formats`: Formats to request from the API. Requested values should be in the format of `{protocol}_{codec}`, e.g. `hls_opus,http_aac`. The `*` character functions as a wildcard, e.g. `*_mp3`, and can be passed by itself to request all formats. Known protocols include `http`, `hls` and `hls-aes`; known codecs include `aac`, `opus` and `mp3`. Original `download` formats are always extracted. Default is `http_aac,hls_aac,http_opus,hls_opus,http_mp3,hls_mp3`
|
* `formats`: Formats to request from the API. Requested values should be in the format of `{protocol}_{extension}` (omitting the bitrate), e.g. `hls_opus,http_aac`. The `*` character functions as a wildcard, e.g. `*_mp3`, and can be passed by itself to request all formats. Known protocols include `http`, `hls` and `hls-aes`; known extensions include `aac`, `opus` and `mp3`. Original `download` formats are always extracted. Default is `http_aac,hls_aac,http_opus,hls_opus,http_mp3,hls_mp3`
|
||||||
|
|
||||||
#### orfon (orf:on)
|
#### orfon (orf:on)
|
||||||
* `prefer_segments_playlist`: Prefer a playlist of program segments instead of a single complete video when available. If individual segments are desired, use `--concat-playlist never --extractor-args "orfon:prefer_segments_playlist"`
|
* `prefer_segments_playlist`: Prefer a playlist of program segments instead of a single complete video when available. If individual segments are desired, use `--concat-playlist never --extractor-args "orfon:prefer_segments_playlist"`
|
||||||
@ -1894,11 +1868,8 @@ The following extractors use this feature:
|
|||||||
#### bilibili
|
#### bilibili
|
||||||
* `prefer_multi_flv`: Prefer extracting flv formats over mp4 for older videos that still provide legacy formats
|
* `prefer_multi_flv`: Prefer extracting flv formats over mp4 for older videos that still provide legacy formats
|
||||||
|
|
||||||
#### sonylivseries
|
#### digitalconcerthall
|
||||||
* `sort_order`: Episode sort order for series extraction - one of `asc` (ascending, oldest first) or `desc` (descending, newest first). Default is `asc`
|
* `prefer_combined_hls`: Prefer extracting combined/pre-merged video and audio HLS formats. This will exclude 4K/HEVC video and lossless/FLAC audio formats, which are only available as split video/audio HLS formats
|
||||||
|
|
||||||
#### tver
|
|
||||||
* `backend`: Backend API to use for extraction - one of `streaks` (default) or `brightcove` (deprecated)
|
|
||||||
|
|
||||||
**Note**: These options may be changed/removed in the future without concern for backward compatibility
|
**Note**: These options may be changed/removed in the future without concern for backward compatibility
|
||||||
|
|
||||||
@ -1926,7 +1897,6 @@ In other words, the file structure on the disk looks something like:
|
|||||||
myplugin.py
|
myplugin.py
|
||||||
|
|
||||||
yt-dlp looks for these `yt_dlp_plugins` namespace folders in many locations (see below) and loads in plugins from **all** of them.
|
yt-dlp looks for these `yt_dlp_plugins` namespace folders in many locations (see below) and loads in plugins from **all** of them.
|
||||||
Set the environment variable `YTDLP_NO_PLUGINS` to something nonempty to disable loading plugins entirely.
|
|
||||||
|
|
||||||
See the [wiki for some known plugins](https://github.com/yt-dlp/yt-dlp/wiki/Plugins)
|
See the [wiki for some known plugins](https://github.com/yt-dlp/yt-dlp/wiki/Plugins)
|
||||||
|
|
||||||
@ -1954,7 +1924,7 @@ Plugins can be installed using various methods and locations.
|
|||||||
* Plugin packages can be installed and managed using `pip`. See [yt-dlp-sample-plugins](https://github.com/yt-dlp/yt-dlp-sample-plugins) for an example.
|
* Plugin packages can be installed and managed using `pip`. See [yt-dlp-sample-plugins](https://github.com/yt-dlp/yt-dlp-sample-plugins) for an example.
|
||||||
* Note: plugin files between plugin packages installed with pip must have unique filenames.
|
* Note: plugin files between plugin packages installed with pip must have unique filenames.
|
||||||
* Any path in `PYTHONPATH` is searched in for the `yt_dlp_plugins` namespace folder.
|
* Any path in `PYTHONPATH` is searched in for the `yt_dlp_plugins` namespace folder.
|
||||||
* Note: This does not apply for Pyinstaller builds.
|
* Note: This does not apply for Pyinstaller/py2exe builds.
|
||||||
|
|
||||||
|
|
||||||
`.zip`, `.egg` and `.whl` archives containing a `yt_dlp_plugins` namespace folder in their root are also supported as plugin packages.
|
`.zip`, `.egg` and `.whl` archives containing a `yt_dlp_plugins` namespace folder in their root are also supported as plugin packages.
|
||||||
@ -2183,14 +2153,14 @@ with yt_dlp.YoutubeDL(ydl_opts) as ydl:
|
|||||||
|
|
||||||
* **[Format Sorting](#sorting-formats)**: The default format sorting options have been changed so that higher resolution and better codecs will be now preferred instead of simply using larger bitrate. Furthermore, you can now specify the sort order using `-S`. This allows for much easier format selection than what is possible by simply using `--format` ([examples](#format-selection-examples))
|
* **[Format Sorting](#sorting-formats)**: The default format sorting options have been changed so that higher resolution and better codecs will be now preferred instead of simply using larger bitrate. Furthermore, you can now specify the sort order using `-S`. This allows for much easier format selection than what is possible by simply using `--format` ([examples](#format-selection-examples))
|
||||||
|
|
||||||
* **Merged with animelover1984/youtube-dl**: You get most of the features and improvements from [animelover1984/youtube-dl](https://github.com/animelover1984/youtube-dl) including `--write-comments`, `BiliBiliSearch`, `BilibiliChannel`, Embedding thumbnail in mp4/ogg/opus, playlist infojson etc. See [#31](https://github.com/yt-dlp/yt-dlp/pull/31) for details.
|
* **Merged with animelover1984/youtube-dl**: You get most of the features and improvements from [animelover1984/youtube-dl](https://github.com/animelover1984/youtube-dl) including `--write-comments`, `BiliBiliSearch`, `BilibiliChannel`, Embedding thumbnail in mp4/ogg/opus, playlist infojson etc. Note that NicoNico livestreams are not available. See [#31](https://github.com/yt-dlp/yt-dlp/pull/31) for details.
|
||||||
|
|
||||||
* **YouTube improvements**:
|
* **YouTube improvements**:
|
||||||
* Supports Clips, Stories (`ytstories:<channel UCID>`), Search (including filters)**\***, YouTube Music Search, Channel-specific search, Search prefixes (`ytsearch:`, `ytsearchdate:`)**\***, Mixes, and Feeds (`:ytfav`, `:ytwatchlater`, `:ytsubs`, `:ythistory`, `:ytrec`, `:ytnotif`)
|
* Supports Clips, Stories (`ytstories:<channel UCID>`), Search (including filters)**\***, YouTube Music Search, Channel-specific search, Search prefixes (`ytsearch:`, `ytsearchdate:`)**\***, Mixes, and Feeds (`:ytfav`, `:ytwatchlater`, `:ytsubs`, `:ythistory`, `:ytrec`, `:ytnotif`)
|
||||||
* Fix for [n-sig based throttling](https://github.com/ytdl-org/youtube-dl/issues/29326) **\***
|
* Fix for [n-sig based throttling](https://github.com/ytdl-org/youtube-dl/issues/29326) **\***
|
||||||
|
* Supports some (but not all) age-gated content without cookies
|
||||||
* Download livestreams from the start using `--live-from-start` (*experimental*)
|
* Download livestreams from the start using `--live-from-start` (*experimental*)
|
||||||
* Channel URLs download all uploads of the channel, including shorts and live
|
* Channel URLs download all uploads of the channel, including shorts and live
|
||||||
* Support for [logging in with OAuth](https://github.com/yt-dlp/yt-dlp/wiki/Extractors#logging-in-with-oauth)
|
|
||||||
|
|
||||||
* **Cookies from browser**: Cookies can be automatically extracted from all major web browsers using `--cookies-from-browser BROWSER[+KEYRING][:PROFILE][::CONTAINER]`
|
* **Cookies from browser**: Cookies can be automatically extracted from all major web browsers using `--cookies-from-browser BROWSER[+KEYRING][:PROFILE][::CONTAINER]`
|
||||||
|
|
||||||
@ -2232,12 +2202,12 @@ Features marked with a **\*** have been back-ported to youtube-dl
|
|||||||
|
|
||||||
Some of yt-dlp's default options are different from that of youtube-dl and youtube-dlc:
|
Some of yt-dlp's default options are different from that of youtube-dl and youtube-dlc:
|
||||||
|
|
||||||
* yt-dlp supports only [Python 3.9+](## "Windows 8"), and will remove support for more versions as they [become EOL](https://devguide.python.org/versions/#python-release-cycle); while [youtube-dl still supports Python 2.6+ and 3.2+](https://github.com/ytdl-org/youtube-dl/issues/30568#issue-1118238743)
|
* yt-dlp supports only [Python 3.8+](## "Windows 7"), and *may* remove support for more versions as they [become EOL](https://devguide.python.org/versions/#python-release-cycle); while [youtube-dl still supports Python 2.6+ and 3.2+](https://github.com/ytdl-org/youtube-dl/issues/30568#issue-1118238743)
|
||||||
* The options `--auto-number` (`-A`), `--title` (`-t`) and `--literal` (`-l`), no longer work. See [removed options](#Removed) for details
|
* The options `--auto-number` (`-A`), `--title` (`-t`) and `--literal` (`-l`), no longer work. See [removed options](#Removed) for details
|
||||||
* `avconv` is not supported as an alternative to `ffmpeg`
|
* `avconv` is not supported as an alternative to `ffmpeg`
|
||||||
* yt-dlp stores config files in slightly different locations to youtube-dl. See [CONFIGURATION](#configuration) for a list of correct locations
|
* yt-dlp stores config files in slightly different locations to youtube-dl. See [CONFIGURATION](#configuration) for a list of correct locations
|
||||||
* The default [output template](#output-template) is `%(title)s [%(id)s].%(ext)s`. There is no real reason for this change. This was changed before yt-dlp was ever made public and now there are no plans to change it back to `%(title)s-%(id)s.%(ext)s`. Instead, you may use `--compat-options filename`
|
* The default [output template](#output-template) is `%(title)s [%(id)s].%(ext)s`. There is no real reason for this change. This was changed before yt-dlp was ever made public and now there are no plans to change it back to `%(title)s-%(id)s.%(ext)s`. Instead, you may use `--compat-options filename`
|
||||||
* The default [format sorting](#sorting-formats) is different from youtube-dl and prefers higher resolution and better codecs rather than higher bitrates. You can use the `--format-sort` option to change this to any order you prefer, or use `--compat-options format-sort` to use youtube-dl's sorting order. Older versions of yt-dlp preferred VP9 due to its broader compatibility; you can use `--compat-options prefer-vp9-sort` to revert to that format sorting preference. These two compat options cannot be used together
|
* The default [format sorting](#sorting-formats) is different from youtube-dl and prefers higher resolution and better codecs rather than higher bitrates. You can use the `--format-sort` option to change this to any order you prefer, or use `--compat-options format-sort` to use youtube-dl's sorting order
|
||||||
* The default format selector is `bv*+ba/b`. This means that if a combined video + audio format that is better than the best video-only format is found, the former will be preferred. Use `-f bv+ba/b` or `--compat-options format-spec` to revert this
|
* The default format selector is `bv*+ba/b`. This means that if a combined video + audio format that is better than the best video-only format is found, the former will be preferred. Use `-f bv+ba/b` or `--compat-options format-spec` to revert this
|
||||||
* Unlike youtube-dlc, yt-dlp does not allow merging multiple audio/video streams into one file by default (since this conflicts with the use of `-f bv*+ba`). If needed, this feature must be enabled using `--audio-multistreams` and `--video-multistreams`. You can also use `--compat-options multistreams` to enable both
|
* Unlike youtube-dlc, yt-dlp does not allow merging multiple audio/video streams into one file by default (since this conflicts with the use of `-f bv*+ba`). If needed, this feature must be enabled using `--audio-multistreams` and `--video-multistreams`. You can also use `--compat-options multistreams` to enable both
|
||||||
* `--no-abort-on-error` is enabled by default. Use `--abort-on-error` or `--compat-options abort-on-error` to abort on errors instead
|
* `--no-abort-on-error` is enabled by default. Use `--abort-on-error` or `--compat-options abort-on-error` to abort on errors instead
|
||||||
@ -2249,7 +2219,7 @@ Some of yt-dlp's default options are different from that of youtube-dl and youtu
|
|||||||
* Live chats (if available) are considered as subtitles. Use `--sub-langs all,-live_chat` to download all subtitles except live chat. You can also use `--compat-options no-live-chat` to prevent any live chat/danmaku from downloading
|
* Live chats (if available) are considered as subtitles. Use `--sub-langs all,-live_chat` to download all subtitles except live chat. You can also use `--compat-options no-live-chat` to prevent any live chat/danmaku from downloading
|
||||||
* YouTube channel URLs download all uploads of the channel. To download only the videos in a specific tab, pass the tab's URL. If the channel does not show the requested tab, an error will be raised. Also, `/live` URLs raise an error if there are no live videos instead of silently downloading the entire channel. You may use `--compat-options no-youtube-channel-redirect` to revert all these redirections
|
* YouTube channel URLs download all uploads of the channel. To download only the videos in a specific tab, pass the tab's URL. If the channel does not show the requested tab, an error will be raised. Also, `/live` URLs raise an error if there are no live videos instead of silently downloading the entire channel. You may use `--compat-options no-youtube-channel-redirect` to revert all these redirections
|
||||||
* Unavailable videos are also listed for YouTube playlists. Use `--compat-options no-youtube-unavailable-videos` to remove this
|
* Unavailable videos are also listed for YouTube playlists. Use `--compat-options no-youtube-unavailable-videos` to remove this
|
||||||
* The upload dates extracted from YouTube are in UTC.
|
* The upload dates extracted from YouTube are in UTC [when available](https://github.com/yt-dlp/yt-dlp/blob/89e4d86171c7b7c997c77d4714542e0383bf0db0/yt_dlp/extractor/youtube.py#L3898-L3900). Use `--compat-options no-youtube-prefer-utc-upload-date` to prefer the non-UTC upload date.
|
||||||
* If `ffmpeg` is used as the downloader, the downloading and merging of formats happen in a single step when possible. Use `--compat-options no-direct-merge` to revert this
|
* If `ffmpeg` is used as the downloader, the downloading and merging of formats happen in a single step when possible. Use `--compat-options no-direct-merge` to revert this
|
||||||
* Thumbnail embedding in `mp4` is done with mutagen if possible. Use `--compat-options embed-thumbnail-atomicparsley` to force the use of AtomicParsley instead
|
* Thumbnail embedding in `mp4` is done with mutagen if possible. Use `--compat-options embed-thumbnail-atomicparsley` to force the use of AtomicParsley instead
|
||||||
* Some internal metadata such as filenames are removed by default from the infojson. Use `--no-clean-infojson` or `--compat-options no-clean-infojson` to revert this
|
* Some internal metadata such as filenames are removed by default from the infojson. Use `--no-clean-infojson` or `--compat-options no-clean-infojson` to revert this
|
||||||
@ -2262,17 +2232,15 @@ Some of yt-dlp's default options are different from that of youtube-dl and youtu
|
|||||||
* yt-dlp uses modern http client backends such as `requests`. Use `--compat-options prefer-legacy-http-handler` to prefer the legacy http handler (`urllib`) to be used for standard http requests.
|
* yt-dlp uses modern http client backends such as `requests`. Use `--compat-options prefer-legacy-http-handler` to prefer the legacy http handler (`urllib`) to be used for standard http requests.
|
||||||
* The sub-modules `swfinterp`, `casefold` are removed.
|
* The sub-modules `swfinterp`, `casefold` are removed.
|
||||||
* Passing `--simulate` (or calling `extract_info` with `download=False`) no longer alters the default format selection. See [#9843](https://github.com/yt-dlp/yt-dlp/issues/9843) for details.
|
* Passing `--simulate` (or calling `extract_info` with `download=False`) no longer alters the default format selection. See [#9843](https://github.com/yt-dlp/yt-dlp/issues/9843) for details.
|
||||||
* yt-dlp no longer applies the server modified time to downloaded files by default. Use `--mtime` or `--compat-options mtime-by-default` to revert this.
|
|
||||||
|
|
||||||
For ease of use, a few more compat options are available:
|
For ease of use, a few more compat options are available:
|
||||||
|
|
||||||
* `--compat-options all`: Use all compat options (**Do NOT use this!**)
|
* `--compat-options all`: Use all compat options (**Do NOT use this!**)
|
||||||
* `--compat-options youtube-dl`: Same as `--compat-options all,-multistreams,-playlist-match-filter,-manifest-filesize-approx,-allow-unsafe-ext,-prefer-vp9-sort`
|
* `--compat-options youtube-dl`: Same as `--compat-options all,-multistreams,-playlist-match-filter,-manifest-filesize-approx,-allow-unsafe-ext`
|
||||||
* `--compat-options youtube-dlc`: Same as `--compat-options all,-no-live-chat,-no-youtube-channel-redirect,-playlist-match-filter,-manifest-filesize-approx,-allow-unsafe-ext,-prefer-vp9-sort`
|
* `--compat-options youtube-dlc`: Same as `--compat-options all,-no-live-chat,-no-youtube-channel-redirect,-playlist-match-filter,-manifest-filesize-approx,-allow-unsafe-ext`
|
||||||
* `--compat-options 2021`: Same as `--compat-options 2022,no-certifi,filename-sanitization`
|
* `--compat-options 2021`: Same as `--compat-options 2022,no-certifi,filename-sanitization,no-youtube-prefer-utc-upload-date`
|
||||||
* `--compat-options 2022`: Same as `--compat-options 2023,playlist-match-filter,no-external-downloader-progress,prefer-legacy-http-handler,manifest-filesize-approx`
|
* `--compat-options 2022`: Same as `--compat-options 2023,playlist-match-filter,no-external-downloader-progress,prefer-legacy-http-handler,manifest-filesize-approx`
|
||||||
* `--compat-options 2023`: Same as `--compat-options 2024,prefer-vp9-sort`
|
* `--compat-options 2023`: Currently does nothing. Use this to enable all future compat options
|
||||||
* `--compat-options 2024`: Same as `--compat-options mtime-by-default`. Use this to enable all future compat options
|
|
||||||
|
|
||||||
The following compat options restore vulnerable behavior from before security patches:
|
The following compat options restore vulnerable behavior from before security patches:
|
||||||
|
|
||||||
@ -2310,8 +2278,8 @@ While these options are redundant, they are still expected to be used due to the
|
|||||||
--min-views COUNT --match-filters "view_count >=? COUNT"
|
--min-views COUNT --match-filters "view_count >=? COUNT"
|
||||||
--max-views COUNT --match-filters "view_count <=? COUNT"
|
--max-views COUNT --match-filters "view_count <=? COUNT"
|
||||||
--break-on-reject Use --break-match-filters
|
--break-on-reject Use --break-match-filters
|
||||||
--user-agent UA --add-headers "User-Agent:UA"
|
--user-agent UA --add-header "User-Agent:UA"
|
||||||
--referer URL --add-headers "Referer:URL"
|
--referer URL --add-header "Referer:URL"
|
||||||
--playlist-start NUMBER -I NUMBER:
|
--playlist-start NUMBER -I NUMBER:
|
||||||
--playlist-end NUMBER -I :NUMBER
|
--playlist-end NUMBER -I :NUMBER
|
||||||
--playlist-reverse -I ::-1
|
--playlist-reverse -I ::-1
|
||||||
|
@ -2,7 +2,6 @@
|
|||||||
set -e
|
set -e
|
||||||
|
|
||||||
source ~/.local/share/pipx/venvs/pyinstaller/bin/activate
|
source ~/.local/share/pipx/venvs/pyinstaller/bin/activate
|
||||||
python -m devscripts.install_deps -o --include build
|
|
||||||
python -m devscripts.install_deps --include secretstorage --include curl-cffi
|
python -m devscripts.install_deps --include secretstorage --include curl-cffi
|
||||||
python -m devscripts.make_lazy_extractors
|
python -m devscripts.make_lazy_extractors
|
||||||
python devscripts/update-version.py -c "${channel}" -r "${origin}" "${version}"
|
python devscripts/update-version.py -c "${channel}" -r "${origin}" "${version}"
|
||||||
|
59
bundle/py2exe.py
Executable file
59
bundle/py2exe.py
Executable file
@ -0,0 +1,59 @@
|
|||||||
|
#!/usr/bin/env python3
|
||||||
|
|
||||||
|
# Allow execution from anywhere
|
||||||
|
import os
|
||||||
|
import sys
|
||||||
|
|
||||||
|
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||||
|
|
||||||
|
import warnings
|
||||||
|
|
||||||
|
from py2exe import freeze
|
||||||
|
|
||||||
|
from devscripts.utils import read_version
|
||||||
|
|
||||||
|
VERSION = read_version()
|
||||||
|
|
||||||
|
|
||||||
|
def main():
|
||||||
|
warnings.warn(
|
||||||
|
'py2exe builds do not support pycryptodomex and needs VC++14 to run. '
|
||||||
|
'It is recommended to run "pyinst.py" to build using pyinstaller instead')
|
||||||
|
|
||||||
|
freeze(
|
||||||
|
console=[{
|
||||||
|
'script': './yt_dlp/__main__.py',
|
||||||
|
'dest_base': 'yt-dlp',
|
||||||
|
'icon_resources': [(1, 'devscripts/logo.ico')],
|
||||||
|
}],
|
||||||
|
version_info={
|
||||||
|
'version': VERSION,
|
||||||
|
'description': 'A feature-rich command-line audio/video downloader',
|
||||||
|
'comments': 'Official repository: <https://github.com/yt-dlp/yt-dlp>',
|
||||||
|
'product_name': 'yt-dlp',
|
||||||
|
'product_version': VERSION,
|
||||||
|
},
|
||||||
|
options={
|
||||||
|
'bundle_files': 0,
|
||||||
|
'compressed': 1,
|
||||||
|
'optimize': 2,
|
||||||
|
'dist_dir': './dist',
|
||||||
|
'excludes': [
|
||||||
|
# py2exe cannot import Crypto
|
||||||
|
'Crypto',
|
||||||
|
'Cryptodome',
|
||||||
|
# requests >=2.32.0 breaks py2exe builds due to certifi dependency
|
||||||
|
'requests',
|
||||||
|
'urllib3',
|
||||||
|
],
|
||||||
|
'dll_excludes': ['w9xpopen.exe', 'crypt32.dll'],
|
||||||
|
# Modules that are only imported dynamically must be added here
|
||||||
|
'includes': ['yt_dlp.compat._legacy', 'yt_dlp.compat._deprecated',
|
||||||
|
'yt_dlp.utils._legacy', 'yt_dlp.utils._deprecated'],
|
||||||
|
},
|
||||||
|
zipfile=None,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == '__main__':
|
||||||
|
main()
|
@ -36,9 +36,6 @@ def main():
|
|||||||
f'--name={name}',
|
f'--name={name}',
|
||||||
'--icon=devscripts/logo.ico',
|
'--icon=devscripts/logo.ico',
|
||||||
'--upx-exclude=vcruntime140.dll',
|
'--upx-exclude=vcruntime140.dll',
|
||||||
# Ref: https://github.com/yt-dlp/yt-dlp/issues/13311
|
|
||||||
# https://github.com/pyinstaller/pyinstaller/issues/9149
|
|
||||||
'--exclude-module=pkg_resources',
|
|
||||||
'--noconfirm',
|
'--noconfirm',
|
||||||
'--additional-hooks-dir=yt_dlp/__pyinstaller',
|
'--additional-hooks-dir=yt_dlp/__pyinstaller',
|
||||||
*opts,
|
*opts,
|
||||||
|
@ -196,71 +196,5 @@
|
|||||||
"when": "b31b81d85f00601710d4fac590c3e4efb4133283",
|
"when": "b31b81d85f00601710d4fac590c3e4efb4133283",
|
||||||
"short": "[ci] Rerun failed tests (#11143)",
|
"short": "[ci] Rerun failed tests (#11143)",
|
||||||
"authors": ["Grub4K"]
|
"authors": ["Grub4K"]
|
||||||
},
|
|
||||||
{
|
|
||||||
"action": "add",
|
|
||||||
"when": "a886cf3e900f4a2ec00af705f883539269545609",
|
|
||||||
"short": "[priority] **py2exe is no longer supported**\nThis release's `yt-dlp_min.exe` will be the last, and it's actually a PyInstaller-bundled executable so that yt-dlp users updating their py2exe build with `-U` will be automatically migrated. [Read more](https://github.com/yt-dlp/yt-dlp/issues/10087)"
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"action": "add",
|
|
||||||
"when": "a886cf3e900f4a2ec00af705f883539269545609",
|
|
||||||
"short": "[priority] **Following this release, yt-dlp's Python dependencies *must* be installed using the `default` group**\nIf you're installing yt-dlp with pip/pipx or requiring yt-dlp in your own Python project, you'll need to specify `yt-dlp[default]` if you want to also install yt-dlp's optional dependencies (which were previously included by default). [Read more](https://github.com/yt-dlp/yt-dlp/pull/11255)"
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"action": "add",
|
|
||||||
"when": "87884f15580910e4e0fe0e1db73508debc657471",
|
|
||||||
"short": "[priority] **Beginning with this release, yt-dlp's Python dependencies *must* be installed using the `default` group**\nIf you're installing yt-dlp with pip/pipx or requiring yt-dlp in your own Python project, you'll need to specify `yt-dlp[default]` if you want to also install yt-dlp's optional dependencies (which were previously included by default). [Read more](https://github.com/yt-dlp/yt-dlp/pull/11255)"
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"action": "add",
|
|
||||||
"when": "d784464399b600ba9516bbcec6286f11d68974dd",
|
|
||||||
"short": "[priority] **The minimum *required* Python version has been raised to 3.9**\nPython 3.8 reached its end-of-life on 2024.10.07, and yt-dlp has now removed support for it. As an unfortunate side effect, the official `yt-dlp.exe` and `yt-dlp_x86.exe` binaries are no longer supported on Windows 7. [Read more](https://github.com/yt-dlp/yt-dlp/issues/10086)"
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"action": "change",
|
|
||||||
"when": "914af9a0cf51c9a3f74aa88d952bee8334c67511",
|
|
||||||
"short": "Expand paths in `--plugin-dirs` (#11334)",
|
|
||||||
"authors": ["bashonly"]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"action": "change",
|
|
||||||
"when": "c29f5a7fae93a08f3cfbb6127b2faa75145b06a0",
|
|
||||||
"short": "[ie/generic] Do not impersonate by default (#11336)",
|
|
||||||
"authors": ["bashonly"]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"action": "change",
|
|
||||||
"when": "57212a5f97ce367590aaa5c3e9a135eead8f81f7",
|
|
||||||
"short": "[ie/vimeo] Fix API retries (#11351)",
|
|
||||||
"authors": ["bashonly"]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"action": "add",
|
|
||||||
"when": "52c0ffe40ad6e8404d93296f575007b05b04c686",
|
|
||||||
"short": "[priority] **Login with OAuth is no longer supported for YouTube**\nDue to a change made by the site, yt-dlp is no longer able to support OAuth login for YouTube. [Read more](https://github.com/yt-dlp/yt-dlp/issues/11462#issuecomment-2471703090)"
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"action": "change",
|
|
||||||
"when": "76ac023ff02f06e8c003d104f02a03deeddebdcd",
|
|
||||||
"short": "[ie/youtube:tab] Improve shorts title extraction (#11997)",
|
|
||||||
"authors": ["bashonly", "d3d9"]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"action": "add",
|
|
||||||
"when": "88eb1e7a9a2720ac89d653c0d0e40292388823bb",
|
|
||||||
"short": "[priority] **New option `--preset-alias`/`-t` has been added**\nThis provides convenient predefined aliases for common use cases. Available presets include `mp4`, `mp3`, `mkv`, `aac`, and `sleep`. See [the README](https://github.com/yt-dlp/yt-dlp/blob/master/README.md#preset-aliases) for more details."
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"action": "remove",
|
|
||||||
"when": "d596824c2f8428362c072518856065070616e348"
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"action": "remove",
|
|
||||||
"when": "7b81634fb1d15999757e7a9883daa6ef09ea785b"
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"action": "remove",
|
|
||||||
"when": "500761e41acb96953a5064e951d41d190c287e46"
|
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
|
@ -11,12 +11,13 @@ import codecs
|
|||||||
import subprocess
|
import subprocess
|
||||||
|
|
||||||
from yt_dlp.aes import aes_encrypt, key_expansion
|
from yt_dlp.aes import aes_encrypt, key_expansion
|
||||||
|
from yt_dlp.utils import intlist_to_bytes
|
||||||
|
|
||||||
secret_msg = b'Secret message goes here'
|
secret_msg = b'Secret message goes here'
|
||||||
|
|
||||||
|
|
||||||
def hex_str(int_list):
|
def hex_str(int_list):
|
||||||
return codecs.encode(bytes(int_list), 'hex')
|
return codecs.encode(intlist_to_bytes(int_list), 'hex')
|
||||||
|
|
||||||
|
|
||||||
def openssl_encode(algo, key, iv):
|
def openssl_encode(algo, key, iv):
|
||||||
|
@ -71,13 +71,14 @@ class CommitGroup(enum.Enum):
|
|||||||
def get(cls, value: str) -> tuple[CommitGroup | None, str | None]:
|
def get(cls, value: str) -> tuple[CommitGroup | None, str | None]:
|
||||||
group, _, subgroup = (group.strip().lower() for group in value.partition('/'))
|
group, _, subgroup = (group.strip().lower() for group in value.partition('/'))
|
||||||
|
|
||||||
if result := cls.group_lookup().get(group):
|
result = cls.group_lookup().get(group)
|
||||||
return result, subgroup or None
|
if not result:
|
||||||
|
if subgroup:
|
||||||
|
return None, value
|
||||||
|
subgroup = group
|
||||||
|
result = cls.subgroup_lookup().get(subgroup)
|
||||||
|
|
||||||
if subgroup:
|
return result, subgroup or None
|
||||||
return None, value
|
|
||||||
|
|
||||||
return cls.subgroup_lookup().get(group), group or None
|
|
||||||
|
|
||||||
|
|
||||||
@dataclass
|
@dataclass
|
||||||
@ -135,7 +136,8 @@ class Changelog:
|
|||||||
first = False
|
first = False
|
||||||
yield '\n<details><summary><h3>Changelog</h3></summary>\n'
|
yield '\n<details><summary><h3>Changelog</h3></summary>\n'
|
||||||
|
|
||||||
if group := groups[item]:
|
group = groups[item]
|
||||||
|
if group:
|
||||||
yield self.format_module(item.value, group)
|
yield self.format_module(item.value, group)
|
||||||
|
|
||||||
if self._collapsible:
|
if self._collapsible:
|
||||||
@ -251,7 +253,7 @@ class CommitRange:
|
|||||||
''', re.VERBOSE | re.DOTALL)
|
''', re.VERBOSE | re.DOTALL)
|
||||||
EXTRACTOR_INDICATOR_RE = re.compile(r'(?:Fix|Add)\s+Extractors?', re.IGNORECASE)
|
EXTRACTOR_INDICATOR_RE = re.compile(r'(?:Fix|Add)\s+Extractors?', re.IGNORECASE)
|
||||||
REVERT_RE = re.compile(r'(?:\[[^\]]+\]\s+)?(?i:Revert)\s+([\da-f]{40})')
|
REVERT_RE = re.compile(r'(?:\[[^\]]+\]\s+)?(?i:Revert)\s+([\da-f]{40})')
|
||||||
FIXES_RE = re.compile(r'(?i:(?:bug\s*)?fix(?:es)?(?:\s+bugs?)?(?:\s+in|\s+for)?|Improve)\s+([\da-f]{40})')
|
FIXES_RE = re.compile(r'(?i:Fix(?:es)?(?:\s+bugs?)?(?:\s+in|\s+for)?|Revert|Improve)\s+([\da-f]{40})')
|
||||||
UPSTREAM_MERGE_RE = re.compile(r'Update to ytdl-commit-([\da-f]+)')
|
UPSTREAM_MERGE_RE = re.compile(r'Update to ytdl-commit-([\da-f]+)')
|
||||||
|
|
||||||
def __init__(self, start, end, default_author=None):
|
def __init__(self, start, end, default_author=None):
|
||||||
@ -285,16 +287,11 @@ class CommitRange:
|
|||||||
short = next(lines)
|
short = next(lines)
|
||||||
skip = short.startswith('Release ') or short == '[version] update'
|
skip = short.startswith('Release ') or short == '[version] update'
|
||||||
|
|
||||||
fix_commitish = None
|
|
||||||
if match := self.FIXES_RE.search(short):
|
|
||||||
fix_commitish = match.group(1)
|
|
||||||
|
|
||||||
authors = [default_author] if default_author else []
|
authors = [default_author] if default_author else []
|
||||||
for line in iter(lambda: next(lines), self.COMMIT_SEPARATOR):
|
for line in iter(lambda: next(lines), self.COMMIT_SEPARATOR):
|
||||||
if match := self.AUTHOR_INDICATOR_RE.match(line):
|
match = self.AUTHOR_INDICATOR_RE.match(line)
|
||||||
|
if match:
|
||||||
authors = sorted(map(str.strip, line[match.end():].split(',')), key=str.casefold)
|
authors = sorted(map(str.strip, line[match.end():].split(',')), key=str.casefold)
|
||||||
if not fix_commitish and (match := self.FIXES_RE.fullmatch(line)):
|
|
||||||
fix_commitish = match.group(1)
|
|
||||||
|
|
||||||
commit = Commit(commit_hash, short, authors)
|
commit = Commit(commit_hash, short, authors)
|
||||||
if skip and (self._start or not i):
|
if skip and (self._start or not i):
|
||||||
@ -304,17 +301,21 @@ class CommitRange:
|
|||||||
logger.debug(f'Reached Release commit, breaking: {commit}')
|
logger.debug(f'Reached Release commit, breaking: {commit}')
|
||||||
break
|
break
|
||||||
|
|
||||||
if match := self.REVERT_RE.fullmatch(commit.short):
|
revert_match = self.REVERT_RE.fullmatch(commit.short)
|
||||||
reverts[match.group(1)] = commit
|
if revert_match:
|
||||||
|
reverts[revert_match.group(1)] = commit
|
||||||
continue
|
continue
|
||||||
|
|
||||||
if fix_commitish:
|
fix_match = self.FIXES_RE.search(commit.short)
|
||||||
fixes[fix_commitish].append(commit)
|
if fix_match:
|
||||||
|
commitish = fix_match.group(1)
|
||||||
|
fixes[commitish].append(commit)
|
||||||
|
|
||||||
commits[commit.hash] = commit
|
commits[commit.hash] = commit
|
||||||
|
|
||||||
for commitish, revert_commit in reverts.items():
|
for commitish, revert_commit in reverts.items():
|
||||||
if reverted := commits.pop(commitish, None):
|
reverted = commits.pop(commitish, None)
|
||||||
|
if reverted:
|
||||||
logger.debug(f'{commitish} fully reverted {reverted}')
|
logger.debug(f'{commitish} fully reverted {reverted}')
|
||||||
else:
|
else:
|
||||||
commits[revert_commit.hash] = revert_commit
|
commits[revert_commit.hash] = revert_commit
|
||||||
@ -460,7 +461,8 @@ def create_changelog(args):
|
|||||||
|
|
||||||
logger.info(f'Loaded {len(commits)} commits')
|
logger.info(f'Loaded {len(commits)} commits')
|
||||||
|
|
||||||
if new_contributors := get_new_contributors(args.contributors_path, commits):
|
new_contributors = get_new_contributors(args.contributors_path, commits)
|
||||||
|
if new_contributors:
|
||||||
if args.contributors:
|
if args.contributors:
|
||||||
write_file(args.contributors_path, '\n'.join(new_contributors) + '\n', mode='a')
|
write_file(args.contributors_path, '\n'.join(new_contributors) + '\n', mode='a')
|
||||||
logger.info(f'New contributors: {", ".join(new_contributors)}')
|
logger.info(f'New contributors: {", ".join(new_contributors)}')
|
||||||
|
@ -11,13 +11,11 @@ import re
|
|||||||
|
|
||||||
from devscripts.utils import get_filename_args, read_file, write_file
|
from devscripts.utils import get_filename_args, read_file, write_file
|
||||||
|
|
||||||
VERBOSE = '''
|
VERBOSE_TMPL = '''
|
||||||
- type: checkboxes
|
- type: checkboxes
|
||||||
id: verbose
|
id: verbose
|
||||||
attributes:
|
attributes:
|
||||||
label: Provide verbose output that clearly demonstrates the problem
|
label: Provide verbose output that clearly demonstrates the problem
|
||||||
description: |
|
|
||||||
This is mandatory unless absolutely impossible to provide. If you are unable to provide the output, please explain why.
|
|
||||||
options:
|
options:
|
||||||
- label: Run **your** yt-dlp command with **-vU** flag added (`yt-dlp -vU <your command line>`)
|
- label: Run **your** yt-dlp command with **-vU** flag added (`yt-dlp -vU <your command line>`)
|
||||||
required: true
|
required: true
|
||||||
@ -34,38 +32,45 @@ VERBOSE = '''
|
|||||||
placeholder: |
|
placeholder: |
|
||||||
[debug] Command-line config: ['-vU', 'https://www.youtube.com/watch?v=BaW_jenozKc']
|
[debug] Command-line config: ['-vU', 'https://www.youtube.com/watch?v=BaW_jenozKc']
|
||||||
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
|
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
|
||||||
[debug] yt-dlp version nightly@... from yt-dlp/yt-dlp-nightly-builds [1a176d874] (win_exe)
|
[debug] yt-dlp version nightly@... from yt-dlp/yt-dlp [b634ba742] (win_exe)
|
||||||
[debug] Python 3.10.11 (CPython AMD64 64bit) - Windows-10-10.0.20348-SP0 (OpenSSL 1.1.1t 7 Feb 2023)
|
[debug] Python 3.8.10 (CPython 64bit) - Windows-10-10.0.22000-SP0
|
||||||
[debug] exe versions: ffmpeg 7.0.2 (setts), ffprobe 7.0.2
|
[debug] exe versions: ffmpeg N-106550-g072101bd52-20220410 (fdk,setts), ffprobe N-106624-g391ce570c8-20220415, phantomjs 2.1.1
|
||||||
[debug] Optional libraries: Cryptodome-3.21.0, brotli-1.1.0, certifi-2024.08.30, curl_cffi-0.5.10, mutagen-1.47.0, requests-2.32.3, sqlite3-3.40.1, urllib3-2.2.3, websockets-13.1
|
[debug] Optional libraries: Cryptodome-3.15.0, brotli-1.0.9, certifi-2022.06.15, mutagen-1.45.1, sqlite3-2.6.0, websockets-10.3
|
||||||
[debug] Proxy map: {}
|
[debug] Proxy map: {}
|
||||||
[debug] Request Handlers: urllib, requests, websockets, curl_cffi
|
[debug] Request Handlers: urllib, requests
|
||||||
[debug] Loaded 1838 extractors
|
[debug] Loaded 1893 extractors
|
||||||
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp/releases/latest
|
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp-nightly-builds/releases/latest
|
||||||
Latest version: nightly@... from yt-dlp/yt-dlp-nightly-builds
|
|
||||||
yt-dlp is up to date (nightly@... from yt-dlp/yt-dlp-nightly-builds)
|
yt-dlp is up to date (nightly@... from yt-dlp/yt-dlp-nightly-builds)
|
||||||
[youtube] Extracting URL: https://www.youtube.com/watch?v=BaW_jenozKc
|
[youtube] Extracting URL: https://www.youtube.com/watch?v=BaW_jenozKc
|
||||||
<more lines>
|
<more lines>
|
||||||
render: shell
|
render: shell
|
||||||
validations:
|
validations:
|
||||||
required: true
|
required: true
|
||||||
'''.strip()
|
|
||||||
|
|
||||||
NO_SKIP = '''
|
|
||||||
- type: markdown
|
- type: markdown
|
||||||
attributes:
|
attributes:
|
||||||
value: |
|
value: |
|
||||||
> [!IMPORTANT]
|
> [!CAUTION]
|
||||||
> Not providing the required (*) information or removing the template will result in your issue being closed and ignored.
|
> ### GitHub is experiencing a high volume of malicious spam comments.
|
||||||
|
> ### If you receive any replies asking you download a file, do NOT follow the download links!
|
||||||
|
>
|
||||||
|
> Note that this issue may be temporarily locked as an anti-spam measure after it is opened.
|
||||||
|
'''.strip()
|
||||||
|
|
||||||
|
NO_SKIP = '''
|
||||||
|
- type: checkboxes
|
||||||
|
attributes:
|
||||||
|
label: DO NOT REMOVE OR SKIP THE ISSUE TEMPLATE
|
||||||
|
description: Fill all fields even if you think it is irrelevant for the issue
|
||||||
|
options:
|
||||||
|
- label: I understand that I will be **blocked** if I *intentionally* remove or skip any mandatory\\* field
|
||||||
|
required: true
|
||||||
'''.strip()
|
'''.strip()
|
||||||
|
|
||||||
|
|
||||||
def main():
|
def main():
|
||||||
fields = {
|
fields = {'no_skip': NO_SKIP}
|
||||||
'no_skip': NO_SKIP,
|
fields['verbose'] = VERBOSE_TMPL % fields
|
||||||
'verbose': VERBOSE,
|
fields['verbose_optional'] = re.sub(r'(\n\s+validations:)?\n\s+required: true', '', fields['verbose'])
|
||||||
'verbose_optional': re.sub(r'(\n\s+validations:)?\n\s+required: true', '', VERBOSE),
|
|
||||||
}
|
|
||||||
|
|
||||||
infile, outfile = get_filename_args(has_infile=True)
|
infile, outfile = get_filename_args(has_infile=True)
|
||||||
write_file(outfile, read_file(infile) % fields)
|
write_file(outfile, read_file(infile) % fields)
|
||||||
|
@ -2,6 +2,7 @@
|
|||||||
|
|
||||||
# Allow direct execution
|
# Allow direct execution
|
||||||
import os
|
import os
|
||||||
|
import shutil
|
||||||
import sys
|
import sys
|
||||||
|
|
||||||
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||||
@ -10,9 +11,6 @@ sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
|||||||
from inspect import getsource
|
from inspect import getsource
|
||||||
|
|
||||||
from devscripts.utils import get_filename_args, read_file, write_file
|
from devscripts.utils import get_filename_args, read_file, write_file
|
||||||
from yt_dlp.extractor import import_extractors
|
|
||||||
from yt_dlp.extractor.common import InfoExtractor, SearchInfoExtractor
|
|
||||||
from yt_dlp.globals import extractors
|
|
||||||
|
|
||||||
NO_ATTR = object()
|
NO_ATTR = object()
|
||||||
STATIC_CLASS_PROPERTIES = [
|
STATIC_CLASS_PROPERTIES = [
|
||||||
@ -36,12 +34,17 @@ MODULE_TEMPLATE = read_file('devscripts/lazy_load_template.py')
|
|||||||
|
|
||||||
|
|
||||||
def main():
|
def main():
|
||||||
os.environ['YTDLP_NO_PLUGINS'] = 'true'
|
|
||||||
os.environ['YTDLP_NO_LAZY_EXTRACTORS'] = 'true'
|
|
||||||
|
|
||||||
lazy_extractors_filename = get_filename_args(default_outfile='yt_dlp/extractor/lazy_extractors.py')
|
lazy_extractors_filename = get_filename_args(default_outfile='yt_dlp/extractor/lazy_extractors.py')
|
||||||
|
if os.path.exists(lazy_extractors_filename):
|
||||||
|
os.remove(lazy_extractors_filename)
|
||||||
|
|
||||||
import_extractors()
|
_ALL_CLASSES = get_all_ies() # Must be before import
|
||||||
|
|
||||||
|
import yt_dlp.plugins
|
||||||
|
from yt_dlp.extractor.common import InfoExtractor, SearchInfoExtractor
|
||||||
|
|
||||||
|
# Filter out plugins
|
||||||
|
_ALL_CLASSES = [cls for cls in _ALL_CLASSES if not cls.__module__.startswith(f'{yt_dlp.plugins.PACKAGE_NAME}.')]
|
||||||
|
|
||||||
DummyInfoExtractor = type('InfoExtractor', (InfoExtractor,), {'IE_NAME': NO_ATTR})
|
DummyInfoExtractor = type('InfoExtractor', (InfoExtractor,), {'IE_NAME': NO_ATTR})
|
||||||
module_src = '\n'.join((
|
module_src = '\n'.join((
|
||||||
@ -49,12 +52,26 @@ def main():
|
|||||||
' _module = None',
|
' _module = None',
|
||||||
*extra_ie_code(DummyInfoExtractor),
|
*extra_ie_code(DummyInfoExtractor),
|
||||||
'\nclass LazyLoadSearchExtractor(LazyLoadExtractor):\n pass\n',
|
'\nclass LazyLoadSearchExtractor(LazyLoadExtractor):\n pass\n',
|
||||||
*build_ies(list(extractors.value.values()), (InfoExtractor, SearchInfoExtractor), DummyInfoExtractor),
|
*build_ies(_ALL_CLASSES, (InfoExtractor, SearchInfoExtractor), DummyInfoExtractor),
|
||||||
))
|
))
|
||||||
|
|
||||||
write_file(lazy_extractors_filename, f'{module_src}\n')
|
write_file(lazy_extractors_filename, f'{module_src}\n')
|
||||||
|
|
||||||
|
|
||||||
|
def get_all_ies():
|
||||||
|
PLUGINS_DIRNAME = 'ytdlp_plugins'
|
||||||
|
BLOCKED_DIRNAME = f'{PLUGINS_DIRNAME}_blocked'
|
||||||
|
if os.path.exists(PLUGINS_DIRNAME):
|
||||||
|
# os.rename cannot be used, e.g. in Docker. See https://github.com/yt-dlp/yt-dlp/pull/4958
|
||||||
|
shutil.move(PLUGINS_DIRNAME, BLOCKED_DIRNAME)
|
||||||
|
try:
|
||||||
|
from yt_dlp.extractor.extractors import _ALL_CLASSES
|
||||||
|
finally:
|
||||||
|
if os.path.exists(BLOCKED_DIRNAME):
|
||||||
|
shutil.move(BLOCKED_DIRNAME, PLUGINS_DIRNAME)
|
||||||
|
return _ALL_CLASSES
|
||||||
|
|
||||||
|
|
||||||
def extra_ie_code(ie, base=None):
|
def extra_ie_code(ie, base=None):
|
||||||
for var in STATIC_CLASS_PROPERTIES:
|
for var in STATIC_CLASS_PROPERTIES:
|
||||||
val = getattr(ie, var)
|
val = getattr(ie, var)
|
||||||
@ -75,7 +92,7 @@ def build_ies(ies, bases, attr_base):
|
|||||||
if ie in ies:
|
if ie in ies:
|
||||||
names.append(ie.__name__)
|
names.append(ie.__name__)
|
||||||
|
|
||||||
yield '\n_CLASS_LOOKUP = {%s}' % ', '.join(f'{name!r}: {name}' for name in names)
|
yield f'\n_ALL_CLASSES = [{", ".join(names)}]'
|
||||||
|
|
||||||
|
|
||||||
def sort_ies(ies, ignored_bases):
|
def sort_ies(ies, ignored_bases):
|
||||||
|
@ -10,21 +10,10 @@ sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
|||||||
from devscripts.utils import get_filename_args, write_file
|
from devscripts.utils import get_filename_args, write_file
|
||||||
from yt_dlp.extractor import list_extractor_classes
|
from yt_dlp.extractor import list_extractor_classes
|
||||||
|
|
||||||
TEMPLATE = '''\
|
|
||||||
# Supported sites
|
|
||||||
|
|
||||||
Below is a list of all extractors that are currently included with yt-dlp.
|
|
||||||
If a site is not listed here, it might still be supported by yt-dlp's embed extraction or generic extractor.
|
|
||||||
Not all sites listed here are guaranteed to work; websites are constantly changing and sometimes this breaks yt-dlp's support for them.
|
|
||||||
The only reliable way to check if a site is supported is to try it.
|
|
||||||
|
|
||||||
{ie_list}
|
|
||||||
'''
|
|
||||||
|
|
||||||
|
|
||||||
def main():
|
def main():
|
||||||
out = '\n'.join(ie.description() for ie in list_extractor_classes() if ie.IE_DESC is not False)
|
out = '\n'.join(ie.description() for ie in list_extractor_classes() if ie.IE_DESC is not False)
|
||||||
write_file(get_filename_args(), TEMPLATE.format(ie_list=out))
|
write_file(get_filename_args(), f'# Supported sites\n{out}\n')
|
||||||
|
|
||||||
|
|
||||||
if __name__ == '__main__':
|
if __name__ == '__main__':
|
||||||
|
@ -16,7 +16,7 @@ fix_test_name = functools.partial(re.compile(r'IE(_all|_\d+)?$').sub, r'\1')
|
|||||||
def parse_args():
|
def parse_args():
|
||||||
parser = argparse.ArgumentParser(description='Run selected yt-dlp tests')
|
parser = argparse.ArgumentParser(description='Run selected yt-dlp tests')
|
||||||
parser.add_argument(
|
parser.add_argument(
|
||||||
'test', help='an extractor test, test path, or one of "core" or "download"', nargs='*')
|
'test', help='a extractor tests, or one of "core" or "download"', nargs='*')
|
||||||
parser.add_argument(
|
parser.add_argument(
|
||||||
'-k', help='run a test matching EXPRESSION. Same as "pytest -k"', metavar='EXPRESSION')
|
'-k', help='run a test matching EXPRESSION. Same as "pytest -k"', metavar='EXPRESSION')
|
||||||
parser.add_argument(
|
parser.add_argument(
|
||||||
@ -25,9 +25,9 @@ def parse_args():
|
|||||||
|
|
||||||
|
|
||||||
def run_tests(*tests, pattern=None, ci=False):
|
def run_tests(*tests, pattern=None, ci=False):
|
||||||
# XXX: hatch uses `tests` if no arguments are passed
|
run_core = 'core' in tests or (not pattern and not tests)
|
||||||
run_core = 'core' in tests or 'tests' in tests or (not pattern and not tests)
|
|
||||||
run_download = 'download' in tests
|
run_download = 'download' in tests
|
||||||
|
tests = list(map(fix_test_name, tests))
|
||||||
|
|
||||||
pytest_args = args.pytest_args or os.getenv('HATCH_TEST_ARGS', '')
|
pytest_args = args.pytest_args or os.getenv('HATCH_TEST_ARGS', '')
|
||||||
arguments = ['pytest', '-Werror', '--tb=short', *shlex.split(pytest_args)]
|
arguments = ['pytest', '-Werror', '--tb=short', *shlex.split(pytest_args)]
|
||||||
@ -41,9 +41,7 @@ def run_tests(*tests, pattern=None, ci=False):
|
|||||||
arguments.extend(['-m', 'download'])
|
arguments.extend(['-m', 'download'])
|
||||||
else:
|
else:
|
||||||
arguments.extend(
|
arguments.extend(
|
||||||
test if '/' in test
|
f'test/test_download.py::TestDownload::test_{test}' for test in tests)
|
||||||
else f'test/test_download.py::TestDownload::test_{fix_test_name(test)}'
|
|
||||||
for test in tests)
|
|
||||||
|
|
||||||
print(f'Running {arguments}', flush=True)
|
print(f'Running {arguments}', flush=True)
|
||||||
try:
|
try:
|
||||||
|
@ -13,7 +13,7 @@ maintainers = [
|
|||||||
]
|
]
|
||||||
description = "A feature-rich command-line audio/video downloader"
|
description = "A feature-rich command-line audio/video downloader"
|
||||||
readme = "README.md"
|
readme = "README.md"
|
||||||
requires-python = ">=3.9"
|
requires-python = ">=3.8"
|
||||||
keywords = [
|
keywords = [
|
||||||
"youtube-dl",
|
"youtube-dl",
|
||||||
"video-downloader",
|
"video-downloader",
|
||||||
@ -29,11 +29,11 @@ classifiers = [
|
|||||||
"Environment :: Console",
|
"Environment :: Console",
|
||||||
"Programming Language :: Python",
|
"Programming Language :: Python",
|
||||||
"Programming Language :: Python :: 3 :: Only",
|
"Programming Language :: Python :: 3 :: Only",
|
||||||
|
"Programming Language :: Python :: 3.8",
|
||||||
"Programming Language :: Python :: 3.9",
|
"Programming Language :: Python :: 3.9",
|
||||||
"Programming Language :: Python :: 3.10",
|
"Programming Language :: Python :: 3.10",
|
||||||
"Programming Language :: Python :: 3.11",
|
"Programming Language :: Python :: 3.11",
|
||||||
"Programming Language :: Python :: 3.12",
|
"Programming Language :: Python :: 3.12",
|
||||||
"Programming Language :: Python :: 3.13",
|
|
||||||
"Programming Language :: Python :: Implementation",
|
"Programming Language :: Python :: Implementation",
|
||||||
"Programming Language :: Python :: Implementation :: CPython",
|
"Programming Language :: Python :: Implementation :: CPython",
|
||||||
"Programming Language :: Python :: Implementation :: PyPy",
|
"Programming Language :: Python :: Implementation :: PyPy",
|
||||||
@ -41,10 +41,7 @@ classifiers = [
|
|||||||
"Operating System :: OS Independent",
|
"Operating System :: OS Independent",
|
||||||
]
|
]
|
||||||
dynamic = ["version"]
|
dynamic = ["version"]
|
||||||
dependencies = []
|
dependencies = [
|
||||||
|
|
||||||
[project.optional-dependencies]
|
|
||||||
default = [
|
|
||||||
"brotli; implementation_name=='cpython'",
|
"brotli; implementation_name=='cpython'",
|
||||||
"brotlicffi; implementation_name!='cpython'",
|
"brotlicffi; implementation_name!='cpython'",
|
||||||
"certifi",
|
"certifi",
|
||||||
@ -54,8 +51,12 @@ default = [
|
|||||||
"urllib3>=1.26.17,<3",
|
"urllib3>=1.26.17,<3",
|
||||||
"websockets>=13.0",
|
"websockets>=13.0",
|
||||||
]
|
]
|
||||||
|
|
||||||
|
[project.optional-dependencies]
|
||||||
|
default = []
|
||||||
curl-cffi = [
|
curl-cffi = [
|
||||||
"curl-cffi>=0.5.10,!=0.6.*,!=0.7.*,!=0.8.*,!=0.9.*,<0.11; implementation_name=='cpython'",
|
"curl-cffi==0.5.10; os_name=='nt' and implementation_name=='cpython'",
|
||||||
|
"curl-cffi>=0.5.10,!=0.6.*,<0.7.2; os_name!='nt' and implementation_name=='cpython'",
|
||||||
]
|
]
|
||||||
secretstorage = [
|
secretstorage = [
|
||||||
"cffi",
|
"cffi",
|
||||||
@ -65,7 +66,7 @@ build = [
|
|||||||
"build",
|
"build",
|
||||||
"hatchling",
|
"hatchling",
|
||||||
"pip",
|
"pip",
|
||||||
"setuptools>=71.0.2,<81", # See https://github.com/pyinstaller/pyinstaller/issues/9149
|
"setuptools>=71.0.2", # 71.0.0 broke pyinstaller
|
||||||
"wheel",
|
"wheel",
|
||||||
]
|
]
|
||||||
dev = [
|
dev = [
|
||||||
@ -75,14 +76,17 @@ dev = [
|
|||||||
]
|
]
|
||||||
static-analysis = [
|
static-analysis = [
|
||||||
"autopep8~=2.0",
|
"autopep8~=2.0",
|
||||||
"ruff~=0.11.0",
|
"ruff~=0.6.0",
|
||||||
]
|
]
|
||||||
test = [
|
test = [
|
||||||
"pytest~=8.1",
|
"pytest~=8.1",
|
||||||
"pytest-rerunfailures~=14.0",
|
"pytest-rerunfailures~=14.0",
|
||||||
]
|
]
|
||||||
pyinstaller = [
|
pyinstaller = [
|
||||||
"pyinstaller>=6.13.0", # Windows temp cleanup fixed in 6.13.0
|
"pyinstaller>=6.10.0", # Windows temp cleanup fixed in 6.10.0
|
||||||
|
]
|
||||||
|
py2exe = [
|
||||||
|
"py2exe>=0.12",
|
||||||
]
|
]
|
||||||
|
|
||||||
[project.urls]
|
[project.urls]
|
||||||
@ -168,11 +172,13 @@ run-cov = "echo Code coverage not implemented && exit 1"
|
|||||||
|
|
||||||
[[tool.hatch.envs.hatch-test.matrix]]
|
[[tool.hatch.envs.hatch-test.matrix]]
|
||||||
python = [
|
python = [
|
||||||
|
"3.8",
|
||||||
"3.9",
|
"3.9",
|
||||||
"3.10",
|
"3.10",
|
||||||
"3.11",
|
"3.11",
|
||||||
"3.12",
|
"3.12",
|
||||||
"3.13",
|
"pypy3.8",
|
||||||
|
"pypy3.9",
|
||||||
"pypy3.10",
|
"pypy3.10",
|
||||||
]
|
]
|
||||||
|
|
||||||
@ -185,7 +191,6 @@ ignore = [
|
|||||||
"E501", # line-too-long
|
"E501", # line-too-long
|
||||||
"E731", # lambda-assignment
|
"E731", # lambda-assignment
|
||||||
"E741", # ambiguous-variable-name
|
"E741", # ambiguous-variable-name
|
||||||
"UP031", # printf-string-formatting
|
|
||||||
"UP036", # outdated-version-block
|
"UP036", # outdated-version-block
|
||||||
"B006", # mutable-argument-default
|
"B006", # mutable-argument-default
|
||||||
"B008", # function-call-in-default-argument
|
"B008", # function-call-in-default-argument
|
||||||
@ -194,7 +199,6 @@ ignore = [
|
|||||||
"B023", # function-uses-loop-variable (false positives)
|
"B023", # function-uses-loop-variable (false positives)
|
||||||
"B028", # no-explicit-stacklevel
|
"B028", # no-explicit-stacklevel
|
||||||
"B904", # raise-without-from-inside-except
|
"B904", # raise-without-from-inside-except
|
||||||
"A005", # stdlib-module-shadowing
|
|
||||||
"C401", # unnecessary-generator-set
|
"C401", # unnecessary-generator-set
|
||||||
"C402", # unnecessary-generator-dict
|
"C402", # unnecessary-generator-dict
|
||||||
"PIE790", # unnecessary-placeholder
|
"PIE790", # unnecessary-placeholder
|
||||||
@ -259,6 +263,9 @@ select = [
|
|||||||
"A002", # builtin-argument-shadowing
|
"A002", # builtin-argument-shadowing
|
||||||
"C408", # unnecessary-collection-call
|
"C408", # unnecessary-collection-call
|
||||||
]
|
]
|
||||||
|
"yt_dlp/jsinterp.py" = [
|
||||||
|
"UP031", # printf-string-formatting
|
||||||
|
]
|
||||||
|
|
||||||
[tool.ruff.lint.isort]
|
[tool.ruff.lint.isort]
|
||||||
known-first-party = [
|
known-first-party = [
|
||||||
@ -311,16 +318,6 @@ banned-from = [
|
|||||||
"yt_dlp.compat.compat_urllib_parse_urlparse".msg = "Use `urllib.parse.urlparse` instead."
|
"yt_dlp.compat.compat_urllib_parse_urlparse".msg = "Use `urllib.parse.urlparse` instead."
|
||||||
"yt_dlp.compat.compat_shlex_quote".msg = "Use `yt_dlp.utils.shell_quote` instead."
|
"yt_dlp.compat.compat_shlex_quote".msg = "Use `yt_dlp.utils.shell_quote` instead."
|
||||||
"yt_dlp.utils.error_to_compat_str".msg = "Use `str` instead."
|
"yt_dlp.utils.error_to_compat_str".msg = "Use `str` instead."
|
||||||
"yt_dlp.utils.bytes_to_intlist".msg = "Use `list` instead."
|
|
||||||
"yt_dlp.utils.intlist_to_bytes".msg = "Use `bytes` instead."
|
|
||||||
"yt_dlp.utils.decodeArgument".msg = "Do not use"
|
|
||||||
"yt_dlp.utils.decodeFilename".msg = "Do not use"
|
|
||||||
"yt_dlp.utils.encodeFilename".msg = "Do not use"
|
|
||||||
"yt_dlp.compat.compat_os_name".msg = "Use `os.name` instead."
|
|
||||||
"yt_dlp.compat.compat_realpath".msg = "Use `os.path.realpath` instead."
|
|
||||||
"yt_dlp.compat.functools".msg = "Use `functools` instead."
|
|
||||||
"yt_dlp.utils.decodeOption".msg = "Do not use"
|
|
||||||
"yt_dlp.utils.compiled_regex_type".msg = "Use `re.Pattern` instead."
|
|
||||||
|
|
||||||
[tool.autopep8]
|
[tool.autopep8]
|
||||||
max_line_length = 120
|
max_line_length = 120
|
||||||
@ -383,14 +380,9 @@ select = [
|
|||||||
"W391",
|
"W391",
|
||||||
"W504",
|
"W504",
|
||||||
]
|
]
|
||||||
exclude = "*/extractor/lazy_extractors.py,*venv*,*/test/testdata/sigs/player-*.js,.idea,.vscode"
|
|
||||||
|
|
||||||
[tool.pytest.ini_options]
|
[tool.pytest.ini_options]
|
||||||
addopts = [
|
addopts = "-ra -v --strict-markers"
|
||||||
"-ra", # summary: all except passed
|
|
||||||
"--verbose",
|
|
||||||
"--strict-markers",
|
|
||||||
]
|
|
||||||
markers = [
|
markers = [
|
||||||
"download",
|
"download",
|
||||||
]
|
]
|
||||||
|
@ -16,7 +16,7 @@ remove-unused-variables = true
|
|||||||
|
|
||||||
[tox:tox]
|
[tox:tox]
|
||||||
skipsdist = true
|
skipsdist = true
|
||||||
envlist = py{39,310,311,312,313},pypy310
|
envlist = py{38,39,310,311,312},pypy{38,39,310}
|
||||||
skip_missing_interpreters = true
|
skip_missing_interpreters = true
|
||||||
|
|
||||||
[testenv] # tox
|
[testenv] # tox
|
||||||
@ -29,7 +29,7 @@ setenv =
|
|||||||
|
|
||||||
|
|
||||||
[isort]
|
[isort]
|
||||||
py_version = 39
|
py_version = 38
|
||||||
multi_line_output = VERTICAL_HANGING_INDENT
|
multi_line_output = VERTICAL_HANGING_INDENT
|
||||||
line_length = 80
|
line_length = 80
|
||||||
reverse_relative = true
|
reverse_relative = true
|
||||||
|
@ -1,15 +1,6 @@
|
|||||||
# Supported sites
|
# Supported sites
|
||||||
|
|
||||||
Below is a list of all extractors that are currently included with yt-dlp.
|
|
||||||
If a site is not listed here, it might still be supported by yt-dlp's embed extraction or generic extractor.
|
|
||||||
Not all sites listed here are guaranteed to work; websites are constantly changing and sometimes this breaks yt-dlp's support for them.
|
|
||||||
The only reliable way to check if a site is supported is to try it.
|
|
||||||
|
|
||||||
- **10play**: [*10play*](## "netrc machine")
|
|
||||||
- **10play:season**
|
|
||||||
- **17live**
|
- **17live**
|
||||||
- **17live:clip**
|
- **17live:clip**
|
||||||
- **17live:vod**
|
|
||||||
- **1News**: 1news.co.nz article videos
|
- **1News**: 1news.co.nz article videos
|
||||||
- **1tv**: Первый канал
|
- **1tv**: Первый канал
|
||||||
- **20min**
|
- **20min**
|
||||||
@ -54,6 +45,10 @@ The only reliable way to check if a site is supported is to try it.
|
|||||||
- **aenetworks:collection**
|
- **aenetworks:collection**
|
||||||
- **aenetworks:show**
|
- **aenetworks:show**
|
||||||
- **AeonCo**
|
- **AeonCo**
|
||||||
|
- **afreecatv**: [*afreecatv*](## "netrc machine") afreecatv.com
|
||||||
|
- **afreecatv:catchstory**: [*afreecatv*](## "netrc machine") afreecatv.com catch story
|
||||||
|
- **afreecatv:live**: [*afreecatv*](## "netrc machine") afreecatv.com livestreams
|
||||||
|
- **afreecatv:user**
|
||||||
- **AirTV**
|
- **AirTV**
|
||||||
- **AitubeKZVideo**
|
- **AitubeKZVideo**
|
||||||
- **AliExpressLive**
|
- **AliExpressLive**
|
||||||
@ -138,8 +133,6 @@ The only reliable way to check if a site is supported is to try it.
|
|||||||
- **Bandcamp:album**
|
- **Bandcamp:album**
|
||||||
- **Bandcamp:user**
|
- **Bandcamp:user**
|
||||||
- **Bandcamp:weekly**
|
- **Bandcamp:weekly**
|
||||||
- **Bandlab**
|
|
||||||
- **BandlabPlaylist**
|
|
||||||
- **BannedVideo**
|
- **BannedVideo**
|
||||||
- **bbc**: [*bbc*](## "netrc machine") BBC
|
- **bbc**: [*bbc*](## "netrc machine") BBC
|
||||||
- **bbc.co.uk**: [*bbc*](## "netrc machine") BBC iPlayer
|
- **bbc.co.uk**: [*bbc*](## "netrc machine") BBC iPlayer
|
||||||
@ -180,7 +173,6 @@ The only reliable way to check if a site is supported is to try it.
|
|||||||
- **BilibiliCheese**
|
- **BilibiliCheese**
|
||||||
- **BilibiliCheeseSeason**
|
- **BilibiliCheeseSeason**
|
||||||
- **BilibiliCollectionList**
|
- **BilibiliCollectionList**
|
||||||
- **BiliBiliDynamic**
|
|
||||||
- **BilibiliFavoritesList**
|
- **BilibiliFavoritesList**
|
||||||
- **BiliBiliPlayer**
|
- **BiliBiliPlayer**
|
||||||
- **BilibiliPlaylist**
|
- **BilibiliPlaylist**
|
||||||
@ -202,8 +194,7 @@ The only reliable way to check if a site is supported is to try it.
|
|||||||
- **blerp**
|
- **blerp**
|
||||||
- **blogger.com**
|
- **blogger.com**
|
||||||
- **Bloomberg**
|
- **Bloomberg**
|
||||||
- **Bluesky**
|
- **BokeCC**
|
||||||
- **BokeCC**: CC视频
|
|
||||||
- **BongaCams**
|
- **BongaCams**
|
||||||
- **Boosty**
|
- **Boosty**
|
||||||
- **BostonGlobe**
|
- **BostonGlobe**
|
||||||
@ -227,7 +218,6 @@ The only reliable way to check if a site is supported is to try it.
|
|||||||
- **bt:vestlendingen**: Bergens Tidende - Vestlendingen
|
- **bt:vestlendingen**: Bergens Tidende - Vestlendingen
|
||||||
- **Bundesliga**
|
- **Bundesliga**
|
||||||
- **Bundestag**
|
- **Bundestag**
|
||||||
- **BunnyCdn**
|
|
||||||
- **BusinessInsider**
|
- **BusinessInsider**
|
||||||
- **BuzzFeed**
|
- **BuzzFeed**
|
||||||
- **BYUtv**: (**Currently broken**)
|
- **BYUtv**: (**Currently broken**)
|
||||||
@ -246,8 +236,8 @@ The only reliable way to check if a site is supported is to try it.
|
|||||||
- **CanalAlpha**
|
- **CanalAlpha**
|
||||||
- **canalc2.tv**
|
- **canalc2.tv**
|
||||||
- **Canalplus**: mycanal.fr and piwiplus.fr
|
- **Canalplus**: mycanal.fr and piwiplus.fr
|
||||||
- **Canalsurmas**
|
|
||||||
- **CaracolTvPlay**: [*caracoltv-play*](## "netrc machine")
|
- **CaracolTvPlay**: [*caracoltv-play*](## "netrc machine")
|
||||||
|
- **CartoonNetwork**
|
||||||
- **cbc.ca**
|
- **cbc.ca**
|
||||||
- **cbc.ca:player**
|
- **cbc.ca:player**
|
||||||
- **cbc.ca:player:playlist**
|
- **cbc.ca:player:playlist**
|
||||||
@ -261,10 +251,9 @@ The only reliable way to check if a site is supported is to try it.
|
|||||||
- **cbsnews:livevideo**: CBS News Live Videos
|
- **cbsnews:livevideo**: CBS News Live Videos
|
||||||
- **cbssports**: (**Currently broken**)
|
- **cbssports**: (**Currently broken**)
|
||||||
- **cbssports:embed**: (**Currently broken**)
|
- **cbssports:embed**: (**Currently broken**)
|
||||||
- **CCMA**: 3Cat, TV3 and Catalunya Ràdio
|
- **CCMA**
|
||||||
- **CCTV**: 央视网
|
- **CCTV**: 央视网
|
||||||
- **CDA**: [*cdapl*](## "netrc machine")
|
- **CDA**: [*cdapl*](## "netrc machine")
|
||||||
- **CDAFolder**
|
|
||||||
- **Cellebrite**
|
- **Cellebrite**
|
||||||
- **CeskaTelevize**
|
- **CeskaTelevize**
|
||||||
- **CGTN**
|
- **CGTN**
|
||||||
@ -294,10 +283,12 @@ The only reliable way to check if a site is supported is to try it.
|
|||||||
- **cmt.com**: (**Currently broken**)
|
- **cmt.com**: (**Currently broken**)
|
||||||
- **CNBCVideo**
|
- **CNBCVideo**
|
||||||
- **CNN**
|
- **CNN**
|
||||||
|
- **CNNArticle**
|
||||||
|
- **CNNBlogs**
|
||||||
- **CNNIndonesia**
|
- **CNNIndonesia**
|
||||||
- **ComedyCentral**
|
- **ComedyCentral**
|
||||||
- **ComedyCentralTV**
|
- **ComedyCentralTV**
|
||||||
- **ConanClassic**: (**Currently broken**)
|
- **ConanClassic**
|
||||||
- **CondeNast**: Condé Nast media group: Allure, Architectural Digest, Ars Technica, Bon Appétit, Brides, Condé Nast, Condé Nast Traveler, Details, Epicurious, GQ, Glamour, Golf Digest, SELF, Teen Vogue, The New Yorker, Vanity Fair, Vogue, W Magazine, WIRED
|
- **CondeNast**: Condé Nast media group: Allure, Architectural Digest, Ars Technica, Bon Appétit, Brides, Condé Nast, Condé Nast Traveler, Details, Epicurious, GQ, Glamour, Golf Digest, SELF, Teen Vogue, The New Yorker, Vanity Fair, Vogue, W Magazine, WIRED
|
||||||
- **CONtv**
|
- **CONtv**
|
||||||
- **CookingChannel**
|
- **CookingChannel**
|
||||||
@ -314,18 +305,21 @@ The only reliable way to check if a site is supported is to try it.
|
|||||||
- **CrowdBunker**
|
- **CrowdBunker**
|
||||||
- **CrowdBunkerChannel**
|
- **CrowdBunkerChannel**
|
||||||
- **Crtvg**
|
- **Crtvg**
|
||||||
|
- **crunchyroll**: [*crunchyroll*](## "netrc machine")
|
||||||
|
- **crunchyroll:artist**: [*crunchyroll*](## "netrc machine")
|
||||||
|
- **crunchyroll:music**: [*crunchyroll*](## "netrc machine")
|
||||||
|
- **crunchyroll:playlist**: [*crunchyroll*](## "netrc machine")
|
||||||
- **CSpan**: C-SPAN
|
- **CSpan**: C-SPAN
|
||||||
- **CSpanCongress**
|
- **CSpanCongress**
|
||||||
- **CtsNews**: 華視新聞
|
- **CtsNews**: 華視新聞
|
||||||
- **CTV**
|
- **CTV**
|
||||||
- **CTVNews**
|
- **CTVNews**
|
||||||
- **cu.ntv.co.jp**: 日テレ無料TADA!
|
- **cu.ntv.co.jp**: Nippon Television Network
|
||||||
- **CultureUnplugged**
|
- **CultureUnplugged**
|
||||||
- **curiositystream**: [*curiositystream*](## "netrc machine")
|
- **curiositystream**: [*curiositystream*](## "netrc machine")
|
||||||
- **curiositystream:collections**: [*curiositystream*](## "netrc machine")
|
- **curiositystream:collections**: [*curiositystream*](## "netrc machine")
|
||||||
- **curiositystream:series**: [*curiositystream*](## "netrc machine")
|
- **curiositystream:series**: [*curiositystream*](## "netrc machine")
|
||||||
- **cwtv**
|
- **CWTV**
|
||||||
- **cwtv:movie**
|
|
||||||
- **Cybrary**: [*cybrary*](## "netrc machine")
|
- **Cybrary**: [*cybrary*](## "netrc machine")
|
||||||
- **CybraryCourse**: [*cybrary*](## "netrc machine")
|
- **CybraryCourse**: [*cybrary*](## "netrc machine")
|
||||||
- **DacastPlaylist**
|
- **DacastPlaylist**
|
||||||
@ -349,6 +343,8 @@ The only reliable way to check if a site is supported is to try it.
|
|||||||
- **daystar:clip**
|
- **daystar:clip**
|
||||||
- **DBTV**
|
- **DBTV**
|
||||||
- **DctpTv**
|
- **DctpTv**
|
||||||
|
- **DeezerAlbum**
|
||||||
|
- **DeezerPlaylist**
|
||||||
- **democracynow**
|
- **democracynow**
|
||||||
- **DestinationAmerica**
|
- **DestinationAmerica**
|
||||||
- **DetikEmbed**
|
- **DetikEmbed**
|
||||||
@ -358,7 +354,6 @@ The only reliable way to check if a site is supported is to try it.
|
|||||||
- **DigitalConcertHall**: [*digitalconcerthall*](## "netrc machine") DigitalConcertHall extractor
|
- **DigitalConcertHall**: [*digitalconcerthall*](## "netrc machine") DigitalConcertHall extractor
|
||||||
- **DigitallySpeaking**
|
- **DigitallySpeaking**
|
||||||
- **Digiteka**
|
- **Digiteka**
|
||||||
- **Digiview**
|
|
||||||
- **DiscogsReleasePlaylist**
|
- **DiscogsReleasePlaylist**
|
||||||
- **DiscoveryLife**
|
- **DiscoveryLife**
|
||||||
- **DiscoveryNetworksDe**
|
- **DiscoveryNetworksDe**
|
||||||
@ -381,7 +376,6 @@ The only reliable way to check if a site is supported is to try it.
|
|||||||
- **Dropbox**
|
- **Dropbox**
|
||||||
- **Dropout**: [*dropout*](## "netrc machine")
|
- **Dropout**: [*dropout*](## "netrc machine")
|
||||||
- **DropoutSeason**
|
- **DropoutSeason**
|
||||||
- **DrTalks**
|
|
||||||
- **DrTuber**
|
- **DrTuber**
|
||||||
- **drtv**
|
- **drtv**
|
||||||
- **drtv:live**
|
- **drtv:live**
|
||||||
@ -395,15 +389,11 @@ The only reliable way to check if a site is supported is to try it.
|
|||||||
- **dvtv**: http://video.aktualne.cz/
|
- **dvtv**: http://video.aktualne.cz/
|
||||||
- **dw**: (**Currently broken**)
|
- **dw**: (**Currently broken**)
|
||||||
- **dw:article**: (**Currently broken**)
|
- **dw:article**: (**Currently broken**)
|
||||||
- **dzen.ru**: Дзен (dzen) formerly Яндекс.Дзен (Yandex Zen)
|
|
||||||
- **dzen.ru:channel**
|
|
||||||
- **EaglePlatform**
|
- **EaglePlatform**
|
||||||
- **EbaumsWorld**
|
- **EbaumsWorld**
|
||||||
- **Ebay**
|
- **Ebay**
|
||||||
- **egghead:course**: egghead.io course
|
- **egghead:course**: egghead.io course
|
||||||
- **egghead:lesson**: egghead.io lesson
|
- **egghead:lesson**: egghead.io lesson
|
||||||
- **eggs:artist**
|
|
||||||
- **eggs:single**
|
|
||||||
- **EinsUndEinsTV**: [*1und1tv*](## "netrc machine")
|
- **EinsUndEinsTV**: [*1und1tv*](## "netrc machine")
|
||||||
- **EinsUndEinsTVLive**: [*1und1tv*](## "netrc machine")
|
- **EinsUndEinsTVLive**: [*1und1tv*](## "netrc machine")
|
||||||
- **EinsUndEinsTVRecordings**: [*1und1tv*](## "netrc machine")
|
- **EinsUndEinsTVRecordings**: [*1und1tv*](## "netrc machine")
|
||||||
@ -475,12 +465,11 @@ The only reliable way to check if a site is supported is to try it.
|
|||||||
- **FoxNewsVideo**
|
- **FoxNewsVideo**
|
||||||
- **FoxSports**
|
- **FoxSports**
|
||||||
- **fptplay**: fptplay.vn
|
- **fptplay**: fptplay.vn
|
||||||
- **FrancaisFacile**
|
|
||||||
- **FranceCulture**
|
- **FranceCulture**
|
||||||
- **FranceInter**
|
- **FranceInter**
|
||||||
- **francetv**
|
- **FranceTV**
|
||||||
- **francetv:site**
|
|
||||||
- **francetvinfo.fr**
|
- **francetvinfo.fr**
|
||||||
|
- **FranceTVSite**
|
||||||
- **Freesound**
|
- **Freesound**
|
||||||
- **freespeech.org**
|
- **freespeech.org**
|
||||||
- **freetv:series**
|
- **freetv:series**
|
||||||
@ -489,6 +478,9 @@ The only reliable way to check if a site is supported is to try it.
|
|||||||
- **FrontendMastersCourse**: [*frontendmasters*](## "netrc machine")
|
- **FrontendMastersCourse**: [*frontendmasters*](## "netrc machine")
|
||||||
- **FrontendMastersLesson**: [*frontendmasters*](## "netrc machine")
|
- **FrontendMastersLesson**: [*frontendmasters*](## "netrc machine")
|
||||||
- **FujiTVFODPlus7**
|
- **FujiTVFODPlus7**
|
||||||
|
- **Funimation**: [*funimation*](## "netrc machine")
|
||||||
|
- **funimation:page**: [*funimation*](## "netrc machine")
|
||||||
|
- **funimation:show**: [*funimation*](## "netrc machine")
|
||||||
- **Funk**
|
- **Funk**
|
||||||
- **Funker530**
|
- **Funker530**
|
||||||
- **Fux**
|
- **Fux**
|
||||||
@ -496,7 +488,6 @@ The only reliable way to check if a site is supported is to try it.
|
|||||||
- **Gab**
|
- **Gab**
|
||||||
- **GabTV**
|
- **GabTV**
|
||||||
- **Gaia**: [*gaia*](## "netrc machine")
|
- **Gaia**: [*gaia*](## "netrc machine")
|
||||||
- **GameDevTVDashboard**: [*gamedevtv*](## "netrc machine")
|
|
||||||
- **GameJolt**
|
- **GameJolt**
|
||||||
- **GameJoltCommunity**
|
- **GameJoltCommunity**
|
||||||
- **GameJoltGame**
|
- **GameJoltGame**
|
||||||
@ -512,7 +503,7 @@ The only reliable way to check if a site is supported is to try it.
|
|||||||
- **GediDigital**
|
- **GediDigital**
|
||||||
- **gem.cbc.ca**: [*cbcgem*](## "netrc machine")
|
- **gem.cbc.ca**: [*cbcgem*](## "netrc machine")
|
||||||
- **gem.cbc.ca:live**
|
- **gem.cbc.ca:live**
|
||||||
- **gem.cbc.ca:playlist**: [*cbcgem*](## "netrc machine")
|
- **gem.cbc.ca:playlist**
|
||||||
- **Genius**
|
- **Genius**
|
||||||
- **GeniusLyrics**
|
- **GeniusLyrics**
|
||||||
- **Germanupa**: germanupa.de
|
- **Germanupa**: germanupa.de
|
||||||
@ -575,7 +566,9 @@ The only reliable way to check if a site is supported is to try it.
|
|||||||
- **HollywoodReporterPlaylist**
|
- **HollywoodReporterPlaylist**
|
||||||
- **Holodex**
|
- **Holodex**
|
||||||
- **HotNewHipHop**: (**Currently broken**)
|
- **HotNewHipHop**: (**Currently broken**)
|
||||||
- **hotstar**: JioHotstar
|
- **hotstar**
|
||||||
|
- **hotstar:playlist**
|
||||||
|
- **hotstar:season**
|
||||||
- **hotstar:series**
|
- **hotstar:series**
|
||||||
- **hrfernsehen**
|
- **hrfernsehen**
|
||||||
- **HRTi**: [*hrti*](## "netrc machine")
|
- **HRTi**: [*hrti*](## "netrc machine")
|
||||||
@ -588,7 +581,7 @@ The only reliable way to check if a site is supported is to try it.
|
|||||||
- **Hungama**
|
- **Hungama**
|
||||||
- **HungamaAlbumPlaylist**
|
- **HungamaAlbumPlaylist**
|
||||||
- **HungamaSong**
|
- **HungamaSong**
|
||||||
- **huya:live**: 虎牙直播
|
- **huya:live**: huya.com
|
||||||
- **huya:video**: 虎牙视频
|
- **huya:video**: 虎牙视频
|
||||||
- **Hypem**
|
- **Hypem**
|
||||||
- **Hytale**
|
- **Hytale**
|
||||||
@ -612,10 +605,10 @@ The only reliable way to check if a site is supported is to try it.
|
|||||||
- **Inc**
|
- **Inc**
|
||||||
- **IndavideoEmbed**
|
- **IndavideoEmbed**
|
||||||
- **InfoQ**
|
- **InfoQ**
|
||||||
- **Instagram**
|
- **Instagram**: [*instagram*](## "netrc machine")
|
||||||
- **instagram:story**
|
- **instagram:story**: [*instagram*](## "netrc machine")
|
||||||
- **instagram:tag**: Instagram hashtag search URLs
|
- **instagram:tag**: [*instagram*](## "netrc machine") Instagram hashtag search URLs
|
||||||
- **instagram:user**: Instagram user profile (**Currently broken**)
|
- **instagram:user**: [*instagram*](## "netrc machine") Instagram user profile (**Currently broken**)
|
||||||
- **InstagramIOS**: IOS instagram:// URL
|
- **InstagramIOS**: IOS instagram:// URL
|
||||||
- **Internazionale**
|
- **Internazionale**
|
||||||
- **InternetVideoArchive**
|
- **InternetVideoArchive**
|
||||||
@ -635,7 +628,6 @@ The only reliable way to check if a site is supported is to try it.
|
|||||||
- **ivi**: ivi.ru
|
- **ivi**: ivi.ru
|
||||||
- **ivi:compilation**: ivi.ru compilations
|
- **ivi:compilation**: ivi.ru compilations
|
||||||
- **ivideon**: Ivideon TV
|
- **ivideon**: Ivideon TV
|
||||||
- **Ivoox**
|
|
||||||
- **IVXPlayer**
|
- **IVXPlayer**
|
||||||
- **iwara**: [*iwara*](## "netrc machine")
|
- **iwara**: [*iwara*](## "netrc machine")
|
||||||
- **iwara:playlist**: [*iwara*](## "netrc machine")
|
- **iwara:playlist**: [*iwara*](## "netrc machine")
|
||||||
@ -645,11 +637,10 @@ The only reliable way to check if a site is supported is to try it.
|
|||||||
- **Jamendo**
|
- **Jamendo**
|
||||||
- **JamendoAlbum**
|
- **JamendoAlbum**
|
||||||
- **JeuxVideo**: (**Currently broken**)
|
- **JeuxVideo**: (**Currently broken**)
|
||||||
|
- **jiocinema**: [*jiocinema*](## "netrc machine")
|
||||||
|
- **jiocinema:series**: [*jiocinema*](## "netrc machine")
|
||||||
- **jiosaavn:album**
|
- **jiosaavn:album**
|
||||||
- **jiosaavn:artist**
|
|
||||||
- **jiosaavn:playlist**
|
- **jiosaavn:playlist**
|
||||||
- **jiosaavn:show**
|
|
||||||
- **jiosaavn:show:playlist**
|
|
||||||
- **jiosaavn:song**
|
- **jiosaavn:song**
|
||||||
- **Joj**
|
- **Joj**
|
||||||
- **JoqrAg**: 超!A&G+ 文化放送 (f.k.a. AGQR) Nippon Cultural Broadcasting, Inc. (JOQR)
|
- **JoqrAg**: 超!A&G+ 文化放送 (f.k.a. AGQR) Nippon Cultural Broadcasting, Inc. (JOQR)
|
||||||
@ -664,8 +655,7 @@ The only reliable way to check if a site is supported is to try it.
|
|||||||
- **Karaoketv**
|
- **Karaoketv**
|
||||||
- **Katsomo**: (**Currently broken**)
|
- **Katsomo**: (**Currently broken**)
|
||||||
- **KelbyOne**: (**Currently broken**)
|
- **KelbyOne**: (**Currently broken**)
|
||||||
- **Kenh14Playlist**
|
- **Ketnet**
|
||||||
- **Kenh14Video**
|
|
||||||
- **khanacademy**
|
- **khanacademy**
|
||||||
- **khanacademy:unit**
|
- **khanacademy:unit**
|
||||||
- **kick:clips**
|
- **kick:clips**
|
||||||
@ -674,7 +664,6 @@ The only reliable way to check if a site is supported is to try it.
|
|||||||
- **Kicker**
|
- **Kicker**
|
||||||
- **KickStarter**
|
- **KickStarter**
|
||||||
- **Kika**: KiKA.de
|
- **Kika**: KiKA.de
|
||||||
- **KikaPlaylist**
|
|
||||||
- **kinja:embed**
|
- **kinja:embed**
|
||||||
- **KinoPoisk**
|
- **KinoPoisk**
|
||||||
- **Kommunetv**
|
- **Kommunetv**
|
||||||
@ -699,9 +688,9 @@ The only reliable way to check if a site is supported is to try it.
|
|||||||
- **LastFMPlaylist**
|
- **LastFMPlaylist**
|
||||||
- **LastFMUser**
|
- **LastFMUser**
|
||||||
- **LaXarxaMes**: [*laxarxames*](## "netrc machine")
|
- **LaXarxaMes**: [*laxarxames*](## "netrc machine")
|
||||||
- **lbry**: odysee.com
|
- **lbry**
|
||||||
- **lbry:channel**: odysee.com channels
|
- **lbry:channel**
|
||||||
- **lbry:playlist**: odysee.com playlists
|
- **lbry:playlist**
|
||||||
- **LCI**
|
- **LCI**
|
||||||
- **Lcp**
|
- **Lcp**
|
||||||
- **LcpPlay**
|
- **LcpPlay**
|
||||||
@ -727,7 +716,6 @@ The only reliable way to check if a site is supported is to try it.
|
|||||||
- **limelight:channel**
|
- **limelight:channel**
|
||||||
- **limelight:channel_list**
|
- **limelight:channel_list**
|
||||||
- **LinkedIn**: [*linkedin*](## "netrc machine")
|
- **LinkedIn**: [*linkedin*](## "netrc machine")
|
||||||
- **linkedin:events**: [*linkedin*](## "netrc machine")
|
|
||||||
- **linkedin:learning**: [*linkedin*](## "netrc machine")
|
- **linkedin:learning**: [*linkedin*](## "netrc machine")
|
||||||
- **linkedin:learning:course**: [*linkedin*](## "netrc machine")
|
- **linkedin:learning:course**: [*linkedin*](## "netrc machine")
|
||||||
- **Liputan6**
|
- **Liputan6**
|
||||||
@ -739,11 +727,9 @@ The only reliable way to check if a site is supported is to try it.
|
|||||||
- **Livestreamfails**
|
- **Livestreamfails**
|
||||||
- **Lnk**
|
- **Lnk**
|
||||||
- **loc**: Library of Congress
|
- **loc**: Library of Congress
|
||||||
- **Loco**
|
|
||||||
- **loom**
|
- **loom**
|
||||||
- **loom:folder**
|
- **loom:folder**
|
||||||
- **LoveHomePorn**
|
- **LoveHomePorn**
|
||||||
- **LRTRadio**
|
|
||||||
- **LRTStream**
|
- **LRTStream**
|
||||||
- **LRTVOD**
|
- **LRTVOD**
|
||||||
- **LSMLREmbed**
|
- **LSMLREmbed**
|
||||||
@ -765,14 +751,13 @@ The only reliable way to check if a site is supported is to try it.
|
|||||||
- **ManotoTV**: Manoto TV (Episode)
|
- **ManotoTV**: Manoto TV (Episode)
|
||||||
- **ManotoTVLive**: Manoto TV (Live)
|
- **ManotoTVLive**: Manoto TV (Live)
|
||||||
- **ManotoTVShow**: Manoto TV (Show)
|
- **ManotoTVShow**: Manoto TV (Show)
|
||||||
- **ManyVids**
|
- **ManyVids**: (**Currently broken**)
|
||||||
- **MaoriTV**
|
- **MaoriTV**
|
||||||
- **Markiza**: (**Currently broken**)
|
- **Markiza**: (**Currently broken**)
|
||||||
- **MarkizaPage**: (**Currently broken**)
|
- **MarkizaPage**: (**Currently broken**)
|
||||||
- **massengeschmack.tv**
|
- **massengeschmack.tv**
|
||||||
- **Masters**
|
- **Masters**
|
||||||
- **MatchTV**
|
- **MatchTV**
|
||||||
- **Mave**
|
|
||||||
- **MBN**: mbn.co.kr (매일방송)
|
- **MBN**: mbn.co.kr (매일방송)
|
||||||
- **MDR**: MDR.DE
|
- **MDR**: MDR.DE
|
||||||
- **MedalTV**
|
- **MedalTV**
|
||||||
@ -803,6 +788,10 @@ The only reliable way to check if a site is supported is to try it.
|
|||||||
- **MicrosoftLearnSession**
|
- **MicrosoftLearnSession**
|
||||||
- **MicrosoftMedius**
|
- **MicrosoftMedius**
|
||||||
- **microsoftstream**: Microsoft Stream
|
- **microsoftstream**: Microsoft Stream
|
||||||
|
- **mildom**: Record ongoing live by specific user in Mildom
|
||||||
|
- **mildom:clip**: Clip in Mildom
|
||||||
|
- **mildom:user:vod**: Download all VODs from specific user in Mildom
|
||||||
|
- **mildom:vod**: VOD in Mildom
|
||||||
- **minds**
|
- **minds**
|
||||||
- **minds:channel**
|
- **minds:channel**
|
||||||
- **minds:group**
|
- **minds:group**
|
||||||
@ -813,7 +802,6 @@ The only reliable way to check if a site is supported is to try it.
|
|||||||
- **MiTele**: mitele.es
|
- **MiTele**: mitele.es
|
||||||
- **mixch**
|
- **mixch**
|
||||||
- **mixch:archive**
|
- **mixch:archive**
|
||||||
- **mixch:movie**
|
|
||||||
- **mixcloud**
|
- **mixcloud**
|
||||||
- **mixcloud:playlist**
|
- **mixcloud:playlist**
|
||||||
- **mixcloud:user**
|
- **mixcloud:user**
|
||||||
@ -829,18 +817,18 @@ The only reliable way to check if a site is supported is to try it.
|
|||||||
- **Mojevideo**: mojevideo.sk
|
- **Mojevideo**: mojevideo.sk
|
||||||
- **Mojvideo**
|
- **Mojvideo**
|
||||||
- **Monstercat**
|
- **Monstercat**
|
||||||
- **monstersiren**: 塞壬唱片
|
- **MonsterSirenHypergryphMusic**
|
||||||
- **Motherless**
|
- **Motherless**
|
||||||
- **MotherlessGallery**
|
- **MotherlessGallery**
|
||||||
- **MotherlessGroup**
|
- **MotherlessGroup**
|
||||||
- **MotherlessUploader**
|
- **MotherlessUploader**
|
||||||
- **Motorsport**: motorsport.com (**Currently broken**)
|
- **Motorsport**: motorsport.com (**Currently broken**)
|
||||||
- **MovieFap**
|
- **MovieFap**
|
||||||
- **moviepilot**: Moviepilot trailer
|
- **Moviepilot**
|
||||||
- **MoviewPlay**
|
- **MoviewPlay**
|
||||||
- **Moviezine**
|
- **Moviezine**
|
||||||
- **MovingImage**
|
- **MovingImage**
|
||||||
- **MSN**
|
- **MSN**: (**Currently broken**)
|
||||||
- **mtg**: MTG services
|
- **mtg**: MTG services
|
||||||
- **mtv**
|
- **mtv**
|
||||||
- **mtv.de**: (**Currently broken**)
|
- **mtv.de**: (**Currently broken**)
|
||||||
@ -881,19 +869,19 @@ The only reliable way to check if a site is supported is to try it.
|
|||||||
- **Naver**
|
- **Naver**
|
||||||
- **Naver:live**
|
- **Naver:live**
|
||||||
- **navernow**
|
- **navernow**
|
||||||
- **nba**: (**Currently broken**)
|
- **nba**
|
||||||
- **nba:channel**: (**Currently broken**)
|
- **nba:channel**
|
||||||
- **nba:embed**: (**Currently broken**)
|
- **nba:embed**
|
||||||
- **nba:watch**: (**Currently broken**)
|
- **nba:watch**
|
||||||
- **nba:watch:collection**: (**Currently broken**)
|
- **nba:watch:collection**
|
||||||
- **nba:watch:embed**: (**Currently broken**)
|
- **nba:watch:embed**
|
||||||
- **NBC**
|
- **NBC**
|
||||||
- **NBCNews**
|
- **NBCNews**
|
||||||
- **nbcolympics**
|
- **nbcolympics**
|
||||||
- **nbcolympics:stream**: (**Currently broken**)
|
- **nbcolympics:stream**
|
||||||
- **NBCSports**: (**Currently broken**)
|
- **NBCSports**
|
||||||
- **NBCSportsStream**: (**Currently broken**)
|
- **NBCSportsStream**
|
||||||
- **NBCSportsVPlayer**: (**Currently broken**)
|
- **NBCSportsVPlayer**
|
||||||
- **NBCStations**
|
- **NBCStations**
|
||||||
- **ndr**: NDR.de - Norddeutscher Rundfunk
|
- **ndr**: NDR.de - Norddeutscher Rundfunk
|
||||||
- **ndr:embed**
|
- **ndr:embed**
|
||||||
@ -905,8 +893,6 @@ The only reliable way to check if a site is supported is to try it.
|
|||||||
- **nebula:video**: [*watchnebula*](## "netrc machine")
|
- **nebula:video**: [*watchnebula*](## "netrc machine")
|
||||||
- **NekoHacker**
|
- **NekoHacker**
|
||||||
- **NerdCubedFeed**
|
- **NerdCubedFeed**
|
||||||
- **Nest**
|
|
||||||
- **NestClip**
|
|
||||||
- **netease:album**: 网易云音乐 - 专辑
|
- **netease:album**: 网易云音乐 - 专辑
|
||||||
- **netease:djradio**: 网易云音乐 - 电台
|
- **netease:djradio**: 网易云音乐 - 电台
|
||||||
- **netease:mv**: 网易云音乐 - MV
|
- **netease:mv**: 网易云音乐 - MV
|
||||||
@ -953,7 +939,7 @@ The only reliable way to check if a site is supported is to try it.
|
|||||||
- **nickelodeonru**
|
- **nickelodeonru**
|
||||||
- **niconico**: [*niconico*](## "netrc machine") ニコニコ動画
|
- **niconico**: [*niconico*](## "netrc machine") ニコニコ動画
|
||||||
- **niconico:history**: NicoNico user history or likes. Requires cookies.
|
- **niconico:history**: NicoNico user history or likes. Requires cookies.
|
||||||
- **niconico:live**: [*niconico*](## "netrc machine") ニコニコ生放送
|
- **niconico:live**: ニコニコ生放送
|
||||||
- **niconico:playlist**
|
- **niconico:playlist**
|
||||||
- **niconico:series**
|
- **niconico:series**
|
||||||
- **niconico:tag**: NicoNico video tag URLs
|
- **niconico:tag**: NicoNico video tag URLs
|
||||||
@ -969,7 +955,7 @@ The only reliable way to check if a site is supported is to try it.
|
|||||||
- **Nitter**
|
- **Nitter**
|
||||||
- **njoy**: N-JOY
|
- **njoy**: N-JOY
|
||||||
- **njoy:embed**
|
- **njoy:embed**
|
||||||
- **NobelPrize**
|
- **NobelPrize**: (**Currently broken**)
|
||||||
- **NoicePodcast**
|
- **NoicePodcast**
|
||||||
- **NonkTube**
|
- **NonkTube**
|
||||||
- **NoodleMagazine**
|
- **NoodleMagazine**
|
||||||
@ -1060,10 +1046,8 @@ The only reliable way to check if a site is supported is to try it.
|
|||||||
- **Parler**: Posts on parler.com
|
- **Parler**: Posts on parler.com
|
||||||
- **parliamentlive.tv**: UK parliament videos
|
- **parliamentlive.tv**: UK parliament videos
|
||||||
- **Parlview**: (**Currently broken**)
|
- **Parlview**: (**Currently broken**)
|
||||||
- **parti:livestream**
|
- **Patreon**
|
||||||
- **parti:video**
|
- **PatreonCampaign**
|
||||||
- **patreon**
|
|
||||||
- **patreon:campaign**
|
|
||||||
- **pbs**: Public Broadcasting Service (PBS) and member stations: PBS: Public Broadcasting Service, APT - Alabama Public Television (WBIQ), GPB/Georgia Public Broadcasting (WGTV), Mississippi Public Broadcasting (WMPN), Nashville Public Television (WNPT), WFSU-TV (WFSU), WSRE (WSRE), WTCI (WTCI), WPBA/Channel 30 (WPBA), Alaska Public Media (KAKM), Arizona PBS (KAET), KNME-TV/Channel 5 (KNME), Vegas PBS (KLVX), AETN/ARKANSAS ETV NETWORK (KETS), KET (WKLE), WKNO/Channel 10 (WKNO), LPB/LOUISIANA PUBLIC BROADCASTING (WLPB), OETA (KETA), Ozarks Public Television (KOZK), WSIU Public Broadcasting (WSIU), KEET TV (KEET), KIXE/Channel 9 (KIXE), KPBS San Diego (KPBS), KQED (KQED), KVIE Public Television (KVIE), PBS SoCal/KOCE (KOCE), ValleyPBS (KVPT), CONNECTICUT PUBLIC TELEVISION (WEDH), KNPB Channel 5 (KNPB), SOPTV (KSYS), Rocky Mountain PBS (KRMA), KENW-TV3 (KENW), KUED Channel 7 (KUED), Wyoming PBS (KCWC), Colorado Public Television / KBDI 12 (KBDI), KBYU-TV (KBYU), Thirteen/WNET New York (WNET), WGBH/Channel 2 (WGBH), WGBY (WGBY), NJTV Public Media NJ (WNJT), WLIW21 (WLIW), mpt/Maryland Public Television (WMPB), WETA Television and Radio (WETA), WHYY (WHYY), PBS 39 (WLVT), WVPT - Your Source for PBS and More! (WVPT), Howard University Television (WHUT), WEDU PBS (WEDU), WGCU Public Media (WGCU), WPBT2 (WPBT), WUCF TV (WUCF), WUFT/Channel 5 (WUFT), WXEL/Channel 42 (WXEL), WLRN/Channel 17 (WLRN), WUSF Public Broadcasting (WUSF), ETV (WRLK), UNC-TV (WUNC), PBS Hawaii - Oceanic Cable Channel 10 (KHET), Idaho Public Television (KAID), KSPS (KSPS), OPB (KOPB), KWSU/Channel 10 & KTNW/Channel 31 (KWSU), WILL-TV (WILL), Network Knowledge - WSEC/Springfield (WSEC), WTTW11 (WTTW), Iowa Public Television/IPTV (KDIN), Nine Network (KETC), PBS39 Fort Wayne (WFWA), WFYI Indianapolis (WFYI), Milwaukee Public Television (WMVS), WNIN (WNIN), WNIT Public Television (WNIT), WPT (WPNE), WVUT/Channel 22 (WVUT), WEIU/Channel 51 (WEIU), WQPT-TV (WQPT), WYCC PBS Chicago (WYCC), WIPB-TV (WIPB), WTIU (WTIU), CET (WCET), ThinkTVNetwork (WPTD), WBGU-TV (WBGU), WGVU TV (WGVU), NET1 (KUON), Pioneer Public Television (KWCM), SDPB Television (KUSD), TPT (KTCA), KSMQ (KSMQ), KPTS/Channel 8 (KPTS), KTWU/Channel 11 (KTWU), East Tennessee PBS (WSJK), WCTE-TV (WCTE), WLJT, Channel 11 (WLJT), WOSU TV (WOSU), WOUB/WOUC (WOUB), WVPB (WVPB), WKYU-PBS (WKYU), KERA 13 (KERA), MPBN (WCBB), Mountain Lake PBS (WCFE), NHPTV (WENH), Vermont PBS (WETK), witf (WITF), WQED Multimedia (WQED), WMHT Educational Telecommunications (WMHT), Q-TV (WDCQ), WTVS Detroit Public TV (WTVS), CMU Public Television (WCMU), WKAR-TV (WKAR), WNMU-TV Public TV 13 (WNMU), WDSE - WRPT (WDSE), WGTE TV (WGTE), Lakeland Public Television (KAWE), KMOS-TV - Channels 6.1, 6.2 and 6.3 (KMOS), MontanaPBS (KUSM), KRWG/Channel 22 (KRWG), KACV (KACV), KCOS/Channel 13 (KCOS), WCNY/Channel 24 (WCNY), WNED (WNED), WPBS (WPBS), WSKG Public TV (WSKG), WXXI (WXXI), WPSU (WPSU), WVIA Public Media Studios (WVIA), WTVI (WTVI), Western Reserve PBS (WNEO), WVIZ/PBS ideastream (WVIZ), KCTS 9 (KCTS), Basin PBS (KPBT), KUHT / Channel 8 (KUHT), KLRN (KLRN), KLRU (KLRU), WTJX Channel 12 (WTJX), WCVE PBS (WCVE), KBTC Public Television (KBTC)
|
- **pbs**: Public Broadcasting Service (PBS) and member stations: PBS: Public Broadcasting Service, APT - Alabama Public Television (WBIQ), GPB/Georgia Public Broadcasting (WGTV), Mississippi Public Broadcasting (WMPN), Nashville Public Television (WNPT), WFSU-TV (WFSU), WSRE (WSRE), WTCI (WTCI), WPBA/Channel 30 (WPBA), Alaska Public Media (KAKM), Arizona PBS (KAET), KNME-TV/Channel 5 (KNME), Vegas PBS (KLVX), AETN/ARKANSAS ETV NETWORK (KETS), KET (WKLE), WKNO/Channel 10 (WKNO), LPB/LOUISIANA PUBLIC BROADCASTING (WLPB), OETA (KETA), Ozarks Public Television (KOZK), WSIU Public Broadcasting (WSIU), KEET TV (KEET), KIXE/Channel 9 (KIXE), KPBS San Diego (KPBS), KQED (KQED), KVIE Public Television (KVIE), PBS SoCal/KOCE (KOCE), ValleyPBS (KVPT), CONNECTICUT PUBLIC TELEVISION (WEDH), KNPB Channel 5 (KNPB), SOPTV (KSYS), Rocky Mountain PBS (KRMA), KENW-TV3 (KENW), KUED Channel 7 (KUED), Wyoming PBS (KCWC), Colorado Public Television / KBDI 12 (KBDI), KBYU-TV (KBYU), Thirteen/WNET New York (WNET), WGBH/Channel 2 (WGBH), WGBY (WGBY), NJTV Public Media NJ (WNJT), WLIW21 (WLIW), mpt/Maryland Public Television (WMPB), WETA Television and Radio (WETA), WHYY (WHYY), PBS 39 (WLVT), WVPT - Your Source for PBS and More! (WVPT), Howard University Television (WHUT), WEDU PBS (WEDU), WGCU Public Media (WGCU), WPBT2 (WPBT), WUCF TV (WUCF), WUFT/Channel 5 (WUFT), WXEL/Channel 42 (WXEL), WLRN/Channel 17 (WLRN), WUSF Public Broadcasting (WUSF), ETV (WRLK), UNC-TV (WUNC), PBS Hawaii - Oceanic Cable Channel 10 (KHET), Idaho Public Television (KAID), KSPS (KSPS), OPB (KOPB), KWSU/Channel 10 & KTNW/Channel 31 (KWSU), WILL-TV (WILL), Network Knowledge - WSEC/Springfield (WSEC), WTTW11 (WTTW), Iowa Public Television/IPTV (KDIN), Nine Network (KETC), PBS39 Fort Wayne (WFWA), WFYI Indianapolis (WFYI), Milwaukee Public Television (WMVS), WNIN (WNIN), WNIT Public Television (WNIT), WPT (WPNE), WVUT/Channel 22 (WVUT), WEIU/Channel 51 (WEIU), WQPT-TV (WQPT), WYCC PBS Chicago (WYCC), WIPB-TV (WIPB), WTIU (WTIU), CET (WCET), ThinkTVNetwork (WPTD), WBGU-TV (WBGU), WGVU TV (WGVU), NET1 (KUON), Pioneer Public Television (KWCM), SDPB Television (KUSD), TPT (KTCA), KSMQ (KSMQ), KPTS/Channel 8 (KPTS), KTWU/Channel 11 (KTWU), East Tennessee PBS (WSJK), WCTE-TV (WCTE), WLJT, Channel 11 (WLJT), WOSU TV (WOSU), WOUB/WOUC (WOUB), WVPB (WVPB), WKYU-PBS (WKYU), KERA 13 (KERA), MPBN (WCBB), Mountain Lake PBS (WCFE), NHPTV (WENH), Vermont PBS (WETK), witf (WITF), WQED Multimedia (WQED), WMHT Educational Telecommunications (WMHT), Q-TV (WDCQ), WTVS Detroit Public TV (WTVS), CMU Public Television (WCMU), WKAR-TV (WKAR), WNMU-TV Public TV 13 (WNMU), WDSE - WRPT (WDSE), WGTE TV (WGTE), Lakeland Public Television (KAWE), KMOS-TV - Channels 6.1, 6.2 and 6.3 (KMOS), MontanaPBS (KUSM), KRWG/Channel 22 (KRWG), KACV (KACV), KCOS/Channel 13 (KCOS), WCNY/Channel 24 (WCNY), WNED (WNED), WPBS (WPBS), WSKG Public TV (WSKG), WXXI (WXXI), WPSU (WPSU), WVIA Public Media Studios (WVIA), WTVI (WTVI), Western Reserve PBS (WNEO), WVIZ/PBS ideastream (WVIZ), KCTS 9 (KCTS), Basin PBS (KPBT), KUHT / Channel 8 (KUHT), KLRN (KLRN), KLRU (KLRU), WTJX Channel 12 (WTJX), WCVE PBS (WCVE), KBTC Public Television (KBTC)
|
||||||
- **PBSKids**
|
- **PBSKids**
|
||||||
- **PearVideo**
|
- **PearVideo**
|
||||||
@ -1080,16 +1064,14 @@ The only reliable way to check if a site is supported is to try it.
|
|||||||
- **PhilharmonieDeParis**: Philharmonie de Paris
|
- **PhilharmonieDeParis**: Philharmonie de Paris
|
||||||
- **phoenix.de**
|
- **phoenix.de**
|
||||||
- **Photobucket**
|
- **Photobucket**
|
||||||
- **PiaLive**
|
|
||||||
- **Piapro**: [*piapro*](## "netrc machine")
|
- **Piapro**: [*piapro*](## "netrc machine")
|
||||||
- **picarto**
|
- **PIAULIZAPortal**: ulizaportal.jp - PIA LIVE STREAM
|
||||||
- **picarto:vod**
|
- **Picarto**
|
||||||
|
- **PicartoVod**
|
||||||
- **Piksel**
|
- **Piksel**
|
||||||
- **Pinkbike**
|
- **Pinkbike**
|
||||||
- **Pinterest**
|
- **Pinterest**
|
||||||
- **PinterestCollection**
|
- **PinterestCollection**
|
||||||
- **PiramideTV**
|
|
||||||
- **PiramideTVChannel**
|
|
||||||
- **pixiv:sketch**
|
- **pixiv:sketch**
|
||||||
- **pixiv:sketch:user**
|
- **pixiv:sketch:user**
|
||||||
- **Pladform**
|
- **Pladform**
|
||||||
@ -1106,11 +1088,12 @@ The only reliable way to check if a site is supported is to try it.
|
|||||||
- **pluralsight**: [*pluralsight*](## "netrc machine")
|
- **pluralsight**: [*pluralsight*](## "netrc machine")
|
||||||
- **pluralsight:course**
|
- **pluralsight:course**
|
||||||
- **PlutoTV**: (**Currently broken**)
|
- **PlutoTV**: (**Currently broken**)
|
||||||
- **PlVideo**: Платформа
|
|
||||||
- **PodbayFM**
|
- **PodbayFM**
|
||||||
- **PodbayFMChannel**
|
- **PodbayFMChannel**
|
||||||
- **Podchaser**
|
- **Podchaser**
|
||||||
- **podomatic**: (**Currently broken**)
|
- **podomatic**: (**Currently broken**)
|
||||||
|
- **Pokemon**
|
||||||
|
- **PokemonWatch**
|
||||||
- **PokerGo**: [*pokergo*](## "netrc machine")
|
- **PokerGo**: [*pokergo*](## "netrc machine")
|
||||||
- **PokerGoCollection**: [*pokergo*](## "netrc machine")
|
- **PokerGoCollection**: [*pokergo*](## "netrc machine")
|
||||||
- **PolsatGo**
|
- **PolsatGo**
|
||||||
@ -1181,7 +1164,6 @@ The only reliable way to check if a site is supported is to try it.
|
|||||||
- **RadioJavan**: (**Currently broken**)
|
- **RadioJavan**: (**Currently broken**)
|
||||||
- **radiokapital**
|
- **radiokapital**
|
||||||
- **radiokapital:show**
|
- **radiokapital:show**
|
||||||
- **RadioRadicale**
|
|
||||||
- **RadioZetPodcast**
|
- **RadioZetPodcast**
|
||||||
- **radlive**
|
- **radlive**
|
||||||
- **radlive:channel**
|
- **radlive:channel**
|
||||||
@ -1236,7 +1218,6 @@ The only reliable way to check if a site is supported is to try it.
|
|||||||
- **RoosterTeeth**: [*roosterteeth*](## "netrc machine")
|
- **RoosterTeeth**: [*roosterteeth*](## "netrc machine")
|
||||||
- **RoosterTeethSeries**: [*roosterteeth*](## "netrc machine")
|
- **RoosterTeethSeries**: [*roosterteeth*](## "netrc machine")
|
||||||
- **RottenTomatoes**
|
- **RottenTomatoes**
|
||||||
- **RoyaLive**
|
|
||||||
- **Rozhlas**
|
- **Rozhlas**
|
||||||
- **RozhlasVltava**
|
- **RozhlasVltava**
|
||||||
- **RTBF**: [*rtbf*](## "netrc machine") (**Currently broken**)
|
- **RTBF**: [*rtbf*](## "netrc machine") (**Currently broken**)
|
||||||
@ -1257,10 +1238,12 @@ The only reliable way to check if a site is supported is to try it.
|
|||||||
- **RTVCKaltura**
|
- **RTVCKaltura**
|
||||||
- **RTVCPlay**
|
- **RTVCPlay**
|
||||||
- **RTVCPlayEmbed**
|
- **RTVCPlayEmbed**
|
||||||
- **rtve.es:alacarta**: RTVE a la carta and Play
|
- **rtve.es:alacarta**: RTVE a la carta
|
||||||
- **rtve.es:audio**: RTVE audio
|
- **rtve.es:audio**: RTVE audio
|
||||||
|
- **rtve.es:infantil**: RTVE infantil
|
||||||
- **rtve.es:live**: RTVE.es live streams
|
- **rtve.es:live**: RTVE.es live streams
|
||||||
- **rtve.es:television**
|
- **rtve.es:television**
|
||||||
|
- **RTVS**
|
||||||
- **rtvslo.si**
|
- **rtvslo.si**
|
||||||
- **rtvslo.si:show**
|
- **rtvslo.si:show**
|
||||||
- **RudoVideo**
|
- **RudoVideo**
|
||||||
@ -1295,7 +1278,6 @@ The only reliable way to check if a site is supported is to try it.
|
|||||||
- **SampleFocus**
|
- **SampleFocus**
|
||||||
- **Sangiin**: 参議院インターネット審議中継 (archive)
|
- **Sangiin**: 参議院インターネット審議中継 (archive)
|
||||||
- **Sapo**: SAPO Vídeos
|
- **Sapo**: SAPO Vídeos
|
||||||
- **SaucePlus**: Sauce+
|
|
||||||
- **SBS**: sbs.com.au
|
- **SBS**: sbs.com.au
|
||||||
- **sbs.co.kr**
|
- **sbs.co.kr**
|
||||||
- **sbs.co.kr:allvod_program**
|
- **sbs.co.kr:allvod_program**
|
||||||
@ -1316,8 +1298,8 @@ The only reliable way to check if a site is supported is to try it.
|
|||||||
- **sejm**
|
- **sejm**
|
||||||
- **Sen**
|
- **Sen**
|
||||||
- **SenalColombiaLive**: (**Currently broken**)
|
- **SenalColombiaLive**: (**Currently broken**)
|
||||||
- **senate.gov**
|
- **SenateGov**
|
||||||
- **senate.gov:isvp**
|
- **SenateISVP**
|
||||||
- **SendtoNews**: (**Currently broken**)
|
- **SendtoNews**: (**Currently broken**)
|
||||||
- **Servus**
|
- **Servus**
|
||||||
- **Sexu**: (**Currently broken**)
|
- **Sexu**: (**Currently broken**)
|
||||||
@ -1353,15 +1335,10 @@ The only reliable way to check if a site is supported is to try it.
|
|||||||
- **Smotrim**
|
- **Smotrim**
|
||||||
- **SnapchatSpotlight**
|
- **SnapchatSpotlight**
|
||||||
- **Snotr**
|
- **Snotr**
|
||||||
- **SoftWhiteUnderbelly**: [*softwhiteunderbelly*](## "netrc machine")
|
|
||||||
- **Sohu**
|
- **Sohu**
|
||||||
- **SohuV**
|
- **SohuV**
|
||||||
- **SonyLIV**: [*sonyliv*](## "netrc machine")
|
- **SonyLIV**: [*sonyliv*](## "netrc machine")
|
||||||
- **SonyLIVSeries**
|
- **SonyLIVSeries**
|
||||||
- **soop**: [*afreecatv*](## "netrc machine") sooplive.co.kr
|
|
||||||
- **soop:catchstory**: [*afreecatv*](## "netrc machine") sooplive.co.kr catch story
|
|
||||||
- **soop:live**: [*afreecatv*](## "netrc machine") sooplive.co.kr livestreams
|
|
||||||
- **soop:user**: [*afreecatv*](## "netrc machine")
|
|
||||||
- **soundcloud**: [*soundcloud*](## "netrc machine")
|
- **soundcloud**: [*soundcloud*](## "netrc machine")
|
||||||
- **soundcloud:playlist**: [*soundcloud*](## "netrc machine")
|
- **soundcloud:playlist**: [*soundcloud*](## "netrc machine")
|
||||||
- **soundcloud:related**: [*soundcloud*](## "netrc machine")
|
- **soundcloud:related**: [*soundcloud*](## "netrc machine")
|
||||||
@ -1390,17 +1367,20 @@ The only reliable way to check if a site is supported is to try it.
|
|||||||
- **spotify**: Spotify episodes (**Currently broken**)
|
- **spotify**: Spotify episodes (**Currently broken**)
|
||||||
- **spotify:show**: Spotify shows (**Currently broken**)
|
- **spotify:show**: Spotify shows (**Currently broken**)
|
||||||
- **Spreaker**
|
- **Spreaker**
|
||||||
|
- **SpreakerPage**
|
||||||
- **SpreakerShow**
|
- **SpreakerShow**
|
||||||
|
- **SpreakerShowPage**
|
||||||
- **SpringboardPlatform**
|
- **SpringboardPlatform**
|
||||||
|
- **Sprout**
|
||||||
- **SproutVideo**
|
- **SproutVideo**
|
||||||
- **sr:mediathek**: Saarländischer Rundfunk
|
- **sr:mediathek**: Saarländischer Rundfunk (**Currently broken**)
|
||||||
- **SRGSSR**
|
- **SRGSSR**
|
||||||
- **SRGSSRPlay**: srf.ch, rts.ch, rsi.ch, rtr.ch and swissinfo.ch play sites
|
- **SRGSSRPlay**: srf.ch, rts.ch, rsi.ch, rtr.ch and swissinfo.ch play sites
|
||||||
- **StacommuLive**: [*stacommu*](## "netrc machine")
|
- **StacommuLive**: [*stacommu*](## "netrc machine")
|
||||||
- **StacommuVOD**: [*stacommu*](## "netrc machine")
|
- **StacommuVOD**: [*stacommu*](## "netrc machine")
|
||||||
- **StagePlusVODConcert**: [*stageplus*](## "netrc machine")
|
- **StagePlusVODConcert**: [*stageplus*](## "netrc machine")
|
||||||
- **stanfordoc**: Stanford Open ClassRoom
|
- **stanfordoc**: Stanford Open ClassRoom
|
||||||
- **startrek**: STAR TREK
|
- **StarTrek**: (**Currently broken**)
|
||||||
- **startv**
|
- **startv**
|
||||||
- **Steam**
|
- **Steam**
|
||||||
- **SteamCommunityBroadcast**
|
- **SteamCommunityBroadcast**
|
||||||
@ -1409,25 +1389,22 @@ The only reliable way to check if a site is supported is to try it.
|
|||||||
- **StoryFire**
|
- **StoryFire**
|
||||||
- **StoryFireSeries**
|
- **StoryFireSeries**
|
||||||
- **StoryFireUser**
|
- **StoryFireUser**
|
||||||
- **Streaks**
|
|
||||||
- **Streamable**
|
- **Streamable**
|
||||||
- **StreamCZ**
|
- **StreamCZ**
|
||||||
- **StreetVoice**
|
- **StreetVoice**
|
||||||
- **StretchInternet**
|
- **StretchInternet**
|
||||||
- **Stripchat**
|
- **Stripchat**
|
||||||
- **stv:player**
|
- **stv:player**
|
||||||
- **stvr**: Slovak Television and Radio (formerly RTVS)
|
|
||||||
- **Subsplash**
|
|
||||||
- **subsplash:playlist**
|
|
||||||
- **Substack**
|
- **Substack**
|
||||||
- **SunPorno**
|
- **SunPorno**
|
||||||
- **sverigesradio:episode**
|
- **sverigesradio:episode**
|
||||||
- **sverigesradio:publication**
|
- **sverigesradio:publication**
|
||||||
- **svt:page**
|
- **SVT**
|
||||||
- **svt:play**: SVT Play and Öppet arkiv
|
- **SVTPage**
|
||||||
- **svt:play:series**
|
- **SVTPlay**: SVT Play and Öppet arkiv
|
||||||
|
- **SVTSeries**
|
||||||
- **SwearnetEpisode**
|
- **SwearnetEpisode**
|
||||||
- **Syfy**
|
- **Syfy**: (**Currently broken**)
|
||||||
- **SYVDK**
|
- **SYVDK**
|
||||||
- **SztvHu**
|
- **SztvHu**
|
||||||
- **t-online.de**: (**Currently broken**)
|
- **t-online.de**: (**Currently broken**)
|
||||||
@ -1468,9 +1445,11 @@ The only reliable way to check if a site is supported is to try it.
|
|||||||
- **TeleQuebecSquat**
|
- **TeleQuebecSquat**
|
||||||
- **TeleQuebecVideo**
|
- **TeleQuebecVideo**
|
||||||
- **TeleTask**: (**Currently broken**)
|
- **TeleTask**: (**Currently broken**)
|
||||||
- **Telewebion**: (**Currently broken**)
|
- **Telewebion**
|
||||||
- **Tempo**
|
- **Tempo**
|
||||||
- **TennisTV**: [*tennistv*](## "netrc machine")
|
- **TennisTV**: [*tennistv*](## "netrc machine")
|
||||||
|
- **TenPlay**: [*10play*](## "netrc machine")
|
||||||
|
- **TenPlaySeason**
|
||||||
- **TF1**
|
- **TF1**
|
||||||
- **TFO**
|
- **TFO**
|
||||||
- **theatercomplextown:ppv**: [*theatercomplextown*](## "netrc machine")
|
- **theatercomplextown:ppv**: [*theatercomplextown*](## "netrc machine")
|
||||||
@ -1508,7 +1487,6 @@ The only reliable way to check if a site is supported is to try it.
|
|||||||
- **tokfm:podcast**
|
- **tokfm:podcast**
|
||||||
- **ToonGoggles**
|
- **ToonGoggles**
|
||||||
- **tou.tv**: [*toutv*](## "netrc machine")
|
- **tou.tv**: [*toutv*](## "netrc machine")
|
||||||
- **toutiao**: 今日头条
|
|
||||||
- **Toypics**: Toypics video (**Currently broken**)
|
- **Toypics**: Toypics video (**Currently broken**)
|
||||||
- **ToypicsUser**: Toypics user profile (**Currently broken**)
|
- **ToypicsUser**: Toypics user profile (**Currently broken**)
|
||||||
- **TrailerAddict**: (**Currently broken**)
|
- **TrailerAddict**: (**Currently broken**)
|
||||||
@ -1547,8 +1525,6 @@ The only reliable way to check if a site is supported is to try it.
|
|||||||
- **tv5unis**
|
- **tv5unis**
|
||||||
- **tv5unis:video**
|
- **tv5unis:video**
|
||||||
- **tv8.it**
|
- **tv8.it**
|
||||||
- **tv8.it:live**: TV8 Live
|
|
||||||
- **tv8.it:playlist**: TV8 Playlist
|
|
||||||
- **TVANouvelles**
|
- **TVANouvelles**
|
||||||
- **TVANouvellesArticle**
|
- **TVANouvellesArticle**
|
||||||
- **tvaplus**: TVA+
|
- **tvaplus**: TVA+
|
||||||
@ -1569,8 +1545,6 @@ The only reliable way to check if a site is supported is to try it.
|
|||||||
- **tvp:vod:series**
|
- **tvp:vod:series**
|
||||||
- **TVPlayer**
|
- **TVPlayer**
|
||||||
- **TVPlayHome**
|
- **TVPlayHome**
|
||||||
- **tvw**
|
|
||||||
- **tvw:tvchannels**
|
|
||||||
- **Tweakers**
|
- **Tweakers**
|
||||||
- **TwitCasting**
|
- **TwitCasting**
|
||||||
- **TwitCastingLive**
|
- **TwitCastingLive**
|
||||||
@ -1596,9 +1570,7 @@ The only reliable way to check if a site is supported is to try it.
|
|||||||
- **UFCTV**: [*ufctv*](## "netrc machine")
|
- **UFCTV**: [*ufctv*](## "netrc machine")
|
||||||
- **ukcolumn**: (**Currently broken**)
|
- **ukcolumn**: (**Currently broken**)
|
||||||
- **UKTVPlay**
|
- **UKTVPlay**
|
||||||
- **UlizaPlayer**
|
- **umg:de**: Universal Music Deutschland (**Currently broken**)
|
||||||
- **UlizaPortal**: ulizaportal.jp
|
|
||||||
- **umg:de**: Universal Music Deutschland
|
|
||||||
- **Unistra**
|
- **Unistra**
|
||||||
- **Unity**: (**Currently broken**)
|
- **Unity**: (**Currently broken**)
|
||||||
- **uol.com.br**
|
- **uol.com.br**
|
||||||
@ -1615,15 +1587,17 @@ The only reliable way to check if a site is supported is to try it.
|
|||||||
- **Varzesh3**: (**Currently broken**)
|
- **Varzesh3**: (**Currently broken**)
|
||||||
- **Vbox7**
|
- **Vbox7**
|
||||||
- **Veo**
|
- **Veo**
|
||||||
|
- **Veoh**
|
||||||
|
- **veoh:user**
|
||||||
- **Vesti**: Вести.Ru (**Currently broken**)
|
- **Vesti**: Вести.Ru (**Currently broken**)
|
||||||
- **Vevo**
|
- **Vevo**
|
||||||
- **VevoPlaylist**
|
- **VevoPlaylist**
|
||||||
- **VGTV**: VGTV, BTTV, FTV, Aftenposten and Aftonbladet
|
- **VGTV**: VGTV, BTTV, FTV, Aftenposten and Aftonbladet
|
||||||
- **vh1.com**
|
- **vh1.com**
|
||||||
- **vhx:embed**: [*vimeo*](## "netrc machine")
|
- **vhx:embed**: [*vimeo*](## "netrc machine")
|
||||||
- **vice**: (**Currently broken**)
|
- **vice**
|
||||||
- **vice:article**: (**Currently broken**)
|
- **vice:article**
|
||||||
- **vice:show**: (**Currently broken**)
|
- **vice:show**
|
||||||
- **Viddler**
|
- **Viddler**
|
||||||
- **Videa**
|
- **Videa**
|
||||||
- **video.arnes.si**: Arnes Video
|
- **video.arnes.si**: Arnes Video
|
||||||
@ -1652,10 +1626,11 @@ The only reliable way to check if a site is supported is to try it.
|
|||||||
- **viewlift**
|
- **viewlift**
|
||||||
- **viewlift:embed**
|
- **viewlift:embed**
|
||||||
- **Viidea**
|
- **Viidea**
|
||||||
|
- **viki**: [*viki*](## "netrc machine")
|
||||||
|
- **viki:channel**: [*viki*](## "netrc machine")
|
||||||
- **vimeo**: [*vimeo*](## "netrc machine")
|
- **vimeo**: [*vimeo*](## "netrc machine")
|
||||||
- **vimeo:album**: [*vimeo*](## "netrc machine")
|
- **vimeo:album**: [*vimeo*](## "netrc machine")
|
||||||
- **vimeo:channel**: [*vimeo*](## "netrc machine")
|
- **vimeo:channel**: [*vimeo*](## "netrc machine")
|
||||||
- **vimeo:event**: [*vimeo*](## "netrc machine")
|
|
||||||
- **vimeo:group**: [*vimeo*](## "netrc machine")
|
- **vimeo:group**: [*vimeo*](## "netrc machine")
|
||||||
- **vimeo:likes**: [*vimeo*](## "netrc machine") Vimeo user likes
|
- **vimeo:likes**: [*vimeo*](## "netrc machine") Vimeo user likes
|
||||||
- **vimeo:ondemand**: [*vimeo*](## "netrc machine")
|
- **vimeo:ondemand**: [*vimeo*](## "netrc machine")
|
||||||
@ -1667,6 +1642,8 @@ The only reliable way to check if a site is supported is to try it.
|
|||||||
- **Vimm:stream**
|
- **Vimm:stream**
|
||||||
- **ViMP**
|
- **ViMP**
|
||||||
- **ViMP:Playlist**
|
- **ViMP:Playlist**
|
||||||
|
- **Vine**
|
||||||
|
- **vine:user**
|
||||||
- **Viously**
|
- **Viously**
|
||||||
- **Viqeo**: (**Currently broken**)
|
- **Viqeo**: (**Currently broken**)
|
||||||
- **Viu**
|
- **Viu**
|
||||||
@ -1690,12 +1667,8 @@ The only reliable way to check if a site is supported is to try it.
|
|||||||
- **vpro**: npo.nl, ntr.nl, omroepwnl.nl, zapp.nl and npo3.nl
|
- **vpro**: npo.nl, ntr.nl, omroepwnl.nl, zapp.nl and npo3.nl
|
||||||
- **vqq:series**
|
- **vqq:series**
|
||||||
- **vqq:video**
|
- **vqq:video**
|
||||||
- **vrsquare**: VR SQUARE
|
|
||||||
- **vrsquare:channel**
|
|
||||||
- **vrsquare:search**
|
|
||||||
- **vrsquare:section**
|
|
||||||
- **VRT**: VRT NWS, Flanders News, Flandern Info and Sporza
|
- **VRT**: VRT NWS, Flanders News, Flandern Info and Sporza
|
||||||
- **vrtmax**: [*vrtnu*](## "netrc machine") VRT MAX (formerly VRT NU)
|
- **VrtNU**: [*vrtnu*](## "netrc machine") VRT MAX
|
||||||
- **VTM**: (**Currently broken**)
|
- **VTM**: (**Currently broken**)
|
||||||
- **VTV**
|
- **VTV**
|
||||||
- **VTVGo**
|
- **VTVGo**
|
||||||
@ -1805,24 +1778,24 @@ The only reliable way to check if a site is supported is to try it.
|
|||||||
- **YouPornStar**: YouPorn Pornstar, with description, sorting and pagination
|
- **YouPornStar**: YouPorn Pornstar, with description, sorting and pagination
|
||||||
- **YouPornTag**: YouPorn tag (porntags), with sorting, filtering and pagination
|
- **YouPornTag**: YouPorn tag (porntags), with sorting, filtering and pagination
|
||||||
- **YouPornVideos**: YouPorn video (browse) playlists, with sorting, filtering and pagination
|
- **YouPornVideos**: YouPorn video (browse) playlists, with sorting, filtering and pagination
|
||||||
- **youtube**: [*youtube*](## "netrc machine") YouTube
|
- **youtube**: YouTube
|
||||||
- **youtube:clip**: [*youtube*](## "netrc machine")
|
- **youtube:clip**
|
||||||
- **youtube:favorites**: [*youtube*](## "netrc machine") YouTube liked videos; ":ytfav" keyword (requires cookies)
|
- **youtube:favorites**: YouTube liked videos; ":ytfav" keyword (requires cookies)
|
||||||
- **youtube:history**: [*youtube*](## "netrc machine") Youtube watch history; ":ythis" keyword (requires cookies)
|
- **youtube:history**: Youtube watch history; ":ythis" keyword (requires cookies)
|
||||||
- **youtube:music:search_url**: [*youtube*](## "netrc machine") YouTube music search URLs with selectable sections, e.g. #songs
|
- **youtube:music:search_url**: YouTube music search URLs with selectable sections, e.g. #songs
|
||||||
- **youtube:notif**: [*youtube*](## "netrc machine") YouTube notifications; ":ytnotif" keyword (requires cookies)
|
- **youtube:notif**: YouTube notifications; ":ytnotif" keyword (requires cookies)
|
||||||
- **youtube:playlist**: [*youtube*](## "netrc machine") YouTube playlists
|
- **youtube:playlist**: YouTube playlists
|
||||||
- **youtube:recommended**: [*youtube*](## "netrc machine") YouTube recommended videos; ":ytrec" keyword
|
- **youtube:recommended**: YouTube recommended videos; ":ytrec" keyword
|
||||||
- **youtube:search**: [*youtube*](## "netrc machine") YouTube search; "ytsearch:" prefix
|
- **youtube:search**: YouTube search; "ytsearch:" prefix
|
||||||
- **youtube:search:date**: [*youtube*](## "netrc machine") YouTube search, newest videos first; "ytsearchdate:" prefix
|
- **youtube:search:date**: YouTube search, newest videos first; "ytsearchdate:" prefix
|
||||||
- **youtube:search_url**: [*youtube*](## "netrc machine") YouTube search URLs with sorting and filter support
|
- **youtube:search_url**: YouTube search URLs with sorting and filter support
|
||||||
- **youtube:shorts:pivot:audio**: [*youtube*](## "netrc machine") YouTube Shorts audio pivot (Shorts using audio of a given video)
|
- **youtube:shorts:pivot:audio**: YouTube Shorts audio pivot (Shorts using audio of a given video)
|
||||||
- **youtube:subscriptions**: [*youtube*](## "netrc machine") YouTube subscriptions feed; ":ytsubs" keyword (requires cookies)
|
- **youtube:subscriptions**: YouTube subscriptions feed; ":ytsubs" keyword (requires cookies)
|
||||||
- **youtube:tab**: [*youtube*](## "netrc machine") YouTube Tabs
|
- **youtube:tab**: YouTube Tabs
|
||||||
- **youtube:user**: [*youtube*](## "netrc machine") YouTube user videos; "ytuser:" prefix
|
- **youtube:user**: YouTube user videos; "ytuser:" prefix
|
||||||
- **youtube:watchlater**: [*youtube*](## "netrc machine") Youtube watch later list; ":ytwatchlater" keyword (requires cookies)
|
- **youtube:watchlater**: Youtube watch later list; ":ytwatchlater" keyword (requires cookies)
|
||||||
- **YoutubeLivestreamEmbed**: [*youtube*](## "netrc machine") YouTube livestream embeds
|
- **YoutubeLivestreamEmbed**: YouTube livestream embeds
|
||||||
- **YoutubeYtBe**: [*youtube*](## "netrc machine") youtu.be
|
- **YoutubeYtBe**: youtu.be
|
||||||
- **Zaiko**
|
- **Zaiko**
|
||||||
- **ZaikoETicket**
|
- **ZaikoETicket**
|
||||||
- **Zapiks**
|
- **Zapiks**
|
||||||
@ -1830,12 +1803,14 @@ The only reliable way to check if a site is supported is to try it.
|
|||||||
- **ZattooLive**: [*zattoo*](## "netrc machine")
|
- **ZattooLive**: [*zattoo*](## "netrc machine")
|
||||||
- **ZattooMovies**: [*zattoo*](## "netrc machine")
|
- **ZattooMovies**: [*zattoo*](## "netrc machine")
|
||||||
- **ZattooRecordings**: [*zattoo*](## "netrc machine")
|
- **ZattooRecordings**: [*zattoo*](## "netrc machine")
|
||||||
- **zdf**
|
- **ZDF**
|
||||||
- **zdf:channel**
|
- **ZDFChannel**
|
||||||
- **Zee5**: [*zee5*](## "netrc machine")
|
- **Zee5**: [*zee5*](## "netrc machine")
|
||||||
- **zee5:series**
|
- **zee5:series**
|
||||||
- **ZeeNews**: (**Currently broken**)
|
- **ZeeNews**: (**Currently broken**)
|
||||||
- **ZenPorn**
|
- **ZenPorn**
|
||||||
|
- **ZenYandex**
|
||||||
|
- **ZenYandexChannel**
|
||||||
- **ZetlandDKArticle**
|
- **ZetlandDKArticle**
|
||||||
- **Zhihu**
|
- **Zhihu**
|
||||||
- **zingmp3**: zingmp3.vn
|
- **zingmp3**: zingmp3.vn
|
||||||
|
201
test/helper.py
201
test/helper.py
@ -9,6 +9,7 @@ import types
|
|||||||
|
|
||||||
import yt_dlp.extractor
|
import yt_dlp.extractor
|
||||||
from yt_dlp import YoutubeDL
|
from yt_dlp import YoutubeDL
|
||||||
|
from yt_dlp.compat import compat_os_name
|
||||||
from yt_dlp.utils import preferredencoding, try_call, write_string, find_available_port
|
from yt_dlp.utils import preferredencoding, try_call, write_string, find_available_port
|
||||||
|
|
||||||
if 'pytest' in sys.modules:
|
if 'pytest' in sys.modules:
|
||||||
@ -48,7 +49,7 @@ def report_warning(message, *args, **kwargs):
|
|||||||
Print the message to stderr, it will be prefixed with 'WARNING:'
|
Print the message to stderr, it will be prefixed with 'WARNING:'
|
||||||
If stderr is a tty file the 'WARNING:' will be colored
|
If stderr is a tty file the 'WARNING:' will be colored
|
||||||
"""
|
"""
|
||||||
if sys.stderr.isatty() and os.name != 'nt':
|
if sys.stderr.isatty() and compat_os_name != 'nt':
|
||||||
_msg_header = '\033[0;33mWARNING:\033[0m'
|
_msg_header = '\033[0;33mWARNING:\033[0m'
|
||||||
else:
|
else:
|
||||||
_msg_header = 'WARNING:'
|
_msg_header = 'WARNING:'
|
||||||
@ -101,109 +102,87 @@ def getwebpagetestcases():
|
|||||||
md5 = lambda s: hashlib.md5(s.encode()).hexdigest()
|
md5 = lambda s: hashlib.md5(s.encode()).hexdigest()
|
||||||
|
|
||||||
|
|
||||||
def _iter_differences(got, expected, field):
|
|
||||||
if isinstance(expected, str):
|
|
||||||
op, _, val = expected.partition(':')
|
|
||||||
if op in ('mincount', 'maxcount', 'count'):
|
|
||||||
if not isinstance(got, (list, dict)):
|
|
||||||
yield field, f'expected either {list.__name__} or {dict.__name__}, got {type(got).__name__}'
|
|
||||||
return
|
|
||||||
|
|
||||||
expected_num = int(val)
|
|
||||||
got_num = len(got)
|
|
||||||
if op == 'mincount':
|
|
||||||
if got_num < expected_num:
|
|
||||||
yield field, f'expected at least {val} items, got {got_num}'
|
|
||||||
return
|
|
||||||
|
|
||||||
if op == 'maxcount':
|
|
||||||
if got_num > expected_num:
|
|
||||||
yield field, f'expected at most {val} items, got {got_num}'
|
|
||||||
return
|
|
||||||
|
|
||||||
assert op == 'count'
|
|
||||||
if got_num != expected_num:
|
|
||||||
yield field, f'expected exactly {val} items, got {got_num}'
|
|
||||||
return
|
|
||||||
|
|
||||||
if not isinstance(got, str):
|
|
||||||
yield field, f'expected {str.__name__}, got {type(got).__name__}'
|
|
||||||
return
|
|
||||||
|
|
||||||
if op == 're':
|
|
||||||
if not re.match(val, got):
|
|
||||||
yield field, f'should match {val!r}, got {got!r}'
|
|
||||||
return
|
|
||||||
|
|
||||||
if op == 'startswith':
|
|
||||||
if not got.startswith(val):
|
|
||||||
yield field, f'should start with {val!r}, got {got!r}'
|
|
||||||
return
|
|
||||||
|
|
||||||
if op == 'contains':
|
|
||||||
if not val.startswith(got):
|
|
||||||
yield field, f'should contain {val!r}, got {got!r}'
|
|
||||||
return
|
|
||||||
|
|
||||||
if op == 'md5':
|
|
||||||
hash_val = md5(got)
|
|
||||||
if hash_val != val:
|
|
||||||
yield field, f'expected hash {val}, got {hash_val}'
|
|
||||||
return
|
|
||||||
|
|
||||||
if got != expected:
|
|
||||||
yield field, f'expected {expected!r}, got {got!r}'
|
|
||||||
return
|
|
||||||
|
|
||||||
if isinstance(expected, dict) and isinstance(got, dict):
|
|
||||||
for key, expected_val in expected.items():
|
|
||||||
if key not in got:
|
|
||||||
yield field, f'missing key: {key!r}'
|
|
||||||
continue
|
|
||||||
|
|
||||||
field_name = key if field is None else f'{field}.{key}'
|
|
||||||
yield from _iter_differences(got[key], expected_val, field_name)
|
|
||||||
return
|
|
||||||
|
|
||||||
if isinstance(expected, type):
|
|
||||||
if not isinstance(got, expected):
|
|
||||||
yield field, f'expected {expected.__name__}, got {type(got).__name__}'
|
|
||||||
return
|
|
||||||
|
|
||||||
if isinstance(expected, list) and isinstance(got, list):
|
|
||||||
# TODO: clever diffing algorithm lmao
|
|
||||||
if len(expected) != len(got):
|
|
||||||
yield field, f'expected length of {len(expected)}, got {len(got)}'
|
|
||||||
return
|
|
||||||
|
|
||||||
for index, (got_val, expected_val) in enumerate(zip(got, expected)):
|
|
||||||
field_name = str(index) if field is None else f'{field}.{index}'
|
|
||||||
yield from _iter_differences(got_val, expected_val, field_name)
|
|
||||||
return
|
|
||||||
|
|
||||||
if got != expected:
|
|
||||||
yield field, f'expected {expected!r}, got {got!r}'
|
|
||||||
|
|
||||||
|
|
||||||
def _expect_value(message, got, expected, field):
|
|
||||||
mismatches = list(_iter_differences(got, expected, field))
|
|
||||||
if not mismatches:
|
|
||||||
return
|
|
||||||
|
|
||||||
fields = [field for field, _ in mismatches if field is not None]
|
|
||||||
return ''.join((
|
|
||||||
message, f' ({", ".join(fields)})' if fields else '',
|
|
||||||
*(f'\n\t{field}: {message}' for field, message in mismatches)))
|
|
||||||
|
|
||||||
|
|
||||||
def expect_value(self, got, expected, field):
|
def expect_value(self, got, expected, field):
|
||||||
if message := _expect_value('values differ', got, expected, field):
|
if isinstance(expected, str) and expected.startswith('re:'):
|
||||||
self.fail(message)
|
match_str = expected[len('re:'):]
|
||||||
|
match_rex = re.compile(match_str)
|
||||||
|
|
||||||
|
self.assertTrue(
|
||||||
|
isinstance(got, str),
|
||||||
|
f'Expected a {str.__name__} object, but got {type(got).__name__} for field {field}')
|
||||||
|
self.assertTrue(
|
||||||
|
match_rex.match(got),
|
||||||
|
f'field {field} (value: {got!r}) should match {match_str!r}')
|
||||||
|
elif isinstance(expected, str) and expected.startswith('startswith:'):
|
||||||
|
start_str = expected[len('startswith:'):]
|
||||||
|
self.assertTrue(
|
||||||
|
isinstance(got, str),
|
||||||
|
f'Expected a {str.__name__} object, but got {type(got).__name__} for field {field}')
|
||||||
|
self.assertTrue(
|
||||||
|
got.startswith(start_str),
|
||||||
|
f'field {field} (value: {got!r}) should start with {start_str!r}')
|
||||||
|
elif isinstance(expected, str) and expected.startswith('contains:'):
|
||||||
|
contains_str = expected[len('contains:'):]
|
||||||
|
self.assertTrue(
|
||||||
|
isinstance(got, str),
|
||||||
|
f'Expected a {str.__name__} object, but got {type(got).__name__} for field {field}')
|
||||||
|
self.assertTrue(
|
||||||
|
contains_str in got,
|
||||||
|
f'field {field} (value: {got!r}) should contain {contains_str!r}')
|
||||||
|
elif isinstance(expected, type):
|
||||||
|
self.assertTrue(
|
||||||
|
isinstance(got, expected),
|
||||||
|
f'Expected type {expected!r} for field {field}, but got value {got!r} of type {type(got)!r}')
|
||||||
|
elif isinstance(expected, dict) and isinstance(got, dict):
|
||||||
|
expect_dict(self, got, expected)
|
||||||
|
elif isinstance(expected, list) and isinstance(got, list):
|
||||||
|
self.assertEqual(
|
||||||
|
len(expected), len(got),
|
||||||
|
f'Expect a list of length {len(expected)}, but got a list of length {len(got)} for field {field}')
|
||||||
|
for index, (item_got, item_expected) in enumerate(zip(got, expected)):
|
||||||
|
type_got = type(item_got)
|
||||||
|
type_expected = type(item_expected)
|
||||||
|
self.assertEqual(
|
||||||
|
type_expected, type_got,
|
||||||
|
f'Type mismatch for list item at index {index} for field {field}, '
|
||||||
|
f'expected {type_expected!r}, got {type_got!r}')
|
||||||
|
expect_value(self, item_got, item_expected, field)
|
||||||
|
else:
|
||||||
|
if isinstance(expected, str) and expected.startswith('md5:'):
|
||||||
|
self.assertTrue(
|
||||||
|
isinstance(got, str),
|
||||||
|
f'Expected field {field} to be a unicode object, but got value {got!r} of type {type(got)!r}')
|
||||||
|
got = 'md5:' + md5(got)
|
||||||
|
elif isinstance(expected, str) and re.match(r'^(?:min|max)?count:\d+', expected):
|
||||||
|
self.assertTrue(
|
||||||
|
isinstance(got, (list, dict)),
|
||||||
|
f'Expected field {field} to be a list or a dict, but it is of type {type(got).__name__}')
|
||||||
|
op, _, expected_num = expected.partition(':')
|
||||||
|
expected_num = int(expected_num)
|
||||||
|
if op == 'mincount':
|
||||||
|
assert_func = assertGreaterEqual
|
||||||
|
msg_tmpl = 'Expected %d items in field %s, but only got %d'
|
||||||
|
elif op == 'maxcount':
|
||||||
|
assert_func = assertLessEqual
|
||||||
|
msg_tmpl = 'Expected maximum %d items in field %s, but got %d'
|
||||||
|
elif op == 'count':
|
||||||
|
assert_func = assertEqual
|
||||||
|
msg_tmpl = 'Expected exactly %d items in field %s, but got %d'
|
||||||
|
else:
|
||||||
|
assert False
|
||||||
|
assert_func(
|
||||||
|
self, len(got), expected_num,
|
||||||
|
msg_tmpl % (expected_num, field, len(got)))
|
||||||
|
return
|
||||||
|
self.assertEqual(
|
||||||
|
expected, got,
|
||||||
|
f'Invalid value for field {field}, expected {expected!r}, got {got!r}')
|
||||||
|
|
||||||
|
|
||||||
def expect_dict(self, got_dict, expected_dict):
|
def expect_dict(self, got_dict, expected_dict):
|
||||||
if message := _expect_value('dictionaries differ', got_dict, expected_dict, None):
|
for info_field, expected in expected_dict.items():
|
||||||
self.fail(message)
|
got = got_dict.get(info_field)
|
||||||
|
expect_value(self, got, expected, info_field)
|
||||||
|
|
||||||
|
|
||||||
def sanitize_got_info_dict(got_dict):
|
def sanitize_got_info_dict(got_dict):
|
||||||
@ -259,20 +238,6 @@ def sanitize_got_info_dict(got_dict):
|
|||||||
|
|
||||||
|
|
||||||
def expect_info_dict(self, got_dict, expected_dict):
|
def expect_info_dict(self, got_dict, expected_dict):
|
||||||
ALLOWED_KEYS_SORT_ORDER = (
|
|
||||||
# NB: Keep in sync with the docstring of extractor/common.py
|
|
||||||
'id', 'ext', 'direct', 'display_id', 'title', 'alt_title', 'description', 'media_type',
|
|
||||||
'uploader', 'uploader_id', 'uploader_url', 'channel', 'channel_id', 'channel_url', 'channel_is_verified',
|
|
||||||
'channel_follower_count', 'comment_count', 'view_count', 'concurrent_view_count',
|
|
||||||
'like_count', 'dislike_count', 'repost_count', 'average_rating', 'age_limit', 'duration', 'thumbnail', 'heatmap',
|
|
||||||
'chapters', 'chapter', 'chapter_number', 'chapter_id', 'start_time', 'end_time', 'section_start', 'section_end',
|
|
||||||
'categories', 'tags', 'cast', 'composers', 'artists', 'album_artists', 'creators', 'genres',
|
|
||||||
'track', 'track_number', 'track_id', 'album', 'album_type', 'disc_number',
|
|
||||||
'series', 'series_id', 'season', 'season_number', 'season_id', 'episode', 'episode_number', 'episode_id',
|
|
||||||
'timestamp', 'upload_date', 'release_timestamp', 'release_date', 'release_year', 'modified_timestamp', 'modified_date',
|
|
||||||
'playable_in_embed', 'availability', 'live_status', 'location', 'license', '_old_archive_ids',
|
|
||||||
)
|
|
||||||
|
|
||||||
expect_dict(self, got_dict, expected_dict)
|
expect_dict(self, got_dict, expected_dict)
|
||||||
# Check for the presence of mandatory fields
|
# Check for the presence of mandatory fields
|
||||||
if got_dict.get('_type') not in ('playlist', 'multi_video'):
|
if got_dict.get('_type') not in ('playlist', 'multi_video'):
|
||||||
@ -288,13 +253,7 @@ def expect_info_dict(self, got_dict, expected_dict):
|
|||||||
|
|
||||||
test_info_dict = sanitize_got_info_dict(got_dict)
|
test_info_dict = sanitize_got_info_dict(got_dict)
|
||||||
|
|
||||||
# Check for invalid/misspelled field names being returned by the extractor
|
missing_keys = set(test_info_dict.keys()) - set(expected_dict.keys())
|
||||||
invalid_keys = sorted(test_info_dict.keys() - ALLOWED_KEYS_SORT_ORDER)
|
|
||||||
self.assertFalse(invalid_keys, f'Invalid fields returned by the extractor: {", ".join(invalid_keys)}')
|
|
||||||
|
|
||||||
missing_keys = sorted(
|
|
||||||
test_info_dict.keys() - expected_dict.keys(),
|
|
||||||
key=lambda x: ALLOWED_KEYS_SORT_ORDER.index(x))
|
|
||||||
if missing_keys:
|
if missing_keys:
|
||||||
def _repr(v):
|
def _repr(v):
|
||||||
if isinstance(v, str):
|
if isinstance(v, str):
|
||||||
|
@ -53,18 +53,6 @@ class TestInfoExtractor(unittest.TestCase):
|
|||||||
def test_ie_key(self):
|
def test_ie_key(self):
|
||||||
self.assertEqual(get_info_extractor(YoutubeIE.ie_key()), YoutubeIE)
|
self.assertEqual(get_info_extractor(YoutubeIE.ie_key()), YoutubeIE)
|
||||||
|
|
||||||
def test_get_netrc_login_info(self):
|
|
||||||
for params in [
|
|
||||||
{'usenetrc': True, 'netrc_location': './test/testdata/netrc/netrc'},
|
|
||||||
{'netrc_cmd': f'{sys.executable} ./test/testdata/netrc/print_netrc.py'},
|
|
||||||
]:
|
|
||||||
ie = DummyIE(FakeYDL(params))
|
|
||||||
self.assertEqual(ie._get_netrc_login_info(netrc_machine='normal_use'), ('user', 'pass'))
|
|
||||||
self.assertEqual(ie._get_netrc_login_info(netrc_machine='empty_user'), ('', 'pass'))
|
|
||||||
self.assertEqual(ie._get_netrc_login_info(netrc_machine='empty_pass'), ('user', ''))
|
|
||||||
self.assertEqual(ie._get_netrc_login_info(netrc_machine='both_empty'), ('', ''))
|
|
||||||
self.assertEqual(ie._get_netrc_login_info(netrc_machine='nonexistent'), (None, None))
|
|
||||||
|
|
||||||
def test_html_search_regex(self):
|
def test_html_search_regex(self):
|
||||||
html = '<p id="foo">Watch this <a href="http://www.youtube.com/watch?v=BaW_jenozKc">video</a></p>'
|
html = '<p id="foo">Watch this <a href="http://www.youtube.com/watch?v=BaW_jenozKc">video</a></p>'
|
||||||
search = lambda re, *args: self.ie._html_search_regex(re, html, *args)
|
search = lambda re, *args: self.ie._html_search_regex(re, html, *args)
|
||||||
@ -314,20 +302,6 @@ class TestInfoExtractor(unittest.TestCase):
|
|||||||
},
|
},
|
||||||
{},
|
{},
|
||||||
),
|
),
|
||||||
(
|
|
||||||
# test thumbnail_url key without URL scheme
|
|
||||||
r'''
|
|
||||||
<script type="application/ld+json">
|
|
||||||
{
|
|
||||||
"@context": "https://schema.org",
|
|
||||||
"@type": "VideoObject",
|
|
||||||
"thumbnail_url": "//www.nobelprize.org/images/12693-landscape-medium-gallery.jpg"
|
|
||||||
}</script>''',
|
|
||||||
{
|
|
||||||
'thumbnails': [{'url': 'https://www.nobelprize.org/images/12693-landscape-medium-gallery.jpg'}],
|
|
||||||
},
|
|
||||||
{},
|
|
||||||
),
|
|
||||||
]
|
]
|
||||||
for html, expected_dict, search_json_ld_kwargs in _TESTS:
|
for html, expected_dict, search_json_ld_kwargs in _TESTS:
|
||||||
expect_dict(
|
expect_dict(
|
||||||
@ -652,7 +626,6 @@ jwplayer("mediaplayer").setup({"abouttext":"Visit Indie DB","aboutlink":"http:\/
|
|||||||
'img_bipbop_adv_example_fmp4',
|
'img_bipbop_adv_example_fmp4',
|
||||||
'https://devstreaming-cdn.apple.com/videos/streaming/examples/img_bipbop_adv_example_fmp4/master.m3u8',
|
'https://devstreaming-cdn.apple.com/videos/streaming/examples/img_bipbop_adv_example_fmp4/master.m3u8',
|
||||||
[{
|
[{
|
||||||
# 60kbps (bitrate not provided in m3u8); sorted as worst because it's grouped with lowest bitrate video track
|
|
||||||
'format_id': 'aud1-English',
|
'format_id': 'aud1-English',
|
||||||
'url': 'https://devstreaming-cdn.apple.com/videos/streaming/examples/img_bipbop_adv_example_fmp4/a1/prog_index.m3u8',
|
'url': 'https://devstreaming-cdn.apple.com/videos/streaming/examples/img_bipbop_adv_example_fmp4/a1/prog_index.m3u8',
|
||||||
'manifest_url': 'https://devstreaming-cdn.apple.com/videos/streaming/examples/img_bipbop_adv_example_fmp4/master.m3u8',
|
'manifest_url': 'https://devstreaming-cdn.apple.com/videos/streaming/examples/img_bipbop_adv_example_fmp4/master.m3u8',
|
||||||
@ -660,19 +633,7 @@ jwplayer("mediaplayer").setup({"abouttext":"Visit Indie DB","aboutlink":"http:\/
|
|||||||
'ext': 'mp4',
|
'ext': 'mp4',
|
||||||
'protocol': 'm3u8_native',
|
'protocol': 'm3u8_native',
|
||||||
'audio_ext': 'mp4',
|
'audio_ext': 'mp4',
|
||||||
'source_preference': 0,
|
|
||||||
}, {
|
}, {
|
||||||
# 192kbps (bitrate not provided in m3u8)
|
|
||||||
'format_id': 'aud3-English',
|
|
||||||
'url': 'https://devstreaming-cdn.apple.com/videos/streaming/examples/img_bipbop_adv_example_fmp4/a3/prog_index.m3u8',
|
|
||||||
'manifest_url': 'https://devstreaming-cdn.apple.com/videos/streaming/examples/img_bipbop_adv_example_fmp4/master.m3u8',
|
|
||||||
'language': 'en',
|
|
||||||
'ext': 'mp4',
|
|
||||||
'protocol': 'm3u8_native',
|
|
||||||
'audio_ext': 'mp4',
|
|
||||||
'source_preference': 1,
|
|
||||||
}, {
|
|
||||||
# 384kbps (bitrate not provided in m3u8); sorted as best because it's grouped with the highest bitrate video track
|
|
||||||
'format_id': 'aud2-English',
|
'format_id': 'aud2-English',
|
||||||
'url': 'https://devstreaming-cdn.apple.com/videos/streaming/examples/img_bipbop_adv_example_fmp4/a2/prog_index.m3u8',
|
'url': 'https://devstreaming-cdn.apple.com/videos/streaming/examples/img_bipbop_adv_example_fmp4/a2/prog_index.m3u8',
|
||||||
'manifest_url': 'https://devstreaming-cdn.apple.com/videos/streaming/examples/img_bipbop_adv_example_fmp4/master.m3u8',
|
'manifest_url': 'https://devstreaming-cdn.apple.com/videos/streaming/examples/img_bipbop_adv_example_fmp4/master.m3u8',
|
||||||
@ -680,7 +641,14 @@ jwplayer("mediaplayer").setup({"abouttext":"Visit Indie DB","aboutlink":"http:\/
|
|||||||
'ext': 'mp4',
|
'ext': 'mp4',
|
||||||
'protocol': 'm3u8_native',
|
'protocol': 'm3u8_native',
|
||||||
'audio_ext': 'mp4',
|
'audio_ext': 'mp4',
|
||||||
'source_preference': 2,
|
}, {
|
||||||
|
'format_id': 'aud3-English',
|
||||||
|
'url': 'https://devstreaming-cdn.apple.com/videos/streaming/examples/img_bipbop_adv_example_fmp4/a3/prog_index.m3u8',
|
||||||
|
'manifest_url': 'https://devstreaming-cdn.apple.com/videos/streaming/examples/img_bipbop_adv_example_fmp4/master.m3u8',
|
||||||
|
'language': 'en',
|
||||||
|
'ext': 'mp4',
|
||||||
|
'protocol': 'm3u8_native',
|
||||||
|
'audio_ext': 'mp4',
|
||||||
}, {
|
}, {
|
||||||
'format_id': '530',
|
'format_id': '530',
|
||||||
'url': 'https://devstreaming-cdn.apple.com/videos/streaming/examples/img_bipbop_adv_example_fmp4/v2/prog_index.m3u8',
|
'url': 'https://devstreaming-cdn.apple.com/videos/streaming/examples/img_bipbop_adv_example_fmp4/v2/prog_index.m3u8',
|
||||||
@ -1947,137 +1915,6 @@ jwplayer("mediaplayer").setup({"abouttext":"Visit Indie DB","aboutlink":"http:\/
|
|||||||
with self.assertWarns(DeprecationWarning):
|
with self.assertWarns(DeprecationWarning):
|
||||||
self.assertEqual(self.ie._search_nextjs_data('', None, default='{}'), {})
|
self.assertEqual(self.ie._search_nextjs_data('', None, default='{}'), {})
|
||||||
|
|
||||||
def test_search_nuxt_json(self):
|
|
||||||
HTML_TMPL = '<script data-ssr="true" id="__NUXT_DATA__" type="application/json">[{}]</script>'
|
|
||||||
VALID_DATA = '''
|
|
||||||
["ShallowReactive",1],
|
|
||||||
{"data":2,"state":21,"once":25,"_errors":28,"_server_errors":30},
|
|
||||||
["ShallowReactive",3],
|
|
||||||
{"$abcdef123456":4},
|
|
||||||
{"podcast":5,"activeEpisodeData":7},
|
|
||||||
{"podcast":6,"seasons":14},
|
|
||||||
{"title":10,"id":11},
|
|
||||||
["Reactive",8],
|
|
||||||
{"episode":9,"creators":18,"empty_list":20},
|
|
||||||
{"title":12,"id":13,"refs":34,"empty_refs":35},
|
|
||||||
"Series Title",
|
|
||||||
"podcast-id-01",
|
|
||||||
"Episode Title",
|
|
||||||
"episode-id-99",
|
|
||||||
[15,16,17],
|
|
||||||
1,
|
|
||||||
2,
|
|
||||||
3,
|
|
||||||
[19],
|
|
||||||
"Podcast Creator",
|
|
||||||
[],
|
|
||||||
{"$ssite-config":22},
|
|
||||||
{"env":23,"name":24,"map":26,"numbers":14},
|
|
||||||
"production",
|
|
||||||
"podcast-website",
|
|
||||||
["Set"],
|
|
||||||
["Reactive",27],
|
|
||||||
["Map"],
|
|
||||||
["ShallowReactive",29],
|
|
||||||
{},
|
|
||||||
["NuxtError",31],
|
|
||||||
{"status":32,"message":33},
|
|
||||||
503,
|
|
||||||
"Service Unavailable",
|
|
||||||
[36,37],
|
|
||||||
[38,39],
|
|
||||||
["Ref",40],
|
|
||||||
["ShallowRef",41],
|
|
||||||
["EmptyRef",42],
|
|
||||||
["EmptyShallowRef",43],
|
|
||||||
"ref",
|
|
||||||
"shallow_ref",
|
|
||||||
"{\\"ref\\":1}",
|
|
||||||
"{\\"shallow_ref\\":2}"
|
|
||||||
'''
|
|
||||||
PAYLOAD = {
|
|
||||||
'data': {
|
|
||||||
'$abcdef123456': {
|
|
||||||
'podcast': {
|
|
||||||
'podcast': {
|
|
||||||
'title': 'Series Title',
|
|
||||||
'id': 'podcast-id-01',
|
|
||||||
},
|
|
||||||
'seasons': [1, 2, 3],
|
|
||||||
},
|
|
||||||
'activeEpisodeData': {
|
|
||||||
'episode': {
|
|
||||||
'title': 'Episode Title',
|
|
||||||
'id': 'episode-id-99',
|
|
||||||
'refs': ['ref', 'shallow_ref'],
|
|
||||||
'empty_refs': [{'ref': 1}, {'shallow_ref': 2}],
|
|
||||||
},
|
|
||||||
'creators': ['Podcast Creator'],
|
|
||||||
'empty_list': [],
|
|
||||||
},
|
|
||||||
},
|
|
||||||
},
|
|
||||||
'state': {
|
|
||||||
'$ssite-config': {
|
|
||||||
'env': 'production',
|
|
||||||
'name': 'podcast-website',
|
|
||||||
'map': [],
|
|
||||||
'numbers': [1, 2, 3],
|
|
||||||
},
|
|
||||||
},
|
|
||||||
'once': [],
|
|
||||||
'_errors': {},
|
|
||||||
'_server_errors': {
|
|
||||||
'status': 503,
|
|
||||||
'message': 'Service Unavailable',
|
|
||||||
},
|
|
||||||
}
|
|
||||||
PARTIALLY_INVALID = [(
|
|
||||||
'''
|
|
||||||
{"data":1},
|
|
||||||
{"invalid_raw_list":2},
|
|
||||||
[15,16,17]
|
|
||||||
''',
|
|
||||||
{'data': {'invalid_raw_list': [None, None, None]}},
|
|
||||||
), (
|
|
||||||
'''
|
|
||||||
{"data":1},
|
|
||||||
["EmptyRef",2],
|
|
||||||
"not valid JSON"
|
|
||||||
''',
|
|
||||||
{'data': None},
|
|
||||||
), (
|
|
||||||
'''
|
|
||||||
{"data":1},
|
|
||||||
["EmptyShallowRef",2],
|
|
||||||
"not valid JSON"
|
|
||||||
''',
|
|
||||||
{'data': None},
|
|
||||||
)]
|
|
||||||
INVALID = [
|
|
||||||
'''
|
|
||||||
[]
|
|
||||||
''',
|
|
||||||
'''
|
|
||||||
["unsupported",1],
|
|
||||||
{"data":2},
|
|
||||||
{}
|
|
||||||
''',
|
|
||||||
]
|
|
||||||
DEFAULT = object()
|
|
||||||
|
|
||||||
self.assertEqual(self.ie._search_nuxt_json(HTML_TMPL.format(VALID_DATA), None), PAYLOAD)
|
|
||||||
self.assertEqual(self.ie._search_nuxt_json('', None, fatal=False), {})
|
|
||||||
self.assertIs(self.ie._search_nuxt_json('', None, default=DEFAULT), DEFAULT)
|
|
||||||
|
|
||||||
for data, expected in PARTIALLY_INVALID:
|
|
||||||
self.assertEqual(
|
|
||||||
self.ie._search_nuxt_json(HTML_TMPL.format(data), None, fatal=False), expected)
|
|
||||||
|
|
||||||
for data in INVALID:
|
|
||||||
self.assertIs(
|
|
||||||
self.ie._search_nuxt_json(HTML_TMPL.format(data), None, default=DEFAULT), DEFAULT)
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == '__main__':
|
if __name__ == '__main__':
|
||||||
unittest.main()
|
unittest.main()
|
||||||
|
@ -6,8 +6,6 @@ import sys
|
|||||||
import unittest
|
import unittest
|
||||||
from unittest.mock import patch
|
from unittest.mock import patch
|
||||||
|
|
||||||
from yt_dlp.globals import all_plugins_loaded
|
|
||||||
|
|
||||||
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||||
|
|
||||||
|
|
||||||
@ -17,6 +15,7 @@ import json
|
|||||||
|
|
||||||
from test.helper import FakeYDL, assertRegexpMatches, try_rm
|
from test.helper import FakeYDL, assertRegexpMatches, try_rm
|
||||||
from yt_dlp import YoutubeDL
|
from yt_dlp import YoutubeDL
|
||||||
|
from yt_dlp.compat import compat_os_name
|
||||||
from yt_dlp.extractor import YoutubeIE
|
from yt_dlp.extractor import YoutubeIE
|
||||||
from yt_dlp.extractor.common import InfoExtractor
|
from yt_dlp.extractor.common import InfoExtractor
|
||||||
from yt_dlp.postprocessor.common import PostProcessor
|
from yt_dlp.postprocessor.common import PostProcessor
|
||||||
@ -488,11 +487,11 @@ class TestFormatSelection(unittest.TestCase):
|
|||||||
|
|
||||||
def test_format_filtering(self):
|
def test_format_filtering(self):
|
||||||
formats = [
|
formats = [
|
||||||
{'format_id': 'A', 'filesize': 500, 'width': 1000, 'aspect_ratio': 1.0},
|
{'format_id': 'A', 'filesize': 500, 'width': 1000},
|
||||||
{'format_id': 'B', 'filesize': 1000, 'width': 500, 'aspect_ratio': 1.33},
|
{'format_id': 'B', 'filesize': 1000, 'width': 500},
|
||||||
{'format_id': 'C', 'filesize': 1000, 'width': 400, 'aspect_ratio': 1.5},
|
{'format_id': 'C', 'filesize': 1000, 'width': 400},
|
||||||
{'format_id': 'D', 'filesize': 2000, 'width': 600, 'aspect_ratio': 1.78},
|
{'format_id': 'D', 'filesize': 2000, 'width': 600},
|
||||||
{'format_id': 'E', 'filesize': 3000, 'aspect_ratio': 0.56},
|
{'format_id': 'E', 'filesize': 3000},
|
||||||
{'format_id': 'F'},
|
{'format_id': 'F'},
|
||||||
{'format_id': 'G', 'filesize': 1000000},
|
{'format_id': 'G', 'filesize': 1000000},
|
||||||
]
|
]
|
||||||
@ -551,31 +550,6 @@ class TestFormatSelection(unittest.TestCase):
|
|||||||
ydl.process_ie_result(info_dict)
|
ydl.process_ie_result(info_dict)
|
||||||
self.assertEqual(ydl.downloaded_info_dicts, [])
|
self.assertEqual(ydl.downloaded_info_dicts, [])
|
||||||
|
|
||||||
ydl = YDL({'format': 'best[aspect_ratio=1]'})
|
|
||||||
ydl.process_ie_result(info_dict)
|
|
||||||
downloaded = ydl.downloaded_info_dicts[0]
|
|
||||||
self.assertEqual(downloaded['format_id'], 'A')
|
|
||||||
|
|
||||||
ydl = YDL({'format': 'all[aspect_ratio > 1.00]'})
|
|
||||||
ydl.process_ie_result(info_dict)
|
|
||||||
downloaded_ids = [info['format_id'] for info in ydl.downloaded_info_dicts]
|
|
||||||
self.assertEqual(downloaded_ids, ['D', 'C', 'B'])
|
|
||||||
|
|
||||||
ydl = YDL({'format': 'all[aspect_ratio < 1.00]'})
|
|
||||||
ydl.process_ie_result(info_dict)
|
|
||||||
downloaded_ids = [info['format_id'] for info in ydl.downloaded_info_dicts]
|
|
||||||
self.assertEqual(downloaded_ids, ['E'])
|
|
||||||
|
|
||||||
ydl = YDL({'format': 'best[aspect_ratio=1.5]'})
|
|
||||||
ydl.process_ie_result(info_dict)
|
|
||||||
downloaded = ydl.downloaded_info_dicts[0]
|
|
||||||
self.assertEqual(downloaded['format_id'], 'C')
|
|
||||||
|
|
||||||
ydl = YDL({'format': 'all[aspect_ratio!=1]'})
|
|
||||||
ydl.process_ie_result(info_dict)
|
|
||||||
downloaded_ids = [info['format_id'] for info in ydl.downloaded_info_dicts]
|
|
||||||
self.assertEqual(downloaded_ids, ['E', 'D', 'C', 'B'])
|
|
||||||
|
|
||||||
@patch('yt_dlp.postprocessor.ffmpeg.FFmpegMergerPP.available', False)
|
@patch('yt_dlp.postprocessor.ffmpeg.FFmpegMergerPP.available', False)
|
||||||
def test_default_format_spec_without_ffmpeg(self):
|
def test_default_format_spec_without_ffmpeg(self):
|
||||||
ydl = YDL({})
|
ydl = YDL({})
|
||||||
@ -788,13 +762,6 @@ class TestYoutubeDL(unittest.TestCase):
|
|||||||
test('%(width)06d.%%(ext)s', 'NA.%(ext)s')
|
test('%(width)06d.%%(ext)s', 'NA.%(ext)s')
|
||||||
test('%%(width)06d.%(ext)s', '%(width)06d.mp4')
|
test('%%(width)06d.%(ext)s', '%(width)06d.mp4')
|
||||||
|
|
||||||
# Sanitization options
|
|
||||||
test('%(title3)s', (None, 'foo⧸bar⧹test'))
|
|
||||||
test('%(title5)s', (None, 'aei_A'), restrictfilenames=True)
|
|
||||||
test('%(title3)s', (None, 'foo_bar_test'), windowsfilenames=False, restrictfilenames=True)
|
|
||||||
if sys.platform != 'win32':
|
|
||||||
test('%(title3)s', (None, 'foo⧸bar\\test'), windowsfilenames=False)
|
|
||||||
|
|
||||||
# ID sanitization
|
# ID sanitization
|
||||||
test('%(id)s', '_abcd', info={'id': '_abcd'})
|
test('%(id)s', '_abcd', info={'id': '_abcd'})
|
||||||
test('%(some_id)s', '_abcd', info={'some_id': '_abcd'})
|
test('%(some_id)s', '_abcd', info={'some_id': '_abcd'})
|
||||||
@ -872,8 +839,8 @@ class TestYoutubeDL(unittest.TestCase):
|
|||||||
test('%(filesize)#D', '1Ki')
|
test('%(filesize)#D', '1Ki')
|
||||||
test('%(height)5.2D', ' 1.08k')
|
test('%(height)5.2D', ' 1.08k')
|
||||||
test('%(title4)#S', 'foo_bar_test')
|
test('%(title4)#S', 'foo_bar_test')
|
||||||
test('%(title4).10S', ('foo "bar" ', 'foo "bar"' + ('#' if os.name == 'nt' else ' ')))
|
test('%(title4).10S', ('foo "bar" ', 'foo "bar"' + ('#' if compat_os_name == 'nt' else ' ')))
|
||||||
if os.name == 'nt':
|
if compat_os_name == 'nt':
|
||||||
test('%(title4)q', ('"foo ""bar"" test"', None))
|
test('%(title4)q', ('"foo ""bar"" test"', None))
|
||||||
test('%(formats.:.id)#q', ('"id 1" "id 2" "id 3"', None))
|
test('%(formats.:.id)#q', ('"id 1" "id 2" "id 3"', None))
|
||||||
test('%(formats.0.id)#q', ('"id 1"', None))
|
test('%(formats.0.id)#q', ('"id 1"', None))
|
||||||
@ -936,9 +903,9 @@ class TestYoutubeDL(unittest.TestCase):
|
|||||||
|
|
||||||
# Environment variable expansion for prepare_filename
|
# Environment variable expansion for prepare_filename
|
||||||
os.environ['__yt_dlp_var'] = 'expanded'
|
os.environ['__yt_dlp_var'] = 'expanded'
|
||||||
envvar = '%__yt_dlp_var%' if os.name == 'nt' else '$__yt_dlp_var'
|
envvar = '%__yt_dlp_var%' if compat_os_name == 'nt' else '$__yt_dlp_var'
|
||||||
test(envvar, (envvar, 'expanded'))
|
test(envvar, (envvar, 'expanded'))
|
||||||
if os.name == 'nt':
|
if compat_os_name == 'nt':
|
||||||
test('%s%', ('%s%', '%s%'))
|
test('%s%', ('%s%', '%s%'))
|
||||||
os.environ['s'] = 'expanded'
|
os.environ['s'] = 'expanded'
|
||||||
test('%s%', ('%s%', 'expanded')) # %s% should be expanded before escaping %s
|
test('%s%', ('%s%', 'expanded')) # %s% should be expanded before escaping %s
|
||||||
@ -1429,33 +1396,6 @@ class TestYoutubeDL(unittest.TestCase):
|
|||||||
self.assertFalse(result.get('cookies'), msg='Cookies set in cookies field for wrong domain')
|
self.assertFalse(result.get('cookies'), msg='Cookies set in cookies field for wrong domain')
|
||||||
self.assertFalse(ydl.cookiejar.get_cookie_header(fmt['url']), msg='Cookies set in cookiejar for wrong domain')
|
self.assertFalse(ydl.cookiejar.get_cookie_header(fmt['url']), msg='Cookies set in cookiejar for wrong domain')
|
||||||
|
|
||||||
def test_load_plugins_compat(self):
|
|
||||||
# Should try to reload plugins if they haven't already been loaded
|
|
||||||
all_plugins_loaded.value = False
|
|
||||||
FakeYDL().close()
|
|
||||||
assert all_plugins_loaded.value
|
|
||||||
|
|
||||||
def test_close_hooks(self):
|
|
||||||
# Should call all registered close hooks on close
|
|
||||||
close_hook_called = False
|
|
||||||
close_hook_two_called = False
|
|
||||||
|
|
||||||
def close_hook():
|
|
||||||
nonlocal close_hook_called
|
|
||||||
close_hook_called = True
|
|
||||||
|
|
||||||
def close_hook_two():
|
|
||||||
nonlocal close_hook_two_called
|
|
||||||
close_hook_two_called = True
|
|
||||||
|
|
||||||
ydl = FakeYDL()
|
|
||||||
ydl.add_close_hook(close_hook)
|
|
||||||
ydl.add_close_hook(close_hook_two)
|
|
||||||
|
|
||||||
ydl.close()
|
|
||||||
self.assertTrue(close_hook_called, 'Close hook was not called')
|
|
||||||
self.assertTrue(close_hook_two_called, 'Close hook two was not called')
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == '__main__':
|
if __name__ == '__main__':
|
||||||
unittest.main()
|
unittest.main()
|
||||||
|
@ -27,6 +27,7 @@ from yt_dlp.aes import (
|
|||||||
pad_block,
|
pad_block,
|
||||||
)
|
)
|
||||||
from yt_dlp.dependencies import Cryptodome
|
from yt_dlp.dependencies import Cryptodome
|
||||||
|
from yt_dlp.utils import bytes_to_intlist, intlist_to_bytes
|
||||||
|
|
||||||
# the encrypted data can be generate with 'devscripts/generate_aes_testdata.py'
|
# the encrypted data can be generate with 'devscripts/generate_aes_testdata.py'
|
||||||
|
|
||||||
@ -39,33 +40,33 @@ class TestAES(unittest.TestCase):
|
|||||||
def test_encrypt(self):
|
def test_encrypt(self):
|
||||||
msg = b'message'
|
msg = b'message'
|
||||||
key = list(range(16))
|
key = list(range(16))
|
||||||
encrypted = aes_encrypt(list(msg), key)
|
encrypted = aes_encrypt(bytes_to_intlist(msg), key)
|
||||||
decrypted = bytes(aes_decrypt(encrypted, key))
|
decrypted = intlist_to_bytes(aes_decrypt(encrypted, key))
|
||||||
self.assertEqual(decrypted, msg)
|
self.assertEqual(decrypted, msg)
|
||||||
|
|
||||||
def test_cbc_decrypt(self):
|
def test_cbc_decrypt(self):
|
||||||
data = b'\x97\x92+\xe5\x0b\xc3\x18\x91ky9m&\xb3\xb5@\xe6\x27\xc2\x96.\xc8u\x88\xab9-[\x9e|\xf1\xcd'
|
data = b'\x97\x92+\xe5\x0b\xc3\x18\x91ky9m&\xb3\xb5@\xe6\x27\xc2\x96.\xc8u\x88\xab9-[\x9e|\xf1\xcd'
|
||||||
decrypted = bytes(aes_cbc_decrypt(list(data), self.key, self.iv))
|
decrypted = intlist_to_bytes(aes_cbc_decrypt(bytes_to_intlist(data), self.key, self.iv))
|
||||||
self.assertEqual(decrypted.rstrip(b'\x08'), self.secret_msg)
|
self.assertEqual(decrypted.rstrip(b'\x08'), self.secret_msg)
|
||||||
if Cryptodome.AES:
|
if Cryptodome.AES:
|
||||||
decrypted = aes_cbc_decrypt_bytes(data, bytes(self.key), bytes(self.iv))
|
decrypted = aes_cbc_decrypt_bytes(data, intlist_to_bytes(self.key), intlist_to_bytes(self.iv))
|
||||||
self.assertEqual(decrypted.rstrip(b'\x08'), self.secret_msg)
|
self.assertEqual(decrypted.rstrip(b'\x08'), self.secret_msg)
|
||||||
|
|
||||||
def test_cbc_encrypt(self):
|
def test_cbc_encrypt(self):
|
||||||
data = list(self.secret_msg)
|
data = bytes_to_intlist(self.secret_msg)
|
||||||
encrypted = bytes(aes_cbc_encrypt(data, self.key, self.iv))
|
encrypted = intlist_to_bytes(aes_cbc_encrypt(data, self.key, self.iv))
|
||||||
self.assertEqual(
|
self.assertEqual(
|
||||||
encrypted,
|
encrypted,
|
||||||
b'\x97\x92+\xe5\x0b\xc3\x18\x91ky9m&\xb3\xb5@\xe6\'\xc2\x96.\xc8u\x88\xab9-[\x9e|\xf1\xcd')
|
b'\x97\x92+\xe5\x0b\xc3\x18\x91ky9m&\xb3\xb5@\xe6\'\xc2\x96.\xc8u\x88\xab9-[\x9e|\xf1\xcd')
|
||||||
|
|
||||||
def test_ctr_decrypt(self):
|
def test_ctr_decrypt(self):
|
||||||
data = list(b'\x03\xc7\xdd\xd4\x8e\xb3\xbc\x1a*O\xdc1\x12+8Aio\xd1z\xb5#\xaf\x08')
|
data = bytes_to_intlist(b'\x03\xc7\xdd\xd4\x8e\xb3\xbc\x1a*O\xdc1\x12+8Aio\xd1z\xb5#\xaf\x08')
|
||||||
decrypted = bytes(aes_ctr_decrypt(data, self.key, self.iv))
|
decrypted = intlist_to_bytes(aes_ctr_decrypt(data, self.key, self.iv))
|
||||||
self.assertEqual(decrypted.rstrip(b'\x08'), self.secret_msg)
|
self.assertEqual(decrypted.rstrip(b'\x08'), self.secret_msg)
|
||||||
|
|
||||||
def test_ctr_encrypt(self):
|
def test_ctr_encrypt(self):
|
||||||
data = list(self.secret_msg)
|
data = bytes_to_intlist(self.secret_msg)
|
||||||
encrypted = bytes(aes_ctr_encrypt(data, self.key, self.iv))
|
encrypted = intlist_to_bytes(aes_ctr_encrypt(data, self.key, self.iv))
|
||||||
self.assertEqual(
|
self.assertEqual(
|
||||||
encrypted,
|
encrypted,
|
||||||
b'\x03\xc7\xdd\xd4\x8e\xb3\xbc\x1a*O\xdc1\x12+8Aio\xd1z\xb5#\xaf\x08')
|
b'\x03\xc7\xdd\xd4\x8e\xb3\xbc\x1a*O\xdc1\x12+8Aio\xd1z\xb5#\xaf\x08')
|
||||||
@ -74,59 +75,47 @@ class TestAES(unittest.TestCase):
|
|||||||
data = b'\x159Y\xcf5eud\x90\x9c\x85&]\x14\x1d\x0f.\x08\xb4T\xe4/\x17\xbd'
|
data = b'\x159Y\xcf5eud\x90\x9c\x85&]\x14\x1d\x0f.\x08\xb4T\xe4/\x17\xbd'
|
||||||
authentication_tag = b'\xe8&I\x80rI\x07\x9d}YWuU@:e'
|
authentication_tag = b'\xe8&I\x80rI\x07\x9d}YWuU@:e'
|
||||||
|
|
||||||
decrypted = bytes(aes_gcm_decrypt_and_verify(
|
decrypted = intlist_to_bytes(aes_gcm_decrypt_and_verify(
|
||||||
list(data), self.key, list(authentication_tag), self.iv[:12]))
|
bytes_to_intlist(data), self.key, bytes_to_intlist(authentication_tag), self.iv[:12]))
|
||||||
self.assertEqual(decrypted.rstrip(b'\x08'), self.secret_msg)
|
self.assertEqual(decrypted.rstrip(b'\x08'), self.secret_msg)
|
||||||
if Cryptodome.AES:
|
if Cryptodome.AES:
|
||||||
decrypted = aes_gcm_decrypt_and_verify_bytes(
|
decrypted = aes_gcm_decrypt_and_verify_bytes(
|
||||||
data, bytes(self.key), authentication_tag, bytes(self.iv[:12]))
|
data, intlist_to_bytes(self.key), authentication_tag, intlist_to_bytes(self.iv[:12]))
|
||||||
self.assertEqual(decrypted.rstrip(b'\x08'), self.secret_msg)
|
self.assertEqual(decrypted.rstrip(b'\x08'), self.secret_msg)
|
||||||
|
|
||||||
def test_gcm_aligned_decrypt(self):
|
|
||||||
data = b'\x159Y\xcf5eud\x90\x9c\x85&]\x14\x1d\x0f'
|
|
||||||
authentication_tag = b'\x08\xb1\x9d!&\x98\xd0\xeaRq\x90\xe6;\xb5]\xd8'
|
|
||||||
|
|
||||||
decrypted = bytes(aes_gcm_decrypt_and_verify(
|
|
||||||
list(data), self.key, list(authentication_tag), self.iv[:12]))
|
|
||||||
self.assertEqual(decrypted.rstrip(b'\x08'), self.secret_msg[:16])
|
|
||||||
if Cryptodome.AES:
|
|
||||||
decrypted = aes_gcm_decrypt_and_verify_bytes(
|
|
||||||
data, bytes(self.key), authentication_tag, bytes(self.iv[:12]))
|
|
||||||
self.assertEqual(decrypted.rstrip(b'\x08'), self.secret_msg[:16])
|
|
||||||
|
|
||||||
def test_decrypt_text(self):
|
def test_decrypt_text(self):
|
||||||
password = bytes(self.key).decode()
|
password = intlist_to_bytes(self.key).decode()
|
||||||
encrypted = base64.b64encode(
|
encrypted = base64.b64encode(
|
||||||
bytes(self.iv[:8])
|
intlist_to_bytes(self.iv[:8])
|
||||||
+ b'\x17\x15\x93\xab\x8d\x80V\xcdV\xe0\t\xcdo\xc2\xa5\xd8ksM\r\xe27N\xae',
|
+ b'\x17\x15\x93\xab\x8d\x80V\xcdV\xe0\t\xcdo\xc2\xa5\xd8ksM\r\xe27N\xae',
|
||||||
).decode()
|
).decode()
|
||||||
decrypted = (aes_decrypt_text(encrypted, password, 16))
|
decrypted = (aes_decrypt_text(encrypted, password, 16))
|
||||||
self.assertEqual(decrypted, self.secret_msg)
|
self.assertEqual(decrypted, self.secret_msg)
|
||||||
|
|
||||||
password = bytes(self.key).decode()
|
password = intlist_to_bytes(self.key).decode()
|
||||||
encrypted = base64.b64encode(
|
encrypted = base64.b64encode(
|
||||||
bytes(self.iv[:8])
|
intlist_to_bytes(self.iv[:8])
|
||||||
+ b'\x0b\xe6\xa4\xd9z\x0e\xb8\xb9\xd0\xd4i_\x85\x1d\x99\x98_\xe5\x80\xe7.\xbf\xa5\x83',
|
+ b'\x0b\xe6\xa4\xd9z\x0e\xb8\xb9\xd0\xd4i_\x85\x1d\x99\x98_\xe5\x80\xe7.\xbf\xa5\x83',
|
||||||
).decode()
|
).decode()
|
||||||
decrypted = (aes_decrypt_text(encrypted, password, 32))
|
decrypted = (aes_decrypt_text(encrypted, password, 32))
|
||||||
self.assertEqual(decrypted, self.secret_msg)
|
self.assertEqual(decrypted, self.secret_msg)
|
||||||
|
|
||||||
def test_ecb_encrypt(self):
|
def test_ecb_encrypt(self):
|
||||||
data = list(self.secret_msg)
|
data = bytes_to_intlist(self.secret_msg)
|
||||||
encrypted = bytes(aes_ecb_encrypt(data, self.key))
|
encrypted = intlist_to_bytes(aes_ecb_encrypt(data, self.key))
|
||||||
self.assertEqual(
|
self.assertEqual(
|
||||||
encrypted,
|
encrypted,
|
||||||
b'\xaa\x86]\x81\x97>\x02\x92\x9d\x1bR[[L/u\xd3&\xd1(h\xde{\x81\x94\xba\x02\xae\xbd\xa6\xd0:')
|
b'\xaa\x86]\x81\x97>\x02\x92\x9d\x1bR[[L/u\xd3&\xd1(h\xde{\x81\x94\xba\x02\xae\xbd\xa6\xd0:')
|
||||||
|
|
||||||
def test_ecb_decrypt(self):
|
def test_ecb_decrypt(self):
|
||||||
data = list(b'\xaa\x86]\x81\x97>\x02\x92\x9d\x1bR[[L/u\xd3&\xd1(h\xde{\x81\x94\xba\x02\xae\xbd\xa6\xd0:')
|
data = bytes_to_intlist(b'\xaa\x86]\x81\x97>\x02\x92\x9d\x1bR[[L/u\xd3&\xd1(h\xde{\x81\x94\xba\x02\xae\xbd\xa6\xd0:')
|
||||||
decrypted = bytes(aes_ecb_decrypt(data, self.key, self.iv))
|
decrypted = intlist_to_bytes(aes_ecb_decrypt(data, self.key, self.iv))
|
||||||
self.assertEqual(decrypted.rstrip(b'\x08'), self.secret_msg)
|
self.assertEqual(decrypted.rstrip(b'\x08'), self.secret_msg)
|
||||||
|
|
||||||
def test_key_expansion(self):
|
def test_key_expansion(self):
|
||||||
key = '4f6bdaa39e2f8cb07f5e722d9edef314'
|
key = '4f6bdaa39e2f8cb07f5e722d9edef314'
|
||||||
|
|
||||||
self.assertEqual(key_expansion(list(bytearray.fromhex(key))), [
|
self.assertEqual(key_expansion(bytes_to_intlist(bytearray.fromhex(key))), [
|
||||||
0x4F, 0x6B, 0xDA, 0xA3, 0x9E, 0x2F, 0x8C, 0xB0, 0x7F, 0x5E, 0x72, 0x2D, 0x9E, 0xDE, 0xF3, 0x14,
|
0x4F, 0x6B, 0xDA, 0xA3, 0x9E, 0x2F, 0x8C, 0xB0, 0x7F, 0x5E, 0x72, 0x2D, 0x9E, 0xDE, 0xF3, 0x14,
|
||||||
0x53, 0x66, 0x20, 0xA8, 0xCD, 0x49, 0xAC, 0x18, 0xB2, 0x17, 0xDE, 0x35, 0x2C, 0xC9, 0x2D, 0x21,
|
0x53, 0x66, 0x20, 0xA8, 0xCD, 0x49, 0xAC, 0x18, 0xB2, 0x17, 0xDE, 0x35, 0x2C, 0xC9, 0x2D, 0x21,
|
||||||
0x8C, 0xBE, 0xDD, 0xD9, 0x41, 0xF7, 0x71, 0xC1, 0xF3, 0xE0, 0xAF, 0xF4, 0xDF, 0x29, 0x82, 0xD5,
|
0x8C, 0xBE, 0xDD, 0xD9, 0x41, 0xF7, 0x71, 0xC1, 0xF3, 0xE0, 0xAF, 0xF4, 0xDF, 0x29, 0x82, 0xD5,
|
||||||
|
@ -12,7 +12,12 @@ import struct
|
|||||||
|
|
||||||
from yt_dlp import compat
|
from yt_dlp import compat
|
||||||
from yt_dlp.compat import urllib # isort: split
|
from yt_dlp.compat import urllib # isort: split
|
||||||
from yt_dlp.compat import compat_etree_fromstring, compat_expanduser
|
from yt_dlp.compat import (
|
||||||
|
compat_etree_fromstring,
|
||||||
|
compat_expanduser,
|
||||||
|
compat_urllib_parse_unquote, # noqa: TID251
|
||||||
|
compat_urllib_parse_urlencode, # noqa: TID251
|
||||||
|
)
|
||||||
from yt_dlp.compat.urllib.request import getproxies
|
from yt_dlp.compat.urllib.request import getproxies
|
||||||
|
|
||||||
|
|
||||||
@ -38,6 +43,39 @@ class TestCompat(unittest.TestCase):
|
|||||||
finally:
|
finally:
|
||||||
os.environ['HOME'] = old_home or ''
|
os.environ['HOME'] = old_home or ''
|
||||||
|
|
||||||
|
def test_compat_urllib_parse_unquote(self):
|
||||||
|
self.assertEqual(compat_urllib_parse_unquote('abc%20def'), 'abc def')
|
||||||
|
self.assertEqual(compat_urllib_parse_unquote('%7e/abc+def'), '~/abc+def')
|
||||||
|
self.assertEqual(compat_urllib_parse_unquote(''), '')
|
||||||
|
self.assertEqual(compat_urllib_parse_unquote('%'), '%')
|
||||||
|
self.assertEqual(compat_urllib_parse_unquote('%%'), '%%')
|
||||||
|
self.assertEqual(compat_urllib_parse_unquote('%%%'), '%%%')
|
||||||
|
self.assertEqual(compat_urllib_parse_unquote('%2F'), '/')
|
||||||
|
self.assertEqual(compat_urllib_parse_unquote('%2f'), '/')
|
||||||
|
self.assertEqual(compat_urllib_parse_unquote('%E6%B4%A5%E6%B3%A2'), '津波')
|
||||||
|
self.assertEqual(
|
||||||
|
compat_urllib_parse_unquote('''<meta property="og:description" content="%E2%96%81%E2%96%82%E2%96%83%E2%96%84%25%E2%96%85%E2%96%86%E2%96%87%E2%96%88" />
|
||||||
|
%<a href="https://ar.wikipedia.org/wiki/%D8%AA%D8%B3%D9%88%D9%86%D8%A7%D9%85%D9%8A">%a'''),
|
||||||
|
'''<meta property="og:description" content="▁▂▃▄%▅▆▇█" />
|
||||||
|
%<a href="https://ar.wikipedia.org/wiki/تسونامي">%a''')
|
||||||
|
self.assertEqual(
|
||||||
|
compat_urllib_parse_unquote('''%28%5E%E2%97%A3_%E2%97%A2%5E%29%E3%81%A3%EF%B8%BB%E3%83%87%E2%95%90%E4%B8%80 %E2%87%80 %E2%87%80 %E2%87%80 %E2%87%80 %E2%87%80 %E2%86%B6%I%Break%25Things%'''),
|
||||||
|
'''(^◣_◢^)っ︻デ═一 ⇀ ⇀ ⇀ ⇀ ⇀ ↶%I%Break%Things%''')
|
||||||
|
|
||||||
|
def test_compat_urllib_parse_unquote_plus(self):
|
||||||
|
self.assertEqual(urllib.parse.unquote_plus('abc%20def'), 'abc def')
|
||||||
|
self.assertEqual(urllib.parse.unquote_plus('%7e/abc+def'), '~/abc def')
|
||||||
|
|
||||||
|
def test_compat_urllib_parse_urlencode(self):
|
||||||
|
self.assertEqual(compat_urllib_parse_urlencode({'abc': 'def'}), 'abc=def')
|
||||||
|
self.assertEqual(compat_urllib_parse_urlencode({'abc': b'def'}), 'abc=def')
|
||||||
|
self.assertEqual(compat_urllib_parse_urlencode({b'abc': 'def'}), 'abc=def')
|
||||||
|
self.assertEqual(compat_urllib_parse_urlencode({b'abc': b'def'}), 'abc=def')
|
||||||
|
self.assertEqual(compat_urllib_parse_urlencode([('abc', 'def')]), 'abc=def')
|
||||||
|
self.assertEqual(compat_urllib_parse_urlencode([('abc', b'def')]), 'abc=def')
|
||||||
|
self.assertEqual(compat_urllib_parse_urlencode([(b'abc', 'def')]), 'abc=def')
|
||||||
|
self.assertEqual(compat_urllib_parse_urlencode([(b'abc', b'def')]), 'abc=def')
|
||||||
|
|
||||||
def test_compat_etree_fromstring(self):
|
def test_compat_etree_fromstring(self):
|
||||||
xml = '''
|
xml = '''
|
||||||
<root foo="bar" spam="中文">
|
<root foo="bar" spam="中文">
|
||||||
|
@ -58,14 +58,6 @@ class TestCookies(unittest.TestCase):
|
|||||||
({'DESKTOP_SESSION': 'kde'}, _LinuxDesktopEnvironment.KDE3),
|
({'DESKTOP_SESSION': 'kde'}, _LinuxDesktopEnvironment.KDE3),
|
||||||
({'DESKTOP_SESSION': 'xfce'}, _LinuxDesktopEnvironment.XFCE),
|
({'DESKTOP_SESSION': 'xfce'}, _LinuxDesktopEnvironment.XFCE),
|
||||||
|
|
||||||
({'XDG_CURRENT_DESKTOP': 'my_custom_de', 'DESKTOP_SESSION': 'gnome'}, _LinuxDesktopEnvironment.GNOME),
|
|
||||||
({'XDG_CURRENT_DESKTOP': 'my_custom_de', 'DESKTOP_SESSION': 'mate'}, _LinuxDesktopEnvironment.GNOME),
|
|
||||||
({'XDG_CURRENT_DESKTOP': 'my_custom_de', 'DESKTOP_SESSION': 'kde4'}, _LinuxDesktopEnvironment.KDE4),
|
|
||||||
({'XDG_CURRENT_DESKTOP': 'my_custom_de', 'DESKTOP_SESSION': 'kde'}, _LinuxDesktopEnvironment.KDE3),
|
|
||||||
({'XDG_CURRENT_DESKTOP': 'my_custom_de', 'DESKTOP_SESSION': 'xfce'}, _LinuxDesktopEnvironment.XFCE),
|
|
||||||
|
|
||||||
({'XDG_CURRENT_DESKTOP': 'my_custom_de', 'DESKTOP_SESSION': 'my_custom_de', 'GNOME_DESKTOP_SESSION_ID': 1}, _LinuxDesktopEnvironment.GNOME),
|
|
||||||
|
|
||||||
({'GNOME_DESKTOP_SESSION_ID': 1}, _LinuxDesktopEnvironment.GNOME),
|
({'GNOME_DESKTOP_SESSION_ID': 1}, _LinuxDesktopEnvironment.GNOME),
|
||||||
({'KDE_FULL_SESSION': 1}, _LinuxDesktopEnvironment.KDE3),
|
({'KDE_FULL_SESSION': 1}, _LinuxDesktopEnvironment.KDE3),
|
||||||
({'KDE_FULL_SESSION': 1, 'DESKTOP_SESSION': 'kde4'}, _LinuxDesktopEnvironment.KDE4),
|
({'KDE_FULL_SESSION': 1, 'DESKTOP_SESSION': 'kde4'}, _LinuxDesktopEnvironment.KDE4),
|
||||||
@ -113,13 +105,6 @@ class TestCookies(unittest.TestCase):
|
|||||||
decryptor = LinuxChromeCookieDecryptor('Chrome', Logger())
|
decryptor = LinuxChromeCookieDecryptor('Chrome', Logger())
|
||||||
self.assertEqual(decryptor.decrypt(encrypted_value), value)
|
self.assertEqual(decryptor.decrypt(encrypted_value), value)
|
||||||
|
|
||||||
def test_chrome_cookie_decryptor_linux_v10_meta24(self):
|
|
||||||
with MonkeyPatch(cookies, {'_get_linux_keyring_password': lambda *args, **kwargs: b''}):
|
|
||||||
encrypted_value = b'v10\x1f\xe4\x0e[\x83\x0c\xcc*kPi \xce\x8d\x1d\xbb\x80\r\x11\t\xbb\x9e^Hy\x94\xf4\x963\x9f\x82\xba\xfe\xa1\xed\xb9\xf1)\x00710\x92\xc8/<\x96B'
|
|
||||||
value = 'DE'
|
|
||||||
decryptor = LinuxChromeCookieDecryptor('Chrome', Logger(), meta_version=24)
|
|
||||||
self.assertEqual(decryptor.decrypt(encrypted_value), value)
|
|
||||||
|
|
||||||
def test_chrome_cookie_decryptor_windows_v10(self):
|
def test_chrome_cookie_decryptor_windows_v10(self):
|
||||||
with MonkeyPatch(cookies, {
|
with MonkeyPatch(cookies, {
|
||||||
'_get_windows_v10_key': lambda *args, **kwargs: b'Y\xef\xad\xad\xeerp\xf0Y\xe6\x9b\x12\xc2<z\x16]\n\xbb\xb8\xcb\xd7\x9bA\xc3\x14e\x99{\xd6\xf4&',
|
'_get_windows_v10_key': lambda *args, **kwargs: b'Y\xef\xad\xad\xeerp\xf0Y\xe6\x9b\x12\xc2<z\x16]\n\xbb\xb8\xcb\xd7\x9bA\xc3\x14e\x99{\xd6\xf4&',
|
||||||
@ -129,15 +114,6 @@ class TestCookies(unittest.TestCase):
|
|||||||
decryptor = WindowsChromeCookieDecryptor('', Logger())
|
decryptor = WindowsChromeCookieDecryptor('', Logger())
|
||||||
self.assertEqual(decryptor.decrypt(encrypted_value), value)
|
self.assertEqual(decryptor.decrypt(encrypted_value), value)
|
||||||
|
|
||||||
def test_chrome_cookie_decryptor_windows_v10_meta24(self):
|
|
||||||
with MonkeyPatch(cookies, {
|
|
||||||
'_get_windows_v10_key': lambda *args, **kwargs: b'\xea\x8b\x02\xc3\xc6\xc5\x99\xc3\xa3[ j\xfa\xf6\xfcU\xac\x13u\xdc\x0c\x0e\xf1\x03\x90\xb6\xdf\xbb\x8fL\xb1\xb2',
|
|
||||||
}):
|
|
||||||
encrypted_value = b'v10dN\xe1\xacy\x84^\xe1I\xact\x03r\xfb\xe2\xce{^\x0e<(\xb0y\xeb\x01\xfb@"\x9e\x8c\xa53~\xdb*\x8f\xac\x8b\xe3\xfd3\x06\xe5\x93\x19OyOG\xb2\xfb\x1d$\xc0\xda\x13j\x9e\xfe\xc5\xa3\xa8\xfe\xd9'
|
|
||||||
value = '1234'
|
|
||||||
decryptor = WindowsChromeCookieDecryptor('', Logger(), meta_version=24)
|
|
||||||
self.assertEqual(decryptor.decrypt(encrypted_value), value)
|
|
||||||
|
|
||||||
def test_chrome_cookie_decryptor_mac_v10(self):
|
def test_chrome_cookie_decryptor_mac_v10(self):
|
||||||
with MonkeyPatch(cookies, {'_get_mac_keyring_password': lambda *args, **kwargs: b'6eIDUdtKAacvlHwBVwvg/Q=='}):
|
with MonkeyPatch(cookies, {'_get_mac_keyring_password': lambda *args, **kwargs: b'6eIDUdtKAacvlHwBVwvg/Q=='}):
|
||||||
encrypted_value = b'v10\xb3\xbe\xad\xa1[\x9fC\xa1\x98\xe0\x9a\x01\xd9\xcf\xbfc'
|
encrypted_value = b'v10\xb3\xbe\xad\xa1[\x9fC\xa1\x98\xe0\x9a\x01\xd9\xcf\xbfc'
|
||||||
|
@ -1,235 +0,0 @@
|
|||||||
#!/usr/bin/env python3
|
|
||||||
|
|
||||||
# Allow direct execution
|
|
||||||
import os
|
|
||||||
import sys
|
|
||||||
|
|
||||||
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
|
||||||
|
|
||||||
|
|
||||||
import datetime as dt
|
|
||||||
import json
|
|
||||||
import math
|
|
||||||
import re
|
|
||||||
import unittest
|
|
||||||
|
|
||||||
from yt_dlp.utils.jslib import devalue
|
|
||||||
|
|
||||||
|
|
||||||
TEST_CASES_EQUALS = [{
|
|
||||||
'name': 'int',
|
|
||||||
'unparsed': [-42],
|
|
||||||
'parsed': -42,
|
|
||||||
}, {
|
|
||||||
'name': 'str',
|
|
||||||
'unparsed': ['woo!!!'],
|
|
||||||
'parsed': 'woo!!!',
|
|
||||||
}, {
|
|
||||||
'name': 'Number',
|
|
||||||
'unparsed': [['Object', 42]],
|
|
||||||
'parsed': 42,
|
|
||||||
}, {
|
|
||||||
'name': 'String',
|
|
||||||
'unparsed': [['Object', 'yar']],
|
|
||||||
'parsed': 'yar',
|
|
||||||
}, {
|
|
||||||
'name': 'Infinity',
|
|
||||||
'unparsed': -4,
|
|
||||||
'parsed': math.inf,
|
|
||||||
}, {
|
|
||||||
'name': 'negative Infinity',
|
|
||||||
'unparsed': -5,
|
|
||||||
'parsed': -math.inf,
|
|
||||||
}, {
|
|
||||||
'name': 'negative zero',
|
|
||||||
'unparsed': -6,
|
|
||||||
'parsed': -0.0,
|
|
||||||
}, {
|
|
||||||
'name': 'RegExp',
|
|
||||||
'unparsed': [['RegExp', 'regexp', 'gim']], # XXX: flags are ignored
|
|
||||||
'parsed': re.compile('regexp'),
|
|
||||||
}, {
|
|
||||||
'name': 'Date',
|
|
||||||
'unparsed': [['Date', '2001-09-09T01:46:40.000Z']],
|
|
||||||
'parsed': dt.datetime.fromtimestamp(1e9, tz=dt.timezone.utc),
|
|
||||||
}, {
|
|
||||||
'name': 'Array',
|
|
||||||
'unparsed': [[1, 2, 3], 'a', 'b', 'c'],
|
|
||||||
'parsed': ['a', 'b', 'c'],
|
|
||||||
}, {
|
|
||||||
'name': 'Array (empty)',
|
|
||||||
'unparsed': [[]],
|
|
||||||
'parsed': [],
|
|
||||||
}, {
|
|
||||||
'name': 'Array (sparse)',
|
|
||||||
'unparsed': [[-2, 1, -2], 'b'],
|
|
||||||
'parsed': [None, 'b', None],
|
|
||||||
}, {
|
|
||||||
'name': 'Object',
|
|
||||||
'unparsed': [{'foo': 1, 'x-y': 2}, 'bar', 'z'],
|
|
||||||
'parsed': {'foo': 'bar', 'x-y': 'z'},
|
|
||||||
}, {
|
|
||||||
'name': 'Set',
|
|
||||||
'unparsed': [['Set', 1, 2, 3], 1, 2, 3],
|
|
||||||
'parsed': [1, 2, 3],
|
|
||||||
}, {
|
|
||||||
'name': 'Map',
|
|
||||||
'unparsed': [['Map', 1, 2], 'a', 'b'],
|
|
||||||
'parsed': [['a', 'b']],
|
|
||||||
}, {
|
|
||||||
'name': 'BigInt',
|
|
||||||
'unparsed': [['BigInt', '1']],
|
|
||||||
'parsed': 1,
|
|
||||||
}, {
|
|
||||||
'name': 'Uint8Array',
|
|
||||||
'unparsed': [['Uint8Array', 'AQID']],
|
|
||||||
'parsed': [1, 2, 3],
|
|
||||||
}, {
|
|
||||||
'name': 'ArrayBuffer',
|
|
||||||
'unparsed': [['ArrayBuffer', 'AQID']],
|
|
||||||
'parsed': [1, 2, 3],
|
|
||||||
}, {
|
|
||||||
'name': 'str (repetition)',
|
|
||||||
'unparsed': [[1, 1], 'a string'],
|
|
||||||
'parsed': ['a string', 'a string'],
|
|
||||||
}, {
|
|
||||||
'name': 'None (repetition)',
|
|
||||||
'unparsed': [[1, 1], None],
|
|
||||||
'parsed': [None, None],
|
|
||||||
}, {
|
|
||||||
'name': 'dict (repetition)',
|
|
||||||
'unparsed': [[1, 1], {}],
|
|
||||||
'parsed': [{}, {}],
|
|
||||||
}, {
|
|
||||||
'name': 'Object without prototype',
|
|
||||||
'unparsed': [['null']],
|
|
||||||
'parsed': {},
|
|
||||||
}, {
|
|
||||||
'name': 'cross-realm POJO',
|
|
||||||
'unparsed': [{}],
|
|
||||||
'parsed': {},
|
|
||||||
}]
|
|
||||||
|
|
||||||
TEST_CASES_IS = [{
|
|
||||||
'name': 'bool',
|
|
||||||
'unparsed': [True],
|
|
||||||
'parsed': True,
|
|
||||||
}, {
|
|
||||||
'name': 'Boolean',
|
|
||||||
'unparsed': [['Object', False]],
|
|
||||||
'parsed': False,
|
|
||||||
}, {
|
|
||||||
'name': 'undefined',
|
|
||||||
'unparsed': -1,
|
|
||||||
'parsed': None,
|
|
||||||
}, {
|
|
||||||
'name': 'null',
|
|
||||||
'unparsed': [None],
|
|
||||||
'parsed': None,
|
|
||||||
}, {
|
|
||||||
'name': 'NaN',
|
|
||||||
'unparsed': -3,
|
|
||||||
'parsed': math.nan,
|
|
||||||
}]
|
|
||||||
|
|
||||||
TEST_CASES_INVALID = [{
|
|
||||||
'name': 'empty string',
|
|
||||||
'unparsed': '',
|
|
||||||
'error': ValueError,
|
|
||||||
'pattern': r'expected int or list as input',
|
|
||||||
}, {
|
|
||||||
'name': 'hole',
|
|
||||||
'unparsed': -2,
|
|
||||||
'error': ValueError,
|
|
||||||
'pattern': r'invalid integer input',
|
|
||||||
}, {
|
|
||||||
'name': 'string',
|
|
||||||
'unparsed': 'hello',
|
|
||||||
'error': ValueError,
|
|
||||||
'pattern': r'expected int or list as input',
|
|
||||||
}, {
|
|
||||||
'name': 'number',
|
|
||||||
'unparsed': 42,
|
|
||||||
'error': ValueError,
|
|
||||||
'pattern': r'invalid integer input',
|
|
||||||
}, {
|
|
||||||
'name': 'boolean',
|
|
||||||
'unparsed': True,
|
|
||||||
'error': ValueError,
|
|
||||||
'pattern': r'expected int or list as input',
|
|
||||||
}, {
|
|
||||||
'name': 'null',
|
|
||||||
'unparsed': None,
|
|
||||||
'error': ValueError,
|
|
||||||
'pattern': r'expected int or list as input',
|
|
||||||
}, {
|
|
||||||
'name': 'object',
|
|
||||||
'unparsed': {},
|
|
||||||
'error': ValueError,
|
|
||||||
'pattern': r'expected int or list as input',
|
|
||||||
}, {
|
|
||||||
'name': 'empty array',
|
|
||||||
'unparsed': [],
|
|
||||||
'error': ValueError,
|
|
||||||
'pattern': r'expected a non-empty list as input',
|
|
||||||
}, {
|
|
||||||
'name': 'Python negative indexing',
|
|
||||||
'unparsed': [[1, 2, 3, 4, 5, 6, 7, -7], 1, 2, 3, 4, 5, 6, 7],
|
|
||||||
'error': IndexError,
|
|
||||||
'pattern': r'invalid index: -7',
|
|
||||||
}]
|
|
||||||
|
|
||||||
|
|
||||||
class TestDevalue(unittest.TestCase):
|
|
||||||
def test_devalue_parse_equals(self):
|
|
||||||
for tc in TEST_CASES_EQUALS:
|
|
||||||
self.assertEqual(devalue.parse(tc['unparsed']), tc['parsed'], tc['name'])
|
|
||||||
|
|
||||||
def test_devalue_parse_is(self):
|
|
||||||
for tc in TEST_CASES_IS:
|
|
||||||
self.assertIs(devalue.parse(tc['unparsed']), tc['parsed'], tc['name'])
|
|
||||||
|
|
||||||
def test_devalue_parse_invalid(self):
|
|
||||||
for tc in TEST_CASES_INVALID:
|
|
||||||
with self.assertRaisesRegex(tc['error'], tc['pattern'], msg=tc['name']):
|
|
||||||
devalue.parse(tc['unparsed'])
|
|
||||||
|
|
||||||
def test_devalue_parse_cyclical(self):
|
|
||||||
name = 'Map (cyclical)'
|
|
||||||
result = devalue.parse([['Map', 1, 0], 'self'])
|
|
||||||
self.assertEqual(result[0][0], 'self', name)
|
|
||||||
self.assertIs(result, result[0][1], name)
|
|
||||||
|
|
||||||
name = 'Set (cyclical)'
|
|
||||||
result = devalue.parse([['Set', 0, 1], 42])
|
|
||||||
self.assertEqual(result[1], 42, name)
|
|
||||||
self.assertIs(result, result[0], name)
|
|
||||||
|
|
||||||
result = devalue.parse([[0]])
|
|
||||||
self.assertIs(result, result[0], 'Array (cyclical)')
|
|
||||||
|
|
||||||
name = 'Object (cyclical)'
|
|
||||||
result = devalue.parse([{'self': 0}])
|
|
||||||
self.assertIs(result, result['self'], name)
|
|
||||||
|
|
||||||
name = 'Object with null prototype (cyclical)'
|
|
||||||
result = devalue.parse([['null', 'self', 0]])
|
|
||||||
self.assertIs(result, result['self'], name)
|
|
||||||
|
|
||||||
name = 'Objects (cyclical)'
|
|
||||||
result = devalue.parse([[1, 2], {'second': 2}, {'first': 1}])
|
|
||||||
self.assertIs(result[0], result[1]['first'], name)
|
|
||||||
self.assertIs(result[1], result[0]['second'], name)
|
|
||||||
|
|
||||||
def test_devalue_parse_revivers(self):
|
|
||||||
self.assertEqual(
|
|
||||||
devalue.parse([['indirect', 1], {'a': 2}, 'b'], revivers={'indirect': lambda x: x}),
|
|
||||||
{'a': 'b'}, 'revivers (indirect)')
|
|
||||||
|
|
||||||
self.assertEqual(
|
|
||||||
devalue.parse([['parse', 1], '{"a":0}'], revivers={'parse': lambda x: json.loads(x)}),
|
|
||||||
{'a': 0}, 'revivers (parse)')
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == '__main__':
|
|
||||||
unittest.main()
|
|
@ -15,6 +15,7 @@ import threading
|
|||||||
from test.helper import http_server_port, try_rm
|
from test.helper import http_server_port, try_rm
|
||||||
from yt_dlp import YoutubeDL
|
from yt_dlp import YoutubeDL
|
||||||
from yt_dlp.downloader.http import HttpFD
|
from yt_dlp.downloader.http import HttpFD
|
||||||
|
from yt_dlp.utils import encodeFilename
|
||||||
from yt_dlp.utils._utils import _YDLLogger as FakeLogger
|
from yt_dlp.utils._utils import _YDLLogger as FakeLogger
|
||||||
|
|
||||||
TEST_DIR = os.path.dirname(os.path.abspath(__file__))
|
TEST_DIR = os.path.dirname(os.path.abspath(__file__))
|
||||||
@ -81,12 +82,12 @@ class TestHttpFD(unittest.TestCase):
|
|||||||
ydl = YoutubeDL(params)
|
ydl = YoutubeDL(params)
|
||||||
downloader = HttpFD(ydl, params)
|
downloader = HttpFD(ydl, params)
|
||||||
filename = 'testfile.mp4'
|
filename = 'testfile.mp4'
|
||||||
try_rm(filename)
|
try_rm(encodeFilename(filename))
|
||||||
self.assertTrue(downloader.real_download(filename, {
|
self.assertTrue(downloader.real_download(filename, {
|
||||||
'url': f'http://127.0.0.1:{self.port}/{ep}',
|
'url': f'http://127.0.0.1:{self.port}/{ep}',
|
||||||
}), ep)
|
}), ep)
|
||||||
self.assertEqual(os.path.getsize(filename), TEST_SIZE, ep)
|
self.assertEqual(os.path.getsize(encodeFilename(filename)), TEST_SIZE, ep)
|
||||||
try_rm(filename)
|
try_rm(encodeFilename(filename))
|
||||||
|
|
||||||
def download_all(self, params):
|
def download_all(self, params):
|
||||||
for ep in ('regular', 'no-content-length', 'no-range', 'no-range-no-content-length'):
|
for ep in ('regular', 'no-content-length', 'no-range', 'no-range-no-content-length'):
|
||||||
|
@ -331,6 +331,10 @@ class TestHTTPConnectProxy:
|
|||||||
assert proxy_info['proxy'] == server_address
|
assert proxy_info['proxy'] == server_address
|
||||||
assert 'Proxy-Authorization' in proxy_info['headers']
|
assert 'Proxy-Authorization' in proxy_info['headers']
|
||||||
|
|
||||||
|
@pytest.mark.skip_handler(
|
||||||
|
'Requests',
|
||||||
|
'bug in urllib3 causes unclosed socket: https://github.com/urllib3/urllib3/issues/3374',
|
||||||
|
)
|
||||||
def test_http_connect_bad_auth(self, handler, ctx):
|
def test_http_connect_bad_auth(self, handler, ctx):
|
||||||
with ctx.http_server(HTTPConnectProxyHandler, username='test', password='test') as server_address:
|
with ctx.http_server(HTTPConnectProxyHandler, username='test', password='test') as server_address:
|
||||||
with handler(verify=False, proxies={ctx.REQUEST_PROTO: f'http://test:bad@{server_address}'}) as rh:
|
with handler(verify=False, proxies={ctx.REQUEST_PROTO: f'http://test:bad@{server_address}'}) as rh:
|
||||||
|
@ -9,7 +9,7 @@ sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
|||||||
|
|
||||||
import math
|
import math
|
||||||
|
|
||||||
from yt_dlp.jsinterp import JS_Undefined, JSInterpreter, js_number_to_string
|
from yt_dlp.jsinterp import JS_Undefined, JSInterpreter
|
||||||
|
|
||||||
|
|
||||||
class NaN:
|
class NaN:
|
||||||
@ -93,16 +93,6 @@ class TestJSInterpreter(unittest.TestCase):
|
|||||||
self._test('function f(){return 0 ?? 42;}', 0)
|
self._test('function f(){return 0 ?? 42;}', 0)
|
||||||
self._test('function f(){return "life, the universe and everything" < 42;}', False)
|
self._test('function f(){return "life, the universe and everything" < 42;}', False)
|
||||||
self._test('function f(){return 0 - 7 * - 6;}', 42)
|
self._test('function f(){return 0 - 7 * - 6;}', 42)
|
||||||
self._test('function f(){return true << "5";}', 32)
|
|
||||||
self._test('function f(){return true << true;}', 2)
|
|
||||||
self._test('function f(){return "19" & "21.9";}', 17)
|
|
||||||
self._test('function f(){return "19" & false;}', 0)
|
|
||||||
self._test('function f(){return "11.0" >> "2.1";}', 2)
|
|
||||||
self._test('function f(){return 5 ^ 9;}', 12)
|
|
||||||
self._test('function f(){return 0.0 << NaN}', 0)
|
|
||||||
self._test('function f(){return null << undefined}', 0)
|
|
||||||
# TODO: Does not work due to number too large
|
|
||||||
# self._test('function f(){return 21 << 4294967297}', 42)
|
|
||||||
|
|
||||||
def test_array_access(self):
|
def test_array_access(self):
|
||||||
self._test('function f(){var x = [1,2,3]; x[0] = 4; x[0] = 5; x[2.0] = 7; return x;}', [5, 2, 7])
|
self._test('function f(){var x = [1,2,3]; x[0] = 4; x[0] = 5; x[2.0] = 7; return x;}', [5, 2, 7])
|
||||||
@ -118,7 +108,6 @@ class TestJSInterpreter(unittest.TestCase):
|
|||||||
self._test('function f(){var x = 20; x = 30 + 1; return x;}', 31)
|
self._test('function f(){var x = 20; x = 30 + 1; return x;}', 31)
|
||||||
self._test('function f(){var x = 20; x += 30 + 1; return x;}', 51)
|
self._test('function f(){var x = 20; x += 30 + 1; return x;}', 51)
|
||||||
self._test('function f(){var x = 20; x -= 30 + 1; return x;}', -11)
|
self._test('function f(){var x = 20; x -= 30 + 1; return x;}', -11)
|
||||||
self._test('function f(){var x = 2; var y = ["a", "b"]; y[x%y["length"]]="z"; return y}', ['z', 'b'])
|
|
||||||
|
|
||||||
@unittest.skip('Not implemented')
|
@unittest.skip('Not implemented')
|
||||||
def test_comments(self):
|
def test_comments(self):
|
||||||
@ -385,7 +374,7 @@ class TestJSInterpreter(unittest.TestCase):
|
|||||||
@unittest.skip('Not implemented')
|
@unittest.skip('Not implemented')
|
||||||
def test_packed(self):
|
def test_packed(self):
|
||||||
jsi = JSInterpreter('''function f(p,a,c,k,e,d){while(c--)if(k[c])p=p.replace(new RegExp('\\b'+c.toString(a)+'\\b','g'),k[c]);return p}''')
|
jsi = JSInterpreter('''function f(p,a,c,k,e,d){while(c--)if(k[c])p=p.replace(new RegExp('\\b'+c.toString(a)+'\\b','g'),k[c]);return p}''')
|
||||||
self.assertEqual(jsi.call_function('f', '''h 7=g("1j");7.7h({7g:[{33:"w://7f-7e-7d-7c.v.7b/7a/79/78/77/76.74?t=73&s=2s&e=72&f=2t&71=70.0.0.1&6z=6y&6x=6w"}],6v:"w://32.v.u/6u.31",16:"r%",15:"r%",6t:"6s",6r:"",6q:"l",6p:"l",6o:"6n",6m:\'6l\',6k:"6j",9:[{33:"/2u?b=6i&n=50&6h=w://32.v.u/6g.31",6f:"6e"}],1y:{6d:1,6c:\'#6b\',6a:\'#69\',68:"67",66:30,65:r,},"64":{63:"%62 2m%m%61%5z%5y%5x.u%5w%5v%5u.2y%22 2k%m%1o%22 5t%m%1o%22 5s%m%1o%22 2j%m%5r%22 16%m%5q%22 15%m%5p%22 5o%2z%5n%5m%2z",5l:"w://v.u/d/1k/5k.2y",5j:[]},\'5i\':{"5h":"5g"},5f:"5e",5d:"w://v.u",5c:{},5b:l,1x:[0.25,0.50,0.75,1,1.25,1.5,2]});h 1m,1n,5a;h 59=0,58=0;h 7=g("1j");h 2x=0,57=0,56=0;$.55({54:{\'53-52\':\'2i-51\'}});7.j(\'4z\',6(x){c(5>0&&x.1l>=5&&1n!=1){1n=1;$(\'q.4y\').4x(\'4w\')}});7.j(\'13\',6(x){2x=x.1l});7.j(\'2g\',6(x){2w(x)});7.j(\'4v\',6(){$(\'q.2v\').4u()});6 2w(x){$(\'q.2v\').4t();c(1m)19;1m=1;17=0;c(4s.4r===l){17=1}$.4q(\'/2u?b=4p&2l=1k&4o=2t-4n-4m-2s-4l&4k=&4j=&4i=&17=\'+17,6(2r){$(\'#4h\').4g(2r)});$(\'.3-8-4f-4e:4d("4c")\').2h(6(e){2q();g().4b(0);g().4a(l)});6 2q(){h $14=$("<q />").2p({1l:"49",16:"r%",15:"r%",48:0,2n:0,2o:47,46:"45(10%, 10%, 10%, 0.4)","44-43":"42"});$("<41 />").2p({16:"60%",15:"60%",2o:40,"3z-2n":"3y"}).3x({\'2m\':\'/?b=3w&2l=1k\',\'2k\':\'0\',\'2j\':\'2i\'}).2f($14);$14.2h(6(){$(3v).3u();g().2g()});$14.2f($(\'#1j\'))}g().13(0);}6 3t(){h 9=7.1b(2e);2d.2c(9);c(9.n>1){1r(i=0;i<9.n;i++){c(9[i].1a==2e){2d.2c(\'!!=\'+i);7.1p(i)}}}}7.j(\'3s\',6(){g().1h("/2a/3r.29","3q 10 28",6(){g().13(g().27()+10)},"2b");$("q[26=2b]").23().21(\'.3-20-1z\');g().1h("/2a/3p.29","3o 10 28",6(){h 12=g().27()-10;c(12<0)12=0;g().13(12)},"24");$("q[26=24]").23().21(\'.3-20-1z\');});6 1i(){}7.j(\'3n\',6(){1i()});7.j(\'3m\',6(){1i()});7.j("k",6(y){h 9=7.1b();c(9.n<2)19;$(\'.3-8-3l-3k\').3j(6(){$(\'#3-8-a-k\').1e(\'3-8-a-z\');$(\'.3-a-k\').p(\'o-1f\',\'11\')});7.1h("/3i/3h.3g","3f 3e",6(){$(\'.3-1w\').3d(\'3-8-1v\');$(\'.3-8-1y, .3-8-1x\').p(\'o-1g\',\'11\');c($(\'.3-1w\').3c(\'3-8-1v\')){$(\'.3-a-k\').p(\'o-1g\',\'l\');$(\'.3-a-k\').p(\'o-1f\',\'l\');$(\'.3-8-a\').1e(\'3-8-a-z\');$(\'.3-8-a:1u\').3b(\'3-8-a-z\')}3a{$(\'.3-a-k\').p(\'o-1g\',\'11\');$(\'.3-a-k\').p(\'o-1f\',\'11\');$(\'.3-8-a:1u\').1e(\'3-8-a-z\')}},"39");7.j("38",6(y){1d.37(\'1c\',y.9[y.36].1a)});c(1d.1t(\'1c\')){35("1s(1d.1t(\'1c\'));",34)}});h 18;6 1s(1q){h 9=7.1b();c(9.n>1){1r(i=0;i<9.n;i++){c(9[i].1a==1q){c(i==18){19}18=i;7.1p(i)}}}}',36,270,'|||jw|||function|player|settings|tracks|submenu||if||||jwplayer|var||on|audioTracks|true|3D|length|aria|attr|div|100|||sx|filemoon|https||event|active||false|tt|seek|dd|height|width|adb|current_audio|return|name|getAudioTracks|default_audio|localStorage|removeClass|expanded|checked|addButton|callMeMaybe|vplayer|0fxcyc2ajhp1|position|vvplay|vvad|220|setCurrentAudioTrack|audio_name|for|audio_set|getItem|last|open|controls|playbackRates|captions|rewind|icon|insertAfter||detach|ff00||button|getPosition|sec|png|player8|ff11|log|console|track_name|appendTo|play|click|no|scrolling|frameborder|file_code|src|top|zIndex|css|showCCform|data|1662367683|383371|dl|video_ad|doPlay|prevt|mp4|3E||jpg|thumbs|file|300|setTimeout|currentTrack|setItem|audioTrackChanged|dualSound|else|addClass|hasClass|toggleClass|Track|Audio|svg|dualy|images|mousedown|buttons|topbar|playAttemptFailed|beforePlay|Rewind|fr|Forward|ff|ready|set_audio_track|remove|this|upload_srt|prop|50px|margin|1000001|iframe|center|align|text|rgba|background|1000000|left|absolute|pause|setCurrentCaptions|Upload|contains|item|content|html|fviews|referer|prem|embed|3e57249ef633e0d03bf76ceb8d8a4b65|216|83|hash|view|get|TokenZir|window|hide|show|complete|slow|fadeIn|video_ad_fadein|time||cache|Cache|Content|headers|ajaxSetup|v2done|tott|vastdone2|vastdone1|vvbefore|playbackRateControls|cast|aboutlink|FileMoon|abouttext|UHD|1870|qualityLabels|sites|GNOME_POWER|link|2Fiframe|3C|allowfullscreen|22360|22640|22no|marginheight|marginwidth|2FGNOME_POWER|2F0fxcyc2ajhp1|2Fe|2Ffilemoon|2F|3A||22https|3Ciframe|code|sharing|fontOpacity|backgroundOpacity|Tahoma|fontFamily|303030|backgroundColor|FFFFFF|color|userFontScale|thumbnails|kind|0fxcyc2ajhp10000|url|get_slides|start|startparam|none|preload|html5|primary|hlshtml|androidhls|duration|uniform|stretching|0fxcyc2ajhp1_xt|image|2048|sp|6871|asn|127|srv|43200|_g3XlBcu2lmD9oDexD2NLWSmah2Nu3XcDrl93m9PwXY|m3u8||master|0fxcyc2ajhp1_x|00076|01|hls2|to|s01|delivery|storage|moon|sources|setup'''.split('|'))) # noqa: SIM905
|
self.assertEqual(jsi.call_function('f', '''h 7=g("1j");7.7h({7g:[{33:"w://7f-7e-7d-7c.v.7b/7a/79/78/77/76.74?t=73&s=2s&e=72&f=2t&71=70.0.0.1&6z=6y&6x=6w"}],6v:"w://32.v.u/6u.31",16:"r%",15:"r%",6t:"6s",6r:"",6q:"l",6p:"l",6o:"6n",6m:\'6l\',6k:"6j",9:[{33:"/2u?b=6i&n=50&6h=w://32.v.u/6g.31",6f:"6e"}],1y:{6d:1,6c:\'#6b\',6a:\'#69\',68:"67",66:30,65:r,},"64":{63:"%62 2m%m%61%5z%5y%5x.u%5w%5v%5u.2y%22 2k%m%1o%22 5t%m%1o%22 5s%m%1o%22 2j%m%5r%22 16%m%5q%22 15%m%5p%22 5o%2z%5n%5m%2z",5l:"w://v.u/d/1k/5k.2y",5j:[]},\'5i\':{"5h":"5g"},5f:"5e",5d:"w://v.u",5c:{},5b:l,1x:[0.25,0.50,0.75,1,1.25,1.5,2]});h 1m,1n,5a;h 59=0,58=0;h 7=g("1j");h 2x=0,57=0,56=0;$.55({54:{\'53-52\':\'2i-51\'}});7.j(\'4z\',6(x){c(5>0&&x.1l>=5&&1n!=1){1n=1;$(\'q.4y\').4x(\'4w\')}});7.j(\'13\',6(x){2x=x.1l});7.j(\'2g\',6(x){2w(x)});7.j(\'4v\',6(){$(\'q.2v\').4u()});6 2w(x){$(\'q.2v\').4t();c(1m)19;1m=1;17=0;c(4s.4r===l){17=1}$.4q(\'/2u?b=4p&2l=1k&4o=2t-4n-4m-2s-4l&4k=&4j=&4i=&17=\'+17,6(2r){$(\'#4h\').4g(2r)});$(\'.3-8-4f-4e:4d("4c")\').2h(6(e){2q();g().4b(0);g().4a(l)});6 2q(){h $14=$("<q />").2p({1l:"49",16:"r%",15:"r%",48:0,2n:0,2o:47,46:"45(10%, 10%, 10%, 0.4)","44-43":"42"});$("<41 />").2p({16:"60%",15:"60%",2o:40,"3z-2n":"3y"}).3x({\'2m\':\'/?b=3w&2l=1k\',\'2k\':\'0\',\'2j\':\'2i\'}).2f($14);$14.2h(6(){$(3v).3u();g().2g()});$14.2f($(\'#1j\'))}g().13(0);}6 3t(){h 9=7.1b(2e);2d.2c(9);c(9.n>1){1r(i=0;i<9.n;i++){c(9[i].1a==2e){2d.2c(\'!!=\'+i);7.1p(i)}}}}7.j(\'3s\',6(){g().1h("/2a/3r.29","3q 10 28",6(){g().13(g().27()+10)},"2b");$("q[26=2b]").23().21(\'.3-20-1z\');g().1h("/2a/3p.29","3o 10 28",6(){h 12=g().27()-10;c(12<0)12=0;g().13(12)},"24");$("q[26=24]").23().21(\'.3-20-1z\');});6 1i(){}7.j(\'3n\',6(){1i()});7.j(\'3m\',6(){1i()});7.j("k",6(y){h 9=7.1b();c(9.n<2)19;$(\'.3-8-3l-3k\').3j(6(){$(\'#3-8-a-k\').1e(\'3-8-a-z\');$(\'.3-a-k\').p(\'o-1f\',\'11\')});7.1h("/3i/3h.3g","3f 3e",6(){$(\'.3-1w\').3d(\'3-8-1v\');$(\'.3-8-1y, .3-8-1x\').p(\'o-1g\',\'11\');c($(\'.3-1w\').3c(\'3-8-1v\')){$(\'.3-a-k\').p(\'o-1g\',\'l\');$(\'.3-a-k\').p(\'o-1f\',\'l\');$(\'.3-8-a\').1e(\'3-8-a-z\');$(\'.3-8-a:1u\').3b(\'3-8-a-z\')}3a{$(\'.3-a-k\').p(\'o-1g\',\'11\');$(\'.3-a-k\').p(\'o-1f\',\'11\');$(\'.3-8-a:1u\').1e(\'3-8-a-z\')}},"39");7.j("38",6(y){1d.37(\'1c\',y.9[y.36].1a)});c(1d.1t(\'1c\')){35("1s(1d.1t(\'1c\'));",34)}});h 18;6 1s(1q){h 9=7.1b();c(9.n>1){1r(i=0;i<9.n;i++){c(9[i].1a==1q){c(i==18){19}18=i;7.1p(i)}}}}',36,270,'|||jw|||function|player|settings|tracks|submenu||if||||jwplayer|var||on|audioTracks|true|3D|length|aria|attr|div|100|||sx|filemoon|https||event|active||false|tt|seek|dd|height|width|adb|current_audio|return|name|getAudioTracks|default_audio|localStorage|removeClass|expanded|checked|addButton|callMeMaybe|vplayer|0fxcyc2ajhp1|position|vvplay|vvad|220|setCurrentAudioTrack|audio_name|for|audio_set|getItem|last|open|controls|playbackRates|captions|rewind|icon|insertAfter||detach|ff00||button|getPosition|sec|png|player8|ff11|log|console|track_name|appendTo|play|click|no|scrolling|frameborder|file_code|src|top|zIndex|css|showCCform|data|1662367683|383371|dl|video_ad|doPlay|prevt|mp4|3E||jpg|thumbs|file|300|setTimeout|currentTrack|setItem|audioTrackChanged|dualSound|else|addClass|hasClass|toggleClass|Track|Audio|svg|dualy|images|mousedown|buttons|topbar|playAttemptFailed|beforePlay|Rewind|fr|Forward|ff|ready|set_audio_track|remove|this|upload_srt|prop|50px|margin|1000001|iframe|center|align|text|rgba|background|1000000|left|absolute|pause|setCurrentCaptions|Upload|contains|item|content|html|fviews|referer|prem|embed|3e57249ef633e0d03bf76ceb8d8a4b65|216|83|hash|view|get|TokenZir|window|hide|show|complete|slow|fadeIn|video_ad_fadein|time||cache|Cache|Content|headers|ajaxSetup|v2done|tott|vastdone2|vastdone1|vvbefore|playbackRateControls|cast|aboutlink|FileMoon|abouttext|UHD|1870|qualityLabels|sites|GNOME_POWER|link|2Fiframe|3C|allowfullscreen|22360|22640|22no|marginheight|marginwidth|2FGNOME_POWER|2F0fxcyc2ajhp1|2Fe|2Ffilemoon|2F|3A||22https|3Ciframe|code|sharing|fontOpacity|backgroundOpacity|Tahoma|fontFamily|303030|backgroundColor|FFFFFF|color|userFontScale|thumbnails|kind|0fxcyc2ajhp10000|url|get_slides|start|startparam|none|preload|html5|primary|hlshtml|androidhls|duration|uniform|stretching|0fxcyc2ajhp1_xt|image|2048|sp|6871|asn|127|srv|43200|_g3XlBcu2lmD9oDexD2NLWSmah2Nu3XcDrl93m9PwXY|m3u8||master|0fxcyc2ajhp1_x|00076|01|hls2|to|s01|delivery|storage|moon|sources|setup'''.split('|')))
|
||||||
|
|
||||||
def test_join(self):
|
def test_join(self):
|
||||||
test_input = list('test')
|
test_input = list('test')
|
||||||
@ -404,8 +393,6 @@ class TestJSInterpreter(unittest.TestCase):
|
|||||||
test_result = list('test')
|
test_result = list('test')
|
||||||
tests = [
|
tests = [
|
||||||
'function f(a, b){return a.split(b)}',
|
'function f(a, b){return a.split(b)}',
|
||||||
'function f(a, b){return a["split"](b)}',
|
|
||||||
'function f(a, b){let x = ["split"]; return a[x[0]](b)}',
|
|
||||||
'function f(a, b){return String.prototype.split.call(a, b)}',
|
'function f(a, b){return String.prototype.split.call(a, b)}',
|
||||||
'function f(a, b){return String.prototype.split.apply(a, [b])}',
|
'function f(a, b){return String.prototype.split.apply(a, [b])}',
|
||||||
]
|
]
|
||||||
@ -444,52 +431,6 @@ class TestJSInterpreter(unittest.TestCase):
|
|||||||
self._test('function f(){return "012345678".slice(-1, 1)}', '')
|
self._test('function f(){return "012345678".slice(-1, 1)}', '')
|
||||||
self._test('function f(){return "012345678".slice(-3, -1)}', '67')
|
self._test('function f(){return "012345678".slice(-3, -1)}', '67')
|
||||||
|
|
||||||
def test_splice(self):
|
|
||||||
self._test('function f(){var T = ["0", "1", "2"]; T["splice"](2, 1, "0")[0]; return T }', ['0', '1', '0'])
|
|
||||||
|
|
||||||
def test_js_number_to_string(self):
|
|
||||||
for test, radix, expected in [
|
|
||||||
(0, None, '0'),
|
|
||||||
(-0, None, '0'),
|
|
||||||
(0.0, None, '0'),
|
|
||||||
(-0.0, None, '0'),
|
|
||||||
(math.nan, None, 'NaN'),
|
|
||||||
(-math.nan, None, 'NaN'),
|
|
||||||
(math.inf, None, 'Infinity'),
|
|
||||||
(-math.inf, None, '-Infinity'),
|
|
||||||
(10 ** 21.5, 8, '526665530627250154000000'),
|
|
||||||
(6, 2, '110'),
|
|
||||||
(254, 16, 'fe'),
|
|
||||||
(-10, 2, '-1010'),
|
|
||||||
(-0xff, 2, '-11111111'),
|
|
||||||
(0.1 + 0.2, 16, '0.4cccccccccccd'),
|
|
||||||
(1234.1234, 10, '1234.1234'),
|
|
||||||
# (1000000000000000128, 10, '1000000000000000100')
|
|
||||||
]:
|
|
||||||
assert js_number_to_string(test, radix) == expected
|
|
||||||
|
|
||||||
def test_extract_function(self):
|
|
||||||
jsi = JSInterpreter('function a(b) { return b + 1; }')
|
|
||||||
func = jsi.extract_function('a')
|
|
||||||
self.assertEqual(func([2]), 3)
|
|
||||||
|
|
||||||
def test_extract_function_with_global_stack(self):
|
|
||||||
jsi = JSInterpreter('function c(d) { return d + e + f + g; }')
|
|
||||||
func = jsi.extract_function('c', {'e': 10}, {'f': 100, 'g': 1000})
|
|
||||||
self.assertEqual(func([1]), 1111)
|
|
||||||
|
|
||||||
def test_extract_object(self):
|
|
||||||
jsi = JSInterpreter('var a={};a.xy={};var xy;var zxy={};xy={z:function(){return "abc"}};')
|
|
||||||
self.assertTrue('z' in jsi.extract_object('xy', None))
|
|
||||||
|
|
||||||
def test_increment_decrement(self):
|
|
||||||
self._test('function f() { var x = 1; return ++x; }', 2)
|
|
||||||
self._test('function f() { var x = 1; return x++; }', 1)
|
|
||||||
self._test('function f() { var x = 1; x--; return x }', 0)
|
|
||||||
self._test('function f() { var y; var x = 1; x++, --x, x--, x--, y="z", "abc", x++; return --x }', -1)
|
|
||||||
self._test('function f() { var a = "test--"; return a; }', 'test--')
|
|
||||||
self._test('function f() { var b = 1; var a = "b--"; return a; }', 'b--')
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == '__main__':
|
if __name__ == '__main__':
|
||||||
unittest.main()
|
unittest.main()
|
||||||
|
@ -39,7 +39,6 @@ from yt_dlp.cookies import YoutubeDLCookieJar
|
|||||||
from yt_dlp.dependencies import brotli, curl_cffi, requests, urllib3
|
from yt_dlp.dependencies import brotli, curl_cffi, requests, urllib3
|
||||||
from yt_dlp.networking import (
|
from yt_dlp.networking import (
|
||||||
HEADRequest,
|
HEADRequest,
|
||||||
PATCHRequest,
|
|
||||||
PUTRequest,
|
PUTRequest,
|
||||||
Request,
|
Request,
|
||||||
RequestDirector,
|
RequestDirector,
|
||||||
@ -615,6 +614,7 @@ class TestHTTPRequestHandler(TestRequestHandlerBase):
|
|||||||
rh, Request(f'http://127.0.0.1:{self.http_port}/source_address')).read().decode()
|
rh, Request(f'http://127.0.0.1:{self.http_port}/source_address')).read().decode()
|
||||||
assert source_address == data
|
assert source_address == data
|
||||||
|
|
||||||
|
# Not supported by CurlCFFI
|
||||||
@pytest.mark.skip_handler('CurlCFFI', 'not supported by curl-cffi')
|
@pytest.mark.skip_handler('CurlCFFI', 'not supported by curl-cffi')
|
||||||
def test_gzip_trailing_garbage(self, handler):
|
def test_gzip_trailing_garbage(self, handler):
|
||||||
with handler() as rh:
|
with handler() as rh:
|
||||||
@ -720,15 +720,6 @@ class TestHTTPRequestHandler(TestRequestHandlerBase):
|
|||||||
rh, Request(
|
rh, Request(
|
||||||
f'http://127.0.0.1:{self.http_port}/headers', proxies={'all': 'http://10.255.255.255'})).close()
|
f'http://127.0.0.1:{self.http_port}/headers', proxies={'all': 'http://10.255.255.255'})).close()
|
||||||
|
|
||||||
@pytest.mark.skip_handlers_if(lambda _, handler: handler not in ['Urllib', 'CurlCFFI'], 'handler does not support keep_header_casing')
|
|
||||||
def test_keep_header_casing(self, handler):
|
|
||||||
with handler() as rh:
|
|
||||||
res = validate_and_send(
|
|
||||||
rh, Request(
|
|
||||||
f'http://127.0.0.1:{self.http_port}/headers', headers={'X-test-heaDer': 'test'}, extensions={'keep_header_casing': True})).read().decode()
|
|
||||||
|
|
||||||
assert 'X-test-heaDer: test' in res
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.mark.parametrize('handler', ['Urllib', 'Requests', 'CurlCFFI'], indirect=True)
|
@pytest.mark.parametrize('handler', ['Urllib', 'Requests', 'CurlCFFI'], indirect=True)
|
||||||
class TestClientCertificate:
|
class TestClientCertificate:
|
||||||
@ -1298,7 +1289,6 @@ class TestRequestHandlerValidation:
|
|||||||
({'legacy_ssl': False}, False),
|
({'legacy_ssl': False}, False),
|
||||||
({'legacy_ssl': True}, False),
|
({'legacy_ssl': True}, False),
|
||||||
({'legacy_ssl': 'notabool'}, AssertionError),
|
({'legacy_ssl': 'notabool'}, AssertionError),
|
||||||
({'keep_header_casing': True}, UnsupportedRequest),
|
|
||||||
]),
|
]),
|
||||||
('Requests', 'http', [
|
('Requests', 'http', [
|
||||||
({'cookiejar': 'notacookiejar'}, AssertionError),
|
({'cookiejar': 'notacookiejar'}, AssertionError),
|
||||||
@ -1309,9 +1299,6 @@ class TestRequestHandlerValidation:
|
|||||||
({'legacy_ssl': False}, False),
|
({'legacy_ssl': False}, False),
|
||||||
({'legacy_ssl': True}, False),
|
({'legacy_ssl': True}, False),
|
||||||
({'legacy_ssl': 'notabool'}, AssertionError),
|
({'legacy_ssl': 'notabool'}, AssertionError),
|
||||||
({'keep_header_casing': False}, False),
|
|
||||||
({'keep_header_casing': True}, False),
|
|
||||||
({'keep_header_casing': 'notabool'}, AssertionError),
|
|
||||||
]),
|
]),
|
||||||
('CurlCFFI', 'http', [
|
('CurlCFFI', 'http', [
|
||||||
({'cookiejar': 'notacookiejar'}, AssertionError),
|
({'cookiejar': 'notacookiejar'}, AssertionError),
|
||||||
@ -1857,7 +1844,6 @@ class TestRequest:
|
|||||||
|
|
||||||
def test_request_helpers(self):
|
def test_request_helpers(self):
|
||||||
assert HEADRequest('http://example.com').method == 'HEAD'
|
assert HEADRequest('http://example.com').method == 'HEAD'
|
||||||
assert PATCHRequest('http://example.com').method == 'PATCH'
|
|
||||||
assert PUTRequest('http://example.com').method == 'PUT'
|
assert PUTRequest('http://example.com').method == 'PUT'
|
||||||
|
|
||||||
def test_headers(self):
|
def test_headers(self):
|
||||||
|
@ -20,6 +20,7 @@ from yt_dlp.networking._helper import (
|
|||||||
add_accept_encoding_header,
|
add_accept_encoding_header,
|
||||||
get_redirect_method,
|
get_redirect_method,
|
||||||
make_socks_proxy_opts,
|
make_socks_proxy_opts,
|
||||||
|
select_proxy,
|
||||||
ssl_load_certs,
|
ssl_load_certs,
|
||||||
)
|
)
|
||||||
from yt_dlp.networking.exceptions import (
|
from yt_dlp.networking.exceptions import (
|
||||||
@ -27,7 +28,7 @@ from yt_dlp.networking.exceptions import (
|
|||||||
IncompleteRead,
|
IncompleteRead,
|
||||||
)
|
)
|
||||||
from yt_dlp.socks import ProxyType
|
from yt_dlp.socks import ProxyType
|
||||||
from yt_dlp.utils.networking import HTTPHeaderDict, select_proxy
|
from yt_dlp.utils.networking import HTTPHeaderDict
|
||||||
|
|
||||||
TEST_DIR = os.path.dirname(os.path.abspath(__file__))
|
TEST_DIR = os.path.dirname(os.path.abspath(__file__))
|
||||||
|
|
||||||
|
@ -10,71 +10,21 @@ TEST_DATA_DIR = Path(os.path.dirname(os.path.abspath(__file__)), 'testdata')
|
|||||||
sys.path.append(str(TEST_DATA_DIR))
|
sys.path.append(str(TEST_DATA_DIR))
|
||||||
importlib.invalidate_caches()
|
importlib.invalidate_caches()
|
||||||
|
|
||||||
from yt_dlp.plugins import (
|
from yt_dlp.plugins import PACKAGE_NAME, directories, load_plugins
|
||||||
PACKAGE_NAME,
|
|
||||||
PluginSpec,
|
|
||||||
directories,
|
|
||||||
load_plugins,
|
|
||||||
load_all_plugins,
|
|
||||||
register_plugin_spec,
|
|
||||||
)
|
|
||||||
|
|
||||||
from yt_dlp.globals import (
|
|
||||||
extractors,
|
|
||||||
postprocessors,
|
|
||||||
plugin_dirs,
|
|
||||||
plugin_ies,
|
|
||||||
plugin_pps,
|
|
||||||
all_plugins_loaded,
|
|
||||||
plugin_specs,
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
EXTRACTOR_PLUGIN_SPEC = PluginSpec(
|
|
||||||
module_name='extractor',
|
|
||||||
suffix='IE',
|
|
||||||
destination=extractors,
|
|
||||||
plugin_destination=plugin_ies,
|
|
||||||
)
|
|
||||||
|
|
||||||
POSTPROCESSOR_PLUGIN_SPEC = PluginSpec(
|
|
||||||
module_name='postprocessor',
|
|
||||||
suffix='PP',
|
|
||||||
destination=postprocessors,
|
|
||||||
plugin_destination=plugin_pps,
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
def reset_plugins():
|
|
||||||
plugin_ies.value = {}
|
|
||||||
plugin_pps.value = {}
|
|
||||||
plugin_dirs.value = ['default']
|
|
||||||
plugin_specs.value = {}
|
|
||||||
all_plugins_loaded.value = False
|
|
||||||
# Clearing override plugins is probably difficult
|
|
||||||
for module_name in tuple(sys.modules):
|
|
||||||
for plugin_type in ('extractor', 'postprocessor'):
|
|
||||||
if module_name.startswith(f'{PACKAGE_NAME}.{plugin_type}.'):
|
|
||||||
del sys.modules[module_name]
|
|
||||||
|
|
||||||
importlib.invalidate_caches()
|
|
||||||
|
|
||||||
|
|
||||||
class TestPlugins(unittest.TestCase):
|
class TestPlugins(unittest.TestCase):
|
||||||
|
|
||||||
TEST_PLUGIN_DIR = TEST_DATA_DIR / PACKAGE_NAME
|
TEST_PLUGIN_DIR = TEST_DATA_DIR / PACKAGE_NAME
|
||||||
|
|
||||||
def setUp(self):
|
|
||||||
reset_plugins()
|
|
||||||
|
|
||||||
def tearDown(self):
|
|
||||||
reset_plugins()
|
|
||||||
|
|
||||||
def test_directories_containing_plugins(self):
|
def test_directories_containing_plugins(self):
|
||||||
self.assertIn(self.TEST_PLUGIN_DIR, map(Path, directories()))
|
self.assertIn(self.TEST_PLUGIN_DIR, map(Path, directories()))
|
||||||
|
|
||||||
def test_extractor_classes(self):
|
def test_extractor_classes(self):
|
||||||
plugins_ie = load_plugins(EXTRACTOR_PLUGIN_SPEC)
|
for module_name in tuple(sys.modules):
|
||||||
|
if module_name.startswith(f'{PACKAGE_NAME}.extractor'):
|
||||||
|
del sys.modules[module_name]
|
||||||
|
plugins_ie = load_plugins('extractor', 'IE')
|
||||||
|
|
||||||
self.assertIn(f'{PACKAGE_NAME}.extractor.normal', sys.modules.keys())
|
self.assertIn(f'{PACKAGE_NAME}.extractor.normal', sys.modules.keys())
|
||||||
self.assertIn('NormalPluginIE', plugins_ie.keys())
|
self.assertIn('NormalPluginIE', plugins_ie.keys())
|
||||||
@ -84,29 +34,17 @@ class TestPlugins(unittest.TestCase):
|
|||||||
f'{PACKAGE_NAME}.extractor._ignore' in sys.modules,
|
f'{PACKAGE_NAME}.extractor._ignore' in sys.modules,
|
||||||
'loaded module beginning with underscore')
|
'loaded module beginning with underscore')
|
||||||
self.assertNotIn('IgnorePluginIE', plugins_ie.keys())
|
self.assertNotIn('IgnorePluginIE', plugins_ie.keys())
|
||||||
self.assertNotIn('IgnorePluginIE', plugin_ies.value)
|
|
||||||
|
|
||||||
# Don't load extractors with underscore prefix
|
# Don't load extractors with underscore prefix
|
||||||
self.assertNotIn('_IgnoreUnderscorePluginIE', plugins_ie.keys())
|
self.assertNotIn('_IgnoreUnderscorePluginIE', plugins_ie.keys())
|
||||||
self.assertNotIn('_IgnoreUnderscorePluginIE', plugin_ies.value)
|
|
||||||
|
|
||||||
# Don't load extractors not specified in __all__ (if supplied)
|
# Don't load extractors not specified in __all__ (if supplied)
|
||||||
self.assertNotIn('IgnoreNotInAllPluginIE', plugins_ie.keys())
|
self.assertNotIn('IgnoreNotInAllPluginIE', plugins_ie.keys())
|
||||||
self.assertNotIn('IgnoreNotInAllPluginIE', plugin_ies.value)
|
|
||||||
self.assertIn('InAllPluginIE', plugins_ie.keys())
|
self.assertIn('InAllPluginIE', plugins_ie.keys())
|
||||||
self.assertIn('InAllPluginIE', plugin_ies.value)
|
|
||||||
|
|
||||||
# Don't load override extractors
|
|
||||||
self.assertNotIn('OverrideGenericIE', plugins_ie.keys())
|
|
||||||
self.assertNotIn('OverrideGenericIE', plugin_ies.value)
|
|
||||||
self.assertNotIn('_UnderscoreOverrideGenericIE', plugins_ie.keys())
|
|
||||||
self.assertNotIn('_UnderscoreOverrideGenericIE', plugin_ies.value)
|
|
||||||
|
|
||||||
def test_postprocessor_classes(self):
|
def test_postprocessor_classes(self):
|
||||||
plugins_pp = load_plugins(POSTPROCESSOR_PLUGIN_SPEC)
|
plugins_pp = load_plugins('postprocessor', 'PP')
|
||||||
self.assertIn('NormalPluginPP', plugins_pp.keys())
|
self.assertIn('NormalPluginPP', plugins_pp.keys())
|
||||||
self.assertIn(f'{PACKAGE_NAME}.postprocessor.normal', sys.modules.keys())
|
|
||||||
self.assertIn('NormalPluginPP', plugin_pps.value)
|
|
||||||
|
|
||||||
def test_importing_zipped_module(self):
|
def test_importing_zipped_module(self):
|
||||||
zip_path = TEST_DATA_DIR / 'zipped_plugins.zip'
|
zip_path = TEST_DATA_DIR / 'zipped_plugins.zip'
|
||||||
@ -119,10 +57,10 @@ class TestPlugins(unittest.TestCase):
|
|||||||
package = importlib.import_module(f'{PACKAGE_NAME}.{plugin_type}')
|
package = importlib.import_module(f'{PACKAGE_NAME}.{plugin_type}')
|
||||||
self.assertIn(zip_path / PACKAGE_NAME / plugin_type, map(Path, package.__path__))
|
self.assertIn(zip_path / PACKAGE_NAME / plugin_type, map(Path, package.__path__))
|
||||||
|
|
||||||
plugins_ie = load_plugins(EXTRACTOR_PLUGIN_SPEC)
|
plugins_ie = load_plugins('extractor', 'IE')
|
||||||
self.assertIn('ZippedPluginIE', plugins_ie.keys())
|
self.assertIn('ZippedPluginIE', plugins_ie.keys())
|
||||||
|
|
||||||
plugins_pp = load_plugins(POSTPROCESSOR_PLUGIN_SPEC)
|
plugins_pp = load_plugins('postprocessor', 'PP')
|
||||||
self.assertIn('ZippedPluginPP', plugins_pp.keys())
|
self.assertIn('ZippedPluginPP', plugins_pp.keys())
|
||||||
|
|
||||||
finally:
|
finally:
|
||||||
@ -130,117 +68,6 @@ class TestPlugins(unittest.TestCase):
|
|||||||
os.remove(zip_path)
|
os.remove(zip_path)
|
||||||
importlib.invalidate_caches() # reset the import caches
|
importlib.invalidate_caches() # reset the import caches
|
||||||
|
|
||||||
def test_reloading_plugins(self):
|
|
||||||
reload_plugins_path = TEST_DATA_DIR / 'reload_plugins'
|
|
||||||
load_plugins(EXTRACTOR_PLUGIN_SPEC)
|
|
||||||
load_plugins(POSTPROCESSOR_PLUGIN_SPEC)
|
|
||||||
|
|
||||||
# Remove default folder and add reload_plugin path
|
|
||||||
sys.path.remove(str(TEST_DATA_DIR))
|
|
||||||
sys.path.append(str(reload_plugins_path))
|
|
||||||
importlib.invalidate_caches()
|
|
||||||
try:
|
|
||||||
for plugin_type in ('extractor', 'postprocessor'):
|
|
||||||
package = importlib.import_module(f'{PACKAGE_NAME}.{plugin_type}')
|
|
||||||
self.assertIn(reload_plugins_path / PACKAGE_NAME / plugin_type, map(Path, package.__path__))
|
|
||||||
|
|
||||||
plugins_ie = load_plugins(EXTRACTOR_PLUGIN_SPEC)
|
|
||||||
self.assertIn('NormalPluginIE', plugins_ie.keys())
|
|
||||||
self.assertTrue(
|
|
||||||
plugins_ie['NormalPluginIE'].REPLACED,
|
|
||||||
msg='Reloading has not replaced original extractor plugin')
|
|
||||||
self.assertTrue(
|
|
||||||
extractors.value['NormalPluginIE'].REPLACED,
|
|
||||||
msg='Reloading has not replaced original extractor plugin globally')
|
|
||||||
|
|
||||||
plugins_pp = load_plugins(POSTPROCESSOR_PLUGIN_SPEC)
|
|
||||||
self.assertIn('NormalPluginPP', plugins_pp.keys())
|
|
||||||
self.assertTrue(plugins_pp['NormalPluginPP'].REPLACED,
|
|
||||||
msg='Reloading has not replaced original postprocessor plugin')
|
|
||||||
self.assertTrue(
|
|
||||||
postprocessors.value['NormalPluginPP'].REPLACED,
|
|
||||||
msg='Reloading has not replaced original postprocessor plugin globally')
|
|
||||||
|
|
||||||
finally:
|
|
||||||
sys.path.remove(str(reload_plugins_path))
|
|
||||||
sys.path.append(str(TEST_DATA_DIR))
|
|
||||||
importlib.invalidate_caches()
|
|
||||||
|
|
||||||
def test_extractor_override_plugin(self):
|
|
||||||
load_plugins(EXTRACTOR_PLUGIN_SPEC)
|
|
||||||
|
|
||||||
from yt_dlp.extractor.generic import GenericIE
|
|
||||||
|
|
||||||
self.assertEqual(GenericIE.TEST_FIELD, 'override')
|
|
||||||
self.assertEqual(GenericIE.SECONDARY_TEST_FIELD, 'underscore-override')
|
|
||||||
|
|
||||||
self.assertEqual(GenericIE.IE_NAME, 'generic+override+underscore-override')
|
|
||||||
importlib.invalidate_caches()
|
|
||||||
# test that loading a second time doesn't wrap a second time
|
|
||||||
load_plugins(EXTRACTOR_PLUGIN_SPEC)
|
|
||||||
from yt_dlp.extractor.generic import GenericIE
|
|
||||||
self.assertEqual(GenericIE.IE_NAME, 'generic+override+underscore-override')
|
|
||||||
|
|
||||||
def test_load_all_plugin_types(self):
|
|
||||||
|
|
||||||
# no plugin specs registered
|
|
||||||
load_all_plugins()
|
|
||||||
|
|
||||||
self.assertNotIn(f'{PACKAGE_NAME}.extractor.normal', sys.modules.keys())
|
|
||||||
self.assertNotIn(f'{PACKAGE_NAME}.postprocessor.normal', sys.modules.keys())
|
|
||||||
|
|
||||||
register_plugin_spec(EXTRACTOR_PLUGIN_SPEC)
|
|
||||||
register_plugin_spec(POSTPROCESSOR_PLUGIN_SPEC)
|
|
||||||
load_all_plugins()
|
|
||||||
self.assertTrue(all_plugins_loaded.value)
|
|
||||||
|
|
||||||
self.assertIn(f'{PACKAGE_NAME}.extractor.normal', sys.modules.keys())
|
|
||||||
self.assertIn(f'{PACKAGE_NAME}.postprocessor.normal', sys.modules.keys())
|
|
||||||
|
|
||||||
def test_no_plugin_dirs(self):
|
|
||||||
register_plugin_spec(EXTRACTOR_PLUGIN_SPEC)
|
|
||||||
register_plugin_spec(POSTPROCESSOR_PLUGIN_SPEC)
|
|
||||||
|
|
||||||
plugin_dirs.value = []
|
|
||||||
load_all_plugins()
|
|
||||||
|
|
||||||
self.assertNotIn(f'{PACKAGE_NAME}.extractor.normal', sys.modules.keys())
|
|
||||||
self.assertNotIn(f'{PACKAGE_NAME}.postprocessor.normal', sys.modules.keys())
|
|
||||||
|
|
||||||
def test_set_plugin_dirs(self):
|
|
||||||
custom_plugin_dir = str(TEST_DATA_DIR / 'plugin_packages')
|
|
||||||
plugin_dirs.value = [custom_plugin_dir]
|
|
||||||
|
|
||||||
load_plugins(EXTRACTOR_PLUGIN_SPEC)
|
|
||||||
|
|
||||||
self.assertIn(f'{PACKAGE_NAME}.extractor.package', sys.modules.keys())
|
|
||||||
self.assertIn('PackagePluginIE', plugin_ies.value)
|
|
||||||
|
|
||||||
def test_invalid_plugin_dir(self):
|
|
||||||
plugin_dirs.value = ['invalid_dir']
|
|
||||||
with self.assertRaises(ValueError):
|
|
||||||
load_plugins(EXTRACTOR_PLUGIN_SPEC)
|
|
||||||
|
|
||||||
def test_append_plugin_dirs(self):
|
|
||||||
custom_plugin_dir = str(TEST_DATA_DIR / 'plugin_packages')
|
|
||||||
|
|
||||||
self.assertEqual(plugin_dirs.value, ['default'])
|
|
||||||
plugin_dirs.value.append(custom_plugin_dir)
|
|
||||||
self.assertEqual(plugin_dirs.value, ['default', custom_plugin_dir])
|
|
||||||
|
|
||||||
load_plugins(EXTRACTOR_PLUGIN_SPEC)
|
|
||||||
|
|
||||||
self.assertIn(f'{PACKAGE_NAME}.extractor.package', sys.modules.keys())
|
|
||||||
self.assertIn('PackagePluginIE', plugin_ies.value)
|
|
||||||
|
|
||||||
def test_get_plugin_spec(self):
|
|
||||||
register_plugin_spec(EXTRACTOR_PLUGIN_SPEC)
|
|
||||||
register_plugin_spec(POSTPROCESSOR_PLUGIN_SPEC)
|
|
||||||
|
|
||||||
self.assertEqual(plugin_specs.value.get('extractor'), EXTRACTOR_PLUGIN_SPEC)
|
|
||||||
self.assertEqual(plugin_specs.value.get('postprocessor'), POSTPROCESSOR_PLUGIN_SPEC)
|
|
||||||
self.assertIsNone(plugin_specs.value.get('invalid'))
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == '__main__':
|
if __name__ == '__main__':
|
||||||
unittest.main()
|
unittest.main()
|
||||||
|
@ -8,8 +8,6 @@ import unittest
|
|||||||
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||||
|
|
||||||
|
|
||||||
import subprocess
|
|
||||||
|
|
||||||
from yt_dlp import YoutubeDL
|
from yt_dlp import YoutubeDL
|
||||||
from yt_dlp.utils import shell_quote
|
from yt_dlp.utils import shell_quote
|
||||||
from yt_dlp.postprocessor import (
|
from yt_dlp.postprocessor import (
|
||||||
@ -49,18 +47,7 @@ class TestConvertThumbnail(unittest.TestCase):
|
|||||||
print('Skipping: ffmpeg not found')
|
print('Skipping: ffmpeg not found')
|
||||||
return
|
return
|
||||||
|
|
||||||
test_data_dir = 'test/testdata/thumbnails'
|
file = 'test/testdata/thumbnails/foo %d bar/foo_%d.{}'
|
||||||
generated_file = f'{test_data_dir}/empty.webp'
|
|
||||||
|
|
||||||
subprocess.check_call([
|
|
||||||
pp.executable, '-y', '-f', 'lavfi', '-i', 'color=c=black:s=320x320',
|
|
||||||
'-c:v', 'libwebp', '-pix_fmt', 'yuv420p', '-vframes', '1', generated_file,
|
|
||||||
], stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL)
|
|
||||||
|
|
||||||
file = test_data_dir + '/foo %d bar/foo_%d.{}'
|
|
||||||
initial_file = file.format('webp')
|
|
||||||
os.replace(generated_file, initial_file)
|
|
||||||
|
|
||||||
tests = (('webp', 'png'), ('png', 'jpg'))
|
tests = (('webp', 'png'), ('png', 'jpg'))
|
||||||
|
|
||||||
for inp, out in tests:
|
for inp, out in tests:
|
||||||
@ -68,13 +55,11 @@ class TestConvertThumbnail(unittest.TestCase):
|
|||||||
if os.path.exists(out_file):
|
if os.path.exists(out_file):
|
||||||
os.remove(out_file)
|
os.remove(out_file)
|
||||||
pp.convert_thumbnail(file.format(inp), out)
|
pp.convert_thumbnail(file.format(inp), out)
|
||||||
self.assertTrue(os.path.exists(out_file))
|
assert os.path.exists(out_file)
|
||||||
|
|
||||||
for _, out in tests:
|
for _, out in tests:
|
||||||
os.remove(file.format(out))
|
os.remove(file.format(out))
|
||||||
|
|
||||||
os.remove(initial_file)
|
|
||||||
|
|
||||||
|
|
||||||
class TestExec(unittest.TestCase):
|
class TestExec(unittest.TestCase):
|
||||||
def test_parse_cmd(self):
|
def test_parse_cmd(self):
|
||||||
@ -625,7 +610,3 @@ outpoint 10.000000
|
|||||||
self.assertEqual(
|
self.assertEqual(
|
||||||
r"'special '\'' characters '\'' galore'\'\'\'",
|
r"'special '\'' characters '\'' galore'\'\'\'",
|
||||||
self._pp._quote_for_ffmpeg("special ' characters ' galore'''"))
|
self._pp._quote_for_ffmpeg("special ' characters ' galore'''"))
|
||||||
|
|
||||||
|
|
||||||
if __name__ == '__main__':
|
|
||||||
unittest.main()
|
|
||||||
|
@ -1,71 +0,0 @@
|
|||||||
import collections
|
|
||||||
|
|
||||||
import pytest
|
|
||||||
|
|
||||||
from yt_dlp import YoutubeDL
|
|
||||||
from yt_dlp.cookies import YoutubeDLCookieJar
|
|
||||||
from yt_dlp.extractor.common import InfoExtractor
|
|
||||||
from yt_dlp.extractor.youtube.pot._provider import IEContentProviderLogger
|
|
||||||
from yt_dlp.extractor.youtube.pot.provider import PoTokenRequest, PoTokenContext
|
|
||||||
from yt_dlp.utils.networking import HTTPHeaderDict
|
|
||||||
|
|
||||||
|
|
||||||
class MockLogger(IEContentProviderLogger):
|
|
||||||
|
|
||||||
log_level = IEContentProviderLogger.LogLevel.TRACE
|
|
||||||
|
|
||||||
def __init__(self, *args, **kwargs):
|
|
||||||
super().__init__(*args, **kwargs)
|
|
||||||
self.messages = collections.defaultdict(list)
|
|
||||||
|
|
||||||
def trace(self, message: str):
|
|
||||||
self.messages['trace'].append(message)
|
|
||||||
|
|
||||||
def debug(self, message: str):
|
|
||||||
self.messages['debug'].append(message)
|
|
||||||
|
|
||||||
def info(self, message: str):
|
|
||||||
self.messages['info'].append(message)
|
|
||||||
|
|
||||||
def warning(self, message: str, *, once=False):
|
|
||||||
self.messages['warning'].append(message)
|
|
||||||
|
|
||||||
def error(self, message: str):
|
|
||||||
self.messages['error'].append(message)
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.fixture
|
|
||||||
def ie() -> InfoExtractor:
|
|
||||||
ydl = YoutubeDL()
|
|
||||||
return ydl.get_info_extractor('Youtube')
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.fixture
|
|
||||||
def logger() -> MockLogger:
|
|
||||||
return MockLogger()
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.fixture()
|
|
||||||
def pot_request() -> PoTokenRequest:
|
|
||||||
return PoTokenRequest(
|
|
||||||
context=PoTokenContext.GVS,
|
|
||||||
innertube_context={'client': {'clientName': 'WEB'}},
|
|
||||||
innertube_host='youtube.com',
|
|
||||||
session_index=None,
|
|
||||||
player_url=None,
|
|
||||||
is_authenticated=False,
|
|
||||||
video_webpage=None,
|
|
||||||
|
|
||||||
visitor_data='example-visitor-data',
|
|
||||||
data_sync_id='example-data-sync-id',
|
|
||||||
video_id='example-video-id',
|
|
||||||
|
|
||||||
request_cookiejar=YoutubeDLCookieJar(),
|
|
||||||
request_proxy=None,
|
|
||||||
request_headers=HTTPHeaderDict(),
|
|
||||||
request_timeout=None,
|
|
||||||
request_source_address=None,
|
|
||||||
request_verify_tls=True,
|
|
||||||
|
|
||||||
bypass_cache=False,
|
|
||||||
)
|
|
@ -1,117 +0,0 @@
|
|||||||
import threading
|
|
||||||
import time
|
|
||||||
from collections import OrderedDict
|
|
||||||
import pytest
|
|
||||||
from yt_dlp.extractor.youtube.pot._provider import IEContentProvider, BuiltinIEContentProvider
|
|
||||||
from yt_dlp.utils import bug_reports_message
|
|
||||||
from yt_dlp.extractor.youtube.pot._builtin.memory_cache import MemoryLRUPCP, memorylru_preference, initialize_global_cache
|
|
||||||
from yt_dlp.version import __version__
|
|
||||||
from yt_dlp.extractor.youtube.pot._registry import _pot_cache_providers, _pot_memory_cache
|
|
||||||
|
|
||||||
|
|
||||||
class TestMemoryLRUPCS:
|
|
||||||
|
|
||||||
def test_base_type(self):
|
|
||||||
assert issubclass(MemoryLRUPCP, IEContentProvider)
|
|
||||||
assert issubclass(MemoryLRUPCP, BuiltinIEContentProvider)
|
|
||||||
|
|
||||||
@pytest.fixture
|
|
||||||
def pcp(self, ie, logger) -> MemoryLRUPCP:
|
|
||||||
return MemoryLRUPCP(ie, logger, {}, initialize_cache=lambda max_size: (OrderedDict(), threading.Lock(), max_size))
|
|
||||||
|
|
||||||
def test_is_registered(self):
|
|
||||||
assert _pot_cache_providers.value.get('MemoryLRU') == MemoryLRUPCP
|
|
||||||
|
|
||||||
def test_initialization(self, pcp):
|
|
||||||
assert pcp.PROVIDER_NAME == 'memory'
|
|
||||||
assert pcp.PROVIDER_VERSION == __version__
|
|
||||||
assert pcp.BUG_REPORT_MESSAGE == bug_reports_message(before='')
|
|
||||||
assert pcp.is_available()
|
|
||||||
|
|
||||||
def test_store_and_get(self, pcp):
|
|
||||||
pcp.store('key1', 'value1', int(time.time()) + 60)
|
|
||||||
assert pcp.get('key1') == 'value1'
|
|
||||||
assert len(pcp.cache) == 1
|
|
||||||
|
|
||||||
def test_store_ignore_expired(self, pcp):
|
|
||||||
pcp.store('key1', 'value1', int(time.time()) - 1)
|
|
||||||
assert len(pcp.cache) == 0
|
|
||||||
assert pcp.get('key1') is None
|
|
||||||
assert len(pcp.cache) == 0
|
|
||||||
|
|
||||||
def test_store_override_existing_key(self, ie, logger):
|
|
||||||
MAX_SIZE = 2
|
|
||||||
pcp = MemoryLRUPCP(ie, logger, {}, initialize_cache=lambda max_size: (OrderedDict(), threading.Lock(), MAX_SIZE))
|
|
||||||
pcp.store('key1', 'value1', int(time.time()) + 60)
|
|
||||||
pcp.store('key2', 'value2', int(time.time()) + 60)
|
|
||||||
assert len(pcp.cache) == 2
|
|
||||||
pcp.store('key1', 'value2', int(time.time()) + 60)
|
|
||||||
# Ensure that the override key gets added to the end of the cache instead of in the same position
|
|
||||||
pcp.store('key3', 'value3', int(time.time()) + 60)
|
|
||||||
assert pcp.get('key1') == 'value2'
|
|
||||||
|
|
||||||
def test_store_ignore_expired_existing_key(self, pcp):
|
|
||||||
pcp.store('key1', 'value2', int(time.time()) + 60)
|
|
||||||
pcp.store('key1', 'value1', int(time.time()) - 1)
|
|
||||||
assert len(pcp.cache) == 1
|
|
||||||
assert pcp.get('key1') == 'value2'
|
|
||||||
assert len(pcp.cache) == 1
|
|
||||||
|
|
||||||
def test_get_key_expired(self, pcp):
|
|
||||||
pcp.store('key1', 'value1', int(time.time()) + 60)
|
|
||||||
assert pcp.get('key1') == 'value1'
|
|
||||||
assert len(pcp.cache) == 1
|
|
||||||
pcp.cache['key1'] = ('value1', int(time.time()) - 1)
|
|
||||||
assert pcp.get('key1') is None
|
|
||||||
assert len(pcp.cache) == 0
|
|
||||||
|
|
||||||
def test_lru_eviction(self, ie, logger):
|
|
||||||
MAX_SIZE = 2
|
|
||||||
provider = MemoryLRUPCP(ie, logger, {}, initialize_cache=lambda max_size: (OrderedDict(), threading.Lock(), MAX_SIZE))
|
|
||||||
provider.store('key1', 'value1', int(time.time()) + 5)
|
|
||||||
provider.store('key2', 'value2', int(time.time()) + 5)
|
|
||||||
assert len(provider.cache) == 2
|
|
||||||
|
|
||||||
assert provider.get('key1') == 'value1'
|
|
||||||
|
|
||||||
provider.store('key3', 'value3', int(time.time()) + 5)
|
|
||||||
assert len(provider.cache) == 2
|
|
||||||
|
|
||||||
assert provider.get('key2') is None
|
|
||||||
|
|
||||||
provider.store('key4', 'value4', int(time.time()) + 5)
|
|
||||||
assert len(provider.cache) == 2
|
|
||||||
|
|
||||||
assert provider.get('key1') is None
|
|
||||||
assert provider.get('key3') == 'value3'
|
|
||||||
assert provider.get('key4') == 'value4'
|
|
||||||
|
|
||||||
def test_delete(self, pcp):
|
|
||||||
pcp.store('key1', 'value1', int(time.time()) + 5)
|
|
||||||
assert len(pcp.cache) == 1
|
|
||||||
assert pcp.get('key1') == 'value1'
|
|
||||||
pcp.delete('key1')
|
|
||||||
assert len(pcp.cache) == 0
|
|
||||||
assert pcp.get('key1') is None
|
|
||||||
|
|
||||||
def test_use_global_cache_default(self, ie, logger):
|
|
||||||
pcp = MemoryLRUPCP(ie, logger, {})
|
|
||||||
assert pcp.max_size == _pot_memory_cache.value['max_size'] == 25
|
|
||||||
assert pcp.cache is _pot_memory_cache.value['cache']
|
|
||||||
assert pcp.lock is _pot_memory_cache.value['lock']
|
|
||||||
|
|
||||||
pcp2 = MemoryLRUPCP(ie, logger, {})
|
|
||||||
assert pcp.max_size == pcp2.max_size == _pot_memory_cache.value['max_size'] == 25
|
|
||||||
assert pcp.cache is pcp2.cache is _pot_memory_cache.value['cache']
|
|
||||||
assert pcp.lock is pcp2.lock is _pot_memory_cache.value['lock']
|
|
||||||
|
|
||||||
def test_fail_max_size_change_global(self, ie, logger):
|
|
||||||
pcp = MemoryLRUPCP(ie, logger, {})
|
|
||||||
assert pcp.max_size == _pot_memory_cache.value['max_size'] == 25
|
|
||||||
with pytest.raises(ValueError, match='Cannot change max_size of initialized global memory cache'):
|
|
||||||
initialize_global_cache(50)
|
|
||||||
|
|
||||||
assert pcp.max_size == _pot_memory_cache.value['max_size'] == 25
|
|
||||||
|
|
||||||
def test_memory_lru_preference(self, pcp, ie, pot_request):
|
|
||||||
assert memorylru_preference(pcp, pot_request) == 10000
|
|
@ -1,47 +0,0 @@
|
|||||||
import pytest
|
|
||||||
from yt_dlp.extractor.youtube.pot.provider import (
|
|
||||||
PoTokenContext,
|
|
||||||
|
|
||||||
)
|
|
||||||
|
|
||||||
from yt_dlp.extractor.youtube.pot.utils import get_webpo_content_binding, ContentBindingType
|
|
||||||
|
|
||||||
|
|
||||||
class TestGetWebPoContentBinding:
|
|
||||||
|
|
||||||
@pytest.mark.parametrize('client_name, context, is_authenticated, expected', [
|
|
||||||
*[(client, context, is_authenticated, expected) for client in [
|
|
||||||
'WEB', 'MWEB', 'TVHTML5', 'WEB_EMBEDDED_PLAYER', 'WEB_CREATOR', 'TVHTML5_SIMPLY_EMBEDDED_PLAYER', 'TVHTML5_SIMPLY']
|
|
||||||
for context, is_authenticated, expected in [
|
|
||||||
(PoTokenContext.GVS, False, ('example-visitor-data', ContentBindingType.VISITOR_DATA)),
|
|
||||||
(PoTokenContext.PLAYER, False, ('example-video-id', ContentBindingType.VIDEO_ID)),
|
|
||||||
(PoTokenContext.SUBS, False, ('example-video-id', ContentBindingType.VIDEO_ID)),
|
|
||||||
(PoTokenContext.GVS, True, ('example-data-sync-id', ContentBindingType.DATASYNC_ID)),
|
|
||||||
]],
|
|
||||||
('WEB_REMIX', PoTokenContext.GVS, False, ('example-visitor-data', ContentBindingType.VISITOR_DATA)),
|
|
||||||
('WEB_REMIX', PoTokenContext.PLAYER, False, ('example-visitor-data', ContentBindingType.VISITOR_DATA)),
|
|
||||||
('ANDROID', PoTokenContext.GVS, False, (None, None)),
|
|
||||||
('IOS', PoTokenContext.GVS, False, (None, None)),
|
|
||||||
])
|
|
||||||
def test_get_webpo_content_binding(self, pot_request, client_name, context, is_authenticated, expected):
|
|
||||||
pot_request.innertube_context['client']['clientName'] = client_name
|
|
||||||
pot_request.context = context
|
|
||||||
pot_request.is_authenticated = is_authenticated
|
|
||||||
assert get_webpo_content_binding(pot_request) == expected
|
|
||||||
|
|
||||||
def test_extract_visitor_id(self, pot_request):
|
|
||||||
pot_request.visitor_data = 'CgsxMjNhYmNYWVpfLSiA4s%2DqBg%3D%3D'
|
|
||||||
assert get_webpo_content_binding(pot_request, bind_to_visitor_id=True) == ('123abcXYZ_-', ContentBindingType.VISITOR_ID)
|
|
||||||
|
|
||||||
def test_invalid_visitor_id(self, pot_request):
|
|
||||||
# visitor id not alphanumeric (i.e. protobuf extraction failed)
|
|
||||||
pot_request.visitor_data = 'CggxMjM0NTY3OCiA4s-qBg%3D%3D'
|
|
||||||
assert get_webpo_content_binding(pot_request, bind_to_visitor_id=True) == (pot_request.visitor_data, ContentBindingType.VISITOR_DATA)
|
|
||||||
|
|
||||||
def test_no_visitor_id(self, pot_request):
|
|
||||||
pot_request.visitor_data = 'KIDiz6oG'
|
|
||||||
assert get_webpo_content_binding(pot_request, bind_to_visitor_id=True) == (pot_request.visitor_data, ContentBindingType.VISITOR_DATA)
|
|
||||||
|
|
||||||
def test_invalid_base64(self, pot_request):
|
|
||||||
pot_request.visitor_data = 'invalid-base64'
|
|
||||||
assert get_webpo_content_binding(pot_request, bind_to_visitor_id=True) == (pot_request.visitor_data, ContentBindingType.VISITOR_DATA)
|
|
@ -1,92 +0,0 @@
|
|||||||
import pytest
|
|
||||||
|
|
||||||
from yt_dlp.extractor.youtube.pot._provider import IEContentProvider, BuiltinIEContentProvider
|
|
||||||
from yt_dlp.extractor.youtube.pot.cache import CacheProviderWritePolicy
|
|
||||||
from yt_dlp.utils import bug_reports_message
|
|
||||||
from yt_dlp.extractor.youtube.pot.provider import (
|
|
||||||
PoTokenRequest,
|
|
||||||
PoTokenContext,
|
|
||||||
|
|
||||||
)
|
|
||||||
from yt_dlp.version import __version__
|
|
||||||
|
|
||||||
from yt_dlp.extractor.youtube.pot._builtin.webpo_cachespec import WebPoPCSP
|
|
||||||
from yt_dlp.extractor.youtube.pot._registry import _pot_pcs_providers
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.fixture()
|
|
||||||
def pot_request(pot_request) -> PoTokenRequest:
|
|
||||||
pot_request.visitor_data = 'CgsxMjNhYmNYWVpfLSiA4s%2DqBg%3D%3D' # visitor_id=123abcXYZ_-
|
|
||||||
return pot_request
|
|
||||||
|
|
||||||
|
|
||||||
class TestWebPoPCSP:
|
|
||||||
def test_base_type(self):
|
|
||||||
assert issubclass(WebPoPCSP, IEContentProvider)
|
|
||||||
assert issubclass(WebPoPCSP, BuiltinIEContentProvider)
|
|
||||||
|
|
||||||
def test_init(self, ie, logger):
|
|
||||||
pcs = WebPoPCSP(ie=ie, logger=logger, settings={})
|
|
||||||
assert pcs.PROVIDER_NAME == 'webpo'
|
|
||||||
assert pcs.PROVIDER_VERSION == __version__
|
|
||||||
assert pcs.BUG_REPORT_MESSAGE == bug_reports_message(before='')
|
|
||||||
assert pcs.is_available()
|
|
||||||
|
|
||||||
def test_is_registered(self):
|
|
||||||
assert _pot_pcs_providers.value.get('WebPo') == WebPoPCSP
|
|
||||||
|
|
||||||
@pytest.mark.parametrize('client_name, context, is_authenticated', [
|
|
||||||
('ANDROID', PoTokenContext.GVS, False),
|
|
||||||
('IOS', PoTokenContext.GVS, False),
|
|
||||||
('IOS', PoTokenContext.PLAYER, False),
|
|
||||||
])
|
|
||||||
def test_not_supports(self, ie, logger, pot_request, client_name, context, is_authenticated):
|
|
||||||
pcs = WebPoPCSP(ie=ie, logger=logger, settings={})
|
|
||||||
pot_request.innertube_context['client']['clientName'] = client_name
|
|
||||||
pot_request.context = context
|
|
||||||
pot_request.is_authenticated = is_authenticated
|
|
||||||
assert pcs.generate_cache_spec(pot_request) is None
|
|
||||||
|
|
||||||
@pytest.mark.parametrize('client_name, context, is_authenticated, remote_host, source_address, request_proxy, expected', [
|
|
||||||
*[(client, context, is_authenticated, remote_host, source_address, request_proxy, expected) for client in [
|
|
||||||
'WEB', 'MWEB', 'TVHTML5', 'WEB_EMBEDDED_PLAYER', 'WEB_CREATOR', 'TVHTML5_SIMPLY_EMBEDDED_PLAYER', 'TVHTML5_SIMPLY']
|
|
||||||
for context, is_authenticated, remote_host, source_address, request_proxy, expected in [
|
|
||||||
(PoTokenContext.GVS, False, 'example-remote-host', 'example-source-address', 'example-request-proxy', {'t': 'webpo', 'ip': 'example-remote-host', 'sa': 'example-source-address', 'px': 'example-request-proxy', 'cb': '123abcXYZ_-', 'cbt': 'visitor_id'}),
|
|
||||||
(PoTokenContext.PLAYER, False, 'example-remote-host', 'example-source-address', 'example-request-proxy', {'t': 'webpo', 'ip': 'example-remote-host', 'sa': 'example-source-address', 'px': 'example-request-proxy', 'cb': '123abcXYZ_-', 'cbt': 'video_id'}),
|
|
||||||
(PoTokenContext.GVS, True, 'example-remote-host', 'example-source-address', 'example-request-proxy', {'t': 'webpo', 'ip': 'example-remote-host', 'sa': 'example-source-address', 'px': 'example-request-proxy', 'cb': 'example-data-sync-id', 'cbt': 'datasync_id'}),
|
|
||||||
]],
|
|
||||||
('WEB_REMIX', PoTokenContext.PLAYER, False, 'example-remote-host', 'example-source-address', 'example-request-proxy', {'t': 'webpo', 'ip': 'example-remote-host', 'sa': 'example-source-address', 'px': 'example-request-proxy', 'cb': '123abcXYZ_-', 'cbt': 'visitor_id'}),
|
|
||||||
('WEB', PoTokenContext.GVS, False, None, None, None, {'t': 'webpo', 'cb': '123abcXYZ_-', 'cbt': 'visitor_id', 'ip': None, 'sa': None, 'px': None}),
|
|
||||||
('TVHTML5', PoTokenContext.PLAYER, False, None, None, 'http://example.com', {'t': 'webpo', 'cb': '123abcXYZ_-', 'cbt': 'video_id', 'ip': None, 'sa': None, 'px': 'http://example.com'}),
|
|
||||||
|
|
||||||
])
|
|
||||||
def test_generate_key_bindings(self, ie, logger, pot_request, client_name, context, is_authenticated, remote_host, source_address, request_proxy, expected):
|
|
||||||
pcs = WebPoPCSP(ie=ie, logger=logger, settings={})
|
|
||||||
pot_request.innertube_context['client']['clientName'] = client_name
|
|
||||||
pot_request.context = context
|
|
||||||
pot_request.is_authenticated = is_authenticated
|
|
||||||
pot_request.innertube_context['client']['remoteHost'] = remote_host
|
|
||||||
pot_request.request_source_address = source_address
|
|
||||||
pot_request.request_proxy = request_proxy
|
|
||||||
pot_request.video_id = '123abcXYZ_-' # same as visitor id to test type
|
|
||||||
|
|
||||||
assert pcs.generate_cache_spec(pot_request).key_bindings == expected
|
|
||||||
|
|
||||||
def test_no_bind_visitor_id(self, ie, logger, pot_request):
|
|
||||||
# Should not bind to visitor id if setting is set to False
|
|
||||||
pcs = WebPoPCSP(ie=ie, logger=logger, settings={'bind_to_visitor_id': ['false']})
|
|
||||||
pot_request.innertube_context['client']['clientName'] = 'WEB'
|
|
||||||
pot_request.context = PoTokenContext.GVS
|
|
||||||
pot_request.is_authenticated = False
|
|
||||||
assert pcs.generate_cache_spec(pot_request).key_bindings == {'t': 'webpo', 'ip': None, 'sa': None, 'px': None, 'cb': 'CgsxMjNhYmNYWVpfLSiA4s%2DqBg%3D%3D', 'cbt': 'visitor_data'}
|
|
||||||
|
|
||||||
def test_default_ttl(self, ie, logger, pot_request):
|
|
||||||
pcs = WebPoPCSP(ie=ie, logger=logger, settings={})
|
|
||||||
assert pcs.generate_cache_spec(pot_request).default_ttl == 6 * 60 * 60 # should default to 6 hours
|
|
||||||
|
|
||||||
def test_write_policy(self, ie, logger, pot_request):
|
|
||||||
pcs = WebPoPCSP(ie=ie, logger=logger, settings={})
|
|
||||||
pot_request.context = PoTokenContext.GVS
|
|
||||||
assert pcs.generate_cache_spec(pot_request).write_policy == CacheProviderWritePolicy.WRITE_ALL
|
|
||||||
pot_request.context = PoTokenContext.PLAYER
|
|
||||||
assert pcs.generate_cache_spec(pot_request).write_policy == CacheProviderWritePolicy.WRITE_FIRST
|
|
File diff suppressed because it is too large
Load Diff
@ -1,629 +0,0 @@
|
|||||||
import pytest
|
|
||||||
|
|
||||||
from yt_dlp.extractor.youtube.pot._provider import IEContentProvider
|
|
||||||
from yt_dlp.cookies import YoutubeDLCookieJar
|
|
||||||
from yt_dlp.utils.networking import HTTPHeaderDict
|
|
||||||
from yt_dlp.extractor.youtube.pot.provider import (
|
|
||||||
PoTokenRequest,
|
|
||||||
PoTokenContext,
|
|
||||||
ExternalRequestFeature,
|
|
||||||
|
|
||||||
)
|
|
||||||
|
|
||||||
from yt_dlp.extractor.youtube.pot.cache import (
|
|
||||||
PoTokenCacheProvider,
|
|
||||||
PoTokenCacheSpec,
|
|
||||||
PoTokenCacheSpecProvider,
|
|
||||||
CacheProviderWritePolicy,
|
|
||||||
)
|
|
||||||
|
|
||||||
import yt_dlp.extractor.youtube.pot.cache as cache
|
|
||||||
|
|
||||||
from yt_dlp.networking import Request
|
|
||||||
from yt_dlp.extractor.youtube.pot.provider import (
|
|
||||||
PoTokenResponse,
|
|
||||||
PoTokenProvider,
|
|
||||||
PoTokenProviderRejectedRequest,
|
|
||||||
provider_bug_report_message,
|
|
||||||
register_provider,
|
|
||||||
register_preference,
|
|
||||||
)
|
|
||||||
|
|
||||||
from yt_dlp.extractor.youtube.pot._registry import _pot_providers, _ptp_preferences, _pot_pcs_providers, _pot_cache_providers, _pot_cache_provider_preferences
|
|
||||||
|
|
||||||
|
|
||||||
class ExamplePTP(PoTokenProvider):
|
|
||||||
PROVIDER_NAME = 'example'
|
|
||||||
PROVIDER_VERSION = '0.0.1'
|
|
||||||
BUG_REPORT_LOCATION = 'https://example.com/issues'
|
|
||||||
|
|
||||||
_SUPPORTED_CLIENTS = ('WEB',)
|
|
||||||
_SUPPORTED_CONTEXTS = (PoTokenContext.GVS, )
|
|
||||||
|
|
||||||
_SUPPORTED_EXTERNAL_REQUEST_FEATURES = (
|
|
||||||
ExternalRequestFeature.PROXY_SCHEME_HTTP,
|
|
||||||
ExternalRequestFeature.PROXY_SCHEME_SOCKS5H,
|
|
||||||
)
|
|
||||||
|
|
||||||
def is_available(self) -> bool:
|
|
||||||
return True
|
|
||||||
|
|
||||||
def _real_request_pot(self, request: PoTokenRequest) -> PoTokenResponse:
|
|
||||||
return PoTokenResponse('example-token', expires_at=123)
|
|
||||||
|
|
||||||
|
|
||||||
class ExampleCacheProviderPCP(PoTokenCacheProvider):
|
|
||||||
|
|
||||||
PROVIDER_NAME = 'example'
|
|
||||||
PROVIDER_VERSION = '0.0.1'
|
|
||||||
BUG_REPORT_LOCATION = 'https://example.com/issues'
|
|
||||||
|
|
||||||
def is_available(self) -> bool:
|
|
||||||
return True
|
|
||||||
|
|
||||||
def get(self, key: str):
|
|
||||||
return 'example-cache'
|
|
||||||
|
|
||||||
def store(self, key: str, value: str, expires_at: int):
|
|
||||||
pass
|
|
||||||
|
|
||||||
def delete(self, key: str):
|
|
||||||
pass
|
|
||||||
|
|
||||||
|
|
||||||
class ExampleCacheSpecProviderPCSP(PoTokenCacheSpecProvider):
|
|
||||||
|
|
||||||
PROVIDER_NAME = 'example'
|
|
||||||
PROVIDER_VERSION = '0.0.1'
|
|
||||||
BUG_REPORT_LOCATION = 'https://example.com/issues'
|
|
||||||
|
|
||||||
def generate_cache_spec(self, request: PoTokenRequest):
|
|
||||||
return PoTokenCacheSpec(
|
|
||||||
key_bindings={'field': 'example-key'},
|
|
||||||
default_ttl=60,
|
|
||||||
write_policy=CacheProviderWritePolicy.WRITE_FIRST,
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
class TestPoTokenProvider:
|
|
||||||
|
|
||||||
def test_base_type(self):
|
|
||||||
assert issubclass(PoTokenProvider, IEContentProvider)
|
|
||||||
|
|
||||||
def test_create_provider_missing_fetch_method(self, ie, logger):
|
|
||||||
class MissingMethodsPTP(PoTokenProvider):
|
|
||||||
def is_available(self) -> bool:
|
|
||||||
return True
|
|
||||||
|
|
||||||
with pytest.raises(TypeError):
|
|
||||||
MissingMethodsPTP(ie=ie, logger=logger, settings={})
|
|
||||||
|
|
||||||
def test_create_provider_missing_available_method(self, ie, logger):
|
|
||||||
class MissingMethodsPTP(PoTokenProvider):
|
|
||||||
def _real_request_pot(self, request: PoTokenRequest) -> PoTokenResponse:
|
|
||||||
raise PoTokenProviderRejectedRequest('Not implemented')
|
|
||||||
|
|
||||||
with pytest.raises(TypeError):
|
|
||||||
MissingMethodsPTP(ie=ie, logger=logger, settings={})
|
|
||||||
|
|
||||||
def test_barebones_provider(self, ie, logger):
|
|
||||||
class BarebonesProviderPTP(PoTokenProvider):
|
|
||||||
def is_available(self) -> bool:
|
|
||||||
return True
|
|
||||||
|
|
||||||
def _real_request_pot(self, request: PoTokenRequest) -> PoTokenResponse:
|
|
||||||
raise PoTokenProviderRejectedRequest('Not implemented')
|
|
||||||
|
|
||||||
provider = BarebonesProviderPTP(ie=ie, logger=logger, settings={})
|
|
||||||
assert provider.PROVIDER_NAME == 'BarebonesProvider'
|
|
||||||
assert provider.PROVIDER_KEY == 'BarebonesProvider'
|
|
||||||
assert provider.PROVIDER_VERSION == '0.0.0'
|
|
||||||
assert provider.BUG_REPORT_MESSAGE == 'please report this issue to the provider developer at (developer has not provided a bug report location) .'
|
|
||||||
|
|
||||||
def test_example_provider_success(self, ie, logger, pot_request):
|
|
||||||
provider = ExamplePTP(ie=ie, logger=logger, settings={})
|
|
||||||
assert provider.PROVIDER_NAME == 'example'
|
|
||||||
assert provider.PROVIDER_KEY == 'Example'
|
|
||||||
assert provider.PROVIDER_VERSION == '0.0.1'
|
|
||||||
assert provider.BUG_REPORT_MESSAGE == 'please report this issue to the provider developer at https://example.com/issues .'
|
|
||||||
assert provider.is_available()
|
|
||||||
|
|
||||||
response = provider.request_pot(pot_request)
|
|
||||||
|
|
||||||
assert response.po_token == 'example-token'
|
|
||||||
assert response.expires_at == 123
|
|
||||||
|
|
||||||
def test_provider_unsupported_context(self, ie, logger, pot_request):
|
|
||||||
provider = ExamplePTP(ie=ie, logger=logger, settings={})
|
|
||||||
pot_request.context = PoTokenContext.PLAYER
|
|
||||||
|
|
||||||
with pytest.raises(PoTokenProviderRejectedRequest):
|
|
||||||
provider.request_pot(pot_request)
|
|
||||||
|
|
||||||
def test_provider_unsupported_client(self, ie, logger, pot_request):
|
|
||||||
provider = ExamplePTP(ie=ie, logger=logger, settings={})
|
|
||||||
pot_request.innertube_context['client']['clientName'] = 'ANDROID'
|
|
||||||
|
|
||||||
with pytest.raises(PoTokenProviderRejectedRequest):
|
|
||||||
provider.request_pot(pot_request)
|
|
||||||
|
|
||||||
def test_provider_unsupported_proxy_scheme(self, ie, logger, pot_request):
|
|
||||||
provider = ExamplePTP(ie=ie, logger=logger, settings={})
|
|
||||||
pot_request.request_proxy = 'socks4://example.com'
|
|
||||||
|
|
||||||
with pytest.raises(
|
|
||||||
PoTokenProviderRejectedRequest,
|
|
||||||
match='External requests by "example" provider do not support proxy scheme "socks4". Supported proxy '
|
|
||||||
'schemes: http, socks5h',
|
|
||||||
):
|
|
||||||
provider.request_pot(pot_request)
|
|
||||||
|
|
||||||
pot_request.request_proxy = 'http://example.com'
|
|
||||||
|
|
||||||
assert provider.request_pot(pot_request)
|
|
||||||
|
|
||||||
def test_provider_ignore_external_request_features(self, ie, logger, pot_request):
|
|
||||||
class InternalPTP(ExamplePTP):
|
|
||||||
_SUPPORTED_EXTERNAL_REQUEST_FEATURES = None
|
|
||||||
|
|
||||||
provider = InternalPTP(ie=ie, logger=logger, settings={})
|
|
||||||
|
|
||||||
pot_request.request_proxy = 'socks5://example.com'
|
|
||||||
assert provider.request_pot(pot_request)
|
|
||||||
pot_request.request_source_address = '0.0.0.0'
|
|
||||||
assert provider.request_pot(pot_request)
|
|
||||||
|
|
||||||
def test_provider_unsupported_external_request_source_address(self, ie, logger, pot_request):
|
|
||||||
class InternalPTP(ExamplePTP):
|
|
||||||
_SUPPORTED_EXTERNAL_REQUEST_FEATURES = tuple()
|
|
||||||
|
|
||||||
provider = InternalPTP(ie=ie, logger=logger, settings={})
|
|
||||||
|
|
||||||
pot_request.request_source_address = None
|
|
||||||
assert provider.request_pot(pot_request)
|
|
||||||
|
|
||||||
pot_request.request_source_address = '0.0.0.0'
|
|
||||||
with pytest.raises(
|
|
||||||
PoTokenProviderRejectedRequest,
|
|
||||||
match='External requests by "example" provider do not support setting source address',
|
|
||||||
):
|
|
||||||
provider.request_pot(pot_request)
|
|
||||||
|
|
||||||
def test_provider_supported_external_request_source_address(self, ie, logger, pot_request):
|
|
||||||
class InternalPTP(ExamplePTP):
|
|
||||||
_SUPPORTED_EXTERNAL_REQUEST_FEATURES = (
|
|
||||||
ExternalRequestFeature.SOURCE_ADDRESS,
|
|
||||||
)
|
|
||||||
|
|
||||||
provider = InternalPTP(ie=ie, logger=logger, settings={})
|
|
||||||
|
|
||||||
pot_request.request_source_address = None
|
|
||||||
assert provider.request_pot(pot_request)
|
|
||||||
|
|
||||||
pot_request.request_source_address = '0.0.0.0'
|
|
||||||
assert provider.request_pot(pot_request)
|
|
||||||
|
|
||||||
def test_provider_unsupported_external_request_tls_verification(self, ie, logger, pot_request):
|
|
||||||
class InternalPTP(ExamplePTP):
|
|
||||||
_SUPPORTED_EXTERNAL_REQUEST_FEATURES = tuple()
|
|
||||||
|
|
||||||
provider = InternalPTP(ie=ie, logger=logger, settings={})
|
|
||||||
|
|
||||||
pot_request.request_verify_tls = True
|
|
||||||
assert provider.request_pot(pot_request)
|
|
||||||
|
|
||||||
pot_request.request_verify_tls = False
|
|
||||||
with pytest.raises(
|
|
||||||
PoTokenProviderRejectedRequest,
|
|
||||||
match='External requests by "example" provider do not support ignoring TLS certificate failures',
|
|
||||||
):
|
|
||||||
provider.request_pot(pot_request)
|
|
||||||
|
|
||||||
def test_provider_supported_external_request_tls_verification(self, ie, logger, pot_request):
|
|
||||||
class InternalPTP(ExamplePTP):
|
|
||||||
_SUPPORTED_EXTERNAL_REQUEST_FEATURES = (
|
|
||||||
ExternalRequestFeature.DISABLE_TLS_VERIFICATION,
|
|
||||||
)
|
|
||||||
|
|
||||||
provider = InternalPTP(ie=ie, logger=logger, settings={})
|
|
||||||
|
|
||||||
pot_request.request_verify_tls = True
|
|
||||||
assert provider.request_pot(pot_request)
|
|
||||||
|
|
||||||
pot_request.request_verify_tls = False
|
|
||||||
assert provider.request_pot(pot_request)
|
|
||||||
|
|
||||||
def test_provider_request_webpage(self, ie, logger, pot_request):
|
|
||||||
provider = ExamplePTP(ie=ie, logger=logger, settings={})
|
|
||||||
|
|
||||||
cookiejar = YoutubeDLCookieJar()
|
|
||||||
pot_request.request_headers = HTTPHeaderDict({'User-Agent': 'example-user-agent'})
|
|
||||||
pot_request.request_proxy = 'socks5://example-proxy.com'
|
|
||||||
pot_request.request_cookiejar = cookiejar
|
|
||||||
|
|
||||||
def mock_urlopen(request):
|
|
||||||
return request
|
|
||||||
|
|
||||||
ie._downloader.urlopen = mock_urlopen
|
|
||||||
|
|
||||||
sent_request = provider._request_webpage(Request(
|
|
||||||
'https://example.com',
|
|
||||||
), pot_request=pot_request)
|
|
||||||
|
|
||||||
assert sent_request.url == 'https://example.com'
|
|
||||||
assert sent_request.headers['User-Agent'] == 'example-user-agent'
|
|
||||||
assert sent_request.proxies == {'all': 'socks5://example-proxy.com'}
|
|
||||||
assert sent_request.extensions['cookiejar'] is cookiejar
|
|
||||||
assert 'Requesting webpage' in logger.messages['info']
|
|
||||||
|
|
||||||
def test_provider_request_webpage_override(self, ie, logger, pot_request):
|
|
||||||
provider = ExamplePTP(ie=ie, logger=logger, settings={})
|
|
||||||
|
|
||||||
cookiejar_request = YoutubeDLCookieJar()
|
|
||||||
pot_request.request_headers = HTTPHeaderDict({'User-Agent': 'example-user-agent'})
|
|
||||||
pot_request.request_proxy = 'socks5://example-proxy.com'
|
|
||||||
pot_request.request_cookiejar = cookiejar_request
|
|
||||||
|
|
||||||
def mock_urlopen(request):
|
|
||||||
return request
|
|
||||||
|
|
||||||
ie._downloader.urlopen = mock_urlopen
|
|
||||||
|
|
||||||
sent_request = provider._request_webpage(Request(
|
|
||||||
'https://example.com',
|
|
||||||
headers={'User-Agent': 'override-user-agent-override'},
|
|
||||||
proxies={'http': 'http://example-proxy-override.com'},
|
|
||||||
extensions={'cookiejar': YoutubeDLCookieJar()},
|
|
||||||
), pot_request=pot_request, note='Custom requesting webpage')
|
|
||||||
|
|
||||||
assert sent_request.url == 'https://example.com'
|
|
||||||
assert sent_request.headers['User-Agent'] == 'override-user-agent-override'
|
|
||||||
assert sent_request.proxies == {'http': 'http://example-proxy-override.com'}
|
|
||||||
assert sent_request.extensions['cookiejar'] is not cookiejar_request
|
|
||||||
assert 'Custom requesting webpage' in logger.messages['info']
|
|
||||||
|
|
||||||
def test_provider_request_webpage_no_log(self, ie, logger, pot_request):
|
|
||||||
provider = ExamplePTP(ie=ie, logger=logger, settings={})
|
|
||||||
|
|
||||||
def mock_urlopen(request):
|
|
||||||
return request
|
|
||||||
|
|
||||||
ie._downloader.urlopen = mock_urlopen
|
|
||||||
|
|
||||||
sent_request = provider._request_webpage(Request(
|
|
||||||
'https://example.com',
|
|
||||||
), note=False)
|
|
||||||
|
|
||||||
assert sent_request.url == 'https://example.com'
|
|
||||||
assert 'info' not in logger.messages
|
|
||||||
|
|
||||||
def test_provider_request_webpage_no_pot_request(self, ie, logger):
|
|
||||||
provider = ExamplePTP(ie=ie, logger=logger, settings={})
|
|
||||||
|
|
||||||
def mock_urlopen(request):
|
|
||||||
return request
|
|
||||||
|
|
||||||
ie._downloader.urlopen = mock_urlopen
|
|
||||||
|
|
||||||
sent_request = provider._request_webpage(Request(
|
|
||||||
'https://example.com',
|
|
||||||
), pot_request=None)
|
|
||||||
|
|
||||||
assert sent_request.url == 'https://example.com'
|
|
||||||
|
|
||||||
def test_get_config_arg(self, ie, logger):
|
|
||||||
provider = ExamplePTP(ie=ie, logger=logger, settings={'abc': ['123D'], 'xyz': ['456a', '789B']})
|
|
||||||
|
|
||||||
assert provider._configuration_arg('abc') == ['123d']
|
|
||||||
assert provider._configuration_arg('abc', default=['default']) == ['123d']
|
|
||||||
assert provider._configuration_arg('ABC', default=['default']) == ['default']
|
|
||||||
assert provider._configuration_arg('abc', casesense=True) == ['123D']
|
|
||||||
assert provider._configuration_arg('xyz', casesense=False) == ['456a', '789b']
|
|
||||||
|
|
||||||
def test_require_class_end_with_suffix(self, ie, logger):
|
|
||||||
class InvalidSuffix(PoTokenProvider):
|
|
||||||
PROVIDER_NAME = 'invalid-suffix'
|
|
||||||
|
|
||||||
def _real_request_pot(self, request: PoTokenRequest) -> PoTokenResponse:
|
|
||||||
raise PoTokenProviderRejectedRequest('Not implemented')
|
|
||||||
|
|
||||||
def is_available(self) -> bool:
|
|
||||||
return True
|
|
||||||
|
|
||||||
provider = InvalidSuffix(ie=ie, logger=logger, settings={})
|
|
||||||
|
|
||||||
with pytest.raises(AssertionError):
|
|
||||||
provider.PROVIDER_KEY # noqa: B018
|
|
||||||
|
|
||||||
|
|
||||||
class TestPoTokenCacheProvider:
|
|
||||||
|
|
||||||
def test_base_type(self):
|
|
||||||
assert issubclass(PoTokenCacheProvider, IEContentProvider)
|
|
||||||
|
|
||||||
def test_create_provider_missing_get_method(self, ie, logger):
|
|
||||||
class MissingMethodsPCP(PoTokenCacheProvider):
|
|
||||||
def store(self, key: str, value: str, expires_at: int):
|
|
||||||
pass
|
|
||||||
|
|
||||||
def delete(self, key: str):
|
|
||||||
pass
|
|
||||||
|
|
||||||
def is_available(self) -> bool:
|
|
||||||
return True
|
|
||||||
|
|
||||||
with pytest.raises(TypeError):
|
|
||||||
MissingMethodsPCP(ie=ie, logger=logger, settings={})
|
|
||||||
|
|
||||||
def test_create_provider_missing_store_method(self, ie, logger):
|
|
||||||
class MissingMethodsPCP(PoTokenCacheProvider):
|
|
||||||
def get(self, key: str):
|
|
||||||
pass
|
|
||||||
|
|
||||||
def delete(self, key: str):
|
|
||||||
pass
|
|
||||||
|
|
||||||
def is_available(self) -> bool:
|
|
||||||
return True
|
|
||||||
|
|
||||||
with pytest.raises(TypeError):
|
|
||||||
MissingMethodsPCP(ie=ie, logger=logger, settings={})
|
|
||||||
|
|
||||||
def test_create_provider_missing_delete_method(self, ie, logger):
|
|
||||||
class MissingMethodsPCP(PoTokenCacheProvider):
|
|
||||||
def get(self, key: str):
|
|
||||||
pass
|
|
||||||
|
|
||||||
def store(self, key: str, value: str, expires_at: int):
|
|
||||||
pass
|
|
||||||
|
|
||||||
def is_available(self) -> bool:
|
|
||||||
return True
|
|
||||||
|
|
||||||
with pytest.raises(TypeError):
|
|
||||||
MissingMethodsPCP(ie=ie, logger=logger, settings={})
|
|
||||||
|
|
||||||
def test_create_provider_missing_is_available_method(self, ie, logger):
|
|
||||||
class MissingMethodsPCP(PoTokenCacheProvider):
|
|
||||||
def get(self, key: str):
|
|
||||||
pass
|
|
||||||
|
|
||||||
def store(self, key: str, value: str, expires_at: int):
|
|
||||||
pass
|
|
||||||
|
|
||||||
def delete(self, key: str):
|
|
||||||
pass
|
|
||||||
|
|
||||||
with pytest.raises(TypeError):
|
|
||||||
MissingMethodsPCP(ie=ie, logger=logger, settings={})
|
|
||||||
|
|
||||||
def test_barebones_provider(self, ie, logger):
|
|
||||||
class BarebonesProviderPCP(PoTokenCacheProvider):
|
|
||||||
|
|
||||||
def is_available(self) -> bool:
|
|
||||||
return True
|
|
||||||
|
|
||||||
def get(self, key: str):
|
|
||||||
return 'example-cache'
|
|
||||||
|
|
||||||
def store(self, key: str, value: str, expires_at: int):
|
|
||||||
pass
|
|
||||||
|
|
||||||
def delete(self, key: str):
|
|
||||||
pass
|
|
||||||
|
|
||||||
provider = BarebonesProviderPCP(ie=ie, logger=logger, settings={})
|
|
||||||
assert provider.PROVIDER_NAME == 'BarebonesProvider'
|
|
||||||
assert provider.PROVIDER_KEY == 'BarebonesProvider'
|
|
||||||
assert provider.PROVIDER_VERSION == '0.0.0'
|
|
||||||
assert provider.BUG_REPORT_MESSAGE == 'please report this issue to the provider developer at (developer has not provided a bug report location) .'
|
|
||||||
|
|
||||||
def test_create_provider_example(self, ie, logger):
|
|
||||||
provider = ExampleCacheProviderPCP(ie=ie, logger=logger, settings={})
|
|
||||||
assert provider.PROVIDER_NAME == 'example'
|
|
||||||
assert provider.PROVIDER_KEY == 'ExampleCacheProvider'
|
|
||||||
assert provider.PROVIDER_VERSION == '0.0.1'
|
|
||||||
assert provider.BUG_REPORT_MESSAGE == 'please report this issue to the provider developer at https://example.com/issues .'
|
|
||||||
assert provider.is_available()
|
|
||||||
|
|
||||||
def test_get_config_arg(self, ie, logger):
|
|
||||||
provider = ExampleCacheProviderPCP(ie=ie, logger=logger, settings={'abc': ['123D'], 'xyz': ['456a', '789B']})
|
|
||||||
assert provider._configuration_arg('abc') == ['123d']
|
|
||||||
assert provider._configuration_arg('abc', default=['default']) == ['123d']
|
|
||||||
assert provider._configuration_arg('ABC', default=['default']) == ['default']
|
|
||||||
assert provider._configuration_arg('abc', casesense=True) == ['123D']
|
|
||||||
assert provider._configuration_arg('xyz', casesense=False) == ['456a', '789b']
|
|
||||||
|
|
||||||
def test_require_class_end_with_suffix(self, ie, logger):
|
|
||||||
class InvalidSuffix(PoTokenCacheProvider):
|
|
||||||
def get(self, key: str):
|
|
||||||
return 'example-cache'
|
|
||||||
|
|
||||||
def store(self, key: str, value: str, expires_at: int):
|
|
||||||
pass
|
|
||||||
|
|
||||||
def delete(self, key: str):
|
|
||||||
pass
|
|
||||||
|
|
||||||
def is_available(self) -> bool:
|
|
||||||
return True
|
|
||||||
|
|
||||||
provider = InvalidSuffix(ie=ie, logger=logger, settings={})
|
|
||||||
|
|
||||||
with pytest.raises(AssertionError):
|
|
||||||
provider.PROVIDER_KEY # noqa: B018
|
|
||||||
|
|
||||||
|
|
||||||
class TestPoTokenCacheSpecProvider:
|
|
||||||
|
|
||||||
def test_base_type(self):
|
|
||||||
assert issubclass(PoTokenCacheSpecProvider, IEContentProvider)
|
|
||||||
|
|
||||||
def test_create_provider_missing_supports_method(self, ie, logger):
|
|
||||||
class MissingMethodsPCS(PoTokenCacheSpecProvider):
|
|
||||||
pass
|
|
||||||
|
|
||||||
with pytest.raises(TypeError):
|
|
||||||
MissingMethodsPCS(ie=ie, logger=logger, settings={})
|
|
||||||
|
|
||||||
def test_create_provider_barebones(self, ie, pot_request, logger):
|
|
||||||
class BarebonesProviderPCSP(PoTokenCacheSpecProvider):
|
|
||||||
def generate_cache_spec(self, request: PoTokenRequest):
|
|
||||||
return PoTokenCacheSpec(
|
|
||||||
default_ttl=100,
|
|
||||||
key_bindings={},
|
|
||||||
)
|
|
||||||
|
|
||||||
provider = BarebonesProviderPCSP(ie=ie, logger=logger, settings={})
|
|
||||||
assert provider.PROVIDER_NAME == 'BarebonesProvider'
|
|
||||||
assert provider.PROVIDER_KEY == 'BarebonesProvider'
|
|
||||||
assert provider.PROVIDER_VERSION == '0.0.0'
|
|
||||||
assert provider.BUG_REPORT_MESSAGE == 'please report this issue to the provider developer at (developer has not provided a bug report location) .'
|
|
||||||
assert provider.is_available()
|
|
||||||
assert provider.generate_cache_spec(request=pot_request).default_ttl == 100
|
|
||||||
assert provider.generate_cache_spec(request=pot_request).key_bindings == {}
|
|
||||||
assert provider.generate_cache_spec(request=pot_request).write_policy == CacheProviderWritePolicy.WRITE_ALL
|
|
||||||
|
|
||||||
def test_create_provider_example(self, ie, pot_request, logger):
|
|
||||||
provider = ExampleCacheSpecProviderPCSP(ie=ie, logger=logger, settings={})
|
|
||||||
assert provider.PROVIDER_NAME == 'example'
|
|
||||||
assert provider.PROVIDER_KEY == 'ExampleCacheSpecProvider'
|
|
||||||
assert provider.PROVIDER_VERSION == '0.0.1'
|
|
||||||
assert provider.BUG_REPORT_MESSAGE == 'please report this issue to the provider developer at https://example.com/issues .'
|
|
||||||
assert provider.is_available()
|
|
||||||
assert provider.generate_cache_spec(pot_request)
|
|
||||||
assert provider.generate_cache_spec(pot_request).key_bindings == {'field': 'example-key'}
|
|
||||||
assert provider.generate_cache_spec(pot_request).default_ttl == 60
|
|
||||||
assert provider.generate_cache_spec(pot_request).write_policy == CacheProviderWritePolicy.WRITE_FIRST
|
|
||||||
|
|
||||||
def test_get_config_arg(self, ie, logger):
|
|
||||||
provider = ExampleCacheSpecProviderPCSP(ie=ie, logger=logger, settings={'abc': ['123D'], 'xyz': ['456a', '789B']})
|
|
||||||
|
|
||||||
assert provider._configuration_arg('abc') == ['123d']
|
|
||||||
assert provider._configuration_arg('abc', default=['default']) == ['123d']
|
|
||||||
assert provider._configuration_arg('ABC', default=['default']) == ['default']
|
|
||||||
assert provider._configuration_arg('abc', casesense=True) == ['123D']
|
|
||||||
assert provider._configuration_arg('xyz', casesense=False) == ['456a', '789b']
|
|
||||||
|
|
||||||
def test_require_class_end_with_suffix(self, ie, logger):
|
|
||||||
class InvalidSuffix(PoTokenCacheSpecProvider):
|
|
||||||
def generate_cache_spec(self, request: PoTokenRequest):
|
|
||||||
return None
|
|
||||||
|
|
||||||
provider = InvalidSuffix(ie=ie, logger=logger, settings={})
|
|
||||||
|
|
||||||
with pytest.raises(AssertionError):
|
|
||||||
provider.PROVIDER_KEY # noqa: B018
|
|
||||||
|
|
||||||
|
|
||||||
class TestPoTokenRequest:
|
|
||||||
def test_copy_request(self, pot_request):
|
|
||||||
copied_request = pot_request.copy()
|
|
||||||
|
|
||||||
assert copied_request is not pot_request
|
|
||||||
assert copied_request.context == pot_request.context
|
|
||||||
assert copied_request.innertube_context == pot_request.innertube_context
|
|
||||||
assert copied_request.innertube_context is not pot_request.innertube_context
|
|
||||||
copied_request.innertube_context['client']['clientName'] = 'ANDROID'
|
|
||||||
assert pot_request.innertube_context['client']['clientName'] != 'ANDROID'
|
|
||||||
assert copied_request.innertube_host == pot_request.innertube_host
|
|
||||||
assert copied_request.session_index == pot_request.session_index
|
|
||||||
assert copied_request.player_url == pot_request.player_url
|
|
||||||
assert copied_request.is_authenticated == pot_request.is_authenticated
|
|
||||||
assert copied_request.visitor_data == pot_request.visitor_data
|
|
||||||
assert copied_request.data_sync_id == pot_request.data_sync_id
|
|
||||||
assert copied_request.video_id == pot_request.video_id
|
|
||||||
assert copied_request.request_cookiejar is pot_request.request_cookiejar
|
|
||||||
assert copied_request.request_proxy == pot_request.request_proxy
|
|
||||||
assert copied_request.request_headers == pot_request.request_headers
|
|
||||||
assert copied_request.request_headers is not pot_request.request_headers
|
|
||||||
assert copied_request.request_timeout == pot_request.request_timeout
|
|
||||||
assert copied_request.request_source_address == pot_request.request_source_address
|
|
||||||
assert copied_request.request_verify_tls == pot_request.request_verify_tls
|
|
||||||
assert copied_request.bypass_cache == pot_request.bypass_cache
|
|
||||||
|
|
||||||
|
|
||||||
def test_provider_bug_report_message(ie, logger):
|
|
||||||
provider = ExamplePTP(ie=ie, logger=logger, settings={})
|
|
||||||
assert provider.BUG_REPORT_MESSAGE == 'please report this issue to the provider developer at https://example.com/issues .'
|
|
||||||
|
|
||||||
message = provider_bug_report_message(provider)
|
|
||||||
assert message == '; please report this issue to the provider developer at https://example.com/issues .'
|
|
||||||
|
|
||||||
message_before = provider_bug_report_message(provider, before='custom message!')
|
|
||||||
assert message_before == 'custom message! Please report this issue to the provider developer at https://example.com/issues .'
|
|
||||||
|
|
||||||
|
|
||||||
def test_register_provider(ie):
|
|
||||||
|
|
||||||
@register_provider
|
|
||||||
class UnavailableProviderPTP(PoTokenProvider):
|
|
||||||
def is_available(self) -> bool:
|
|
||||||
return False
|
|
||||||
|
|
||||||
def _real_request_pot(self, request: PoTokenRequest) -> PoTokenResponse:
|
|
||||||
raise PoTokenProviderRejectedRequest('Not implemented')
|
|
||||||
|
|
||||||
assert _pot_providers.value.get('UnavailableProvider') == UnavailableProviderPTP
|
|
||||||
_pot_providers.value.pop('UnavailableProvider')
|
|
||||||
|
|
||||||
|
|
||||||
def test_register_pot_preference(ie):
|
|
||||||
before = len(_ptp_preferences.value)
|
|
||||||
|
|
||||||
@register_preference(ExamplePTP)
|
|
||||||
def unavailable_preference(provider: PoTokenProvider, request: PoTokenRequest):
|
|
||||||
return 1
|
|
||||||
|
|
||||||
assert len(_ptp_preferences.value) == before + 1
|
|
||||||
|
|
||||||
|
|
||||||
def test_register_cache_provider(ie):
|
|
||||||
|
|
||||||
@cache.register_provider
|
|
||||||
class UnavailableCacheProviderPCP(PoTokenCacheProvider):
|
|
||||||
def is_available(self) -> bool:
|
|
||||||
return False
|
|
||||||
|
|
||||||
def get(self, key: str):
|
|
||||||
return 'example-cache'
|
|
||||||
|
|
||||||
def store(self, key: str, value: str, expires_at: int):
|
|
||||||
pass
|
|
||||||
|
|
||||||
def delete(self, key: str):
|
|
||||||
pass
|
|
||||||
|
|
||||||
assert _pot_cache_providers.value.get('UnavailableCacheProvider') == UnavailableCacheProviderPCP
|
|
||||||
_pot_cache_providers.value.pop('UnavailableCacheProvider')
|
|
||||||
|
|
||||||
|
|
||||||
def test_register_cache_provider_spec(ie):
|
|
||||||
|
|
||||||
@cache.register_spec
|
|
||||||
class UnavailableCacheProviderPCSP(PoTokenCacheSpecProvider):
|
|
||||||
def is_available(self) -> bool:
|
|
||||||
return False
|
|
||||||
|
|
||||||
def generate_cache_spec(self, request: PoTokenRequest):
|
|
||||||
return None
|
|
||||||
|
|
||||||
assert _pot_pcs_providers.value.get('UnavailableCacheProvider') == UnavailableCacheProviderPCSP
|
|
||||||
_pot_pcs_providers.value.pop('UnavailableCacheProvider')
|
|
||||||
|
|
||||||
|
|
||||||
def test_register_cache_provider_preference(ie):
|
|
||||||
before = len(_pot_cache_provider_preferences.value)
|
|
||||||
|
|
||||||
@cache.register_preference(ExampleCacheProviderPCP)
|
|
||||||
def unavailable_preference(provider: PoTokenCacheProvider, request: PoTokenRequest):
|
|
||||||
return 1
|
|
||||||
|
|
||||||
assert len(_pot_cache_provider_preferences.value) == before + 1
|
|
||||||
|
|
||||||
|
|
||||||
def test_logger_log_level(logger):
|
|
||||||
assert logger.LogLevel('INFO') == logger.LogLevel.INFO
|
|
||||||
assert logger.LogLevel('debuG') == logger.LogLevel.DEBUG
|
|
||||||
assert logger.LogLevel(10) == logger.LogLevel.DEBUG
|
|
||||||
assert logger.LogLevel('UNKNOWN') == logger.LogLevel.INFO
|
|
@ -216,9 +216,7 @@ class SocksWebSocketTestRequestHandler(SocksTestRequestHandler):
|
|||||||
protocol = websockets.ServerProtocol()
|
protocol = websockets.ServerProtocol()
|
||||||
connection = websockets.sync.server.ServerConnection(socket=self.request, protocol=protocol, close_timeout=0)
|
connection = websockets.sync.server.ServerConnection(socket=self.request, protocol=protocol, close_timeout=0)
|
||||||
connection.handshake()
|
connection.handshake()
|
||||||
for message in connection:
|
connection.send(json.dumps(self.socks_info))
|
||||||
if message == 'socks_info':
|
|
||||||
connection.send(json.dumps(self.socks_info))
|
|
||||||
connection.close()
|
connection.close()
|
||||||
|
|
||||||
|
|
||||||
|
@ -23,6 +23,7 @@ from yt_dlp.extractor import (
|
|||||||
TedTalkIE,
|
TedTalkIE,
|
||||||
ThePlatformFeedIE,
|
ThePlatformFeedIE,
|
||||||
ThePlatformIE,
|
ThePlatformIE,
|
||||||
|
VikiIE,
|
||||||
VimeoIE,
|
VimeoIE,
|
||||||
WallaIE,
|
WallaIE,
|
||||||
YoutubeIE,
|
YoutubeIE,
|
||||||
@ -330,6 +331,20 @@ class TestRaiPlaySubtitles(BaseTestSubtitles):
|
|||||||
self.assertEqual(md5(subtitles['it']), '4b3264186fbb103508abe5311cfcb9cd')
|
self.assertEqual(md5(subtitles['it']), '4b3264186fbb103508abe5311cfcb9cd')
|
||||||
|
|
||||||
|
|
||||||
|
@is_download_test
|
||||||
|
@unittest.skip('IE broken - DRM only')
|
||||||
|
class TestVikiSubtitles(BaseTestSubtitles):
|
||||||
|
url = 'http://www.viki.com/videos/1060846v-punch-episode-18'
|
||||||
|
IE = VikiIE
|
||||||
|
|
||||||
|
def test_allsubtitles(self):
|
||||||
|
self.DL.params['writesubtitles'] = True
|
||||||
|
self.DL.params['allsubtitles'] = True
|
||||||
|
subtitles = self.getSubtitles()
|
||||||
|
self.assertEqual(set(subtitles.keys()), {'en'})
|
||||||
|
self.assertEqual(md5(subtitles['en']), '53cb083a5914b2d84ef1ab67b880d18a')
|
||||||
|
|
||||||
|
|
||||||
@is_download_test
|
@is_download_test
|
||||||
class TestThePlatformSubtitles(BaseTestSubtitles):
|
class TestThePlatformSubtitles(BaseTestSubtitles):
|
||||||
# from http://www.3playmedia.com/services-features/tools/integrations/theplatform/
|
# from http://www.3playmedia.com/services-features/tools/integrations/theplatform/
|
||||||
|
@ -4,23 +4,8 @@ import xml.etree.ElementTree
|
|||||||
|
|
||||||
import pytest
|
import pytest
|
||||||
|
|
||||||
from yt_dlp.utils import (
|
from yt_dlp.utils import dict_get, int_or_none, str_or_none
|
||||||
ExtractorError,
|
from yt_dlp.utils.traversal import traverse_obj
|
||||||
determine_ext,
|
|
||||||
dict_get,
|
|
||||||
int_or_none,
|
|
||||||
join_nonempty,
|
|
||||||
str_or_none,
|
|
||||||
)
|
|
||||||
from yt_dlp.utils.traversal import (
|
|
||||||
find_element,
|
|
||||||
find_elements,
|
|
||||||
require,
|
|
||||||
subs_list_to_dict,
|
|
||||||
traverse_obj,
|
|
||||||
trim_str,
|
|
||||||
unpack,
|
|
||||||
)
|
|
||||||
|
|
||||||
_TEST_DATA = {
|
_TEST_DATA = {
|
||||||
100: 100,
|
100: 100,
|
||||||
@ -39,14 +24,6 @@ _TEST_DATA = {
|
|||||||
'dict': {},
|
'dict': {},
|
||||||
}
|
}
|
||||||
|
|
||||||
_TEST_HTML = '''<html><body>
|
|
||||||
<div class="a">1</div>
|
|
||||||
<div class="a" id="x" custom="z">2</div>
|
|
||||||
<div class="b" data-id="y" custom="z">3</div>
|
|
||||||
<p class="a">4</p>
|
|
||||||
<p id="d" custom="e">5</p>
|
|
||||||
</body></html>'''
|
|
||||||
|
|
||||||
|
|
||||||
class TestTraversal:
|
class TestTraversal:
|
||||||
def test_traversal_base(self):
|
def test_traversal_base(self):
|
||||||
@ -416,8 +393,18 @@ class TestTraversal:
|
|||||||
'`any` should allow further branching'
|
'`any` should allow further branching'
|
||||||
|
|
||||||
def test_traversal_morsel(self):
|
def test_traversal_morsel(self):
|
||||||
|
values = {
|
||||||
|
'expires': 'a',
|
||||||
|
'path': 'b',
|
||||||
|
'comment': 'c',
|
||||||
|
'domain': 'd',
|
||||||
|
'max-age': 'e',
|
||||||
|
'secure': 'f',
|
||||||
|
'httponly': 'g',
|
||||||
|
'version': 'h',
|
||||||
|
'samesite': 'i',
|
||||||
|
}
|
||||||
morsel = http.cookies.Morsel()
|
morsel = http.cookies.Morsel()
|
||||||
values = dict(zip(morsel, 'abcdefghijklmnop'))
|
|
||||||
morsel.set('item_key', 'item_value', 'coded_value')
|
morsel.set('item_key', 'item_value', 'coded_value')
|
||||||
morsel.update(values)
|
morsel.update(values)
|
||||||
values['key'] = 'item_key'
|
values['key'] = 'item_key'
|
||||||
@ -433,186 +420,6 @@ class TestTraversal:
|
|||||||
assert traverse_obj(morsel, [(None,), any]) == morsel, \
|
assert traverse_obj(morsel, [(None,), any]) == morsel, \
|
||||||
'Morsel should not be implicitly changed to dict on usage'
|
'Morsel should not be implicitly changed to dict on usage'
|
||||||
|
|
||||||
def test_traversal_filter(self):
|
|
||||||
data = [None, False, True, 0, 1, 0.0, 1.1, '', 'str', {}, {0: 0}, [], [1]]
|
|
||||||
|
|
||||||
assert traverse_obj(data, [..., filter]) == [True, 1, 1.1, 'str', {0: 0}, [1]], \
|
|
||||||
'`filter` should filter falsy values'
|
|
||||||
|
|
||||||
|
|
||||||
class TestTraversalHelpers:
|
|
||||||
def test_traversal_require(self):
|
|
||||||
with pytest.raises(ExtractorError):
|
|
||||||
traverse_obj(_TEST_DATA, ['None', {require('value')}])
|
|
||||||
assert traverse_obj(_TEST_DATA, ['str', {require('value')}]) == 'str', \
|
|
||||||
'`require` should pass through non `None` values'
|
|
||||||
|
|
||||||
def test_subs_list_to_dict(self):
|
|
||||||
assert traverse_obj([
|
|
||||||
{'name': 'de', 'url': 'https://example.com/subs/de.vtt'},
|
|
||||||
{'name': 'en', 'url': 'https://example.com/subs/en1.ass'},
|
|
||||||
{'name': 'en', 'url': 'https://example.com/subs/en2.ass'},
|
|
||||||
], [..., {
|
|
||||||
'id': 'name',
|
|
||||||
'url': 'url',
|
|
||||||
}, all, {subs_list_to_dict}]) == {
|
|
||||||
'de': [{'url': 'https://example.com/subs/de.vtt'}],
|
|
||||||
'en': [
|
|
||||||
{'url': 'https://example.com/subs/en1.ass'},
|
|
||||||
{'url': 'https://example.com/subs/en2.ass'},
|
|
||||||
],
|
|
||||||
}, 'function should build subtitle dict from list of subtitles'
|
|
||||||
assert traverse_obj([
|
|
||||||
{'name': 'de', 'url': 'https://example.com/subs/de.ass'},
|
|
||||||
{'name': 'de'},
|
|
||||||
{'name': 'en', 'content': 'content'},
|
|
||||||
{'url': 'https://example.com/subs/en'},
|
|
||||||
], [..., {
|
|
||||||
'id': 'name',
|
|
||||||
'data': 'content',
|
|
||||||
'url': 'url',
|
|
||||||
}, all, {subs_list_to_dict(lang=None)}]) == {
|
|
||||||
'de': [{'url': 'https://example.com/subs/de.ass'}],
|
|
||||||
'en': [{'data': 'content'}],
|
|
||||||
}, 'subs with mandatory items missing should be filtered'
|
|
||||||
assert traverse_obj([
|
|
||||||
{'url': 'https://example.com/subs/de.ass', 'name': 'de'},
|
|
||||||
{'url': 'https://example.com/subs/en', 'name': 'en'},
|
|
||||||
], [..., {
|
|
||||||
'id': 'name',
|
|
||||||
'ext': ['url', {determine_ext(default_ext=None)}],
|
|
||||||
'url': 'url',
|
|
||||||
}, all, {subs_list_to_dict(ext='ext')}]) == {
|
|
||||||
'de': [{'url': 'https://example.com/subs/de.ass', 'ext': 'ass'}],
|
|
||||||
'en': [{'url': 'https://example.com/subs/en', 'ext': 'ext'}],
|
|
||||||
}, '`ext` should set default ext but leave existing value untouched'
|
|
||||||
assert traverse_obj([
|
|
||||||
{'name': 'en', 'url': 'https://example.com/subs/en2', 'prio': True},
|
|
||||||
{'name': 'en', 'url': 'https://example.com/subs/en1', 'prio': False},
|
|
||||||
], [..., {
|
|
||||||
'id': 'name',
|
|
||||||
'quality': ['prio', {int}],
|
|
||||||
'url': 'url',
|
|
||||||
}, all, {subs_list_to_dict(ext='ext')}]) == {'en': [
|
|
||||||
{'url': 'https://example.com/subs/en1', 'ext': 'ext'},
|
|
||||||
{'url': 'https://example.com/subs/en2', 'ext': 'ext'},
|
|
||||||
]}, '`quality` key should sort subtitle list accordingly'
|
|
||||||
assert traverse_obj([
|
|
||||||
{'name': 'de', 'url': 'https://example.com/subs/de.ass'},
|
|
||||||
{'name': 'de'},
|
|
||||||
{'name': 'en', 'content': 'content'},
|
|
||||||
{'url': 'https://example.com/subs/en'},
|
|
||||||
], [..., {
|
|
||||||
'id': 'name',
|
|
||||||
'url': 'url',
|
|
||||||
'data': 'content',
|
|
||||||
}, all, {subs_list_to_dict(lang='en')}]) == {
|
|
||||||
'de': [{'url': 'https://example.com/subs/de.ass'}],
|
|
||||||
'en': [
|
|
||||||
{'data': 'content'},
|
|
||||||
{'url': 'https://example.com/subs/en'},
|
|
||||||
],
|
|
||||||
}, 'optionally provided lang should be used if no id available'
|
|
||||||
assert traverse_obj([
|
|
||||||
{'name': 1, 'url': 'https://example.com/subs/de1'},
|
|
||||||
{'name': {}, 'url': 'https://example.com/subs/de2'},
|
|
||||||
{'name': 'de', 'ext': 1, 'url': 'https://example.com/subs/de3'},
|
|
||||||
{'name': 'de', 'ext': {}, 'url': 'https://example.com/subs/de4'},
|
|
||||||
], [..., {
|
|
||||||
'id': 'name',
|
|
||||||
'url': 'url',
|
|
||||||
'ext': 'ext',
|
|
||||||
}, all, {subs_list_to_dict(lang=None)}]) == {
|
|
||||||
'de': [
|
|
||||||
{'url': 'https://example.com/subs/de3'},
|
|
||||||
{'url': 'https://example.com/subs/de4'},
|
|
||||||
],
|
|
||||||
}, 'non str types should be ignored for id and ext'
|
|
||||||
assert traverse_obj([
|
|
||||||
{'name': 1, 'url': 'https://example.com/subs/de1'},
|
|
||||||
{'name': {}, 'url': 'https://example.com/subs/de2'},
|
|
||||||
{'name': 'de', 'ext': 1, 'url': 'https://example.com/subs/de3'},
|
|
||||||
{'name': 'de', 'ext': {}, 'url': 'https://example.com/subs/de4'},
|
|
||||||
], [..., {
|
|
||||||
'id': 'name',
|
|
||||||
'url': 'url',
|
|
||||||
'ext': 'ext',
|
|
||||||
}, all, {subs_list_to_dict(lang='de')}]) == {
|
|
||||||
'de': [
|
|
||||||
{'url': 'https://example.com/subs/de1'},
|
|
||||||
{'url': 'https://example.com/subs/de2'},
|
|
||||||
{'url': 'https://example.com/subs/de3'},
|
|
||||||
{'url': 'https://example.com/subs/de4'},
|
|
||||||
],
|
|
||||||
}, 'non str types should be replaced by default id'
|
|
||||||
|
|
||||||
def test_trim_str(self):
|
|
||||||
with pytest.raises(TypeError):
|
|
||||||
trim_str('positional')
|
|
||||||
|
|
||||||
assert callable(trim_str(start='a'))
|
|
||||||
assert trim_str(start='ab')('abc') == 'c'
|
|
||||||
assert trim_str(end='bc')('abc') == 'a'
|
|
||||||
assert trim_str(start='a', end='c')('abc') == 'b'
|
|
||||||
assert trim_str(start='ab', end='c')('abc') == ''
|
|
||||||
assert trim_str(start='a', end='bc')('abc') == ''
|
|
||||||
assert trim_str(start='ab', end='bc')('abc') == ''
|
|
||||||
assert trim_str(start='abc', end='abc')('abc') == ''
|
|
||||||
assert trim_str(start='', end='')('abc') == 'abc'
|
|
||||||
|
|
||||||
def test_unpack(self):
|
|
||||||
assert unpack(lambda *x: ''.join(map(str, x)))([1, 2, 3]) == '123'
|
|
||||||
assert unpack(join_nonempty)([1, 2, 3]) == '1-2-3'
|
|
||||||
assert unpack(join_nonempty, delim=' ')([1, 2, 3]) == '1 2 3'
|
|
||||||
with pytest.raises(TypeError):
|
|
||||||
unpack(join_nonempty)()
|
|
||||||
with pytest.raises(TypeError):
|
|
||||||
unpack()
|
|
||||||
|
|
||||||
def test_find_element(self):
|
|
||||||
for improper_kwargs in [
|
|
||||||
dict(attr='data-id'),
|
|
||||||
dict(value='y'),
|
|
||||||
dict(attr='data-id', value='y', cls='a'),
|
|
||||||
dict(attr='data-id', value='y', id='x'),
|
|
||||||
dict(cls='a', id='x'),
|
|
||||||
dict(cls='a', tag='p'),
|
|
||||||
dict(cls='[ab]', regex=True),
|
|
||||||
]:
|
|
||||||
with pytest.raises(AssertionError):
|
|
||||||
find_element(**improper_kwargs)(_TEST_HTML)
|
|
||||||
|
|
||||||
assert find_element(cls='a')(_TEST_HTML) == '1'
|
|
||||||
assert find_element(cls='a', html=True)(_TEST_HTML) == '<div class="a">1</div>'
|
|
||||||
assert find_element(id='x')(_TEST_HTML) == '2'
|
|
||||||
assert find_element(id='[ex]')(_TEST_HTML) is None
|
|
||||||
assert find_element(id='[ex]', regex=True)(_TEST_HTML) == '2'
|
|
||||||
assert find_element(id='x', html=True)(_TEST_HTML) == '<div class="a" id="x" custom="z">2</div>'
|
|
||||||
assert find_element(attr='data-id', value='y')(_TEST_HTML) == '3'
|
|
||||||
assert find_element(attr='data-id', value='y(?:es)?')(_TEST_HTML) is None
|
|
||||||
assert find_element(attr='data-id', value='y(?:es)?', regex=True)(_TEST_HTML) == '3'
|
|
||||||
assert find_element(
|
|
||||||
attr='data-id', value='y', html=True)(_TEST_HTML) == '<div class="b" data-id="y" custom="z">3</div>'
|
|
||||||
|
|
||||||
def test_find_elements(self):
|
|
||||||
for improper_kwargs in [
|
|
||||||
dict(tag='p'),
|
|
||||||
dict(attr='data-id'),
|
|
||||||
dict(value='y'),
|
|
||||||
dict(attr='data-id', value='y', cls='a'),
|
|
||||||
dict(cls='a', tag='div'),
|
|
||||||
dict(cls='[ab]', regex=True),
|
|
||||||
]:
|
|
||||||
with pytest.raises(AssertionError):
|
|
||||||
find_elements(**improper_kwargs)(_TEST_HTML)
|
|
||||||
|
|
||||||
assert find_elements(cls='a')(_TEST_HTML) == ['1', '2', '4']
|
|
||||||
assert find_elements(cls='a', html=True)(_TEST_HTML) == [
|
|
||||||
'<div class="a">1</div>', '<div class="a" id="x" custom="z">2</div>', '<p class="a">4</p>']
|
|
||||||
assert find_elements(attr='custom', value='z')(_TEST_HTML) == ['2', '3']
|
|
||||||
assert find_elements(attr='custom', value='[ez]')(_TEST_HTML) == []
|
|
||||||
assert find_elements(attr='custom', value='[ez]', regex=True)(_TEST_HTML) == ['2', '3', '5']
|
|
||||||
|
|
||||||
|
|
||||||
class TestDictGet:
|
class TestDictGet:
|
||||||
def test_dict_get(self):
|
def test_dict_get(self):
|
||||||
|
@ -82,32 +82,16 @@ TEST_LOCKFILE_V1 = rf'''{TEST_LOCKFILE_COMMENT}
|
|||||||
lock 2022.08.18.36 .+ Python 3\.6
|
lock 2022.08.18.36 .+ Python 3\.6
|
||||||
lock 2023.11.16 (?!win_x86_exe).+ Python 3\.7
|
lock 2023.11.16 (?!win_x86_exe).+ Python 3\.7
|
||||||
lock 2023.11.16 win_x86_exe .+ Windows-(?:Vista|2008Server)
|
lock 2023.11.16 win_x86_exe .+ Windows-(?:Vista|2008Server)
|
||||||
lock 2024.10.22 py2exe .+
|
|
||||||
lock 2024.10.22 linux_(?:armv7l|aarch64)_exe .+-glibc2\.(?:[12]?\d|30)\b
|
|
||||||
lock 2024.10.22 (?!\w+_exe).+ Python 3\.8
|
|
||||||
lock 2024.10.22 win(?:_x86)?_exe Python 3\.[78].+ Windows-(?:7-|2008ServerR2)
|
|
||||||
'''
|
'''
|
||||||
|
|
||||||
TEST_LOCKFILE_V2_TMPL = r'''%s
|
TEST_LOCKFILE_V2_TMPL = r'''%s
|
||||||
lockV2 yt-dlp/yt-dlp 2022.08.18.36 .+ Python 3\.6
|
lockV2 yt-dlp/yt-dlp 2022.08.18.36 .+ Python 3\.6
|
||||||
lockV2 yt-dlp/yt-dlp 2023.11.16 (?!win_x86_exe).+ Python 3\.7
|
lockV2 yt-dlp/yt-dlp 2023.11.16 (?!win_x86_exe).+ Python 3\.7
|
||||||
lockV2 yt-dlp/yt-dlp 2023.11.16 win_x86_exe .+ Windows-(?:Vista|2008Server)
|
lockV2 yt-dlp/yt-dlp 2023.11.16 win_x86_exe .+ Windows-(?:Vista|2008Server)
|
||||||
lockV2 yt-dlp/yt-dlp 2024.10.22 py2exe .+
|
|
||||||
lockV2 yt-dlp/yt-dlp 2024.10.22 linux_(?:armv7l|aarch64)_exe .+-glibc2\.(?:[12]?\d|30)\b
|
|
||||||
lockV2 yt-dlp/yt-dlp 2024.10.22 (?!\w+_exe).+ Python 3\.8
|
|
||||||
lockV2 yt-dlp/yt-dlp 2024.10.22 win(?:_x86)?_exe Python 3\.[78].+ Windows-(?:7-|2008ServerR2)
|
|
||||||
lockV2 yt-dlp/yt-dlp-nightly-builds 2023.11.15.232826 (?!win_x86_exe).+ Python 3\.7
|
lockV2 yt-dlp/yt-dlp-nightly-builds 2023.11.15.232826 (?!win_x86_exe).+ Python 3\.7
|
||||||
lockV2 yt-dlp/yt-dlp-nightly-builds 2023.11.15.232826 win_x86_exe .+ Windows-(?:Vista|2008Server)
|
lockV2 yt-dlp/yt-dlp-nightly-builds 2023.11.15.232826 win_x86_exe .+ Windows-(?:Vista|2008Server)
|
||||||
lockV2 yt-dlp/yt-dlp-nightly-builds 2024.10.22.051025 py2exe .+
|
|
||||||
lockV2 yt-dlp/yt-dlp-nightly-builds 2024.10.22.051025 linux_(?:armv7l|aarch64)_exe .+-glibc2\.(?:[12]?\d|30)\b
|
|
||||||
lockV2 yt-dlp/yt-dlp-nightly-builds 2024.10.22.051025 (?!\w+_exe).+ Python 3\.8
|
|
||||||
lockV2 yt-dlp/yt-dlp-nightly-builds 2024.10.22.051025 win(?:_x86)?_exe Python 3\.[78].+ Windows-(?:7-|2008ServerR2)
|
|
||||||
lockV2 yt-dlp/yt-dlp-master-builds 2023.11.15.232812 (?!win_x86_exe).+ Python 3\.7
|
lockV2 yt-dlp/yt-dlp-master-builds 2023.11.15.232812 (?!win_x86_exe).+ Python 3\.7
|
||||||
lockV2 yt-dlp/yt-dlp-master-builds 2023.11.15.232812 win_x86_exe .+ Windows-(?:Vista|2008Server)
|
lockV2 yt-dlp/yt-dlp-master-builds 2023.11.15.232812 win_x86_exe .+ Windows-(?:Vista|2008Server)
|
||||||
lockV2 yt-dlp/yt-dlp-master-builds 2024.10.22.045052 py2exe .+
|
|
||||||
lockV2 yt-dlp/yt-dlp-master-builds 2024.10.22.060347 linux_(?:armv7l|aarch64)_exe .+-glibc2\.(?:[12]?\d|30)\b
|
|
||||||
lockV2 yt-dlp/yt-dlp-master-builds 2024.10.22.060347 (?!\w+_exe).+ Python 3\.8
|
|
||||||
lockV2 yt-dlp/yt-dlp-master-builds 2024.10.22.060347 win(?:_x86)?_exe Python 3\.[78].+ Windows-(?:7-|2008ServerR2)
|
|
||||||
'''
|
'''
|
||||||
|
|
||||||
TEST_LOCKFILE_V2 = TEST_LOCKFILE_V2_TMPL % TEST_LOCKFILE_COMMENT
|
TEST_LOCKFILE_V2 = TEST_LOCKFILE_V2_TMPL % TEST_LOCKFILE_COMMENT
|
||||||
@ -161,76 +145,43 @@ class TestUpdate(unittest.TestCase):
|
|||||||
for lockfile in (TEST_LOCKFILE_V1, TEST_LOCKFILE_V2, TEST_LOCKFILE_ACTUAL, TEST_LOCKFILE_FORK):
|
for lockfile in (TEST_LOCKFILE_V1, TEST_LOCKFILE_V2, TEST_LOCKFILE_ACTUAL, TEST_LOCKFILE_FORK):
|
||||||
# Normal operation
|
# Normal operation
|
||||||
test(lockfile, 'zip Python 3.12.0', '2023.12.31', '2023.12.31')
|
test(lockfile, 'zip Python 3.12.0', '2023.12.31', '2023.12.31')
|
||||||
test(lockfile, 'zip Python 3.12.0', '2023.12.31', '2023.12.31', exact=True)
|
test(lockfile, 'zip stable Python 3.12.0', '2023.12.31', '2023.12.31', exact=True)
|
||||||
# py2exe should never update beyond 2024.10.22
|
# Python 3.6 --update should update only to its lock
|
||||||
test(lockfile, 'py2exe Python 3.8', '2025.01.01', '2024.10.22')
|
|
||||||
test(lockfile, 'py2exe Python 3.8', '2025.01.01', None, exact=True)
|
|
||||||
# Python 3.6 --update should update only to the py3.6 lock
|
|
||||||
test(lockfile, 'zip Python 3.6.0', '2023.11.16', '2022.08.18.36')
|
test(lockfile, 'zip Python 3.6.0', '2023.11.16', '2022.08.18.36')
|
||||||
# Python 3.6 --update-to an exact version later than the py3.6 lock should return None
|
# --update-to an exact version later than the lock should return None
|
||||||
test(lockfile, 'zip Python 3.6.0', '2023.11.16', None, exact=True)
|
test(lockfile, 'zip stable Python 3.6.0', '2023.11.16', None, exact=True)
|
||||||
# Python 3.7 should be able to update to the py3.7 lock
|
# Python 3.7 should be able to update to its lock
|
||||||
test(lockfile, 'zip Python 3.7.0', '2023.11.16', '2023.11.16')
|
test(lockfile, 'zip Python 3.7.0', '2023.11.16', '2023.11.16')
|
||||||
test(lockfile, 'zip Python 3.7.1', '2023.11.16', '2023.11.16', exact=True)
|
test(lockfile, 'zip stable Python 3.7.1', '2023.11.16', '2023.11.16', exact=True)
|
||||||
# Non-win_x86_exe builds on py3.7 must be locked at py3.7 lock
|
# Non-win_x86_exe builds on py3.7 must be locked
|
||||||
test(lockfile, 'zip Python 3.7.1', '2023.12.31', '2023.11.16')
|
test(lockfile, 'zip Python 3.7.1', '2023.12.31', '2023.11.16')
|
||||||
test(lockfile, 'zip Python 3.7.1', '2023.12.31', None, exact=True)
|
test(lockfile, 'zip stable Python 3.7.1', '2023.12.31', None, exact=True)
|
||||||
# Python 3.8 should only update to the py3.8 lock
|
test( # Windows Vista w/ win_x86_exe must be locked
|
||||||
test(lockfile, 'zip Python 3.8.10', '2025.01.01', '2024.10.22')
|
lockfile, 'win_x86_exe stable Python 3.7.9 (CPython x86 32bit) - Windows-Vista-6.0.6003-SP2',
|
||||||
test(lockfile, 'zip Python 3.8.110', '2025.01.01', None, exact=True)
|
|
||||||
test( # Windows Vista w/ win_x86_exe must be locked at Vista lock
|
|
||||||
lockfile, 'win_x86_exe Python 3.7.9 (CPython x86 32bit) - Windows-Vista-6.0.6003-SP2',
|
|
||||||
'2023.12.31', '2023.11.16')
|
'2023.12.31', '2023.11.16')
|
||||||
test( # Windows 2008Server w/ win_x86_exe must be locked at Vista lock
|
test( # Windows 2008Server w/ win_x86_exe must be locked
|
||||||
lockfile, 'win_x86_exe Python 3.7.9 (CPython x86 32bit) - Windows-2008Server',
|
lockfile, 'win_x86_exe Python 3.7.9 (CPython x86 32bit) - Windows-2008Server',
|
||||||
'2023.12.31', None, exact=True)
|
'2023.12.31', None, exact=True)
|
||||||
test( # Windows 7 w/ win_x86_exe py3.7 build should be able to update beyond py3.7 lock
|
test( # Windows 7 w/ win_x86_exe py3.7 build should be able to update beyond lock
|
||||||
lockfile, 'win_x86_exe Python 3.7.9 (CPython x86 32bit) - Windows-7-6.1.7601-SP1',
|
lockfile, 'win_x86_exe stable Python 3.7.9 (CPython x86 32bit) - Windows-7-6.1.7601-SP1',
|
||||||
'2023.12.31', '2023.12.31', exact=True)
|
'2023.12.31', '2023.12.31')
|
||||||
test( # Windows 7 win_x86_exe should only update to Win7 lock
|
test( # Windows 8.1 w/ '2008Server' in platform string should be able to update beyond lock
|
||||||
lockfile, 'win_x86_exe Python 3.7.9 (CPython x86 32bit) - Windows-7-6.1.7601-SP1',
|
|
||||||
'2025.01.01', '2024.10.22')
|
|
||||||
test( # Windows 2008ServerR2 win_exe should only update to Win7 lock
|
|
||||||
lockfile, 'win_exe Python 3.8.10 (CPython x86 32bit) - Windows-2008ServerR2',
|
|
||||||
'2025.12.31', '2024.10.22')
|
|
||||||
test( # Windows 8.1 w/ '2008Server' in platform string should be able to update beyond py3.7 lock
|
|
||||||
lockfile, 'win_x86_exe Python 3.7.9 (CPython x86 32bit) - Windows-post2008Server-6.2.9200',
|
lockfile, 'win_x86_exe Python 3.7.9 (CPython x86 32bit) - Windows-post2008Server-6.2.9200',
|
||||||
'2023.12.31', '2023.12.31', exact=True)
|
'2023.12.31', '2023.12.31', exact=True)
|
||||||
test( # win_exe built w/Python 3.8 on Windows>=8 should be able to update beyond py3.8 lock
|
|
||||||
lockfile, 'win_exe Python 3.8.10 (CPython AMD64 64bit) - Windows-10-10.0.20348-SP0',
|
|
||||||
'2025.01.01', '2025.01.01', exact=True)
|
|
||||||
test( # linux_armv7l_exe w/glibc2.7 should only update to glibc<2.31 lock
|
|
||||||
lockfile, 'linux_armv7l_exe Python 3.8.0 (CPython armv7l 32bit) - Linux-6.5.0-1025-azure-armv7l-with-glibc2.7',
|
|
||||||
'2025.01.01', '2024.10.22')
|
|
||||||
test( # linux_armv7l_exe w/Python 3.8 and glibc>=2.31 should be able to update beyond py3.8 and glibc<2.31 locks
|
|
||||||
lockfile, 'linux_armv7l_exe Python 3.8.0 (CPython armv7l 32bit) - Linux-6.5.0-1025-azure-armv7l-with-glibc2.31',
|
|
||||||
'2025.01.01', '2025.01.01')
|
|
||||||
test( # linux_armv7l_exe w/glibc2.30 should only update to glibc<2.31 lock
|
|
||||||
lockfile, 'linux_armv7l_exe Python 3.8.0 (CPython armv7l 64bit) - Linux-6.5.0-1025-azure-aarch64-with-glibc2.30 (OpenSSL',
|
|
||||||
'2025.01.01', '2024.10.22')
|
|
||||||
test( # linux_aarch64_exe w/glibc2.17 should only update to glibc<2.31 lock
|
|
||||||
lockfile, 'linux_aarch64_exe Python 3.8.0 (CPython aarch64 64bit) - Linux-6.5.0-1025-azure-aarch64-with-glibc2.17',
|
|
||||||
'2025.01.01', '2024.10.22')
|
|
||||||
test( # linux_aarch64_exe w/glibc2.40 and glibc>=2.31 should be able to update beyond py3.8 and glibc<2.31 locks
|
|
||||||
lockfile, 'linux_aarch64_exe Python 3.8.0 (CPython aarch64 64bit) - Linux-6.5.0-1025-azure-aarch64-with-glibc2.40',
|
|
||||||
'2025.01.01', '2025.01.01')
|
|
||||||
test( # linux_aarch64_exe w/glibc2.3 should only update to glibc<2.31 lock
|
|
||||||
lockfile, 'linux_aarch64_exe Python 3.8.0 (CPython aarch64 64bit) - Linux-6.5.0-1025-azure-aarch64-with-glibc2.3 (OpenSSL',
|
|
||||||
'2025.01.01', '2024.10.22')
|
|
||||||
|
|
||||||
# Forks can block updates to non-numeric tags rather than lock
|
# Forks can block updates to non-numeric tags rather than lock
|
||||||
test(TEST_LOCKFILE_FORK, 'zip Python 3.6.3', 'pr0000', None, repo='fork/yt-dlp')
|
test(TEST_LOCKFILE_FORK, 'zip Python 3.6.3', 'pr0000', None, repo='fork/yt-dlp')
|
||||||
test(TEST_LOCKFILE_FORK, 'zip Python 3.7.4', 'pr0000', 'pr0000', repo='fork/yt-dlp')
|
test(TEST_LOCKFILE_FORK, 'zip stable Python 3.7.4', 'pr0000', 'pr0000', repo='fork/yt-dlp')
|
||||||
test(TEST_LOCKFILE_FORK, 'zip Python 3.7.4', 'pr1234', None, repo='fork/yt-dlp')
|
test(TEST_LOCKFILE_FORK, 'zip stable Python 3.7.4', 'pr1234', None, repo='fork/yt-dlp')
|
||||||
test(TEST_LOCKFILE_FORK, 'zip Python 3.8.1', 'pr1234', 'pr1234', repo='fork/yt-dlp', exact=True)
|
test(TEST_LOCKFILE_FORK, 'zip Python 3.8.1', 'pr1234', 'pr1234', repo='fork/yt-dlp', exact=True)
|
||||||
test(
|
test(
|
||||||
TEST_LOCKFILE_FORK, 'win_x86_exe Python 3.7.9 (CPython x86 32bit) - Windows-Vista-6.0.6003-SP2',
|
TEST_LOCKFILE_FORK, 'win_x86_exe stable Python 3.7.9 (CPython x86 32bit) - Windows-Vista-6.0.6003-SP2',
|
||||||
'pr1234', None, repo='fork/yt-dlp')
|
'pr1234', None, repo='fork/yt-dlp')
|
||||||
test(
|
test(
|
||||||
TEST_LOCKFILE_FORK, 'win_x86_exe Python 3.7.9 (CPython x86 32bit) - Windows-7-6.1.7601-SP1',
|
TEST_LOCKFILE_FORK, 'win_x86_exe stable Python 3.7.9 (CPython x86 32bit) - Windows-7-6.1.7601-SP1',
|
||||||
'2023.12.31', '2023.12.31', repo='fork/yt-dlp')
|
'2023.12.31', '2023.12.31', repo='fork/yt-dlp')
|
||||||
test(TEST_LOCKFILE_FORK, 'zip Python 3.11.2', 'pr9999', None, repo='fork/yt-dlp', exact=True)
|
test(TEST_LOCKFILE_FORK, 'zip Python 3.11.2', 'pr9999', None, repo='fork/yt-dlp', exact=True)
|
||||||
test(TEST_LOCKFILE_FORK, 'zip Python 3.12.0', 'pr9999', 'pr9999', repo='fork/yt-dlp')
|
test(TEST_LOCKFILE_FORK, 'zip stable Python 3.12.0', 'pr9999', 'pr9999', repo='fork/yt-dlp')
|
||||||
|
|
||||||
def test_query_update(self):
|
def test_query_update(self):
|
||||||
ydl = FakeYDL()
|
ydl = FakeYDL()
|
||||||
|
@ -3,25 +3,24 @@
|
|||||||
# Allow direct execution
|
# Allow direct execution
|
||||||
import os
|
import os
|
||||||
import sys
|
import sys
|
||||||
|
import unittest
|
||||||
|
import warnings
|
||||||
|
import datetime as dt
|
||||||
|
|
||||||
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||||
|
|
||||||
|
|
||||||
import contextlib
|
import contextlib
|
||||||
import datetime as dt
|
|
||||||
import io
|
import io
|
||||||
import itertools
|
import itertools
|
||||||
import json
|
import json
|
||||||
import pickle
|
|
||||||
import subprocess
|
import subprocess
|
||||||
import unittest
|
|
||||||
import unittest.mock
|
|
||||||
import warnings
|
|
||||||
import xml.etree.ElementTree
|
import xml.etree.ElementTree
|
||||||
|
|
||||||
from yt_dlp.compat import (
|
from yt_dlp.compat import (
|
||||||
compat_etree_fromstring,
|
compat_etree_fromstring,
|
||||||
compat_HTMLParseError,
|
compat_HTMLParseError,
|
||||||
|
compat_os_name,
|
||||||
)
|
)
|
||||||
from yt_dlp.utils import (
|
from yt_dlp.utils import (
|
||||||
Config,
|
Config,
|
||||||
@ -49,6 +48,7 @@ from yt_dlp.utils import (
|
|||||||
dfxp2srt,
|
dfxp2srt,
|
||||||
encode_base_n,
|
encode_base_n,
|
||||||
encode_compat_str,
|
encode_compat_str,
|
||||||
|
encodeFilename,
|
||||||
expand_path,
|
expand_path,
|
||||||
extract_attributes,
|
extract_attributes,
|
||||||
extract_basic_auth,
|
extract_basic_auth,
|
||||||
@ -68,6 +68,7 @@ from yt_dlp.utils import (
|
|||||||
get_elements_html_by_class,
|
get_elements_html_by_class,
|
||||||
get_elements_text_and_html_by_attribute,
|
get_elements_text_and_html_by_attribute,
|
||||||
int_or_none,
|
int_or_none,
|
||||||
|
intlist_to_bytes,
|
||||||
iri_to_uri,
|
iri_to_uri,
|
||||||
is_html,
|
is_html,
|
||||||
js_to_json,
|
js_to_json,
|
||||||
@ -219,8 +220,10 @@ class TestUtil(unittest.TestCase):
|
|||||||
self.assertEqual(sanitize_filename('_BD_eEpuzXw', is_id=True), '_BD_eEpuzXw')
|
self.assertEqual(sanitize_filename('_BD_eEpuzXw', is_id=True), '_BD_eEpuzXw')
|
||||||
self.assertEqual(sanitize_filename('N0Y__7-UOdI', is_id=True), 'N0Y__7-UOdI')
|
self.assertEqual(sanitize_filename('N0Y__7-UOdI', is_id=True), 'N0Y__7-UOdI')
|
||||||
|
|
||||||
@unittest.mock.patch('sys.platform', 'win32')
|
|
||||||
def test_sanitize_path(self):
|
def test_sanitize_path(self):
|
||||||
|
if sys.platform != 'win32':
|
||||||
|
return
|
||||||
|
|
||||||
self.assertEqual(sanitize_path('abc'), 'abc')
|
self.assertEqual(sanitize_path('abc'), 'abc')
|
||||||
self.assertEqual(sanitize_path('abc/def'), 'abc\\def')
|
self.assertEqual(sanitize_path('abc/def'), 'abc\\def')
|
||||||
self.assertEqual(sanitize_path('abc\\def'), 'abc\\def')
|
self.assertEqual(sanitize_path('abc\\def'), 'abc\\def')
|
||||||
@ -247,33 +250,11 @@ class TestUtil(unittest.TestCase):
|
|||||||
self.assertEqual(sanitize_path('abc/def...'), 'abc\\def..#')
|
self.assertEqual(sanitize_path('abc/def...'), 'abc\\def..#')
|
||||||
self.assertEqual(sanitize_path('abc.../def'), 'abc..#\\def')
|
self.assertEqual(sanitize_path('abc.../def'), 'abc..#\\def')
|
||||||
self.assertEqual(sanitize_path('abc.../def...'), 'abc..#\\def..#')
|
self.assertEqual(sanitize_path('abc.../def...'), 'abc..#\\def..#')
|
||||||
self.assertEqual(sanitize_path('C:\\abc:%(title)s.%(ext)s'), 'C:\\abc#%(title)s.%(ext)s')
|
|
||||||
|
|
||||||
# Check with nt._path_normpath if available
|
self.assertEqual(sanitize_path('../abc'), '..\\abc')
|
||||||
try:
|
self.assertEqual(sanitize_path('../../abc'), '..\\..\\abc')
|
||||||
from nt import _path_normpath as nt_path_normpath
|
self.assertEqual(sanitize_path('./abc'), 'abc')
|
||||||
except ImportError:
|
self.assertEqual(sanitize_path('./../abc'), '..\\abc')
|
||||||
nt_path_normpath = None
|
|
||||||
|
|
||||||
for test, expected in [
|
|
||||||
('C:\\', 'C:\\'),
|
|
||||||
('../abc', '..\\abc'),
|
|
||||||
('../../abc', '..\\..\\abc'),
|
|
||||||
('./abc', 'abc'),
|
|
||||||
('./../abc', '..\\abc'),
|
|
||||||
('\\abc', '\\abc'),
|
|
||||||
('C:abc', 'C:abc'),
|
|
||||||
('C:abc\\..\\', 'C:'),
|
|
||||||
('C:abc\\..\\def\\..\\..\\', 'C:..'),
|
|
||||||
('C:\\abc\\xyz///..\\def\\', 'C:\\abc\\def'),
|
|
||||||
('abc/../', '.'),
|
|
||||||
('./abc/../', '.'),
|
|
||||||
]:
|
|
||||||
result = sanitize_path(test)
|
|
||||||
assert result == expected, f'{test} was incorrectly resolved'
|
|
||||||
assert result == sanitize_path(result), f'{test} changed after sanitizing again'
|
|
||||||
if nt_path_normpath:
|
|
||||||
assert result == nt_path_normpath(test), f'{test} does not match nt._path_normpath'
|
|
||||||
|
|
||||||
def test_sanitize_url(self):
|
def test_sanitize_url(self):
|
||||||
self.assertEqual(sanitize_url('//foo.bar'), 'http://foo.bar')
|
self.assertEqual(sanitize_url('//foo.bar'), 'http://foo.bar')
|
||||||
@ -356,13 +337,11 @@ class TestUtil(unittest.TestCase):
|
|||||||
self.assertEqual(remove_start(None, 'A - '), None)
|
self.assertEqual(remove_start(None, 'A - '), None)
|
||||||
self.assertEqual(remove_start('A - B', 'A - '), 'B')
|
self.assertEqual(remove_start('A - B', 'A - '), 'B')
|
||||||
self.assertEqual(remove_start('B - A', 'A - '), 'B - A')
|
self.assertEqual(remove_start('B - A', 'A - '), 'B - A')
|
||||||
self.assertEqual(remove_start('non-empty', ''), 'non-empty')
|
|
||||||
|
|
||||||
def test_remove_end(self):
|
def test_remove_end(self):
|
||||||
self.assertEqual(remove_end(None, ' - B'), None)
|
self.assertEqual(remove_end(None, ' - B'), None)
|
||||||
self.assertEqual(remove_end('A - B', ' - B'), 'A')
|
self.assertEqual(remove_end('A - B', ' - B'), 'A')
|
||||||
self.assertEqual(remove_end('B - A', ' - B'), 'B - A')
|
self.assertEqual(remove_end('B - A', ' - B'), 'B - A')
|
||||||
self.assertEqual(remove_end('non-empty', ''), 'non-empty')
|
|
||||||
|
|
||||||
def test_remove_quotes(self):
|
def test_remove_quotes(self):
|
||||||
self.assertEqual(remove_quotes(None), None)
|
self.assertEqual(remove_quotes(None), None)
|
||||||
@ -578,10 +557,10 @@ class TestUtil(unittest.TestCase):
|
|||||||
self.assertEqual(res_data, {'a': 'b', 'c': 'd'})
|
self.assertEqual(res_data, {'a': 'b', 'c': 'd'})
|
||||||
|
|
||||||
def test_shell_quote(self):
|
def test_shell_quote(self):
|
||||||
args = ['ffmpeg', '-i', 'ñ€ß\'.mp4']
|
args = ['ffmpeg', '-i', encodeFilename('ñ€ß\'.mp4')]
|
||||||
self.assertEqual(
|
self.assertEqual(
|
||||||
shell_quote(args),
|
shell_quote(args),
|
||||||
"""ffmpeg -i 'ñ€ß'"'"'.mp4'""" if os.name != 'nt' else '''ffmpeg -i "ñ€ß'.mp4"''')
|
"""ffmpeg -i 'ñ€ß'"'"'.mp4'""" if compat_os_name != 'nt' else '''ffmpeg -i "ñ€ß'.mp4"''')
|
||||||
|
|
||||||
def test_float_or_none(self):
|
def test_float_or_none(self):
|
||||||
self.assertEqual(float_or_none('42.42'), 42.42)
|
self.assertEqual(float_or_none('42.42'), 42.42)
|
||||||
@ -659,8 +638,6 @@ class TestUtil(unittest.TestCase):
|
|||||||
self.assertEqual(url_or_none('mms://foo.de'), 'mms://foo.de')
|
self.assertEqual(url_or_none('mms://foo.de'), 'mms://foo.de')
|
||||||
self.assertEqual(url_or_none('rtspu://foo.de'), 'rtspu://foo.de')
|
self.assertEqual(url_or_none('rtspu://foo.de'), 'rtspu://foo.de')
|
||||||
self.assertEqual(url_or_none('ftps://foo.de'), 'ftps://foo.de')
|
self.assertEqual(url_or_none('ftps://foo.de'), 'ftps://foo.de')
|
||||||
self.assertEqual(url_or_none('ws://foo.de'), 'ws://foo.de')
|
|
||||||
self.assertEqual(url_or_none('wss://foo.de'), 'wss://foo.de')
|
|
||||||
|
|
||||||
def test_parse_age_limit(self):
|
def test_parse_age_limit(self):
|
||||||
self.assertEqual(parse_age_limit(None), None)
|
self.assertEqual(parse_age_limit(None), None)
|
||||||
@ -1262,7 +1239,6 @@ class TestUtil(unittest.TestCase):
|
|||||||
def test_js_to_json_malformed(self):
|
def test_js_to_json_malformed(self):
|
||||||
self.assertEqual(js_to_json('42a1'), '42"a1"')
|
self.assertEqual(js_to_json('42a1'), '42"a1"')
|
||||||
self.assertEqual(js_to_json('42a-1'), '42"a"-1')
|
self.assertEqual(js_to_json('42a-1'), '42"a"-1')
|
||||||
self.assertEqual(js_to_json('{a: `${e("")}`}'), '{"a": "\\"e\\"(\\"\\")"}')
|
|
||||||
|
|
||||||
def test_js_to_json_template_literal(self):
|
def test_js_to_json_template_literal(self):
|
||||||
self.assertEqual(js_to_json('`Hello ${name}`', {'name': '"world"'}), '"Hello world"')
|
self.assertEqual(js_to_json('`Hello ${name}`', {'name': '"world"'}), '"Hello world"')
|
||||||
@ -1324,10 +1300,15 @@ class TestUtil(unittest.TestCase):
|
|||||||
self.assertEqual(clean_html('a:\n "b"'), 'a: "b"')
|
self.assertEqual(clean_html('a:\n "b"'), 'a: "b"')
|
||||||
self.assertEqual(clean_html('a<br>\xa0b'), 'a\nb')
|
self.assertEqual(clean_html('a<br>\xa0b'), 'a\nb')
|
||||||
|
|
||||||
|
def test_intlist_to_bytes(self):
|
||||||
|
self.assertEqual(
|
||||||
|
intlist_to_bytes([0, 1, 127, 128, 255]),
|
||||||
|
b'\x00\x01\x7f\x80\xff')
|
||||||
|
|
||||||
def test_args_to_str(self):
|
def test_args_to_str(self):
|
||||||
self.assertEqual(
|
self.assertEqual(
|
||||||
args_to_str(['foo', 'ba/r', '-baz', '2 be', '']),
|
args_to_str(['foo', 'ba/r', '-baz', '2 be', '']),
|
||||||
'foo ba/r -baz \'2 be\' \'\'' if os.name != 'nt' else 'foo ba/r -baz "2 be" ""',
|
'foo ba/r -baz \'2 be\' \'\'' if compat_os_name != 'nt' else 'foo ba/r -baz "2 be" ""',
|
||||||
)
|
)
|
||||||
|
|
||||||
def test_parse_filesize(self):
|
def test_parse_filesize(self):
|
||||||
@ -2086,26 +2067,21 @@ Line 1
|
|||||||
headers = HTTPHeaderDict()
|
headers = HTTPHeaderDict()
|
||||||
headers['ytdl-test'] = b'0'
|
headers['ytdl-test'] = b'0'
|
||||||
self.assertEqual(list(headers.items()), [('Ytdl-Test', '0')])
|
self.assertEqual(list(headers.items()), [('Ytdl-Test', '0')])
|
||||||
self.assertEqual(list(headers.sensitive().items()), [('ytdl-test', '0')])
|
|
||||||
headers['ytdl-test'] = 1
|
headers['ytdl-test'] = 1
|
||||||
self.assertEqual(list(headers.items()), [('Ytdl-Test', '1')])
|
self.assertEqual(list(headers.items()), [('Ytdl-Test', '1')])
|
||||||
self.assertEqual(list(headers.sensitive().items()), [('ytdl-test', '1')])
|
|
||||||
headers['Ytdl-test'] = '2'
|
headers['Ytdl-test'] = '2'
|
||||||
self.assertEqual(list(headers.items()), [('Ytdl-Test', '2')])
|
self.assertEqual(list(headers.items()), [('Ytdl-Test', '2')])
|
||||||
self.assertEqual(list(headers.sensitive().items()), [('Ytdl-test', '2')])
|
|
||||||
self.assertTrue('ytDl-Test' in headers)
|
self.assertTrue('ytDl-Test' in headers)
|
||||||
self.assertEqual(str(headers), str(dict(headers)))
|
self.assertEqual(str(headers), str(dict(headers)))
|
||||||
self.assertEqual(repr(headers), str(dict(headers)))
|
self.assertEqual(repr(headers), str(dict(headers)))
|
||||||
|
|
||||||
headers.update({'X-dlp': 'data'})
|
headers.update({'X-dlp': 'data'})
|
||||||
self.assertEqual(set(headers.items()), {('Ytdl-Test', '2'), ('X-Dlp', 'data')})
|
self.assertEqual(set(headers.items()), {('Ytdl-Test', '2'), ('X-Dlp', 'data')})
|
||||||
self.assertEqual(set(headers.sensitive().items()), {('Ytdl-test', '2'), ('X-dlp', 'data')})
|
|
||||||
self.assertEqual(dict(headers), {'Ytdl-Test': '2', 'X-Dlp': 'data'})
|
self.assertEqual(dict(headers), {'Ytdl-Test': '2', 'X-Dlp': 'data'})
|
||||||
self.assertEqual(len(headers), 2)
|
self.assertEqual(len(headers), 2)
|
||||||
self.assertEqual(headers.copy(), headers)
|
self.assertEqual(headers.copy(), headers)
|
||||||
headers2 = HTTPHeaderDict({'X-dlp': 'data3'}, headers, **{'X-dlP': 'data2'})
|
headers2 = HTTPHeaderDict({'X-dlp': 'data3'}, **headers, **{'X-dlp': 'data2'})
|
||||||
self.assertEqual(set(headers2.items()), {('Ytdl-Test', '2'), ('X-Dlp', 'data2')})
|
self.assertEqual(set(headers2.items()), {('Ytdl-Test', '2'), ('X-Dlp', 'data2')})
|
||||||
self.assertEqual(set(headers2.sensitive().items()), {('Ytdl-test', '2'), ('X-dlP', 'data2')})
|
|
||||||
self.assertEqual(len(headers2), 2)
|
self.assertEqual(len(headers2), 2)
|
||||||
headers2.clear()
|
headers2.clear()
|
||||||
self.assertEqual(len(headers2), 0)
|
self.assertEqual(len(headers2), 0)
|
||||||
@ -2113,23 +2089,16 @@ Line 1
|
|||||||
# ensure we prefer latter headers
|
# ensure we prefer latter headers
|
||||||
headers3 = HTTPHeaderDict({'Ytdl-TeSt': 1}, {'Ytdl-test': 2})
|
headers3 = HTTPHeaderDict({'Ytdl-TeSt': 1}, {'Ytdl-test': 2})
|
||||||
self.assertEqual(set(headers3.items()), {('Ytdl-Test', '2')})
|
self.assertEqual(set(headers3.items()), {('Ytdl-Test', '2')})
|
||||||
self.assertEqual(set(headers3.sensitive().items()), {('Ytdl-test', '2')})
|
|
||||||
del headers3['ytdl-tesT']
|
del headers3['ytdl-tesT']
|
||||||
self.assertEqual(dict(headers3), {})
|
self.assertEqual(dict(headers3), {})
|
||||||
|
|
||||||
headers4 = HTTPHeaderDict({'ytdl-test': 'data;'})
|
headers4 = HTTPHeaderDict({'ytdl-test': 'data;'})
|
||||||
self.assertEqual(set(headers4.items()), {('Ytdl-Test', 'data;')})
|
self.assertEqual(set(headers4.items()), {('Ytdl-Test', 'data;')})
|
||||||
self.assertEqual(set(headers4.sensitive().items()), {('ytdl-test', 'data;')})
|
|
||||||
|
|
||||||
# common mistake: strip whitespace from values
|
# common mistake: strip whitespace from values
|
||||||
# https://github.com/yt-dlp/yt-dlp/issues/8729
|
# https://github.com/yt-dlp/yt-dlp/issues/8729
|
||||||
headers5 = HTTPHeaderDict({'ytdl-test': ' data; '})
|
headers5 = HTTPHeaderDict({'ytdl-test': ' data; '})
|
||||||
self.assertEqual(set(headers5.items()), {('Ytdl-Test', 'data;')})
|
self.assertEqual(set(headers5.items()), {('Ytdl-Test', 'data;')})
|
||||||
self.assertEqual(set(headers5.sensitive().items()), {('ytdl-test', 'data;')})
|
|
||||||
|
|
||||||
# test if picklable
|
|
||||||
headers6 = HTTPHeaderDict(a=1, b=2)
|
|
||||||
self.assertEqual(pickle.loads(pickle.dumps(headers6)), headers6)
|
|
||||||
|
|
||||||
def test_extract_basic_auth(self):
|
def test_extract_basic_auth(self):
|
||||||
assert extract_basic_auth('http://:foo.bar') == ('http://:foo.bar', None)
|
assert extract_basic_auth('http://:foo.bar') == ('http://:foo.bar', None)
|
||||||
@ -2139,7 +2108,7 @@ Line 1
|
|||||||
assert extract_basic_auth('http://user:@foo.bar') == ('http://foo.bar', 'Basic dXNlcjo=')
|
assert extract_basic_auth('http://user:@foo.bar') == ('http://foo.bar', 'Basic dXNlcjo=')
|
||||||
assert extract_basic_auth('http://user:pass@foo.bar') == ('http://foo.bar', 'Basic dXNlcjpwYXNz')
|
assert extract_basic_auth('http://user:pass@foo.bar') == ('http://foo.bar', 'Basic dXNlcjpwYXNz')
|
||||||
|
|
||||||
@unittest.skipUnless(os.name == 'nt', 'Only relevant on Windows')
|
@unittest.skipUnless(compat_os_name == 'nt', 'Only relevant on Windows')
|
||||||
def test_windows_escaping(self):
|
def test_windows_escaping(self):
|
||||||
tests = [
|
tests = [
|
||||||
'test"&',
|
'test"&',
|
||||||
@ -2173,12 +2142,6 @@ Line 1
|
|||||||
assert run_shell(args) == expected
|
assert run_shell(args) == expected
|
||||||
assert run_shell(shell_quote(args, shell=True)) == expected
|
assert run_shell(shell_quote(args, shell=True)) == expected
|
||||||
|
|
||||||
def test_partial_application(self):
|
|
||||||
assert callable(int_or_none(scale=10)), 'missing positional parameter should apply partially'
|
|
||||||
assert int_or_none(10, scale=0.1) == 100, 'positionally passed argument should call function'
|
|
||||||
assert int_or_none(v=10) == 10, 'keyword passed positional should call function'
|
|
||||||
assert int_or_none(scale=0.1)(10) == 100, 'call after partial application should call the function'
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == '__main__':
|
if __name__ == '__main__':
|
||||||
unittest.main()
|
unittest.main()
|
||||||
|
@ -44,7 +44,7 @@ def websocket_handler(websocket):
|
|||||||
return websocket.send('2')
|
return websocket.send('2')
|
||||||
elif isinstance(message, str):
|
elif isinstance(message, str):
|
||||||
if message == 'headers':
|
if message == 'headers':
|
||||||
return websocket.send(json.dumps(dict(websocket.request.headers.raw_items())))
|
return websocket.send(json.dumps(dict(websocket.request.headers)))
|
||||||
elif message == 'path':
|
elif message == 'path':
|
||||||
return websocket.send(websocket.request.path)
|
return websocket.send(websocket.request.path)
|
||||||
elif message == 'source_address':
|
elif message == 'source_address':
|
||||||
@ -266,18 +266,18 @@ class TestWebsSocketRequestHandlerConformance:
|
|||||||
with handler(cookiejar=cookiejar) as rh:
|
with handler(cookiejar=cookiejar) as rh:
|
||||||
ws = ws_validate_and_send(rh, Request(self.ws_base_url))
|
ws = ws_validate_and_send(rh, Request(self.ws_base_url))
|
||||||
ws.send('headers')
|
ws.send('headers')
|
||||||
assert HTTPHeaderDict(json.loads(ws.recv()))['cookie'] == 'test=ytdlp'
|
assert json.loads(ws.recv())['cookie'] == 'test=ytdlp'
|
||||||
ws.close()
|
ws.close()
|
||||||
|
|
||||||
with handler() as rh:
|
with handler() as rh:
|
||||||
ws = ws_validate_and_send(rh, Request(self.ws_base_url))
|
ws = ws_validate_and_send(rh, Request(self.ws_base_url))
|
||||||
ws.send('headers')
|
ws.send('headers')
|
||||||
assert 'cookie' not in HTTPHeaderDict(json.loads(ws.recv()))
|
assert 'cookie' not in json.loads(ws.recv())
|
||||||
ws.close()
|
ws.close()
|
||||||
|
|
||||||
ws = ws_validate_and_send(rh, Request(self.ws_base_url, extensions={'cookiejar': cookiejar}))
|
ws = ws_validate_and_send(rh, Request(self.ws_base_url, extensions={'cookiejar': cookiejar}))
|
||||||
ws.send('headers')
|
ws.send('headers')
|
||||||
assert HTTPHeaderDict(json.loads(ws.recv()))['cookie'] == 'test=ytdlp'
|
assert json.loads(ws.recv())['cookie'] == 'test=ytdlp'
|
||||||
ws.close()
|
ws.close()
|
||||||
|
|
||||||
@pytest.mark.skip_handler('Websockets', 'Set-Cookie not supported by websockets')
|
@pytest.mark.skip_handler('Websockets', 'Set-Cookie not supported by websockets')
|
||||||
@ -287,7 +287,7 @@ class TestWebsSocketRequestHandlerConformance:
|
|||||||
ws_validate_and_send(rh, Request(f'{self.ws_base_url}/get_cookie', extensions={'cookiejar': YoutubeDLCookieJar()}))
|
ws_validate_and_send(rh, Request(f'{self.ws_base_url}/get_cookie', extensions={'cookiejar': YoutubeDLCookieJar()}))
|
||||||
ws = ws_validate_and_send(rh, Request(self.ws_base_url, extensions={'cookiejar': YoutubeDLCookieJar()}))
|
ws = ws_validate_and_send(rh, Request(self.ws_base_url, extensions={'cookiejar': YoutubeDLCookieJar()}))
|
||||||
ws.send('headers')
|
ws.send('headers')
|
||||||
assert 'cookie' not in HTTPHeaderDict(json.loads(ws.recv()))
|
assert 'cookie' not in json.loads(ws.recv())
|
||||||
ws.close()
|
ws.close()
|
||||||
|
|
||||||
@pytest.mark.skip_handler('Websockets', 'Set-Cookie not supported by websockets')
|
@pytest.mark.skip_handler('Websockets', 'Set-Cookie not supported by websockets')
|
||||||
@ -298,12 +298,12 @@ class TestWebsSocketRequestHandlerConformance:
|
|||||||
ws_validate_and_send(rh, Request(f'{self.ws_base_url}/get_cookie'))
|
ws_validate_and_send(rh, Request(f'{self.ws_base_url}/get_cookie'))
|
||||||
ws = ws_validate_and_send(rh, Request(self.ws_base_url))
|
ws = ws_validate_and_send(rh, Request(self.ws_base_url))
|
||||||
ws.send('headers')
|
ws.send('headers')
|
||||||
assert HTTPHeaderDict(json.loads(ws.recv()))['cookie'] == 'test=ytdlp'
|
assert json.loads(ws.recv())['cookie'] == 'test=ytdlp'
|
||||||
ws.close()
|
ws.close()
|
||||||
cookiejar.clear_session_cookies()
|
cookiejar.clear_session_cookies()
|
||||||
ws = ws_validate_and_send(rh, Request(self.ws_base_url))
|
ws = ws_validate_and_send(rh, Request(self.ws_base_url))
|
||||||
ws.send('headers')
|
ws.send('headers')
|
||||||
assert 'cookie' not in HTTPHeaderDict(json.loads(ws.recv()))
|
assert 'cookie' not in json.loads(ws.recv())
|
||||||
ws.close()
|
ws.close()
|
||||||
|
|
||||||
def test_source_address(self, handler):
|
def test_source_address(self, handler):
|
||||||
@ -341,14 +341,6 @@ class TestWebsSocketRequestHandlerConformance:
|
|||||||
assert headers['test3'] == 'test3'
|
assert headers['test3'] == 'test3'
|
||||||
ws.close()
|
ws.close()
|
||||||
|
|
||||||
def test_keep_header_casing(self, handler):
|
|
||||||
with handler(headers=HTTPHeaderDict({'x-TeSt1': 'test'})) as rh:
|
|
||||||
ws = ws_validate_and_send(rh, Request(self.ws_base_url, headers={'x-TeSt2': 'test'}, extensions={'keep_header_casing': True}))
|
|
||||||
ws.send('headers')
|
|
||||||
headers = json.loads(ws.recv())
|
|
||||||
assert 'x-TeSt1' in headers
|
|
||||||
assert 'x-TeSt2' in headers
|
|
||||||
|
|
||||||
@pytest.mark.parametrize('client_cert', (
|
@pytest.mark.parametrize('client_cert', (
|
||||||
{'client_certificate': os.path.join(MTLS_CERT_DIR, 'clientwithkey.crt')},
|
{'client_certificate': os.path.join(MTLS_CERT_DIR, 'clientwithkey.crt')},
|
||||||
{
|
{
|
||||||
|
@ -68,76 +68,6 @@ _SIG_TESTS = [
|
|||||||
'2aq0aqSyOoJXtK73m-uME_jv7-pT15gOFC02RFkGMqWpzEICs69VdbwQ0LDp1v7j8xx92efCJlYFYb1sUkkBSPOlPmXgIARw8JQ0qOAOAA',
|
'2aq0aqSyOoJXtK73m-uME_jv7-pT15gOFC02RFkGMqWpzEICs69VdbwQ0LDp1v7j8xx92efCJlYFYb1sUkkBSPOlPmXgIARw8JQ0qOAOAA',
|
||||||
'AOq0QJ8wRAIgXmPlOPSBkkUs1bYFYlJCfe29xx8j7v1pDL2QwbdV96sCIEzpWqMGkFR20CFOg51Tp-7vj_EMu-m37KtXJoOySqa0',
|
'AOq0QJ8wRAIgXmPlOPSBkkUs1bYFYlJCfe29xx8j7v1pDL2QwbdV96sCIEzpWqMGkFR20CFOg51Tp-7vj_EMu-m37KtXJoOySqa0',
|
||||||
),
|
),
|
||||||
(
|
|
||||||
'https://www.youtube.com/s/player/3bb1f723/player_ias.vflset/en_US/base.js',
|
|
||||||
'2aq0aqSyOoJXtK73m-uME_jv7-pT15gOFC02RFkGMqWpzEICs69VdbwQ0LDp1v7j8xx92efCJlYFYb1sUkkBSPOlPmXgIARw8JQ0qOAOAA',
|
|
||||||
'MyOSJXtKI3m-uME_jv7-pT12gOFC02RFkGoqWpzE0Cs69VdbwQ0LDp1v7j8xx92efCJlYFYb1sUkkBSPOlPmXgIARw8JQ0qOAOAA',
|
|
||||||
),
|
|
||||||
(
|
|
||||||
'https://www.youtube.com/s/player/2f1832d2/player_ias.vflset/en_US/base.js',
|
|
||||||
'2aq0aqSyOoJXtK73m-uME_jv7-pT15gOFC02RFkGMqWpzEICs69VdbwQ0LDp1v7j8xx92efCJlYFYb1sUkkBSPOlPmXgIARw8JQ0qOAOAA',
|
|
||||||
'0QJ8wRAIgXmPlOPSBkkUs1bYFYlJCfe29xxAj7v1pDL0QwbdV96sCIEzpWqMGkFR20CFOg51Tp-7vj_EMu-m37KtXJ2OySqa0q',
|
|
||||||
),
|
|
||||||
(
|
|
||||||
'https://www.youtube.com/s/player/643afba4/tv-player-ias.vflset/tv-player-ias.js',
|
|
||||||
'2aq0aqSyOoJXtK73m-uME_jv7-pT15gOFC02RFkGMqWpzEICs69VdbwQ0LDp1v7j8xx92efCJlYFYb1sUkkBSPOlPmXgIARw8JQ0qOAOAA',
|
|
||||||
'AAOAOq0QJ8wRAIgXmPlOPSBkkUs1bYFYlJCfe29xx8j7vgpDL0QwbdV06sCIEzpWqMGkFR20CFOS21Tp-7vj_EMu-m37KtXJoOy1',
|
|
||||||
),
|
|
||||||
(
|
|
||||||
'https://www.youtube.com/s/player/363db69b/player_ias.vflset/en_US/base.js',
|
|
||||||
'2aq0aqSyOoJXtK73m-uME_jv7-pT15gOFC02RFkGMqWpzEICs69VdbwQ0LDp1v7j8xx92efCJlYFYb1sUkkBSPOlPmXgIARw8JQ0qOAOAA',
|
|
||||||
'0aqSyOoJXtK73m-uME_jv7-pT15gOFC02RFkGMqWpz2ICs6EVdbwQ0LDp1v7j8xx92efCJlYFYb1sUkkBSPOlPmXgIARw8JQ0qOAOAA',
|
|
||||||
),
|
|
||||||
(
|
|
||||||
'https://www.youtube.com/s/player/363db69b/player_ias_tce.vflset/en_US/base.js',
|
|
||||||
'2aq0aqSyOoJXtK73m-uME_jv7-pT15gOFC02RFkGMqWpzEICs69VdbwQ0LDp1v7j8xx92efCJlYFYb1sUkkBSPOlPmXgIARw8JQ0qOAOAA',
|
|
||||||
'0aqSyOoJXtK73m-uME_jv7-pT15gOFC02RFkGMqWpz2ICs6EVdbwQ0LDp1v7j8xx92efCJlYFYb1sUkkBSPOlPmXgIARw8JQ0qOAOAA',
|
|
||||||
),
|
|
||||||
(
|
|
||||||
'https://www.youtube.com/s/player/4fcd6e4a/player_ias.vflset/en_US/base.js',
|
|
||||||
'2aq0aqSyOoJXtK73m-uME_jv7-pT15gOFC02RFkGMqWpzEICs69VdbwQ0LDp1v7j8xx92efCJlYFYb1sUkkBSPOlPmXgIARw8JQ0qOAOAA',
|
|
||||||
'wAOAOq0QJ8ARAIgXmPlOPSBkkUs1bYFYlJCfe29xx8q7v1pDL0QwbdV96sCIEzpWqMGkFR20CFOg51Tp-7vj_EMu-m37KtXJoOySqa0',
|
|
||||||
),
|
|
||||||
(
|
|
||||||
'https://www.youtube.com/s/player/4fcd6e4a/player_ias_tce.vflset/en_US/base.js',
|
|
||||||
'2aq0aqSyOoJXtK73m-uME_jv7-pT15gOFC02RFkGMqWpzEICs69VdbwQ0LDp1v7j8xx92efCJlYFYb1sUkkBSPOlPmXgIARw8JQ0qOAOAA',
|
|
||||||
'wAOAOq0QJ8ARAIgXmPlOPSBkkUs1bYFYlJCfe29xx8q7v1pDL0QwbdV96sCIEzpWqMGkFR20CFOg51Tp-7vj_EMu-m37KtXJoOySqa0',
|
|
||||||
),
|
|
||||||
(
|
|
||||||
'https://www.youtube.com/s/player/20830619/player_ias.vflset/en_US/base.js',
|
|
||||||
'2aq0aqSyOoJXtK73m-uME_jv7-pT15gOFC02RFkGMqWpzEICs69VdbwQ0LDp1v7j8xx92efCJlYFYb1sUkkBSPOlPmXgIARw8JQ0qOAOAA',
|
|
||||||
'7AOq0QJ8wRAIgXmPlOPSBkkAs1bYFYlJCfe29xx8jOv1pDL0Q2bdV96sCIEzpWqMGkFR20CFOg51Tp-7vj_EMu-m37KtXJoOySqa0qaw',
|
|
||||||
),
|
|
||||||
(
|
|
||||||
'https://www.youtube.com/s/player/20830619/player_ias_tce.vflset/en_US/base.js',
|
|
||||||
'2aq0aqSyOoJXtK73m-uME_jv7-pT15gOFC02RFkGMqWpzEICs69VdbwQ0LDp1v7j8xx92efCJlYFYb1sUkkBSPOlPmXgIARw8JQ0qOAOAA',
|
|
||||||
'7AOq0QJ8wRAIgXmPlOPSBkkAs1bYFYlJCfe29xx8jOv1pDL0Q2bdV96sCIEzpWqMGkFR20CFOg51Tp-7vj_EMu-m37KtXJoOySqa0qaw',
|
|
||||||
),
|
|
||||||
(
|
|
||||||
'https://www.youtube.com/s/player/20830619/player-plasma-ias-phone-en_US.vflset/base.js',
|
|
||||||
'2aq0aqSyOoJXtK73m-uME_jv7-pT15gOFC02RFkGMqWpzEICs69VdbwQ0LDp1v7j8xx92efCJlYFYb1sUkkBSPOlPmXgIARw8JQ0qOAOAA',
|
|
||||||
'7AOq0QJ8wRAIgXmPlOPSBkkAs1bYFYlJCfe29xx8jOv1pDL0Q2bdV96sCIEzpWqMGkFR20CFOg51Tp-7vj_EMu-m37KtXJoOySqa0qaw',
|
|
||||||
),
|
|
||||||
(
|
|
||||||
'https://www.youtube.com/s/player/20830619/player-plasma-ias-tablet-en_US.vflset/base.js',
|
|
||||||
'2aq0aqSyOoJXtK73m-uME_jv7-pT15gOFC02RFkGMqWpzEICs69VdbwQ0LDp1v7j8xx92efCJlYFYb1sUkkBSPOlPmXgIARw8JQ0qOAOAA',
|
|
||||||
'7AOq0QJ8wRAIgXmPlOPSBkkAs1bYFYlJCfe29xx8jOv1pDL0Q2bdV96sCIEzpWqMGkFR20CFOg51Tp-7vj_EMu-m37KtXJoOySqa0qaw',
|
|
||||||
),
|
|
||||||
(
|
|
||||||
'https://www.youtube.com/s/player/8a8ac953/player_ias_tce.vflset/en_US/base.js',
|
|
||||||
'2aq0aqSyOoJXtK73m-uME_jv7-pT15gOFC02RFkGMqWpzEICs69VdbwQ0LDp1v7j8xx92efCJlYFYb1sUkkBSPOlPmXgIARw8JQ0qOAOAA',
|
|
||||||
'IAOAOq0QJ8wRAAgXmPlOPSBkkUs1bYFYlJCfe29xx8j7v1pDL0QwbdV96sCIEzpWqMGkFR20CFOg51Tp-7vj_E2u-m37KtXJoOySqa0',
|
|
||||||
),
|
|
||||||
(
|
|
||||||
'https://www.youtube.com/s/player/8a8ac953/tv-player-es6.vflset/tv-player-es6.js',
|
|
||||||
'2aq0aqSyOoJXtK73m-uME_jv7-pT15gOFC02RFkGMqWpzEICs69VdbwQ0LDp1v7j8xx92efCJlYFYb1sUkkBSPOlPmXgIARw8JQ0qOAOAA',
|
|
||||||
'IAOAOq0QJ8wRAAgXmPlOPSBkkUs1bYFYlJCfe29xx8j7v1pDL0QwbdV96sCIEzpWqMGkFR20CFOg51Tp-7vj_E2u-m37KtXJoOySqa0',
|
|
||||||
),
|
|
||||||
(
|
|
||||||
'https://www.youtube.com/s/player/e12fbea4/player_ias.vflset/en_US/base.js',
|
|
||||||
'gN7a-hudCuAuPH6fByOk1_GNXN0yNMHShjZXS2VOgsEItAJz0tipeavEOmNdYN-wUtcEqD3bCXjc0iyKfAyZxCBGgIARwsSdQfJ2CJtt',
|
|
||||||
'JC2JfQdSswRAIgGBCxZyAfKyi0cjXCb3DqEctUw-NYdNmOEvaepit0zJAtIEsgOV2SXZjhSHMNy0NXNG_1kOyBf6HPuAuCduh-a',
|
|
||||||
),
|
|
||||||
]
|
]
|
||||||
|
|
||||||
_NSIG_TESTS = [
|
_NSIG_TESTS = [
|
||||||
@ -253,86 +183,6 @@ _NSIG_TESTS = [
|
|||||||
'https://www.youtube.com/s/player/b12cc44b/player_ias.vflset/en_US/base.js',
|
'https://www.youtube.com/s/player/b12cc44b/player_ias.vflset/en_US/base.js',
|
||||||
'keLa5R2U00sR9SQK', 'N1OGyujjEwMnLw',
|
'keLa5R2U00sR9SQK', 'N1OGyujjEwMnLw',
|
||||||
),
|
),
|
||||||
(
|
|
||||||
'https://www.youtube.com/s/player/3bb1f723/player_ias.vflset/en_US/base.js',
|
|
||||||
'gK15nzVyaXE9RsMP3z', 'ZFFWFLPWx9DEgQ',
|
|
||||||
),
|
|
||||||
(
|
|
||||||
'https://www.youtube.com/s/player/2f1832d2/player_ias.vflset/en_US/base.js',
|
|
||||||
'YWt1qdbe8SAfkoPHW5d', 'RrRjWQOJmBiP',
|
|
||||||
),
|
|
||||||
(
|
|
||||||
'https://www.youtube.com/s/player/9c6dfc4a/player_ias.vflset/en_US/base.js',
|
|
||||||
'jbu7ylIosQHyJyJV', 'uwI0ESiynAmhNg',
|
|
||||||
),
|
|
||||||
(
|
|
||||||
'https://www.youtube.com/s/player/e7567ecf/player_ias_tce.vflset/en_US/base.js',
|
|
||||||
'Sy4aDGc0VpYRR9ew_', '5UPOT1VhoZxNLQ',
|
|
||||||
),
|
|
||||||
(
|
|
||||||
'https://www.youtube.com/s/player/d50f54ef/player_ias_tce.vflset/en_US/base.js',
|
|
||||||
'Ha7507LzRmH3Utygtj', 'XFTb2HoeOE5MHg',
|
|
||||||
),
|
|
||||||
(
|
|
||||||
'https://www.youtube.com/s/player/074a8365/player_ias_tce.vflset/en_US/base.js',
|
|
||||||
'Ha7507LzRmH3Utygtj', 'ufTsrE0IVYrkl8v',
|
|
||||||
),
|
|
||||||
(
|
|
||||||
'https://www.youtube.com/s/player/643afba4/player_ias.vflset/en_US/base.js',
|
|
||||||
'N5uAlLqm0eg1GyHO', 'dCBQOejdq5s-ww',
|
|
||||||
),
|
|
||||||
(
|
|
||||||
'https://www.youtube.com/s/player/69f581a5/tv-player-ias.vflset/tv-player-ias.js',
|
|
||||||
'-qIP447rVlTTwaZjY', 'KNcGOksBAvwqQg',
|
|
||||||
),
|
|
||||||
(
|
|
||||||
'https://www.youtube.com/s/player/643afba4/tv-player-ias.vflset/tv-player-ias.js',
|
|
||||||
'ir9-V6cdbCiyKxhr', '2PL7ZDYAALMfmA',
|
|
||||||
),
|
|
||||||
(
|
|
||||||
'https://www.youtube.com/s/player/363db69b/player_ias.vflset/en_US/base.js',
|
|
||||||
'eWYu5d5YeY_4LyEDc', 'XJQqf-N7Xra3gg',
|
|
||||||
),
|
|
||||||
(
|
|
||||||
'https://www.youtube.com/s/player/4fcd6e4a/player_ias.vflset/en_US/base.js',
|
|
||||||
'o_L251jm8yhZkWtBW', 'lXoxI3XvToqn6A',
|
|
||||||
),
|
|
||||||
(
|
|
||||||
'https://www.youtube.com/s/player/4fcd6e4a/player_ias_tce.vflset/en_US/base.js',
|
|
||||||
'o_L251jm8yhZkWtBW', 'lXoxI3XvToqn6A',
|
|
||||||
),
|
|
||||||
(
|
|
||||||
'https://www.youtube.com/s/player/20830619/tv-player-ias.vflset/tv-player-ias.js',
|
|
||||||
'ir9-V6cdbCiyKxhr', '9YE85kNjZiS4',
|
|
||||||
),
|
|
||||||
(
|
|
||||||
'https://www.youtube.com/s/player/20830619/player-plasma-ias-phone-en_US.vflset/base.js',
|
|
||||||
'ir9-V6cdbCiyKxhr', '9YE85kNjZiS4',
|
|
||||||
),
|
|
||||||
(
|
|
||||||
'https://www.youtube.com/s/player/20830619/player-plasma-ias-tablet-en_US.vflset/base.js',
|
|
||||||
'ir9-V6cdbCiyKxhr', '9YE85kNjZiS4',
|
|
||||||
),
|
|
||||||
(
|
|
||||||
'https://www.youtube.com/s/player/8a8ac953/player_ias_tce.vflset/en_US/base.js',
|
|
||||||
'MiBYeXx_vRREbiCCmh', 'RtZYMVvmkE0JE',
|
|
||||||
),
|
|
||||||
(
|
|
||||||
'https://www.youtube.com/s/player/8a8ac953/tv-player-es6.vflset/tv-player-es6.js',
|
|
||||||
'MiBYeXx_vRREbiCCmh', 'RtZYMVvmkE0JE',
|
|
||||||
),
|
|
||||||
(
|
|
||||||
'https://www.youtube.com/s/player/59b252b9/player_ias.vflset/en_US/base.js',
|
|
||||||
'D3XWVpYgwhLLKNK4AGX', 'aZrQ1qWJ5yv5h',
|
|
||||||
),
|
|
||||||
(
|
|
||||||
'https://www.youtube.com/s/player/fc2a56a5/player_ias.vflset/en_US/base.js',
|
|
||||||
'qTKWg_Il804jd2kAC', 'OtUAm2W6gyzJjB9u',
|
|
||||||
),
|
|
||||||
(
|
|
||||||
'https://www.youtube.com/s/player/fc2a56a5/tv-player-ias.vflset/tv-player-ias.js',
|
|
||||||
'qTKWg_Il804jd2kAC', 'OtUAm2W6gyzJjB9u',
|
|
||||||
),
|
|
||||||
]
|
]
|
||||||
|
|
||||||
|
|
||||||
@ -346,8 +196,6 @@ class TestPlayerInfo(unittest.TestCase):
|
|||||||
('https://www.youtube.com/s/player/64dddad9/player-plasma-ias-phone-en_US.vflset/base.js', '64dddad9'),
|
('https://www.youtube.com/s/player/64dddad9/player-plasma-ias-phone-en_US.vflset/base.js', '64dddad9'),
|
||||||
('https://www.youtube.com/s/player/64dddad9/player-plasma-ias-phone-de_DE.vflset/base.js', '64dddad9'),
|
('https://www.youtube.com/s/player/64dddad9/player-plasma-ias-phone-de_DE.vflset/base.js', '64dddad9'),
|
||||||
('https://www.youtube.com/s/player/64dddad9/player-plasma-ias-tablet-en_US.vflset/base.js', '64dddad9'),
|
('https://www.youtube.com/s/player/64dddad9/player-plasma-ias-tablet-en_US.vflset/base.js', '64dddad9'),
|
||||||
('https://www.youtube.com/s/player/e7567ecf/player_ias_tce.vflset/en_US/base.js', 'e7567ecf'),
|
|
||||||
('https://www.youtube.com/s/player/643afba4/tv-player-ias.vflset/tv-player-ias.js', '643afba4'),
|
|
||||||
# obsolete
|
# obsolete
|
||||||
('https://www.youtube.com/yts/jsbin/player_ias-vfle4-e03/en_US/base.js', 'vfle4-e03'),
|
('https://www.youtube.com/yts/jsbin/player_ias-vfle4-e03/en_US/base.js', 'vfle4-e03'),
|
||||||
('https://www.youtube.com/yts/jsbin/player_ias-vfl49f_g4/en_US/base.js', 'vfl49f_g4'),
|
('https://www.youtube.com/yts/jsbin/player_ias-vfl49f_g4/en_US/base.js', 'vfl49f_g4'),
|
||||||
@ -380,51 +228,43 @@ def t_factory(name, sig_func, url_pattern):
|
|||||||
def make_tfunc(url, sig_input, expected_sig):
|
def make_tfunc(url, sig_input, expected_sig):
|
||||||
m = url_pattern.match(url)
|
m = url_pattern.match(url)
|
||||||
assert m, f'{url!r} should follow URL format'
|
assert m, f'{url!r} should follow URL format'
|
||||||
test_id = re.sub(r'[/.-]', '_', m.group('id') or m.group('compat_id'))
|
test_id = m.group('id')
|
||||||
|
|
||||||
def test_func(self):
|
def test_func(self):
|
||||||
basename = f'player-{test_id}.js'
|
basename = f'player-{name}-{test_id}.js'
|
||||||
fn = os.path.join(self.TESTDATA_DIR, basename)
|
fn = os.path.join(self.TESTDATA_DIR, basename)
|
||||||
|
|
||||||
if not os.path.exists(fn):
|
if not os.path.exists(fn):
|
||||||
urllib.request.urlretrieve(url, fn)
|
urllib.request.urlretrieve(url, fn)
|
||||||
with open(fn, encoding='utf-8') as testf:
|
with open(fn, encoding='utf-8') as testf:
|
||||||
jscode = testf.read()
|
jscode = testf.read()
|
||||||
self.assertEqual(sig_func(jscode, sig_input, url), expected_sig)
|
self.assertEqual(sig_func(jscode, sig_input), expected_sig)
|
||||||
|
|
||||||
test_func.__name__ = f'test_{name}_js_{test_id}'
|
test_func.__name__ = f'test_{name}_js_{test_id}'
|
||||||
setattr(TestSignature, test_func.__name__, test_func)
|
setattr(TestSignature, test_func.__name__, test_func)
|
||||||
return make_tfunc
|
return make_tfunc
|
||||||
|
|
||||||
|
|
||||||
def signature(jscode, sig_input, player_url):
|
def signature(jscode, sig_input):
|
||||||
func = YoutubeIE(FakeYDL())._parse_sig_js(jscode, player_url)
|
func = YoutubeIE(FakeYDL())._parse_sig_js(jscode)
|
||||||
src_sig = (
|
src_sig = (
|
||||||
str(string.printable[:sig_input])
|
str(string.printable[:sig_input])
|
||||||
if isinstance(sig_input, int) else sig_input)
|
if isinstance(sig_input, int) else sig_input)
|
||||||
return func(src_sig)
|
return func(src_sig)
|
||||||
|
|
||||||
|
|
||||||
def n_sig(jscode, sig_input, player_url):
|
def n_sig(jscode, sig_input):
|
||||||
ie = YoutubeIE(FakeYDL())
|
funcname = YoutubeIE(FakeYDL())._extract_n_function_name(jscode)
|
||||||
funcname = ie._extract_n_function_name(jscode, player_url=player_url)
|
return JSInterpreter(jscode).call_function(funcname, sig_input)
|
||||||
jsi = JSInterpreter(jscode)
|
|
||||||
func = jsi.extract_function_from_code(*ie._fixup_n_function_code(*jsi.extract_function_code(funcname), jscode, player_url))
|
|
||||||
return func([sig_input])
|
|
||||||
|
|
||||||
|
|
||||||
make_sig_test = t_factory(
|
make_sig_test = t_factory(
|
||||||
'signature', signature,
|
'signature', signature, re.compile(r'.*(?:-|/player/)(?P<id>[a-zA-Z0-9_-]+)(?:/.+\.js|(?:/watch_as3|/html5player)?\.[a-z]+)$'))
|
||||||
re.compile(r'''(?x)
|
|
||||||
.+(?:
|
|
||||||
/player/(?P<id>[a-zA-Z0-9_/.-]+)|
|
|
||||||
/html5player-(?:en_US-)?(?P<compat_id>[a-zA-Z0-9_-]+)(?:/watch_as3|/html5player)?
|
|
||||||
)\.js$'''))
|
|
||||||
for test_spec in _SIG_TESTS:
|
for test_spec in _SIG_TESTS:
|
||||||
make_sig_test(*test_spec)
|
make_sig_test(*test_spec)
|
||||||
|
|
||||||
make_nsig_test = t_factory(
|
make_nsig_test = t_factory(
|
||||||
'nsig', n_sig, re.compile(r'.+/player/(?P<id>[a-zA-Z0-9_/.-]+)\.js$'))
|
'nsig', n_sig, re.compile(r'.+/player/(?P<id>[a-zA-Z0-9_-]+)/.+.js$'))
|
||||||
for test_spec in _NSIG_TESTS:
|
for test_spec in _NSIG_TESTS:
|
||||||
make_nsig_test(*test_spec)
|
make_nsig_test(*test_spec)
|
||||||
|
|
||||||
|
4
test/testdata/netrc/netrc
vendored
4
test/testdata/netrc/netrc
vendored
@ -1,4 +0,0 @@
|
|||||||
machine normal_use login user password pass
|
|
||||||
machine empty_user login "" password pass
|
|
||||||
machine empty_pass login user password ""
|
|
||||||
machine both_empty login "" password ""
|
|
2
test/testdata/netrc/print_netrc.py
vendored
2
test/testdata/netrc/print_netrc.py
vendored
@ -1,2 +0,0 @@
|
|||||||
with open('./test/testdata/netrc/netrc', encoding='utf-8') as fp:
|
|
||||||
print(fp.read())
|
|
@ -1,6 +0,0 @@
|
|||||||
from yt_dlp.extractor.common import InfoExtractor
|
|
||||||
|
|
||||||
|
|
||||||
class PackagePluginIE(InfoExtractor):
|
|
||||||
_VALID_URL = 'package'
|
|
||||||
pass
|
|
@ -1,10 +0,0 @@
|
|||||||
from yt_dlp.extractor.common import InfoExtractor
|
|
||||||
|
|
||||||
|
|
||||||
class NormalPluginIE(InfoExtractor):
|
|
||||||
_VALID_URL = 'normal'
|
|
||||||
REPLACED = True
|
|
||||||
|
|
||||||
|
|
||||||
class _IgnoreUnderscorePluginIE(InfoExtractor):
|
|
||||||
pass
|
|
@ -1,5 +0,0 @@
|
|||||||
from yt_dlp.postprocessor.common import PostProcessor
|
|
||||||
|
|
||||||
|
|
||||||
class NormalPluginPP(PostProcessor):
|
|
||||||
REPLACED = True
|
|
BIN
test/testdata/thumbnails/foo %d bar/foo_%d.webp
vendored
Normal file
BIN
test/testdata/thumbnails/foo %d bar/foo_%d.webp
vendored
Normal file
Binary file not shown.
After Width: | Height: | Size: 3.8 KiB |
@ -6,7 +6,6 @@ class IgnoreNotInAllPluginIE(InfoExtractor):
|
|||||||
|
|
||||||
|
|
||||||
class InAllPluginIE(InfoExtractor):
|
class InAllPluginIE(InfoExtractor):
|
||||||
_VALID_URL = 'inallpluginie'
|
|
||||||
pass
|
pass
|
||||||
|
|
||||||
|
|
||||||
|
@ -2,10 +2,8 @@ from yt_dlp.extractor.common import InfoExtractor
|
|||||||
|
|
||||||
|
|
||||||
class NormalPluginIE(InfoExtractor):
|
class NormalPluginIE(InfoExtractor):
|
||||||
_VALID_URL = 'normalpluginie'
|
pass
|
||||||
REPLACED = False
|
|
||||||
|
|
||||||
|
|
||||||
class _IgnoreUnderscorePluginIE(InfoExtractor):
|
class _IgnoreUnderscorePluginIE(InfoExtractor):
|
||||||
_VALID_URL = 'ignoreunderscorepluginie'
|
|
||||||
pass
|
pass
|
||||||
|
@ -1,5 +0,0 @@
|
|||||||
from yt_dlp.extractor.generic import GenericIE
|
|
||||||
|
|
||||||
|
|
||||||
class OverrideGenericIE(GenericIE, plugin_name='override'):
|
|
||||||
TEST_FIELD = 'override'
|
|
@ -1,5 +0,0 @@
|
|||||||
from yt_dlp.extractor.generic import GenericIE
|
|
||||||
|
|
||||||
|
|
||||||
class _UnderscoreOverrideGenericIE(GenericIE, plugin_name='underscore-override'):
|
|
||||||
SECONDARY_TEST_FIELD = 'underscore-override'
|
|
@ -2,4 +2,4 @@ from yt_dlp.postprocessor.common import PostProcessor
|
|||||||
|
|
||||||
|
|
||||||
class NormalPluginPP(PostProcessor):
|
class NormalPluginPP(PostProcessor):
|
||||||
REPLACED = False
|
pass
|
||||||
|
@ -2,5 +2,4 @@ from yt_dlp.extractor.common import InfoExtractor
|
|||||||
|
|
||||||
|
|
||||||
class ZippedPluginIE(InfoExtractor):
|
class ZippedPluginIE(InfoExtractor):
|
||||||
_VALID_URL = 'zippedpluginie'
|
|
||||||
pass
|
pass
|
||||||
|
@ -26,22 +26,13 @@ import unicodedata
|
|||||||
|
|
||||||
from .cache import Cache
|
from .cache import Cache
|
||||||
from .compat import urllib # isort: split
|
from .compat import urllib # isort: split
|
||||||
from .compat import urllib_req_to_req
|
from .compat import compat_os_name, urllib_req_to_req
|
||||||
from .cookies import CookieLoadError, LenientSimpleCookie, load_cookies
|
from .cookies import CookieLoadError, LenientSimpleCookie, load_cookies
|
||||||
from .downloader import FFmpegFD, get_suitable_downloader, shorten_protocol_name
|
from .downloader import FFmpegFD, get_suitable_downloader, shorten_protocol_name
|
||||||
from .downloader.rtmp import rtmpdump_version
|
from .downloader.rtmp import rtmpdump_version
|
||||||
from .extractor import gen_extractor_classes, get_info_extractor, import_extractors
|
from .extractor import gen_extractor_classes, get_info_extractor
|
||||||
from .extractor.common import UnsupportedURLIE
|
from .extractor.common import UnsupportedURLIE
|
||||||
from .extractor.openload import PhantomJSwrapper
|
from .extractor.openload import PhantomJSwrapper
|
||||||
from .globals import (
|
|
||||||
IN_CLI,
|
|
||||||
LAZY_EXTRACTORS,
|
|
||||||
plugin_ies,
|
|
||||||
plugin_ies_overrides,
|
|
||||||
plugin_pps,
|
|
||||||
all_plugins_loaded,
|
|
||||||
plugin_dirs,
|
|
||||||
)
|
|
||||||
from .minicurses import format_text
|
from .minicurses import format_text
|
||||||
from .networking import HEADRequest, Request, RequestDirector
|
from .networking import HEADRequest, Request, RequestDirector
|
||||||
from .networking.common import _REQUEST_HANDLERS, _RH_PREFERENCES
|
from .networking.common import _REQUEST_HANDLERS, _RH_PREFERENCES
|
||||||
@ -53,7 +44,8 @@ from .networking.exceptions import (
|
|||||||
network_exceptions,
|
network_exceptions,
|
||||||
)
|
)
|
||||||
from .networking.impersonate import ImpersonateRequestHandler
|
from .networking.impersonate import ImpersonateRequestHandler
|
||||||
from .plugins import directories as plugin_directories, load_all_plugins
|
from .plugins import directories as plugin_directories
|
||||||
|
from .postprocessor import _PLUGIN_CLASSES as plugin_pps
|
||||||
from .postprocessor import (
|
from .postprocessor import (
|
||||||
EmbedThumbnailPP,
|
EmbedThumbnailPP,
|
||||||
FFmpegFixupDuplicateMoovPP,
|
FFmpegFixupDuplicateMoovPP,
|
||||||
@ -117,6 +109,7 @@ from .utils import (
|
|||||||
determine_ext,
|
determine_ext,
|
||||||
determine_protocol,
|
determine_protocol,
|
||||||
encode_compat_str,
|
encode_compat_str,
|
||||||
|
encodeFilename,
|
||||||
escapeHTML,
|
escapeHTML,
|
||||||
expand_path,
|
expand_path,
|
||||||
extract_basic_auth,
|
extract_basic_auth,
|
||||||
@ -161,11 +154,12 @@ from .utils import (
|
|||||||
try_get,
|
try_get,
|
||||||
url_basename,
|
url_basename,
|
||||||
variadic,
|
variadic,
|
||||||
|
version_tuple,
|
||||||
windows_enable_vt_mode,
|
windows_enable_vt_mode,
|
||||||
write_json_file,
|
write_json_file,
|
||||||
write_string,
|
write_string,
|
||||||
)
|
)
|
||||||
from .utils._utils import _UnsafeExtensionError, _YDLLogger, _ProgressState
|
from .utils._utils import _UnsafeExtensionError, _YDLLogger
|
||||||
from .utils.networking import (
|
from .utils.networking import (
|
||||||
HTTPHeaderDict,
|
HTTPHeaderDict,
|
||||||
clean_headers,
|
clean_headers,
|
||||||
@ -174,7 +168,7 @@ from .utils.networking import (
|
|||||||
)
|
)
|
||||||
from .version import CHANNEL, ORIGIN, RELEASE_GIT_HEAD, VARIANT, __version__
|
from .version import CHANNEL, ORIGIN, RELEASE_GIT_HEAD, VARIANT, __version__
|
||||||
|
|
||||||
if os.name == 'nt':
|
if compat_os_name == 'nt':
|
||||||
import ctypes
|
import ctypes
|
||||||
|
|
||||||
|
|
||||||
@ -257,7 +251,7 @@ class YoutubeDL:
|
|||||||
format_sort_force: Force the given format_sort. see "Sorting Formats"
|
format_sort_force: Force the given format_sort. see "Sorting Formats"
|
||||||
for more details.
|
for more details.
|
||||||
prefer_free_formats: Whether to prefer video formats with free containers
|
prefer_free_formats: Whether to prefer video formats with free containers
|
||||||
over non-free ones of the same quality.
|
over non-free ones of same quality.
|
||||||
allow_multiple_video_streams: Allow multiple video streams to be merged
|
allow_multiple_video_streams: Allow multiple video streams to be merged
|
||||||
into a single file
|
into a single file
|
||||||
allow_multiple_audio_streams: Allow multiple audio streams to be merged
|
allow_multiple_audio_streams: Allow multiple audio streams to be merged
|
||||||
@ -274,9 +268,7 @@ class YoutubeDL:
|
|||||||
outtmpl_na_placeholder: Placeholder for unavailable meta fields.
|
outtmpl_na_placeholder: Placeholder for unavailable meta fields.
|
||||||
restrictfilenames: Do not allow "&" and spaces in file names
|
restrictfilenames: Do not allow "&" and spaces in file names
|
||||||
trim_file_name: Limit length of filename (extension excluded)
|
trim_file_name: Limit length of filename (extension excluded)
|
||||||
windowsfilenames: True: Force filenames to be Windows compatible
|
windowsfilenames: Force the filenames to be windows compatible
|
||||||
False: Sanitize filenames only minimally
|
|
||||||
This option has no effect when running on Windows
|
|
||||||
ignoreerrors: Do not stop on download/postprocessing errors.
|
ignoreerrors: Do not stop on download/postprocessing errors.
|
||||||
Can be 'only_download' to ignore only download errors.
|
Can be 'only_download' to ignore only download errors.
|
||||||
Default is 'only_download' for CLI, but False for API
|
Default is 'only_download' for CLI, but False for API
|
||||||
@ -291,12 +283,9 @@ class YoutubeDL:
|
|||||||
lazy_playlist: Process playlist entries as they are received.
|
lazy_playlist: Process playlist entries as they are received.
|
||||||
matchtitle: Download only matching titles.
|
matchtitle: Download only matching titles.
|
||||||
rejecttitle: Reject downloads for matching titles.
|
rejecttitle: Reject downloads for matching titles.
|
||||||
logger: A class having a `debug`, `warning` and `error` function where
|
logger: Log messages to a logging.Logger instance.
|
||||||
each has a single string parameter, the message to be logged.
|
|
||||||
For compatibility reasons, both debug and info messages are passed to `debug`.
|
|
||||||
A debug message will have a prefix of `[debug] ` to discern it from info messages.
|
|
||||||
logtostderr: Print everything to stderr instead of stdout.
|
logtostderr: Print everything to stderr instead of stdout.
|
||||||
consoletitle: Display progress in the console window's titlebar.
|
consoletitle: Display progress in console window's titlebar.
|
||||||
writedescription: Write the video description to a .description file
|
writedescription: Write the video description to a .description file
|
||||||
writeinfojson: Write the video description to a .info.json file
|
writeinfojson: Write the video description to a .info.json file
|
||||||
clean_infojson: Remove internal metadata from the infojson
|
clean_infojson: Remove internal metadata from the infojson
|
||||||
@ -482,8 +471,7 @@ class YoutubeDL:
|
|||||||
The following options do not work when used through the API:
|
The following options do not work when used through the API:
|
||||||
filename, abort-on-error, multistreams, no-live-chat,
|
filename, abort-on-error, multistreams, no-live-chat,
|
||||||
format-sort, no-clean-infojson, no-playlist-metafiles,
|
format-sort, no-clean-infojson, no-playlist-metafiles,
|
||||||
no-keep-subs, no-attach-info-json, allow-unsafe-ext, prefer-vp9-sort,
|
no-keep-subs, no-attach-info-json, allow-unsafe-ext.
|
||||||
mtime-by-default.
|
|
||||||
Refer __init__.py for their implementation
|
Refer __init__.py for their implementation
|
||||||
progress_template: Dictionary of templates for progress outputs.
|
progress_template: Dictionary of templates for progress outputs.
|
||||||
Allowed keys are 'download', 'postprocess',
|
Allowed keys are 'download', 'postprocess',
|
||||||
@ -491,7 +479,7 @@ class YoutubeDL:
|
|||||||
The template is mapped on a dictionary with keys 'progress' and 'info'
|
The template is mapped on a dictionary with keys 'progress' and 'info'
|
||||||
retry_sleep_functions: Dictionary of functions that takes the number of attempts
|
retry_sleep_functions: Dictionary of functions that takes the number of attempts
|
||||||
as argument and returns the time to sleep in seconds.
|
as argument and returns the time to sleep in seconds.
|
||||||
Allowed keys are 'http', 'fragment', 'file_access', 'extractor'
|
Allowed keys are 'http', 'fragment', 'file_access'
|
||||||
download_ranges: A callback function that gets called for every video with
|
download_ranges: A callback function that gets called for every video with
|
||||||
the signature (info_dict, ydl) -> Iterable[Section].
|
the signature (info_dict, ydl) -> Iterable[Section].
|
||||||
Only the returned sections will be downloaded.
|
Only the returned sections will be downloaded.
|
||||||
@ -525,7 +513,7 @@ class YoutubeDL:
|
|||||||
The following options are used by the extractors:
|
The following options are used by the extractors:
|
||||||
extractor_retries: Number of times to retry for known errors (default: 3)
|
extractor_retries: Number of times to retry for known errors (default: 3)
|
||||||
dynamic_mpd: Whether to process dynamic DASH manifests (default: True)
|
dynamic_mpd: Whether to process dynamic DASH manifests (default: True)
|
||||||
hls_split_discontinuity: Split HLS playlists into different formats at
|
hls_split_discontinuity: Split HLS playlists to different formats at
|
||||||
discontinuities such as ad breaks (default: False)
|
discontinuities such as ad breaks (default: False)
|
||||||
extractor_args: A dictionary of arguments to be passed to the extractors.
|
extractor_args: A dictionary of arguments to be passed to the extractors.
|
||||||
See "EXTRACTOR ARGUMENTS" for details.
|
See "EXTRACTOR ARGUMENTS" for details.
|
||||||
@ -565,7 +553,7 @@ class YoutubeDL:
|
|||||||
include_ads: - Doesn't work
|
include_ads: - Doesn't work
|
||||||
Download ads as well
|
Download ads as well
|
||||||
call_home: - Not implemented
|
call_home: - Not implemented
|
||||||
Boolean, true if we are allowed to contact the
|
Boolean, true iff we are allowed to contact the
|
||||||
yt-dlp servers for debugging.
|
yt-dlp servers for debugging.
|
||||||
post_hooks: - Register a custom postprocessor
|
post_hooks: - Register a custom postprocessor
|
||||||
A list of functions that get called as the final step
|
A list of functions that get called as the final step
|
||||||
@ -607,7 +595,7 @@ class YoutubeDL:
|
|||||||
# NB: Keep in sync with the docstring of extractor/common.py
|
# NB: Keep in sync with the docstring of extractor/common.py
|
||||||
'url', 'manifest_url', 'manifest_stream_number', 'ext', 'format', 'format_id', 'format_note',
|
'url', 'manifest_url', 'manifest_stream_number', 'ext', 'format', 'format_id', 'format_note',
|
||||||
'width', 'height', 'aspect_ratio', 'resolution', 'dynamic_range', 'tbr', 'abr', 'acodec', 'asr', 'audio_channels',
|
'width', 'height', 'aspect_ratio', 'resolution', 'dynamic_range', 'tbr', 'abr', 'acodec', 'asr', 'audio_channels',
|
||||||
'vbr', 'fps', 'vcodec', 'container', 'filesize', 'filesize_approx', 'rows', 'columns', 'hls_media_playlist_data',
|
'vbr', 'fps', 'vcodec', 'container', 'filesize', 'filesize_approx', 'rows', 'columns',
|
||||||
'player_url', 'protocol', 'fragment_base_url', 'fragments', 'is_from_start', 'is_dash_periods', 'request_data',
|
'player_url', 'protocol', 'fragment_base_url', 'fragments', 'is_from_start', 'is_dash_periods', 'request_data',
|
||||||
'preference', 'language', 'language_preference', 'quality', 'source_preference', 'cookies',
|
'preference', 'language', 'language_preference', 'quality', 'source_preference', 'cookies',
|
||||||
'http_headers', 'stretched_ratio', 'no_resume', 'has_drm', 'extra_param_to_segment_url', 'extra_param_to_key_url',
|
'http_headers', 'stretched_ratio', 'no_resume', 'has_drm', 'extra_param_to_segment_url', 'extra_param_to_key_url',
|
||||||
@ -641,7 +629,6 @@ class YoutubeDL:
|
|||||||
self._printed_messages = set()
|
self._printed_messages = set()
|
||||||
self._first_webpage_request = True
|
self._first_webpage_request = True
|
||||||
self._post_hooks = []
|
self._post_hooks = []
|
||||||
self._close_hooks = []
|
|
||||||
self._progress_hooks = []
|
self._progress_hooks = []
|
||||||
self._postprocessor_hooks = []
|
self._postprocessor_hooks = []
|
||||||
self._download_retcode = 0
|
self._download_retcode = 0
|
||||||
@ -652,15 +639,13 @@ class YoutubeDL:
|
|||||||
self.cache = Cache(self)
|
self.cache = Cache(self)
|
||||||
self.__header_cookies = []
|
self.__header_cookies = []
|
||||||
|
|
||||||
# compat for API: load plugins if they have not already
|
|
||||||
if not all_plugins_loaded.value:
|
|
||||||
load_all_plugins()
|
|
||||||
|
|
||||||
stdout = sys.stderr if self.params.get('logtostderr') else sys.stdout
|
stdout = sys.stderr if self.params.get('logtostderr') else sys.stdout
|
||||||
self._out_files = Namespace(
|
self._out_files = Namespace(
|
||||||
out=stdout,
|
out=stdout,
|
||||||
error=sys.stderr,
|
error=sys.stderr,
|
||||||
screen=sys.stderr if self.params.get('quiet') else stdout,
|
screen=sys.stderr if self.params.get('quiet') else stdout,
|
||||||
|
console=None if compat_os_name == 'nt' else next(
|
||||||
|
filter(supports_terminal_sequences, (sys.stderr, sys.stdout)), None),
|
||||||
)
|
)
|
||||||
|
|
||||||
try:
|
try:
|
||||||
@ -668,9 +653,6 @@ class YoutubeDL:
|
|||||||
except Exception as e:
|
except Exception as e:
|
||||||
self.write_debug(f'Failed to enable VT mode: {e}')
|
self.write_debug(f'Failed to enable VT mode: {e}')
|
||||||
|
|
||||||
# hehe "immutable" namespace
|
|
||||||
self._out_files.console = next(filter(supports_terminal_sequences, (sys.stderr, sys.stdout)), None)
|
|
||||||
|
|
||||||
if self.params.get('no_color'):
|
if self.params.get('no_color'):
|
||||||
if self.params.get('color') is not None:
|
if self.params.get('color') is not None:
|
||||||
self.params.setdefault('_warnings', []).append(
|
self.params.setdefault('_warnings', []).append(
|
||||||
@ -910,11 +892,6 @@ class YoutubeDL:
|
|||||||
"""Add the post hook"""
|
"""Add the post hook"""
|
||||||
self._post_hooks.append(ph)
|
self._post_hooks.append(ph)
|
||||||
|
|
||||||
def add_close_hook(self, ch):
|
|
||||||
"""Add a close hook, called when YoutubeDL.close() is called"""
|
|
||||||
assert callable(ch), 'Close hook must be callable'
|
|
||||||
self._close_hooks.append(ch)
|
|
||||||
|
|
||||||
def add_progress_hook(self, ph):
|
def add_progress_hook(self, ph):
|
||||||
"""Add the download progress hook"""
|
"""Add the download progress hook"""
|
||||||
self._progress_hooks.append(ph)
|
self._progress_hooks.append(ph)
|
||||||
@ -976,22 +953,21 @@ class YoutubeDL:
|
|||||||
self._write_string(f'{self._bidi_workaround(message)}\n', self._out_files.error, only_once=only_once)
|
self._write_string(f'{self._bidi_workaround(message)}\n', self._out_files.error, only_once=only_once)
|
||||||
|
|
||||||
def _send_console_code(self, code):
|
def _send_console_code(self, code):
|
||||||
if not supports_terminal_sequences(self._out_files.console):
|
if compat_os_name == 'nt' or not self._out_files.console:
|
||||||
return False
|
|
||||||
self._write_string(code, self._out_files.console)
|
|
||||||
return True
|
|
||||||
|
|
||||||
def to_console_title(self, message=None, progress_state=None, percent=None):
|
|
||||||
if not self.params.get('consoletitle'):
|
|
||||||
return
|
return
|
||||||
|
self._write_string(code, self._out_files.console)
|
||||||
|
|
||||||
if message:
|
def to_console_title(self, message):
|
||||||
success = self._send_console_code(f'\033]0;{remove_terminal_sequences(message)}\007')
|
if not self.params.get('consoletitle', False):
|
||||||
if not success and os.name == 'nt' and ctypes.windll.kernel32.GetConsoleWindow():
|
return
|
||||||
ctypes.windll.kernel32.SetConsoleTitleW(message)
|
message = remove_terminal_sequences(message)
|
||||||
|
if compat_os_name == 'nt':
|
||||||
if isinstance(progress_state, _ProgressState):
|
if ctypes.windll.kernel32.GetConsoleWindow():
|
||||||
self._send_console_code(progress_state.get_ansi_escape(percent))
|
# c_wchar_p() might not be necessary if `message` is
|
||||||
|
# already of type unicode()
|
||||||
|
ctypes.windll.kernel32.SetConsoleTitleW(ctypes.c_wchar_p(message))
|
||||||
|
else:
|
||||||
|
self._send_console_code(f'\033]0;{message}\007')
|
||||||
|
|
||||||
def save_console_title(self):
|
def save_console_title(self):
|
||||||
if not self.params.get('consoletitle') or self.params.get('simulate'):
|
if not self.params.get('consoletitle') or self.params.get('simulate'):
|
||||||
@ -1005,7 +981,6 @@ class YoutubeDL:
|
|||||||
|
|
||||||
def __enter__(self):
|
def __enter__(self):
|
||||||
self.save_console_title()
|
self.save_console_title()
|
||||||
self.to_console_title(progress_state=_ProgressState.INDETERMINATE)
|
|
||||||
return self
|
return self
|
||||||
|
|
||||||
def save_cookies(self):
|
def save_cookies(self):
|
||||||
@ -1014,7 +989,6 @@ class YoutubeDL:
|
|||||||
|
|
||||||
def __exit__(self, *args):
|
def __exit__(self, *args):
|
||||||
self.restore_console_title()
|
self.restore_console_title()
|
||||||
self.to_console_title(progress_state=_ProgressState.HIDDEN)
|
|
||||||
self.close()
|
self.close()
|
||||||
|
|
||||||
def close(self):
|
def close(self):
|
||||||
@ -1023,9 +997,6 @@ class YoutubeDL:
|
|||||||
self._request_director.close()
|
self._request_director.close()
|
||||||
del self._request_director
|
del self._request_director
|
||||||
|
|
||||||
for close_hook in self._close_hooks:
|
|
||||||
close_hook()
|
|
||||||
|
|
||||||
def trouble(self, message=None, tb=None, is_error=True):
|
def trouble(self, message=None, tb=None, is_error=True):
|
||||||
"""Determine action to take when a download problem appears.
|
"""Determine action to take when a download problem appears.
|
||||||
|
|
||||||
@ -1147,7 +1118,7 @@ class YoutubeDL:
|
|||||||
def raise_no_formats(self, info, forced=False, *, msg=None):
|
def raise_no_formats(self, info, forced=False, *, msg=None):
|
||||||
has_drm = info.get('_has_drm')
|
has_drm = info.get('_has_drm')
|
||||||
ignored, expected = self.params.get('ignore_no_formats_error'), bool(msg)
|
ignored, expected = self.params.get('ignore_no_formats_error'), bool(msg)
|
||||||
msg = msg or (has_drm and 'This video is DRM protected') or 'No video formats found!'
|
msg = msg or has_drm and 'This video is DRM protected' or 'No video formats found!'
|
||||||
if forced or not ignored:
|
if forced or not ignored:
|
||||||
raise ExtractorError(msg, video_id=info['id'], ie=info['extractor'],
|
raise ExtractorError(msg, video_id=info['id'], ie=info['extractor'],
|
||||||
expected=has_drm or ignored or expected)
|
expected=has_drm or ignored or expected)
|
||||||
@ -1223,7 +1194,8 @@ class YoutubeDL:
|
|||||||
|
|
||||||
def prepare_outtmpl(self, outtmpl, info_dict, sanitize=False):
|
def prepare_outtmpl(self, outtmpl, info_dict, sanitize=False):
|
||||||
""" Make the outtmpl and info_dict suitable for substitution: ydl.escape_outtmpl(outtmpl) % info_dict
|
""" Make the outtmpl and info_dict suitable for substitution: ydl.escape_outtmpl(outtmpl) % info_dict
|
||||||
@param sanitize Whether to sanitize the output as a filename
|
@param sanitize Whether to sanitize the output as a filename.
|
||||||
|
For backward compatibility, a function can also be passed
|
||||||
"""
|
"""
|
||||||
|
|
||||||
info_dict.setdefault('epoch', int(time.time())) # keep epoch consistent once set
|
info_dict.setdefault('epoch', int(time.time())) # keep epoch consistent once set
|
||||||
@ -1339,23 +1311,14 @@ class YoutubeDL:
|
|||||||
|
|
||||||
na = self.params.get('outtmpl_na_placeholder', 'NA')
|
na = self.params.get('outtmpl_na_placeholder', 'NA')
|
||||||
|
|
||||||
def filename_sanitizer(key, value, restricted):
|
def filename_sanitizer(key, value, restricted=self.params.get('restrictfilenames')):
|
||||||
return sanitize_filename(str(value), restricted=restricted, is_id=(
|
return sanitize_filename(str(value), restricted=restricted, is_id=(
|
||||||
bool(re.search(r'(^|[_.])id(\.|$)', key))
|
bool(re.search(r'(^|[_.])id(\.|$)', key))
|
||||||
if 'filename-sanitization' in self.params['compat_opts']
|
if 'filename-sanitization' in self.params['compat_opts']
|
||||||
else NO_DEFAULT))
|
else NO_DEFAULT))
|
||||||
|
|
||||||
if callable(sanitize):
|
sanitizer = sanitize if callable(sanitize) else filename_sanitizer
|
||||||
self.deprecation_warning('Passing a callable "sanitize" to YoutubeDL.prepare_outtmpl is deprecated')
|
sanitize = bool(sanitize)
|
||||||
elif not sanitize:
|
|
||||||
pass
|
|
||||||
elif (sys.platform != 'win32' and not self.params.get('restrictfilenames')
|
|
||||||
and self.params.get('windowsfilenames') is False):
|
|
||||||
def sanitize(key, value):
|
|
||||||
return str(value).replace('/', '\u29F8').replace('\0', '')
|
|
||||||
else:
|
|
||||||
def sanitize(key, value):
|
|
||||||
return filename_sanitizer(key, value, restricted=self.params.get('restrictfilenames'))
|
|
||||||
|
|
||||||
def _dumpjson_default(obj):
|
def _dumpjson_default(obj):
|
||||||
if isinstance(obj, (set, LazyList)):
|
if isinstance(obj, (set, LazyList)):
|
||||||
@ -1438,13 +1401,13 @@ class YoutubeDL:
|
|||||||
|
|
||||||
if sanitize:
|
if sanitize:
|
||||||
# If value is an object, sanitize might convert it to a string
|
# If value is an object, sanitize might convert it to a string
|
||||||
# So we manually convert it before sanitizing
|
# So we convert it to repr first
|
||||||
if fmt[-1] == 'r':
|
if fmt[-1] == 'r':
|
||||||
value, fmt = repr(value), str_fmt
|
value, fmt = repr(value), str_fmt
|
||||||
elif fmt[-1] == 'a':
|
elif fmt[-1] == 'a':
|
||||||
value, fmt = ascii(value), str_fmt
|
value, fmt = ascii(value), str_fmt
|
||||||
if fmt[-1] in 'csra':
|
if fmt[-1] in 'csra':
|
||||||
value = sanitize(last_field, value)
|
value = sanitizer(last_field, value)
|
||||||
|
|
||||||
key = '{}\0{}'.format(key.replace('%', '%\0'), outer_mobj.group('format'))
|
key = '{}\0{}'.format(key.replace('%', '%\0'), outer_mobj.group('format'))
|
||||||
TMPL_DICT[key] = value
|
TMPL_DICT[key] = value
|
||||||
@ -1986,7 +1949,6 @@ class YoutubeDL:
|
|||||||
'playlist_uploader_id': ie_result.get('uploader_id'),
|
'playlist_uploader_id': ie_result.get('uploader_id'),
|
||||||
'playlist_channel': ie_result.get('channel'),
|
'playlist_channel': ie_result.get('channel'),
|
||||||
'playlist_channel_id': ie_result.get('channel_id'),
|
'playlist_channel_id': ie_result.get('channel_id'),
|
||||||
'playlist_webpage_url': ie_result.get('webpage_url'),
|
|
||||||
**kwargs,
|
**kwargs,
|
||||||
}
|
}
|
||||||
if strict:
|
if strict:
|
||||||
@ -2147,7 +2109,7 @@ class YoutubeDL:
|
|||||||
m = operator_rex.fullmatch(filter_spec)
|
m = operator_rex.fullmatch(filter_spec)
|
||||||
if m:
|
if m:
|
||||||
try:
|
try:
|
||||||
comparison_value = float(m.group('value'))
|
comparison_value = int(m.group('value'))
|
||||||
except ValueError:
|
except ValueError:
|
||||||
comparison_value = parse_filesize(m.group('value'))
|
comparison_value = parse_filesize(m.group('value'))
|
||||||
if comparison_value is None:
|
if comparison_value is None:
|
||||||
@ -2220,7 +2182,6 @@ class YoutubeDL:
|
|||||||
self.report_warning(f'Unable to delete temporary file "{temp_file.name}"')
|
self.report_warning(f'Unable to delete temporary file "{temp_file.name}"')
|
||||||
f['__working'] = success
|
f['__working'] = success
|
||||||
if success:
|
if success:
|
||||||
f.pop('__needs_testing', None)
|
|
||||||
yield f
|
yield f
|
||||||
else:
|
else:
|
||||||
self.to_screen('[info] Unable to download format {}. Skipping...'.format(f['format_id']))
|
self.to_screen('[info] Unable to download format {}. Skipping...'.format(f['format_id']))
|
||||||
@ -2236,7 +2197,7 @@ class YoutubeDL:
|
|||||||
def _default_format_spec(self, info_dict):
|
def _default_format_spec(self, info_dict):
|
||||||
prefer_best = (
|
prefer_best = (
|
||||||
self.params['outtmpl']['default'] == '-'
|
self.params['outtmpl']['default'] == '-'
|
||||||
or (info_dict.get('is_live') and not self.params.get('live_from_start')))
|
or info_dict.get('is_live') and not self.params.get('live_from_start'))
|
||||||
|
|
||||||
def can_merge():
|
def can_merge():
|
||||||
merger = FFmpegMergerPP(self)
|
merger = FFmpegMergerPP(self)
|
||||||
@ -2405,7 +2366,7 @@ class YoutubeDL:
|
|||||||
vexts=[f['ext'] for f in video_fmts],
|
vexts=[f['ext'] for f in video_fmts],
|
||||||
aexts=[f['ext'] for f in audio_fmts],
|
aexts=[f['ext'] for f in audio_fmts],
|
||||||
preferences=(try_call(lambda: self.params['merge_output_format'].split('/'))
|
preferences=(try_call(lambda: self.params['merge_output_format'].split('/'))
|
||||||
or (self.params.get('prefer_free_formats') and ('webm', 'mkv'))))
|
or self.params.get('prefer_free_formats') and ('webm', 'mkv')))
|
||||||
|
|
||||||
filtered = lambda *keys: filter(None, (traverse_obj(fmt, *keys) for fmt in formats_info))
|
filtered = lambda *keys: filter(None, (traverse_obj(fmt, *keys) for fmt in formats_info))
|
||||||
|
|
||||||
@ -2889,10 +2850,13 @@ class YoutubeDL:
|
|||||||
sanitize_string_field(fmt, 'format_id')
|
sanitize_string_field(fmt, 'format_id')
|
||||||
sanitize_numeric_fields(fmt)
|
sanitize_numeric_fields(fmt)
|
||||||
fmt['url'] = sanitize_url(fmt['url'])
|
fmt['url'] = sanitize_url(fmt['url'])
|
||||||
FormatSorter._fill_sorting_fields(fmt)
|
if fmt.get('ext') is None:
|
||||||
|
fmt['ext'] = determine_ext(fmt['url']).lower()
|
||||||
if fmt['ext'] in ('aac', 'opus', 'mp3', 'flac', 'vorbis'):
|
if fmt['ext'] in ('aac', 'opus', 'mp3', 'flac', 'vorbis'):
|
||||||
if fmt.get('acodec') is None:
|
if fmt.get('acodec') is None:
|
||||||
fmt['acodec'] = fmt['ext']
|
fmt['acodec'] = fmt['ext']
|
||||||
|
if fmt.get('protocol') is None:
|
||||||
|
fmt['protocol'] = determine_protocol(fmt)
|
||||||
if fmt.get('resolution') is None:
|
if fmt.get('resolution') is None:
|
||||||
fmt['resolution'] = self.format_resolution(fmt, default=None)
|
fmt['resolution'] = self.format_resolution(fmt, default=None)
|
||||||
if fmt.get('dynamic_range') is None and fmt.get('vcodec') != 'none':
|
if fmt.get('dynamic_range') is None and fmt.get('vcodec') != 'none':
|
||||||
@ -3295,9 +3259,9 @@ class YoutubeDL:
|
|||||||
|
|
||||||
if full_filename is None:
|
if full_filename is None:
|
||||||
return
|
return
|
||||||
if not self._ensure_dir_exists(full_filename):
|
if not self._ensure_dir_exists(encodeFilename(full_filename)):
|
||||||
return
|
return
|
||||||
if not self._ensure_dir_exists(temp_filename):
|
if not self._ensure_dir_exists(encodeFilename(temp_filename)):
|
||||||
return
|
return
|
||||||
|
|
||||||
if self._write_description('video', info_dict,
|
if self._write_description('video', info_dict,
|
||||||
@ -3329,16 +3293,16 @@ class YoutubeDL:
|
|||||||
if self.params.get('writeannotations', False):
|
if self.params.get('writeannotations', False):
|
||||||
annofn = self.prepare_filename(info_dict, 'annotation')
|
annofn = self.prepare_filename(info_dict, 'annotation')
|
||||||
if annofn:
|
if annofn:
|
||||||
if not self._ensure_dir_exists(annofn):
|
if not self._ensure_dir_exists(encodeFilename(annofn)):
|
||||||
return
|
return
|
||||||
if not self.params.get('overwrites', True) and os.path.exists(annofn):
|
if not self.params.get('overwrites', True) and os.path.exists(encodeFilename(annofn)):
|
||||||
self.to_screen('[info] Video annotations are already present')
|
self.to_screen('[info] Video annotations are already present')
|
||||||
elif not info_dict.get('annotations'):
|
elif not info_dict.get('annotations'):
|
||||||
self.report_warning('There are no annotations to write.')
|
self.report_warning('There are no annotations to write.')
|
||||||
else:
|
else:
|
||||||
try:
|
try:
|
||||||
self.to_screen('[info] Writing video annotations to: ' + annofn)
|
self.to_screen('[info] Writing video annotations to: ' + annofn)
|
||||||
with open(annofn, 'w', encoding='utf-8') as annofile:
|
with open(encodeFilename(annofn), 'w', encoding='utf-8') as annofile:
|
||||||
annofile.write(info_dict['annotations'])
|
annofile.write(info_dict['annotations'])
|
||||||
except (KeyError, TypeError):
|
except (KeyError, TypeError):
|
||||||
self.report_warning('There are no annotations to write.')
|
self.report_warning('There are no annotations to write.')
|
||||||
@ -3354,14 +3318,14 @@ class YoutubeDL:
|
|||||||
f'Cannot write internet shortcut file because the actual URL of "{info_dict["webpage_url"]}" is unknown')
|
f'Cannot write internet shortcut file because the actual URL of "{info_dict["webpage_url"]}" is unknown')
|
||||||
return True
|
return True
|
||||||
linkfn = replace_extension(self.prepare_filename(info_dict, 'link'), link_type, info_dict.get('ext'))
|
linkfn = replace_extension(self.prepare_filename(info_dict, 'link'), link_type, info_dict.get('ext'))
|
||||||
if not self._ensure_dir_exists(linkfn):
|
if not self._ensure_dir_exists(encodeFilename(linkfn)):
|
||||||
return False
|
return False
|
||||||
if self.params.get('overwrites', True) and os.path.exists(linkfn):
|
if self.params.get('overwrites', True) and os.path.exists(encodeFilename(linkfn)):
|
||||||
self.to_screen(f'[info] Internet shortcut (.{link_type}) is already present')
|
self.to_screen(f'[info] Internet shortcut (.{link_type}) is already present')
|
||||||
return True
|
return True
|
||||||
try:
|
try:
|
||||||
self.to_screen(f'[info] Writing internet shortcut (.{link_type}) to: {linkfn}')
|
self.to_screen(f'[info] Writing internet shortcut (.{link_type}) to: {linkfn}')
|
||||||
with open(to_high_limit_path(linkfn), 'w', encoding='utf-8',
|
with open(encodeFilename(to_high_limit_path(linkfn)), 'w', encoding='utf-8',
|
||||||
newline='\r\n' if link_type == 'url' else '\n') as linkfile:
|
newline='\r\n' if link_type == 'url' else '\n') as linkfile:
|
||||||
template_vars = {'url': url}
|
template_vars = {'url': url}
|
||||||
if link_type == 'desktop':
|
if link_type == 'desktop':
|
||||||
@ -3392,7 +3356,7 @@ class YoutubeDL:
|
|||||||
|
|
||||||
if self.params.get('skip_download'):
|
if self.params.get('skip_download'):
|
||||||
info_dict['filepath'] = temp_filename
|
info_dict['filepath'] = temp_filename
|
||||||
info_dict['__finaldir'] = os.path.dirname(os.path.abspath(full_filename))
|
info_dict['__finaldir'] = os.path.dirname(os.path.abspath(encodeFilename(full_filename)))
|
||||||
info_dict['__files_to_move'] = files_to_move
|
info_dict['__files_to_move'] = files_to_move
|
||||||
replace_info_dict(self.run_pp(MoveFilesAfterDownloadPP(self, False), info_dict))
|
replace_info_dict(self.run_pp(MoveFilesAfterDownloadPP(self, False), info_dict))
|
||||||
info_dict['__write_download_archive'] = self.params.get('force_write_download_archive')
|
info_dict['__write_download_archive'] = self.params.get('force_write_download_archive')
|
||||||
@ -3522,7 +3486,7 @@ class YoutubeDL:
|
|||||||
self.report_file_already_downloaded(dl_filename)
|
self.report_file_already_downloaded(dl_filename)
|
||||||
|
|
||||||
dl_filename = dl_filename or temp_filename
|
dl_filename = dl_filename or temp_filename
|
||||||
info_dict['__finaldir'] = os.path.dirname(os.path.abspath(full_filename))
|
info_dict['__finaldir'] = os.path.dirname(os.path.abspath(encodeFilename(full_filename)))
|
||||||
|
|
||||||
except network_exceptions as err:
|
except network_exceptions as err:
|
||||||
self.report_error(f'unable to download video data: {err}')
|
self.report_error(f'unable to download video data: {err}')
|
||||||
@ -3581,8 +3545,8 @@ class YoutubeDL:
|
|||||||
and info_dict.get('container') == 'm4a_dash',
|
and info_dict.get('container') == 'm4a_dash',
|
||||||
'writing DASH m4a. Only some players support this container',
|
'writing DASH m4a. Only some players support this container',
|
||||||
FFmpegFixupM4aPP)
|
FFmpegFixupM4aPP)
|
||||||
ffmpeg_fixup((downloader == 'hlsnative' and not self.params.get('hls_use_mpegts'))
|
ffmpeg_fixup(downloader == 'hlsnative' and not self.params.get('hls_use_mpegts')
|
||||||
or (info_dict.get('is_live') and self.params.get('hls_use_mpegts') is None),
|
or info_dict.get('is_live') and self.params.get('hls_use_mpegts') is None,
|
||||||
'Possible MPEG-TS in MP4 container or malformed AAC timestamps',
|
'Possible MPEG-TS in MP4 container or malformed AAC timestamps',
|
||||||
FFmpegFixupM3u8PP)
|
FFmpegFixupM3u8PP)
|
||||||
ffmpeg_fixup(downloader == 'dashsegments'
|
ffmpeg_fixup(downloader == 'dashsegments'
|
||||||
@ -3965,7 +3929,6 @@ class YoutubeDL:
|
|||||||
self._format_out('UNSUPPORTED', self.Styles.BAD_FORMAT) if f.get('ext') in ('f4f', 'f4m') else None,
|
self._format_out('UNSUPPORTED', self.Styles.BAD_FORMAT) if f.get('ext') in ('f4f', 'f4m') else None,
|
||||||
(self._format_out('Maybe DRM', self.Styles.WARNING) if f.get('has_drm') == 'maybe'
|
(self._format_out('Maybe DRM', self.Styles.WARNING) if f.get('has_drm') == 'maybe'
|
||||||
else self._format_out('DRM', self.Styles.BAD_FORMAT) if f.get('has_drm') else None),
|
else self._format_out('DRM', self.Styles.BAD_FORMAT) if f.get('has_drm') else None),
|
||||||
self._format_out('Untested', self.Styles.WARNING) if f.get('__needs_testing') else None,
|
|
||||||
format_field(f, 'format_note'),
|
format_field(f, 'format_note'),
|
||||||
format_field(f, 'container', ignore=(None, f.get('ext'))),
|
format_field(f, 'container', ignore=(None, f.get('ext'))),
|
||||||
delim=', '), delim=' '),
|
delim=', '), delim=' '),
|
||||||
@ -4021,6 +3984,15 @@ class YoutubeDL:
|
|||||||
if not self.params.get('verbose'):
|
if not self.params.get('verbose'):
|
||||||
return
|
return
|
||||||
|
|
||||||
|
from . import _IN_CLI # Must be delayed import
|
||||||
|
|
||||||
|
# These imports can be slow. So import them only as needed
|
||||||
|
from .extractor.extractors import _LAZY_LOADER
|
||||||
|
from .extractor.extractors import (
|
||||||
|
_PLUGIN_CLASSES as plugin_ies,
|
||||||
|
_PLUGIN_OVERRIDES as plugin_ie_overrides,
|
||||||
|
)
|
||||||
|
|
||||||
def get_encoding(stream):
|
def get_encoding(stream):
|
||||||
ret = str(getattr(stream, 'encoding', f'missing ({type(stream).__name__})'))
|
ret = str(getattr(stream, 'encoding', f'missing ({type(stream).__name__})'))
|
||||||
additional_info = []
|
additional_info = []
|
||||||
@ -4059,18 +4031,17 @@ class YoutubeDL:
|
|||||||
_make_label(ORIGIN, CHANNEL.partition('@')[2] or __version__, __version__),
|
_make_label(ORIGIN, CHANNEL.partition('@')[2] or __version__, __version__),
|
||||||
f'[{RELEASE_GIT_HEAD[:9]}]' if RELEASE_GIT_HEAD else '',
|
f'[{RELEASE_GIT_HEAD[:9]}]' if RELEASE_GIT_HEAD else '',
|
||||||
'' if source == 'unknown' else f'({source})',
|
'' if source == 'unknown' else f'({source})',
|
||||||
'' if IN_CLI.value else 'API' if klass == YoutubeDL else f'API:{self.__module__}.{klass.__qualname__}',
|
'' if _IN_CLI else 'API' if klass == YoutubeDL else f'API:{self.__module__}.{klass.__qualname__}',
|
||||||
delim=' '))
|
delim=' '))
|
||||||
|
|
||||||
if not IN_CLI.value:
|
if not _IN_CLI:
|
||||||
write_debug(f'params: {self.params}')
|
write_debug(f'params: {self.params}')
|
||||||
|
|
||||||
import_extractors()
|
if not _LAZY_LOADER:
|
||||||
lazy_extractors = LAZY_EXTRACTORS.value
|
if os.environ.get('YTDLP_NO_LAZY_EXTRACTORS'):
|
||||||
if lazy_extractors is None:
|
write_debug('Lazy loading extractors is forcibly disabled')
|
||||||
write_debug('Lazy loading extractors is disabled')
|
else:
|
||||||
elif not lazy_extractors:
|
write_debug('Lazy loading extractors is disabled')
|
||||||
write_debug('Lazy loading extractors is forcibly disabled')
|
|
||||||
if self.params['compat_opts']:
|
if self.params['compat_opts']:
|
||||||
write_debug('Compatibility options: {}'.format(', '.join(self.params['compat_opts'])))
|
write_debug('Compatibility options: {}'.format(', '.join(self.params['compat_opts'])))
|
||||||
|
|
||||||
@ -4099,27 +4070,31 @@ class YoutubeDL:
|
|||||||
|
|
||||||
write_debug(f'Proxy map: {self.proxies}')
|
write_debug(f'Proxy map: {self.proxies}')
|
||||||
write_debug(f'Request Handlers: {", ".join(rh.RH_NAME for rh in self._request_director.handlers.values())}')
|
write_debug(f'Request Handlers: {", ".join(rh.RH_NAME for rh in self._request_director.handlers.values())}')
|
||||||
|
for plugin_type, plugins in {'Extractor': plugin_ies, 'Post-Processor': plugin_pps}.items():
|
||||||
for plugin_type, plugins in (('Extractor', plugin_ies), ('Post-Processor', plugin_pps)):
|
display_list = ['{}{}'.format(
|
||||||
display_list = [
|
klass.__name__, '' if klass.__name__ == name else f' as {name}')
|
||||||
klass.__name__ if klass.__name__ == name else f'{klass.__name__} as {name}'
|
for name, klass in plugins.items()]
|
||||||
for name, klass in plugins.value.items()]
|
|
||||||
if plugin_type == 'Extractor':
|
if plugin_type == 'Extractor':
|
||||||
display_list.extend(f'{plugins[-1].IE_NAME.partition("+")[2]} ({parent.__name__})'
|
display_list.extend(f'{plugins[-1].IE_NAME.partition("+")[2]} ({parent.__name__})'
|
||||||
for parent, plugins in plugin_ies_overrides.value.items())
|
for parent, plugins in plugin_ie_overrides.items())
|
||||||
if not display_list:
|
if not display_list:
|
||||||
continue
|
continue
|
||||||
write_debug(f'{plugin_type} Plugins: {", ".join(sorted(display_list))}')
|
write_debug(f'{plugin_type} Plugins: {", ".join(sorted(display_list))}')
|
||||||
|
|
||||||
plugin_dirs_msg = 'none'
|
plugin_dirs = plugin_directories()
|
||||||
if not plugin_dirs.value:
|
if plugin_dirs:
|
||||||
plugin_dirs_msg = 'none (disabled)'
|
write_debug(f'Plugin directories: {plugin_dirs}')
|
||||||
else:
|
|
||||||
found_plugin_directories = plugin_directories()
|
|
||||||
if found_plugin_directories:
|
|
||||||
plugin_dirs_msg = ', '.join(found_plugin_directories)
|
|
||||||
|
|
||||||
write_debug(f'Plugin directories: {plugin_dirs_msg}')
|
# Not implemented
|
||||||
|
if False and self.params.get('call_home'):
|
||||||
|
ipaddr = self.urlopen('https://yt-dl.org/ip').read().decode()
|
||||||
|
write_debug(f'Public IP address: {ipaddr}')
|
||||||
|
latest_version = self.urlopen(
|
||||||
|
'https://yt-dl.org/latest/version').read().decode()
|
||||||
|
if version_tuple(latest_version) > version_tuple(__version__):
|
||||||
|
self.report_warning(
|
||||||
|
f'You are using an outdated version (newest version: {latest_version})! '
|
||||||
|
'See https://yt-dl.org/update if you need help updating.')
|
||||||
|
|
||||||
@functools.cached_property
|
@functools.cached_property
|
||||||
def proxies(self):
|
def proxies(self):
|
||||||
@ -4145,8 +4120,7 @@ class YoutubeDL:
|
|||||||
self.params.get('cookiefile'), self.params.get('cookiesfrombrowser'), self)
|
self.params.get('cookiefile'), self.params.get('cookiesfrombrowser'), self)
|
||||||
except CookieLoadError as error:
|
except CookieLoadError as error:
|
||||||
cause = error.__context__
|
cause = error.__context__
|
||||||
# compat: <=py3.9: `traceback.format_exception` has a different signature
|
self.report_error(str(cause), tb=''.join(traceback.format_exception(cause)))
|
||||||
self.report_error(str(cause), tb=''.join(traceback.format_exception(None, cause, cause.__traceback__)))
|
|
||||||
raise
|
raise
|
||||||
|
|
||||||
@property
|
@property
|
||||||
@ -4164,7 +4138,7 @@ class YoutubeDL:
|
|||||||
(target, rh.RH_NAME)
|
(target, rh.RH_NAME)
|
||||||
for rh in self._request_director.handlers.values()
|
for rh in self._request_director.handlers.values()
|
||||||
if isinstance(rh, ImpersonateRequestHandler)
|
if isinstance(rh, ImpersonateRequestHandler)
|
||||||
for target in reversed(rh.supported_targets)
|
for target in rh.supported_targets
|
||||||
]
|
]
|
||||||
|
|
||||||
def _impersonate_target_available(self, target):
|
def _impersonate_target_available(self, target):
|
||||||
@ -4333,7 +4307,7 @@ class YoutubeDL:
|
|||||||
else:
|
else:
|
||||||
try:
|
try:
|
||||||
self.to_screen(f'[info] Writing {label} description to: {descfn}')
|
self.to_screen(f'[info] Writing {label} description to: {descfn}')
|
||||||
with open(descfn, 'w', encoding='utf-8') as descfile:
|
with open(encodeFilename(descfn), 'w', encoding='utf-8') as descfile:
|
||||||
descfile.write(ie_result['description'])
|
descfile.write(ie_result['description'])
|
||||||
except OSError:
|
except OSError:
|
||||||
self.report_error(f'Cannot write {label} description file {descfn}')
|
self.report_error(f'Cannot write {label} description file {descfn}')
|
||||||
@ -4417,9 +4391,7 @@ class YoutubeDL:
|
|||||||
return None
|
return None
|
||||||
|
|
||||||
for idx, t in list(enumerate(thumbnails))[::-1]:
|
for idx, t in list(enumerate(thumbnails))[::-1]:
|
||||||
thumb_ext = t.get('ext') or determine_ext(t['url'], 'jpg')
|
thumb_ext = (f'{t["id"]}.' if multiple else '') + determine_ext(t['url'], 'jpg')
|
||||||
if multiple:
|
|
||||||
thumb_ext = f'{t["id"]}.{thumb_ext}'
|
|
||||||
thumb_display_id = f'{label} thumbnail {t["id"]}'
|
thumb_display_id = f'{label} thumbnail {t["id"]}'
|
||||||
thumb_filename = replace_extension(filename, thumb_ext, info_dict.get('ext'))
|
thumb_filename = replace_extension(filename, thumb_ext, info_dict.get('ext'))
|
||||||
thumb_filename_final = replace_extension(thumb_filename_base, thumb_ext, info_dict.get('ext'))
|
thumb_filename_final = replace_extension(thumb_filename_base, thumb_ext, info_dict.get('ext'))
|
||||||
@ -4435,7 +4407,7 @@ class YoutubeDL:
|
|||||||
try:
|
try:
|
||||||
uf = self.urlopen(Request(t['url'], headers=t.get('http_headers', {})))
|
uf = self.urlopen(Request(t['url'], headers=t.get('http_headers', {})))
|
||||||
self.to_screen(f'[info] Writing {thumb_display_id} to: {thumb_filename}')
|
self.to_screen(f'[info] Writing {thumb_display_id} to: {thumb_filename}')
|
||||||
with open(thumb_filename, 'wb') as thumbf:
|
with open(encodeFilename(thumb_filename), 'wb') as thumbf:
|
||||||
shutil.copyfileobj(uf, thumbf)
|
shutil.copyfileobj(uf, thumbf)
|
||||||
ret.append((thumb_filename, thumb_filename_final))
|
ret.append((thumb_filename, thumb_filename_final))
|
||||||
t['filepath'] = thumb_filename
|
t['filepath'] = thumb_filename
|
||||||
|
@ -1,8 +1,8 @@
|
|||||||
import sys
|
import sys
|
||||||
|
|
||||||
if sys.version_info < (3, 9):
|
if sys.version_info < (3, 8):
|
||||||
raise ImportError(
|
raise ImportError(
|
||||||
f'You are using an unsupported version of Python. Only Python versions 3.9 and above are supported by yt-dlp') # noqa: F541
|
f'You are using an unsupported version of Python. Only Python versions 3.8 and above are supported by yt-dlp') # noqa: F541
|
||||||
|
|
||||||
__license__ = 'The Unlicense'
|
__license__ = 'The Unlicense'
|
||||||
|
|
||||||
@ -14,14 +14,13 @@ import os
|
|||||||
import re
|
import re
|
||||||
import traceback
|
import traceback
|
||||||
|
|
||||||
|
from .compat import compat_os_name
|
||||||
from .cookies import SUPPORTED_BROWSERS, SUPPORTED_KEYRINGS, CookieLoadError
|
from .cookies import SUPPORTED_BROWSERS, SUPPORTED_KEYRINGS, CookieLoadError
|
||||||
from .downloader.external import get_external_downloader
|
from .downloader.external import get_external_downloader
|
||||||
from .extractor import list_extractor_classes
|
from .extractor import list_extractor_classes
|
||||||
from .extractor.adobepass import MSO_INFO
|
from .extractor.adobepass import MSO_INFO
|
||||||
from .networking.impersonate import ImpersonateTarget
|
from .networking.impersonate import ImpersonateTarget
|
||||||
from .globals import IN_CLI, plugin_dirs
|
|
||||||
from .options import parseOpts
|
from .options import parseOpts
|
||||||
from .plugins import load_all_plugins as _load_all_plugins
|
|
||||||
from .postprocessor import (
|
from .postprocessor import (
|
||||||
FFmpegExtractAudioPP,
|
FFmpegExtractAudioPP,
|
||||||
FFmpegMergerPP,
|
FFmpegMergerPP,
|
||||||
@ -44,6 +43,7 @@ from .utils import (
|
|||||||
GeoUtils,
|
GeoUtils,
|
||||||
PlaylistEntries,
|
PlaylistEntries,
|
||||||
SameFileError,
|
SameFileError,
|
||||||
|
decodeOption,
|
||||||
download_range_func,
|
download_range_func,
|
||||||
expand_path,
|
expand_path,
|
||||||
float_or_none,
|
float_or_none,
|
||||||
@ -67,6 +67,8 @@ from .utils.networking import std_headers
|
|||||||
from .utils._utils import _UnsafeExtensionError
|
from .utils._utils import _UnsafeExtensionError
|
||||||
from .YoutubeDL import YoutubeDL
|
from .YoutubeDL import YoutubeDL
|
||||||
|
|
||||||
|
_IN_CLI = False
|
||||||
|
|
||||||
|
|
||||||
def _exit(status=0, *args):
|
def _exit(status=0, *args):
|
||||||
for msg in args:
|
for msg in args:
|
||||||
@ -156,15 +158,6 @@ def set_compat_opts(opts):
|
|||||||
opts.embed_infojson = False
|
opts.embed_infojson = False
|
||||||
if 'format-sort' in opts.compat_opts:
|
if 'format-sort' in opts.compat_opts:
|
||||||
opts.format_sort.extend(FormatSorter.ytdl_default)
|
opts.format_sort.extend(FormatSorter.ytdl_default)
|
||||||
elif 'prefer-vp9-sort' in opts.compat_opts:
|
|
||||||
opts.format_sort.extend(FormatSorter._prefer_vp9_sort)
|
|
||||||
|
|
||||||
if 'mtime-by-default' in opts.compat_opts:
|
|
||||||
if opts.updatetime is None:
|
|
||||||
opts.updatetime = True
|
|
||||||
else:
|
|
||||||
_unused_compat_opt('mtime-by-default')
|
|
||||||
|
|
||||||
_video_multistreams_set = set_default_compat('multistreams', 'allow_multiple_video_streams', False, remove_compat=False)
|
_video_multistreams_set = set_default_compat('multistreams', 'allow_multiple_video_streams', False, remove_compat=False)
|
||||||
_audio_multistreams_set = set_default_compat('multistreams', 'allow_multiple_audio_streams', False, remove_compat=False)
|
_audio_multistreams_set = set_default_compat('multistreams', 'allow_multiple_audio_streams', False, remove_compat=False)
|
||||||
if _video_multistreams_set is False and _audio_multistreams_set is False:
|
if _video_multistreams_set is False and _audio_multistreams_set is False:
|
||||||
@ -266,11 +259,9 @@ def validate_options(opts):
|
|||||||
elif value in ('inf', 'infinite'):
|
elif value in ('inf', 'infinite'):
|
||||||
return float('inf')
|
return float('inf')
|
||||||
try:
|
try:
|
||||||
int_value = int(value)
|
return int(value)
|
||||||
except (TypeError, ValueError):
|
except (TypeError, ValueError):
|
||||||
validate(False, f'{name} retry count', value)
|
validate(False, f'{name} retry count', value)
|
||||||
validate_positive(f'{name} retry count', int_value)
|
|
||||||
return int_value
|
|
||||||
|
|
||||||
opts.retries = parse_retries('download', opts.retries)
|
opts.retries = parse_retries('download', opts.retries)
|
||||||
opts.fragment_retries = parse_retries('fragment', opts.fragment_retries)
|
opts.fragment_retries = parse_retries('fragment', opts.fragment_retries)
|
||||||
@ -300,20 +291,18 @@ def validate_options(opts):
|
|||||||
raise ValueError(f'invalid {key} retry sleep expression {expr!r}')
|
raise ValueError(f'invalid {key} retry sleep expression {expr!r}')
|
||||||
|
|
||||||
# Bytes
|
# Bytes
|
||||||
def validate_bytes(name, value, strict_positive=False):
|
def validate_bytes(name, value):
|
||||||
if value is None:
|
if value is None:
|
||||||
return None
|
return None
|
||||||
numeric_limit = parse_bytes(value)
|
numeric_limit = parse_bytes(value)
|
||||||
validate(numeric_limit is not None, name, value)
|
validate(numeric_limit is not None, 'rate limit', value)
|
||||||
if strict_positive:
|
|
||||||
validate_positive(name, numeric_limit, True)
|
|
||||||
return numeric_limit
|
return numeric_limit
|
||||||
|
|
||||||
opts.ratelimit = validate_bytes('rate limit', opts.ratelimit, True)
|
opts.ratelimit = validate_bytes('rate limit', opts.ratelimit)
|
||||||
opts.throttledratelimit = validate_bytes('throttled rate limit', opts.throttledratelimit)
|
opts.throttledratelimit = validate_bytes('throttled rate limit', opts.throttledratelimit)
|
||||||
opts.min_filesize = validate_bytes('min filesize', opts.min_filesize)
|
opts.min_filesize = validate_bytes('min filesize', opts.min_filesize)
|
||||||
opts.max_filesize = validate_bytes('max filesize', opts.max_filesize)
|
opts.max_filesize = validate_bytes('max filesize', opts.max_filesize)
|
||||||
opts.buffersize = validate_bytes('buffer size', opts.buffersize, True)
|
opts.buffersize = validate_bytes('buffer size', opts.buffersize)
|
||||||
opts.http_chunk_size = validate_bytes('http chunk size', opts.http_chunk_size)
|
opts.http_chunk_size = validate_bytes('http chunk size', opts.http_chunk_size)
|
||||||
|
|
||||||
# Output templates
|
# Output templates
|
||||||
@ -438,10 +427,6 @@ def validate_options(opts):
|
|||||||
}
|
}
|
||||||
|
|
||||||
# Other options
|
# Other options
|
||||||
opts.plugin_dirs = opts.plugin_dirs
|
|
||||||
if opts.plugin_dirs is None:
|
|
||||||
opts.plugin_dirs = ['default']
|
|
||||||
|
|
||||||
if opts.playlist_items is not None:
|
if opts.playlist_items is not None:
|
||||||
try:
|
try:
|
||||||
tuple(PlaylistEntries.parse_playlist_items(opts.playlist_items))
|
tuple(PlaylistEntries.parse_playlist_items(opts.playlist_items))
|
||||||
@ -894,8 +879,8 @@ def parse_options(argv=None):
|
|||||||
'listsubtitles': opts.listsubtitles,
|
'listsubtitles': opts.listsubtitles,
|
||||||
'subtitlesformat': opts.subtitlesformat,
|
'subtitlesformat': opts.subtitlesformat,
|
||||||
'subtitleslangs': opts.subtitleslangs,
|
'subtitleslangs': opts.subtitleslangs,
|
||||||
'matchtitle': opts.matchtitle,
|
'matchtitle': decodeOption(opts.matchtitle),
|
||||||
'rejecttitle': opts.rejecttitle,
|
'rejecttitle': decodeOption(opts.rejecttitle),
|
||||||
'max_downloads': opts.max_downloads,
|
'max_downloads': opts.max_downloads,
|
||||||
'prefer_free_formats': opts.prefer_free_formats,
|
'prefer_free_formats': opts.prefer_free_formats,
|
||||||
'trim_file_name': opts.trim_file_name,
|
'trim_file_name': opts.trim_file_name,
|
||||||
@ -996,11 +981,6 @@ def _real_main(argv=None):
|
|||||||
if opts.ffmpeg_location:
|
if opts.ffmpeg_location:
|
||||||
FFmpegPostProcessor._ffmpeg_location.set(opts.ffmpeg_location)
|
FFmpegPostProcessor._ffmpeg_location.set(opts.ffmpeg_location)
|
||||||
|
|
||||||
# load all plugins into the global lookup
|
|
||||||
plugin_dirs.value = opts.plugin_dirs
|
|
||||||
if plugin_dirs.value:
|
|
||||||
_load_all_plugins()
|
|
||||||
|
|
||||||
with YoutubeDL(ydl_opts) as ydl:
|
with YoutubeDL(ydl_opts) as ydl:
|
||||||
pre_process = opts.update_self or opts.rm_cachedir
|
pre_process = opts.update_self or opts.rm_cachedir
|
||||||
actual_use = all_urls or opts.load_info_filename
|
actual_use = all_urls or opts.load_info_filename
|
||||||
@ -1027,9 +1007,8 @@ def _real_main(argv=None):
|
|||||||
# List of simplified targets we know are supported,
|
# List of simplified targets we know are supported,
|
||||||
# to help users know what dependencies may be required.
|
# to help users know what dependencies may be required.
|
||||||
(ImpersonateTarget('chrome'), 'curl_cffi'),
|
(ImpersonateTarget('chrome'), 'curl_cffi'),
|
||||||
(ImpersonateTarget('safari'), 'curl_cffi'),
|
|
||||||
(ImpersonateTarget('firefox'), 'curl_cffi>=0.10'),
|
|
||||||
(ImpersonateTarget('edge'), 'curl_cffi'),
|
(ImpersonateTarget('edge'), 'curl_cffi'),
|
||||||
|
(ImpersonateTarget('safari'), 'curl_cffi'),
|
||||||
]
|
]
|
||||||
|
|
||||||
available_targets = ydl._get_available_impersonate_targets()
|
available_targets = ydl._get_available_impersonate_targets()
|
||||||
@ -1045,12 +1024,12 @@ def _real_main(argv=None):
|
|||||||
|
|
||||||
for known_target, known_handler in known_targets:
|
for known_target, known_handler in known_targets:
|
||||||
if not any(
|
if not any(
|
||||||
known_target in target and known_handler.startswith(handler)
|
known_target in target and handler == known_handler
|
||||||
for target, handler in available_targets
|
for target, handler in available_targets
|
||||||
):
|
):
|
||||||
rows.insert(0, [
|
rows.append([
|
||||||
ydl._format_out(text, ydl.Styles.SUPPRESS)
|
ydl._format_out(text, ydl.Styles.SUPPRESS)
|
||||||
for text in make_row(known_target, f'{known_handler} (unavailable)')
|
for text in make_row(known_target, f'{known_handler} (not available)')
|
||||||
])
|
])
|
||||||
|
|
||||||
ydl.to_screen('[info] Available impersonate targets')
|
ydl.to_screen('[info] Available impersonate targets')
|
||||||
@ -1065,7 +1044,7 @@ def _real_main(argv=None):
|
|||||||
ydl.warn_if_short_id(args)
|
ydl.warn_if_short_id(args)
|
||||||
|
|
||||||
# Show a useful error message and wait for keypress if not launched from shell on Windows
|
# Show a useful error message and wait for keypress if not launched from shell on Windows
|
||||||
if not args and os.name == 'nt' and getattr(sys, 'frozen', False):
|
if not args and compat_os_name == 'nt' and getattr(sys, 'frozen', False):
|
||||||
import ctypes.wintypes
|
import ctypes.wintypes
|
||||||
import msvcrt
|
import msvcrt
|
||||||
|
|
||||||
@ -1076,7 +1055,7 @@ def _real_main(argv=None):
|
|||||||
# If we only have a single process attached, then the executable was double clicked
|
# If we only have a single process attached, then the executable was double clicked
|
||||||
# When using `pyinstaller` with `--onefile`, two processes get attached
|
# When using `pyinstaller` with `--onefile`, two processes get attached
|
||||||
is_onefile = hasattr(sys, '_MEIPASS') and os.path.basename(sys._MEIPASS).startswith('_MEI')
|
is_onefile = hasattr(sys, '_MEIPASS') and os.path.basename(sys._MEIPASS).startswith('_MEI')
|
||||||
if attached_processes == 1 or (is_onefile and attached_processes == 2):
|
if attached_processes == 1 or is_onefile and attached_processes == 2:
|
||||||
print(parser._generate_error_message(
|
print(parser._generate_error_message(
|
||||||
'Do not double-click the executable, instead call it from a command line.\n'
|
'Do not double-click the executable, instead call it from a command line.\n'
|
||||||
'Please read the README for further information on how to use yt-dlp: '
|
'Please read the README for further information on how to use yt-dlp: '
|
||||||
@ -1101,7 +1080,8 @@ def _real_main(argv=None):
|
|||||||
|
|
||||||
|
|
||||||
def main(argv=None):
|
def main(argv=None):
|
||||||
IN_CLI.value = True
|
global _IN_CLI
|
||||||
|
_IN_CLI = True
|
||||||
try:
|
try:
|
||||||
_exit(*variadic(_real_main(argv)))
|
_exit(*variadic(_real_main(argv)))
|
||||||
except (CookieLoadError, DownloadError):
|
except (CookieLoadError, DownloadError):
|
||||||
@ -1122,9 +1102,9 @@ def main(argv=None):
|
|||||||
from .extractor import gen_extractors, list_extractors
|
from .extractor import gen_extractors, list_extractors
|
||||||
|
|
||||||
__all__ = [
|
__all__ = [
|
||||||
|
'main',
|
||||||
'YoutubeDL',
|
'YoutubeDL',
|
||||||
|
'parse_options',
|
||||||
'gen_extractors',
|
'gen_extractors',
|
||||||
'list_extractors',
|
'list_extractors',
|
||||||
'main',
|
|
||||||
'parse_options',
|
|
||||||
]
|
]
|
||||||
|
@ -3,6 +3,7 @@ from math import ceil
|
|||||||
|
|
||||||
from .compat import compat_ord
|
from .compat import compat_ord
|
||||||
from .dependencies import Cryptodome
|
from .dependencies import Cryptodome
|
||||||
|
from .utils import bytes_to_intlist, intlist_to_bytes
|
||||||
|
|
||||||
if Cryptodome.AES:
|
if Cryptodome.AES:
|
||||||
def aes_cbc_decrypt_bytes(data, key, iv):
|
def aes_cbc_decrypt_bytes(data, key, iv):
|
||||||
@ -16,15 +17,15 @@ if Cryptodome.AES:
|
|||||||
else:
|
else:
|
||||||
def aes_cbc_decrypt_bytes(data, key, iv):
|
def aes_cbc_decrypt_bytes(data, key, iv):
|
||||||
""" Decrypt bytes with AES-CBC using native implementation since pycryptodome is unavailable """
|
""" Decrypt bytes with AES-CBC using native implementation since pycryptodome is unavailable """
|
||||||
return bytes(aes_cbc_decrypt(*map(list, (data, key, iv))))
|
return intlist_to_bytes(aes_cbc_decrypt(*map(bytes_to_intlist, (data, key, iv))))
|
||||||
|
|
||||||
def aes_gcm_decrypt_and_verify_bytes(data, key, tag, nonce):
|
def aes_gcm_decrypt_and_verify_bytes(data, key, tag, nonce):
|
||||||
""" Decrypt bytes with AES-GCM using native implementation since pycryptodome is unavailable """
|
""" Decrypt bytes with AES-GCM using native implementation since pycryptodome is unavailable """
|
||||||
return bytes(aes_gcm_decrypt_and_verify(*map(list, (data, key, tag, nonce))))
|
return intlist_to_bytes(aes_gcm_decrypt_and_verify(*map(bytes_to_intlist, (data, key, tag, nonce))))
|
||||||
|
|
||||||
|
|
||||||
def aes_cbc_encrypt_bytes(data, key, iv, **kwargs):
|
def aes_cbc_encrypt_bytes(data, key, iv, **kwargs):
|
||||||
return bytes(aes_cbc_encrypt(*map(list, (data, key, iv)), **kwargs))
|
return intlist_to_bytes(aes_cbc_encrypt(*map(bytes_to_intlist, (data, key, iv)), **kwargs))
|
||||||
|
|
||||||
|
|
||||||
BLOCK_SIZE_BYTES = 16
|
BLOCK_SIZE_BYTES = 16
|
||||||
@ -83,7 +84,7 @@ def aes_ecb_encrypt(data, key, iv=None):
|
|||||||
@returns {int[]} encrypted data
|
@returns {int[]} encrypted data
|
||||||
"""
|
"""
|
||||||
expanded_key = key_expansion(key)
|
expanded_key = key_expansion(key)
|
||||||
block_count = ceil(len(data) / BLOCK_SIZE_BYTES)
|
block_count = int(ceil(float(len(data)) / BLOCK_SIZE_BYTES))
|
||||||
|
|
||||||
encrypted_data = []
|
encrypted_data = []
|
||||||
for i in range(block_count):
|
for i in range(block_count):
|
||||||
@ -103,7 +104,7 @@ def aes_ecb_decrypt(data, key, iv=None):
|
|||||||
@returns {int[]} decrypted data
|
@returns {int[]} decrypted data
|
||||||
"""
|
"""
|
||||||
expanded_key = key_expansion(key)
|
expanded_key = key_expansion(key)
|
||||||
block_count = ceil(len(data) / BLOCK_SIZE_BYTES)
|
block_count = int(ceil(float(len(data)) / BLOCK_SIZE_BYTES))
|
||||||
|
|
||||||
encrypted_data = []
|
encrypted_data = []
|
||||||
for i in range(block_count):
|
for i in range(block_count):
|
||||||
@ -134,7 +135,7 @@ def aes_ctr_encrypt(data, key, iv):
|
|||||||
@returns {int[]} encrypted data
|
@returns {int[]} encrypted data
|
||||||
"""
|
"""
|
||||||
expanded_key = key_expansion(key)
|
expanded_key = key_expansion(key)
|
||||||
block_count = ceil(len(data) / BLOCK_SIZE_BYTES)
|
block_count = int(ceil(float(len(data)) / BLOCK_SIZE_BYTES))
|
||||||
counter = iter_vector(iv)
|
counter = iter_vector(iv)
|
||||||
|
|
||||||
encrypted_data = []
|
encrypted_data = []
|
||||||
@ -158,7 +159,7 @@ def aes_cbc_decrypt(data, key, iv):
|
|||||||
@returns {int[]} decrypted data
|
@returns {int[]} decrypted data
|
||||||
"""
|
"""
|
||||||
expanded_key = key_expansion(key)
|
expanded_key = key_expansion(key)
|
||||||
block_count = ceil(len(data) / BLOCK_SIZE_BYTES)
|
block_count = int(ceil(float(len(data)) / BLOCK_SIZE_BYTES))
|
||||||
|
|
||||||
decrypted_data = []
|
decrypted_data = []
|
||||||
previous_cipher_block = iv
|
previous_cipher_block = iv
|
||||||
@ -183,7 +184,7 @@ def aes_cbc_encrypt(data, key, iv, *, padding_mode='pkcs7'):
|
|||||||
@returns {int[]} encrypted data
|
@returns {int[]} encrypted data
|
||||||
"""
|
"""
|
||||||
expanded_key = key_expansion(key)
|
expanded_key = key_expansion(key)
|
||||||
block_count = ceil(len(data) / BLOCK_SIZE_BYTES)
|
block_count = int(ceil(float(len(data)) / BLOCK_SIZE_BYTES))
|
||||||
|
|
||||||
encrypted_data = []
|
encrypted_data = []
|
||||||
previous_cipher_block = iv
|
previous_cipher_block = iv
|
||||||
@ -220,7 +221,7 @@ def aes_gcm_decrypt_and_verify(data, key, tag, nonce):
|
|||||||
j0 = [*nonce, 0, 0, 0, 1]
|
j0 = [*nonce, 0, 0, 0, 1]
|
||||||
else:
|
else:
|
||||||
fill = (BLOCK_SIZE_BYTES - (len(nonce) % BLOCK_SIZE_BYTES)) % BLOCK_SIZE_BYTES + 8
|
fill = (BLOCK_SIZE_BYTES - (len(nonce) % BLOCK_SIZE_BYTES)) % BLOCK_SIZE_BYTES + 8
|
||||||
ghash_in = nonce + [0] * fill + list((8 * len(nonce)).to_bytes(8, 'big'))
|
ghash_in = nonce + [0] * fill + bytes_to_intlist((8 * len(nonce)).to_bytes(8, 'big'))
|
||||||
j0 = ghash(hash_subkey, ghash_in)
|
j0 = ghash(hash_subkey, ghash_in)
|
||||||
|
|
||||||
# TODO: add nonce support to aes_ctr_decrypt
|
# TODO: add nonce support to aes_ctr_decrypt
|
||||||
@ -229,13 +230,13 @@ def aes_gcm_decrypt_and_verify(data, key, tag, nonce):
|
|||||||
iv_ctr = inc(j0)
|
iv_ctr = inc(j0)
|
||||||
|
|
||||||
decrypted_data = aes_ctr_decrypt(data, key, iv_ctr + [0] * (BLOCK_SIZE_BYTES - len(iv_ctr)))
|
decrypted_data = aes_ctr_decrypt(data, key, iv_ctr + [0] * (BLOCK_SIZE_BYTES - len(iv_ctr)))
|
||||||
pad_len = (BLOCK_SIZE_BYTES - (len(data) % BLOCK_SIZE_BYTES)) % BLOCK_SIZE_BYTES
|
pad_len = len(data) // 16 * 16
|
||||||
s_tag = ghash(
|
s_tag = ghash(
|
||||||
hash_subkey,
|
hash_subkey,
|
||||||
data
|
data
|
||||||
+ [0] * pad_len # pad
|
+ [0] * (BLOCK_SIZE_BYTES - len(data) + pad_len) # pad
|
||||||
+ list((0 * 8).to_bytes(8, 'big') # length of associated data
|
+ bytes_to_intlist((0 * 8).to_bytes(8, 'big') # length of associated data
|
||||||
+ ((len(data) * 8).to_bytes(8, 'big'))), # length of data
|
+ ((len(data) * 8).to_bytes(8, 'big'))), # length of data
|
||||||
)
|
)
|
||||||
|
|
||||||
if tag != aes_ctr_encrypt(s_tag, key, j0):
|
if tag != aes_ctr_encrypt(s_tag, key, j0):
|
||||||
@ -299,8 +300,8 @@ def aes_decrypt_text(data, password, key_size_bytes):
|
|||||||
"""
|
"""
|
||||||
NONCE_LENGTH_BYTES = 8
|
NONCE_LENGTH_BYTES = 8
|
||||||
|
|
||||||
data = list(base64.b64decode(data))
|
data = bytes_to_intlist(base64.b64decode(data))
|
||||||
password = list(password.encode())
|
password = bytes_to_intlist(password.encode())
|
||||||
|
|
||||||
key = password[:key_size_bytes] + [0] * (key_size_bytes - len(password))
|
key = password[:key_size_bytes] + [0] * (key_size_bytes - len(password))
|
||||||
key = aes_encrypt(key[:BLOCK_SIZE_BYTES], key_expansion(key)) * (key_size_bytes // BLOCK_SIZE_BYTES)
|
key = aes_encrypt(key[:BLOCK_SIZE_BYTES], key_expansion(key)) * (key_size_bytes // BLOCK_SIZE_BYTES)
|
||||||
@ -309,7 +310,7 @@ def aes_decrypt_text(data, password, key_size_bytes):
|
|||||||
cipher = data[NONCE_LENGTH_BYTES:]
|
cipher = data[NONCE_LENGTH_BYTES:]
|
||||||
|
|
||||||
decrypted_data = aes_ctr_decrypt(cipher, key, nonce + [0] * (BLOCK_SIZE_BYTES - NONCE_LENGTH_BYTES))
|
decrypted_data = aes_ctr_decrypt(cipher, key, nonce + [0] * (BLOCK_SIZE_BYTES - NONCE_LENGTH_BYTES))
|
||||||
return bytes(decrypted_data)
|
return intlist_to_bytes(decrypted_data)
|
||||||
|
|
||||||
|
|
||||||
RCON = (0x8d, 0x01, 0x02, 0x04, 0x08, 0x10, 0x20, 0x40, 0x80, 0x1b, 0x36)
|
RCON = (0x8d, 0x01, 0x02, 0x04, 0x08, 0x10, 0x20, 0x40, 0x80, 0x1b, 0x36)
|
||||||
@ -534,17 +535,19 @@ def ghash(subkey, data):
|
|||||||
__all__ = [
|
__all__ = [
|
||||||
'aes_cbc_decrypt',
|
'aes_cbc_decrypt',
|
||||||
'aes_cbc_decrypt_bytes',
|
'aes_cbc_decrypt_bytes',
|
||||||
'aes_cbc_encrypt',
|
|
||||||
'aes_cbc_encrypt_bytes',
|
|
||||||
'aes_ctr_decrypt',
|
'aes_ctr_decrypt',
|
||||||
'aes_ctr_encrypt',
|
|
||||||
'aes_decrypt',
|
|
||||||
'aes_decrypt_text',
|
'aes_decrypt_text',
|
||||||
|
'aes_decrypt',
|
||||||
'aes_ecb_decrypt',
|
'aes_ecb_decrypt',
|
||||||
'aes_ecb_encrypt',
|
|
||||||
'aes_encrypt',
|
|
||||||
'aes_gcm_decrypt_and_verify',
|
'aes_gcm_decrypt_and_verify',
|
||||||
'aes_gcm_decrypt_and_verify_bytes',
|
'aes_gcm_decrypt_and_verify_bytes',
|
||||||
|
|
||||||
|
'aes_cbc_encrypt',
|
||||||
|
'aes_cbc_encrypt_bytes',
|
||||||
|
'aes_ctr_encrypt',
|
||||||
|
'aes_ecb_encrypt',
|
||||||
|
'aes_encrypt',
|
||||||
|
|
||||||
'key_expansion',
|
'key_expansion',
|
||||||
'pad_block',
|
'pad_block',
|
||||||
'pkcs7_padding',
|
'pkcs7_padding',
|
||||||
|
@ -1,4 +1,5 @@
|
|||||||
import os
|
import os
|
||||||
|
import sys
|
||||||
import xml.etree.ElementTree as etree
|
import xml.etree.ElementTree as etree
|
||||||
|
|
||||||
from .compat_utils import passthrough_module
|
from .compat_utils import passthrough_module
|
||||||
@ -23,14 +24,33 @@ def compat_etree_fromstring(text):
|
|||||||
return etree.XML(text, parser=etree.XMLParser(target=_TreeBuilder()))
|
return etree.XML(text, parser=etree.XMLParser(target=_TreeBuilder()))
|
||||||
|
|
||||||
|
|
||||||
|
compat_os_name = os._name if os.name == 'java' else os.name
|
||||||
|
|
||||||
|
|
||||||
|
def compat_shlex_quote(s):
|
||||||
|
from ..utils import shell_quote
|
||||||
|
return shell_quote(s)
|
||||||
|
|
||||||
|
|
||||||
def compat_ord(c):
|
def compat_ord(c):
|
||||||
return c if isinstance(c, int) else ord(c)
|
return c if isinstance(c, int) else ord(c)
|
||||||
|
|
||||||
|
|
||||||
|
if compat_os_name == 'nt' and sys.version_info < (3, 8):
|
||||||
|
# os.path.realpath on Windows does not follow symbolic links
|
||||||
|
# prior to Python 3.8 (see https://bugs.python.org/issue9949)
|
||||||
|
def compat_realpath(path):
|
||||||
|
while os.path.islink(path):
|
||||||
|
path = os.path.abspath(os.readlink(path))
|
||||||
|
return os.path.realpath(path)
|
||||||
|
else:
|
||||||
|
compat_realpath = os.path.realpath
|
||||||
|
|
||||||
|
|
||||||
# Python 3.8+ does not honor %HOME% on windows, but this breaks compatibility with youtube-dl
|
# Python 3.8+ does not honor %HOME% on windows, but this breaks compatibility with youtube-dl
|
||||||
# See https://github.com/yt-dlp/yt-dlp/issues/792
|
# See https://github.com/yt-dlp/yt-dlp/issues/792
|
||||||
# https://docs.python.org/3/library/os.path.html#os.path.expanduser
|
# https://docs.python.org/3/library/os.path.html#os.path.expanduser
|
||||||
if os.name in ('nt', 'ce'):
|
if compat_os_name in ('nt', 'ce'):
|
||||||
def compat_expanduser(path):
|
def compat_expanduser(path):
|
||||||
HOME = os.environ.get('HOME')
|
HOME = os.environ.get('HOME')
|
||||||
if not HOME:
|
if not HOME:
|
||||||
|
@ -8,14 +8,16 @@ passthrough_module(__name__, '.._legacy', callback=lambda attr: warnings.warn(
|
|||||||
DeprecationWarning(f'{__name__}.{attr} is deprecated'), stacklevel=6))
|
DeprecationWarning(f'{__name__}.{attr} is deprecated'), stacklevel=6))
|
||||||
del passthrough_module
|
del passthrough_module
|
||||||
|
|
||||||
import functools # noqa: F401
|
import base64
|
||||||
import os
|
import urllib.error
|
||||||
|
import urllib.parse
|
||||||
|
|
||||||
|
compat_str = str
|
||||||
|
|
||||||
compat_os_name = os.name
|
compat_b64decode = base64.b64decode
|
||||||
compat_realpath = os.path.realpath
|
|
||||||
|
|
||||||
|
compat_urlparse = urllib.parse
|
||||||
def compat_shlex_quote(s):
|
compat_parse_qs = urllib.parse.parse_qs
|
||||||
from ..utils import shell_quote
|
compat_urllib_parse_unquote = urllib.parse.unquote
|
||||||
return shell_quote(s)
|
compat_urllib_parse_urlencode = urllib.parse.urlencode
|
||||||
|
compat_urllib_parse_urlparse = urllib.parse.urlparse
|
||||||
|
@ -30,7 +30,7 @@ from asyncio import run as compat_asyncio_run # noqa: F401
|
|||||||
from re import Pattern as compat_Pattern # noqa: F401
|
from re import Pattern as compat_Pattern # noqa: F401
|
||||||
from re import match as compat_Match # noqa: F401
|
from re import match as compat_Match # noqa: F401
|
||||||
|
|
||||||
from . import compat_expanduser, compat_HTMLParseError
|
from . import compat_expanduser, compat_HTMLParseError, compat_realpath
|
||||||
from .compat_utils import passthrough_module
|
from .compat_utils import passthrough_module
|
||||||
from ..dependencies import brotli as compat_brotli # noqa: F401
|
from ..dependencies import brotli as compat_brotli # noqa: F401
|
||||||
from ..dependencies import websockets as compat_websockets # noqa: F401
|
from ..dependencies import websockets as compat_websockets # noqa: F401
|
||||||
@ -78,7 +78,7 @@ compat_kwargs = lambda kwargs: kwargs
|
|||||||
compat_map = map
|
compat_map = map
|
||||||
compat_numeric_types = (int, float, complex)
|
compat_numeric_types = (int, float, complex)
|
||||||
compat_os_path_expanduser = compat_expanduser
|
compat_os_path_expanduser = compat_expanduser
|
||||||
compat_os_path_realpath = os.path.realpath
|
compat_os_path_realpath = compat_realpath
|
||||||
compat_print = print
|
compat_print = print
|
||||||
compat_shlex_split = shlex.split
|
compat_shlex_split = shlex.split
|
||||||
compat_socket_create_connection = socket.create_connection
|
compat_socket_create_connection = socket.create_connection
|
||||||
@ -104,12 +104,5 @@ compat_xml_parse_error = compat_xml_etree_ElementTree_ParseError = etree.ParseEr
|
|||||||
compat_xpath = lambda xpath: xpath
|
compat_xpath = lambda xpath: xpath
|
||||||
compat_zip = zip
|
compat_zip = zip
|
||||||
workaround_optparse_bug9161 = lambda: None
|
workaround_optparse_bug9161 = lambda: None
|
||||||
compat_str = str
|
|
||||||
compat_b64decode = base64.b64decode
|
|
||||||
compat_urlparse = urllib.parse
|
|
||||||
compat_parse_qs = urllib.parse.parse_qs
|
|
||||||
compat_urllib_parse_unquote = urllib.parse.unquote
|
|
||||||
compat_urllib_parse_urlencode = urllib.parse.urlencode
|
|
||||||
compat_urllib_parse_urlparse = urllib.parse.urlparse
|
|
||||||
|
|
||||||
legacy = []
|
legacy = []
|
||||||
|
@ -57,7 +57,7 @@ def passthrough_module(parent, child, allowed_attributes=(..., ), *, callback=la
|
|||||||
callback(attr)
|
callback(attr)
|
||||||
return ret
|
return ret
|
||||||
|
|
||||||
@functools.cache
|
@functools.lru_cache(maxsize=None)
|
||||||
def from_child(attr):
|
def from_child(attr):
|
||||||
nonlocal child
|
nonlocal child
|
||||||
if attr not in allowed_attributes:
|
if attr not in allowed_attributes:
|
||||||
|
12
yt_dlp/compat/functools.py
Normal file
12
yt_dlp/compat/functools.py
Normal file
@ -0,0 +1,12 @@
|
|||||||
|
# flake8: noqa: F405
|
||||||
|
from functools import * # noqa: F403
|
||||||
|
|
||||||
|
from .compat_utils import passthrough_module
|
||||||
|
|
||||||
|
passthrough_module(__name__, 'functools')
|
||||||
|
del passthrough_module
|
||||||
|
|
||||||
|
try:
|
||||||
|
_ = cache # >= 3.9
|
||||||
|
except NameError:
|
||||||
|
cache = lru_cache(maxsize=None)
|
@ -7,9 +7,9 @@ passthrough_module(__name__, 'urllib.request')
|
|||||||
del passthrough_module
|
del passthrough_module
|
||||||
|
|
||||||
|
|
||||||
import os
|
from .. import compat_os_name
|
||||||
|
|
||||||
if os.name == 'nt':
|
if compat_os_name == 'nt':
|
||||||
# On older Python versions, proxies are extracted from Windows registry erroneously. [1]
|
# On older Python versions, proxies are extracted from Windows registry erroneously. [1]
|
||||||
# If the https proxy in the registry does not have a scheme, urllib will incorrectly add https:// to it. [2]
|
# If the https proxy in the registry does not have a scheme, urllib will incorrectly add https:// to it. [2]
|
||||||
# It is unlikely that the user has actually set it to be https, so we should be fine to safely downgrade
|
# It is unlikely that the user has actually set it to be https, so we should be fine to safely downgrade
|
||||||
@ -37,4 +37,4 @@ if os.name == 'nt':
|
|||||||
def getproxies():
|
def getproxies():
|
||||||
return getproxies_environment() or getproxies_registry_patched()
|
return getproxies_environment() or getproxies_registry_patched()
|
||||||
|
|
||||||
del os
|
del compat_os_name
|
||||||
|
@ -25,6 +25,7 @@ from .aes import (
|
|||||||
aes_gcm_decrypt_and_verify_bytes,
|
aes_gcm_decrypt_and_verify_bytes,
|
||||||
unpad_pkcs7,
|
unpad_pkcs7,
|
||||||
)
|
)
|
||||||
|
from .compat import compat_os_name
|
||||||
from .dependencies import (
|
from .dependencies import (
|
||||||
_SECRETSTORAGE_UNAVAILABLE_REASON,
|
_SECRETSTORAGE_UNAVAILABLE_REASON,
|
||||||
secretstorage,
|
secretstorage,
|
||||||
@ -195,10 +196,7 @@ def _extract_firefox_cookies(profile, container, logger):
|
|||||||
|
|
||||||
def _firefox_browser_dirs():
|
def _firefox_browser_dirs():
|
||||||
if sys.platform in ('cygwin', 'win32'):
|
if sys.platform in ('cygwin', 'win32'):
|
||||||
yield from map(os.path.expandvars, (
|
yield os.path.expandvars(R'%APPDATA%\Mozilla\Firefox\Profiles')
|
||||||
R'%APPDATA%\Mozilla\Firefox\Profiles',
|
|
||||||
R'%LOCALAPPDATA%\Packages\Mozilla.Firefox_n80bbvh6b1yt2\LocalCache\Roaming\Mozilla\Firefox\Profiles',
|
|
||||||
))
|
|
||||||
|
|
||||||
elif sys.platform == 'darwin':
|
elif sys.platform == 'darwin':
|
||||||
yield os.path.expanduser('~/Library/Application Support/Firefox/Profiles')
|
yield os.path.expanduser('~/Library/Application Support/Firefox/Profiles')
|
||||||
@ -304,18 +302,12 @@ def _extract_chrome_cookies(browser_name, profile, keyring, logger):
|
|||||||
raise FileNotFoundError(f'could not find {browser_name} cookies database in "{search_root}"')
|
raise FileNotFoundError(f'could not find {browser_name} cookies database in "{search_root}"')
|
||||||
logger.debug(f'Extracting cookies from: "{cookie_database_path}"')
|
logger.debug(f'Extracting cookies from: "{cookie_database_path}"')
|
||||||
|
|
||||||
|
decryptor = get_cookie_decryptor(config['browser_dir'], config['keyring_name'], logger, keyring=keyring)
|
||||||
|
|
||||||
with tempfile.TemporaryDirectory(prefix='yt_dlp') as tmpdir:
|
with tempfile.TemporaryDirectory(prefix='yt_dlp') as tmpdir:
|
||||||
cursor = None
|
cursor = None
|
||||||
try:
|
try:
|
||||||
cursor = _open_database_copy(cookie_database_path, tmpdir)
|
cursor = _open_database_copy(cookie_database_path, tmpdir)
|
||||||
|
|
||||||
# meta_version is necessary to determine if we need to trim the hash prefix from the cookies
|
|
||||||
# Ref: https://chromium.googlesource.com/chromium/src/+/b02dcebd7cafab92770734dc2bc317bd07f1d891/net/extras/sqlite/sqlite_persistent_cookie_store.cc#223
|
|
||||||
meta_version = int(cursor.execute('SELECT value FROM meta WHERE key = "version"').fetchone()[0])
|
|
||||||
decryptor = get_cookie_decryptor(
|
|
||||||
config['browser_dir'], config['keyring_name'], logger,
|
|
||||||
keyring=keyring, meta_version=meta_version)
|
|
||||||
|
|
||||||
cursor.connection.text_factory = bytes
|
cursor.connection.text_factory = bytes
|
||||||
column_names = _get_column_names(cursor, 'cookies')
|
column_names = _get_column_names(cursor, 'cookies')
|
||||||
secure_column = 'is_secure' if 'is_secure' in column_names else 'secure'
|
secure_column = 'is_secure' if 'is_secure' in column_names else 'secure'
|
||||||
@ -345,7 +337,7 @@ def _extract_chrome_cookies(browser_name, profile, keyring, logger):
|
|||||||
logger.debug(f'cookie version breakdown: {counts}')
|
logger.debug(f'cookie version breakdown: {counts}')
|
||||||
return jar
|
return jar
|
||||||
except PermissionError as error:
|
except PermissionError as error:
|
||||||
if os.name == 'nt' and error.errno == 13:
|
if compat_os_name == 'nt' and error.errno == 13:
|
||||||
message = 'Could not copy Chrome cookie database. See https://github.com/yt-dlp/yt-dlp/issues/7271 for more info'
|
message = 'Could not copy Chrome cookie database. See https://github.com/yt-dlp/yt-dlp/issues/7271 for more info'
|
||||||
logger.error(message)
|
logger.error(message)
|
||||||
raise DownloadError(message) # force exit
|
raise DownloadError(message) # force exit
|
||||||
@ -413,23 +405,22 @@ class ChromeCookieDecryptor:
|
|||||||
raise NotImplementedError('Must be implemented by sub classes')
|
raise NotImplementedError('Must be implemented by sub classes')
|
||||||
|
|
||||||
|
|
||||||
def get_cookie_decryptor(browser_root, browser_keyring_name, logger, *, keyring=None, meta_version=None):
|
def get_cookie_decryptor(browser_root, browser_keyring_name, logger, *, keyring=None):
|
||||||
if sys.platform == 'darwin':
|
if sys.platform == 'darwin':
|
||||||
return MacChromeCookieDecryptor(browser_keyring_name, logger, meta_version=meta_version)
|
return MacChromeCookieDecryptor(browser_keyring_name, logger)
|
||||||
elif sys.platform in ('win32', 'cygwin'):
|
elif sys.platform in ('win32', 'cygwin'):
|
||||||
return WindowsChromeCookieDecryptor(browser_root, logger, meta_version=meta_version)
|
return WindowsChromeCookieDecryptor(browser_root, logger)
|
||||||
return LinuxChromeCookieDecryptor(browser_keyring_name, logger, keyring=keyring, meta_version=meta_version)
|
return LinuxChromeCookieDecryptor(browser_keyring_name, logger, keyring=keyring)
|
||||||
|
|
||||||
|
|
||||||
class LinuxChromeCookieDecryptor(ChromeCookieDecryptor):
|
class LinuxChromeCookieDecryptor(ChromeCookieDecryptor):
|
||||||
def __init__(self, browser_keyring_name, logger, *, keyring=None, meta_version=None):
|
def __init__(self, browser_keyring_name, logger, *, keyring=None):
|
||||||
self._logger = logger
|
self._logger = logger
|
||||||
self._v10_key = self.derive_key(b'peanuts')
|
self._v10_key = self.derive_key(b'peanuts')
|
||||||
self._empty_key = self.derive_key(b'')
|
self._empty_key = self.derive_key(b'')
|
||||||
self._cookie_counts = {'v10': 0, 'v11': 0, 'other': 0}
|
self._cookie_counts = {'v10': 0, 'v11': 0, 'other': 0}
|
||||||
self._browser_keyring_name = browser_keyring_name
|
self._browser_keyring_name = browser_keyring_name
|
||||||
self._keyring = keyring
|
self._keyring = keyring
|
||||||
self._meta_version = meta_version or 0
|
|
||||||
|
|
||||||
@functools.cached_property
|
@functools.cached_property
|
||||||
def _v11_key(self):
|
def _v11_key(self):
|
||||||
@ -458,18 +449,14 @@ class LinuxChromeCookieDecryptor(ChromeCookieDecryptor):
|
|||||||
|
|
||||||
if version == b'v10':
|
if version == b'v10':
|
||||||
self._cookie_counts['v10'] += 1
|
self._cookie_counts['v10'] += 1
|
||||||
return _decrypt_aes_cbc_multi(
|
return _decrypt_aes_cbc_multi(ciphertext, (self._v10_key, self._empty_key), self._logger)
|
||||||
ciphertext, (self._v10_key, self._empty_key), self._logger,
|
|
||||||
hash_prefix=self._meta_version >= 24)
|
|
||||||
|
|
||||||
elif version == b'v11':
|
elif version == b'v11':
|
||||||
self._cookie_counts['v11'] += 1
|
self._cookie_counts['v11'] += 1
|
||||||
if self._v11_key is None:
|
if self._v11_key is None:
|
||||||
self._logger.warning('cannot decrypt v11 cookies: no key found', only_once=True)
|
self._logger.warning('cannot decrypt v11 cookies: no key found', only_once=True)
|
||||||
return None
|
return None
|
||||||
return _decrypt_aes_cbc_multi(
|
return _decrypt_aes_cbc_multi(ciphertext, (self._v11_key, self._empty_key), self._logger)
|
||||||
ciphertext, (self._v11_key, self._empty_key), self._logger,
|
|
||||||
hash_prefix=self._meta_version >= 24)
|
|
||||||
|
|
||||||
else:
|
else:
|
||||||
self._logger.warning(f'unknown cookie version: "{version}"', only_once=True)
|
self._logger.warning(f'unknown cookie version: "{version}"', only_once=True)
|
||||||
@ -478,12 +465,11 @@ class LinuxChromeCookieDecryptor(ChromeCookieDecryptor):
|
|||||||
|
|
||||||
|
|
||||||
class MacChromeCookieDecryptor(ChromeCookieDecryptor):
|
class MacChromeCookieDecryptor(ChromeCookieDecryptor):
|
||||||
def __init__(self, browser_keyring_name, logger, meta_version=None):
|
def __init__(self, browser_keyring_name, logger):
|
||||||
self._logger = logger
|
self._logger = logger
|
||||||
password = _get_mac_keyring_password(browser_keyring_name, logger)
|
password = _get_mac_keyring_password(browser_keyring_name, logger)
|
||||||
self._v10_key = None if password is None else self.derive_key(password)
|
self._v10_key = None if password is None else self.derive_key(password)
|
||||||
self._cookie_counts = {'v10': 0, 'other': 0}
|
self._cookie_counts = {'v10': 0, 'other': 0}
|
||||||
self._meta_version = meta_version or 0
|
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
def derive_key(password):
|
def derive_key(password):
|
||||||
@ -501,8 +487,7 @@ class MacChromeCookieDecryptor(ChromeCookieDecryptor):
|
|||||||
self._logger.warning('cannot decrypt v10 cookies: no key found', only_once=True)
|
self._logger.warning('cannot decrypt v10 cookies: no key found', only_once=True)
|
||||||
return None
|
return None
|
||||||
|
|
||||||
return _decrypt_aes_cbc_multi(
|
return _decrypt_aes_cbc_multi(ciphertext, (self._v10_key,), self._logger)
|
||||||
ciphertext, (self._v10_key,), self._logger, hash_prefix=self._meta_version >= 24)
|
|
||||||
|
|
||||||
else:
|
else:
|
||||||
self._cookie_counts['other'] += 1
|
self._cookie_counts['other'] += 1
|
||||||
@ -512,11 +497,10 @@ class MacChromeCookieDecryptor(ChromeCookieDecryptor):
|
|||||||
|
|
||||||
|
|
||||||
class WindowsChromeCookieDecryptor(ChromeCookieDecryptor):
|
class WindowsChromeCookieDecryptor(ChromeCookieDecryptor):
|
||||||
def __init__(self, browser_root, logger, meta_version=None):
|
def __init__(self, browser_root, logger):
|
||||||
self._logger = logger
|
self._logger = logger
|
||||||
self._v10_key = _get_windows_v10_key(browser_root, logger)
|
self._v10_key = _get_windows_v10_key(browser_root, logger)
|
||||||
self._cookie_counts = {'v10': 0, 'other': 0}
|
self._cookie_counts = {'v10': 0, 'other': 0}
|
||||||
self._meta_version = meta_version or 0
|
|
||||||
|
|
||||||
def decrypt(self, encrypted_value):
|
def decrypt(self, encrypted_value):
|
||||||
version = encrypted_value[:3]
|
version = encrypted_value[:3]
|
||||||
@ -540,9 +524,7 @@ class WindowsChromeCookieDecryptor(ChromeCookieDecryptor):
|
|||||||
ciphertext = raw_ciphertext[nonce_length:-authentication_tag_length]
|
ciphertext = raw_ciphertext[nonce_length:-authentication_tag_length]
|
||||||
authentication_tag = raw_ciphertext[-authentication_tag_length:]
|
authentication_tag = raw_ciphertext[-authentication_tag_length:]
|
||||||
|
|
||||||
return _decrypt_aes_gcm(
|
return _decrypt_aes_gcm(ciphertext, self._v10_key, nonce, authentication_tag, self._logger)
|
||||||
ciphertext, self._v10_key, nonce, authentication_tag, self._logger,
|
|
||||||
hash_prefix=self._meta_version >= 24)
|
|
||||||
|
|
||||||
else:
|
else:
|
||||||
self._cookie_counts['other'] += 1
|
self._cookie_counts['other'] += 1
|
||||||
@ -764,11 +746,11 @@ def _get_linux_desktop_environment(env, logger):
|
|||||||
GetDesktopEnvironment
|
GetDesktopEnvironment
|
||||||
"""
|
"""
|
||||||
xdg_current_desktop = env.get('XDG_CURRENT_DESKTOP', None)
|
xdg_current_desktop = env.get('XDG_CURRENT_DESKTOP', None)
|
||||||
desktop_session = env.get('DESKTOP_SESSION', '')
|
desktop_session = env.get('DESKTOP_SESSION', None)
|
||||||
if xdg_current_desktop is not None:
|
if xdg_current_desktop is not None:
|
||||||
for part in map(str.strip, xdg_current_desktop.split(':')):
|
for part in map(str.strip, xdg_current_desktop.split(':')):
|
||||||
if part == 'Unity':
|
if part == 'Unity':
|
||||||
if 'gnome-fallback' in desktop_session:
|
if desktop_session is not None and 'gnome-fallback' in desktop_session:
|
||||||
return _LinuxDesktopEnvironment.GNOME
|
return _LinuxDesktopEnvironment.GNOME
|
||||||
else:
|
else:
|
||||||
return _LinuxDesktopEnvironment.UNITY
|
return _LinuxDesktopEnvironment.UNITY
|
||||||
@ -797,34 +779,35 @@ def _get_linux_desktop_environment(env, logger):
|
|||||||
return _LinuxDesktopEnvironment.UKUI
|
return _LinuxDesktopEnvironment.UKUI
|
||||||
elif part == 'LXQt':
|
elif part == 'LXQt':
|
||||||
return _LinuxDesktopEnvironment.LXQT
|
return _LinuxDesktopEnvironment.LXQT
|
||||||
logger.debug(f'XDG_CURRENT_DESKTOP is set to an unknown value: "{xdg_current_desktop}"')
|
logger.info(f'XDG_CURRENT_DESKTOP is set to an unknown value: "{xdg_current_desktop}"')
|
||||||
|
|
||||||
if desktop_session == 'deepin':
|
elif desktop_session is not None:
|
||||||
return _LinuxDesktopEnvironment.DEEPIN
|
if desktop_session == 'deepin':
|
||||||
elif desktop_session in ('mate', 'gnome'):
|
return _LinuxDesktopEnvironment.DEEPIN
|
||||||
return _LinuxDesktopEnvironment.GNOME
|
elif desktop_session in ('mate', 'gnome'):
|
||||||
elif desktop_session in ('kde4', 'kde-plasma'):
|
return _LinuxDesktopEnvironment.GNOME
|
||||||
return _LinuxDesktopEnvironment.KDE4
|
elif desktop_session in ('kde4', 'kde-plasma'):
|
||||||
elif desktop_session == 'kde':
|
|
||||||
if 'KDE_SESSION_VERSION' in env:
|
|
||||||
return _LinuxDesktopEnvironment.KDE4
|
return _LinuxDesktopEnvironment.KDE4
|
||||||
|
elif desktop_session == 'kde':
|
||||||
|
if 'KDE_SESSION_VERSION' in env:
|
||||||
|
return _LinuxDesktopEnvironment.KDE4
|
||||||
|
else:
|
||||||
|
return _LinuxDesktopEnvironment.KDE3
|
||||||
|
elif 'xfce' in desktop_session or desktop_session == 'xubuntu':
|
||||||
|
return _LinuxDesktopEnvironment.XFCE
|
||||||
|
elif desktop_session == 'ukui':
|
||||||
|
return _LinuxDesktopEnvironment.UKUI
|
||||||
else:
|
else:
|
||||||
return _LinuxDesktopEnvironment.KDE3
|
logger.info(f'DESKTOP_SESSION is set to an unknown value: "{desktop_session}"')
|
||||||
elif 'xfce' in desktop_session or desktop_session == 'xubuntu':
|
|
||||||
return _LinuxDesktopEnvironment.XFCE
|
|
||||||
elif desktop_session == 'ukui':
|
|
||||||
return _LinuxDesktopEnvironment.UKUI
|
|
||||||
else:
|
else:
|
||||||
logger.debug(f'DESKTOP_SESSION is set to an unknown value: "{desktop_session}"')
|
if 'GNOME_DESKTOP_SESSION_ID' in env:
|
||||||
|
return _LinuxDesktopEnvironment.GNOME
|
||||||
if 'GNOME_DESKTOP_SESSION_ID' in env:
|
elif 'KDE_FULL_SESSION' in env:
|
||||||
return _LinuxDesktopEnvironment.GNOME
|
if 'KDE_SESSION_VERSION' in env:
|
||||||
elif 'KDE_FULL_SESSION' in env:
|
return _LinuxDesktopEnvironment.KDE4
|
||||||
if 'KDE_SESSION_VERSION' in env:
|
else:
|
||||||
return _LinuxDesktopEnvironment.KDE4
|
return _LinuxDesktopEnvironment.KDE3
|
||||||
else:
|
|
||||||
return _LinuxDesktopEnvironment.KDE3
|
|
||||||
|
|
||||||
return _LinuxDesktopEnvironment.OTHER
|
return _LinuxDesktopEnvironment.OTHER
|
||||||
|
|
||||||
|
|
||||||
@ -1027,12 +1010,10 @@ def pbkdf2_sha1(password, salt, iterations, key_length):
|
|||||||
return hashlib.pbkdf2_hmac('sha1', password, salt, iterations, key_length)
|
return hashlib.pbkdf2_hmac('sha1', password, salt, iterations, key_length)
|
||||||
|
|
||||||
|
|
||||||
def _decrypt_aes_cbc_multi(ciphertext, keys, logger, initialization_vector=b' ' * 16, hash_prefix=False):
|
def _decrypt_aes_cbc_multi(ciphertext, keys, logger, initialization_vector=b' ' * 16):
|
||||||
for key in keys:
|
for key in keys:
|
||||||
plaintext = unpad_pkcs7(aes_cbc_decrypt_bytes(ciphertext, key, initialization_vector))
|
plaintext = unpad_pkcs7(aes_cbc_decrypt_bytes(ciphertext, key, initialization_vector))
|
||||||
try:
|
try:
|
||||||
if hash_prefix:
|
|
||||||
return plaintext[32:].decode()
|
|
||||||
return plaintext.decode()
|
return plaintext.decode()
|
||||||
except UnicodeDecodeError:
|
except UnicodeDecodeError:
|
||||||
pass
|
pass
|
||||||
@ -1040,7 +1021,7 @@ def _decrypt_aes_cbc_multi(ciphertext, keys, logger, initialization_vector=b' '
|
|||||||
return None
|
return None
|
||||||
|
|
||||||
|
|
||||||
def _decrypt_aes_gcm(ciphertext, key, nonce, authentication_tag, logger, hash_prefix=False):
|
def _decrypt_aes_gcm(ciphertext, key, nonce, authentication_tag, logger):
|
||||||
try:
|
try:
|
||||||
plaintext = aes_gcm_decrypt_and_verify_bytes(ciphertext, key, authentication_tag, nonce)
|
plaintext = aes_gcm_decrypt_and_verify_bytes(ciphertext, key, authentication_tag, nonce)
|
||||||
except ValueError:
|
except ValueError:
|
||||||
@ -1048,8 +1029,6 @@ def _decrypt_aes_gcm(ciphertext, key, nonce, authentication_tag, logger, hash_pr
|
|||||||
return None
|
return None
|
||||||
|
|
||||||
try:
|
try:
|
||||||
if hash_prefix:
|
|
||||||
return plaintext[32:].decode()
|
|
||||||
return plaintext.decode()
|
return plaintext.decode()
|
||||||
except UnicodeDecodeError:
|
except UnicodeDecodeError:
|
||||||
logger.warning('failed to decrypt cookie (AES-GCM) because UTF-8 decoding failed. Possibly the key is wrong?', only_once=True)
|
logger.warning('failed to decrypt cookie (AES-GCM) because UTF-8 decoding failed. Possibly the key is wrong?', only_once=True)
|
||||||
@ -1278,8 +1257,8 @@ class YoutubeDLCookieJar(http.cookiejar.MozillaCookieJar):
|
|||||||
def _really_save(self, f, ignore_discard, ignore_expires):
|
def _really_save(self, f, ignore_discard, ignore_expires):
|
||||||
now = time.time()
|
now = time.time()
|
||||||
for cookie in self:
|
for cookie in self:
|
||||||
if ((not ignore_discard and cookie.discard)
|
if (not ignore_discard and cookie.discard
|
||||||
or (not ignore_expires and cookie.is_expired(now))):
|
or not ignore_expires and cookie.is_expired(now)):
|
||||||
continue
|
continue
|
||||||
name, value = cookie.name, cookie.value
|
name, value = cookie.name, cookie.value
|
||||||
if value is None:
|
if value is None:
|
||||||
|
@ -24,7 +24,7 @@ try:
|
|||||||
from Crypto.Cipher import AES, PKCS1_OAEP, Blowfish, PKCS1_v1_5 # noqa: F401
|
from Crypto.Cipher import AES, PKCS1_OAEP, Blowfish, PKCS1_v1_5 # noqa: F401
|
||||||
from Crypto.Hash import CMAC, SHA1 # noqa: F401
|
from Crypto.Hash import CMAC, SHA1 # noqa: F401
|
||||||
from Crypto.PublicKey import RSA # noqa: F401
|
from Crypto.PublicKey import RSA # noqa: F401
|
||||||
except (ImportError, OSError):
|
except ImportError:
|
||||||
__version__ = f'broken {__version__}'.strip()
|
__version__ = f'broken {__version__}'.strip()
|
||||||
|
|
||||||
|
|
||||||
|
@ -30,12 +30,11 @@ from .hls import HlsFD
|
|||||||
from .http import HttpFD
|
from .http import HttpFD
|
||||||
from .ism import IsmFD
|
from .ism import IsmFD
|
||||||
from .mhtml import MhtmlFD
|
from .mhtml import MhtmlFD
|
||||||
from .niconico import NiconicoLiveFD
|
from .niconico import NiconicoDmcFD, NiconicoLiveFD
|
||||||
from .rtmp import RtmpFD
|
from .rtmp import RtmpFD
|
||||||
from .rtsp import RtspFD
|
from .rtsp import RtspFD
|
||||||
from .websocket import WebSocketFragmentFD
|
from .websocket import WebSocketFragmentFD
|
||||||
from .youtube_live_chat import YoutubeLiveChatFD
|
from .youtube_live_chat import YoutubeLiveChatFD
|
||||||
from .bunnycdn import BunnyCdnFD
|
|
||||||
|
|
||||||
PROTOCOL_MAP = {
|
PROTOCOL_MAP = {
|
||||||
'rtmp': RtmpFD,
|
'rtmp': RtmpFD,
|
||||||
@ -50,12 +49,12 @@ PROTOCOL_MAP = {
|
|||||||
'http_dash_segments_generator': DashSegmentsFD,
|
'http_dash_segments_generator': DashSegmentsFD,
|
||||||
'ism': IsmFD,
|
'ism': IsmFD,
|
||||||
'mhtml': MhtmlFD,
|
'mhtml': MhtmlFD,
|
||||||
|
'niconico_dmc': NiconicoDmcFD,
|
||||||
'niconico_live': NiconicoLiveFD,
|
'niconico_live': NiconicoLiveFD,
|
||||||
'fc2_live': FC2LiveFD,
|
'fc2_live': FC2LiveFD,
|
||||||
'websocket_frag': WebSocketFragmentFD,
|
'websocket_frag': WebSocketFragmentFD,
|
||||||
'youtube_live_chat': YoutubeLiveChatFD,
|
'youtube_live_chat': YoutubeLiveChatFD,
|
||||||
'youtube_live_chat_replay': YoutubeLiveChatFD,
|
'youtube_live_chat_replay': YoutubeLiveChatFD,
|
||||||
'bunnycdn': BunnyCdnFD,
|
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
@ -66,6 +65,7 @@ def shorten_protocol_name(proto, simplify=False):
|
|||||||
'rtmp_ffmpeg': 'rtmpF',
|
'rtmp_ffmpeg': 'rtmpF',
|
||||||
'http_dash_segments': 'dash',
|
'http_dash_segments': 'dash',
|
||||||
'http_dash_segments_generator': 'dashG',
|
'http_dash_segments_generator': 'dashG',
|
||||||
|
'niconico_dmc': 'dmc',
|
||||||
'websocket_frag': 'WSfrag',
|
'websocket_frag': 'WSfrag',
|
||||||
}
|
}
|
||||||
if simplify:
|
if simplify:
|
||||||
|
@ -1,50 +0,0 @@
|
|||||||
import hashlib
|
|
||||||
import random
|
|
||||||
import threading
|
|
||||||
|
|
||||||
from .common import FileDownloader
|
|
||||||
from . import HlsFD
|
|
||||||
from ..networking import Request
|
|
||||||
from ..networking.exceptions import network_exceptions
|
|
||||||
|
|
||||||
|
|
||||||
class BunnyCdnFD(FileDownloader):
|
|
||||||
"""
|
|
||||||
Downloads from BunnyCDN with required pings
|
|
||||||
Note, this is not a part of public API, and will be removed without notice.
|
|
||||||
DO NOT USE
|
|
||||||
"""
|
|
||||||
|
|
||||||
def real_download(self, filename, info_dict):
|
|
||||||
self.to_screen(f'[{self.FD_NAME}] Downloading from BunnyCDN')
|
|
||||||
|
|
||||||
fd = HlsFD(self.ydl, self.params)
|
|
||||||
|
|
||||||
stop_event = threading.Event()
|
|
||||||
ping_thread = threading.Thread(target=self.ping_thread, args=(stop_event,), kwargs=info_dict['_bunnycdn_ping_data'])
|
|
||||||
ping_thread.start()
|
|
||||||
|
|
||||||
try:
|
|
||||||
return fd.real_download(filename, info_dict)
|
|
||||||
finally:
|
|
||||||
stop_event.set()
|
|
||||||
|
|
||||||
def ping_thread(self, stop_event, url, headers, secret, context_id):
|
|
||||||
# Site sends ping every 4 seconds, but this throttles the download. Pinging every 2 seconds seems to work.
|
|
||||||
ping_interval = 2
|
|
||||||
# Hard coded resolution as it doesn't seem to matter
|
|
||||||
res = 1080
|
|
||||||
paused = 'false'
|
|
||||||
current_time = 0
|
|
||||||
|
|
||||||
while not stop_event.wait(ping_interval):
|
|
||||||
current_time += ping_interval
|
|
||||||
|
|
||||||
time = current_time + round(random.random(), 6)
|
|
||||||
md5_hash = hashlib.md5(f'{secret}_{context_id}_{time}_{paused}_{res}'.encode()).hexdigest()
|
|
||||||
ping_url = f'{url}?hash={md5_hash}&time={time}&paused={paused}&resolution={res}'
|
|
||||||
|
|
||||||
try:
|
|
||||||
self.ydl.urlopen(Request(ping_url, headers=headers)).read()
|
|
||||||
except network_exceptions as e:
|
|
||||||
self.to_screen(f'[{self.FD_NAME}] Ping failed: {e}')
|
|
@ -20,7 +20,9 @@ from ..utils import (
|
|||||||
Namespace,
|
Namespace,
|
||||||
RetryManager,
|
RetryManager,
|
||||||
classproperty,
|
classproperty,
|
||||||
|
decodeArgument,
|
||||||
deprecation_warning,
|
deprecation_warning,
|
||||||
|
encodeFilename,
|
||||||
format_bytes,
|
format_bytes,
|
||||||
join_nonempty,
|
join_nonempty,
|
||||||
parse_bytes,
|
parse_bytes,
|
||||||
@ -31,7 +33,6 @@ from ..utils import (
|
|||||||
timetuple_from_msec,
|
timetuple_from_msec,
|
||||||
try_call,
|
try_call,
|
||||||
)
|
)
|
||||||
from ..utils._utils import _ProgressState
|
|
||||||
|
|
||||||
|
|
||||||
class FileDownloader:
|
class FileDownloader:
|
||||||
@ -218,7 +219,7 @@ class FileDownloader:
|
|||||||
def temp_name(self, filename):
|
def temp_name(self, filename):
|
||||||
"""Returns a temporary filename for the given filename."""
|
"""Returns a temporary filename for the given filename."""
|
||||||
if self.params.get('nopart', False) or filename == '-' or \
|
if self.params.get('nopart', False) or filename == '-' or \
|
||||||
(os.path.exists(filename) and not os.path.isfile(filename)):
|
(os.path.exists(encodeFilename(filename)) and not os.path.isfile(encodeFilename(filename))):
|
||||||
return filename
|
return filename
|
||||||
return filename + '.part'
|
return filename + '.part'
|
||||||
|
|
||||||
@ -272,7 +273,7 @@ class FileDownloader:
|
|||||||
"""Try to set the last-modified time of the given file."""
|
"""Try to set the last-modified time of the given file."""
|
||||||
if last_modified_hdr is None:
|
if last_modified_hdr is None:
|
||||||
return
|
return
|
||||||
if not os.path.isfile(filename):
|
if not os.path.isfile(encodeFilename(filename)):
|
||||||
return
|
return
|
||||||
timestr = last_modified_hdr
|
timestr = last_modified_hdr
|
||||||
if timestr is None:
|
if timestr is None:
|
||||||
@ -334,7 +335,7 @@ class FileDownloader:
|
|||||||
progress_dict), s.get('progress_idx') or 0)
|
progress_dict), s.get('progress_idx') or 0)
|
||||||
self.to_console_title(self.ydl.evaluate_outtmpl(
|
self.to_console_title(self.ydl.evaluate_outtmpl(
|
||||||
progress_template.get('download-title') or 'yt-dlp %(progress._default_template)s',
|
progress_template.get('download-title') or 'yt-dlp %(progress._default_template)s',
|
||||||
progress_dict), _ProgressState.from_dict(s), s.get('_percent'))
|
progress_dict))
|
||||||
|
|
||||||
def _format_progress(self, *args, **kwargs):
|
def _format_progress(self, *args, **kwargs):
|
||||||
return self.ydl._format_text(
|
return self.ydl._format_text(
|
||||||
@ -358,7 +359,6 @@ class FileDownloader:
|
|||||||
'_speed_str': self.format_speed(speed).strip(),
|
'_speed_str': self.format_speed(speed).strip(),
|
||||||
'_total_bytes_str': _format_bytes('total_bytes'),
|
'_total_bytes_str': _format_bytes('total_bytes'),
|
||||||
'_elapsed_str': self.format_seconds(s.get('elapsed')),
|
'_elapsed_str': self.format_seconds(s.get('elapsed')),
|
||||||
'_percent': 100.0,
|
|
||||||
'_percent_str': self.format_percent(100),
|
'_percent_str': self.format_percent(100),
|
||||||
})
|
})
|
||||||
self._report_progress_status(s, join_nonempty(
|
self._report_progress_status(s, join_nonempty(
|
||||||
@ -377,15 +377,13 @@ class FileDownloader:
|
|||||||
return
|
return
|
||||||
self._progress_delta_time += update_delta
|
self._progress_delta_time += update_delta
|
||||||
|
|
||||||
progress = try_call(
|
|
||||||
lambda: 100 * s['downloaded_bytes'] / s['total_bytes'],
|
|
||||||
lambda: 100 * s['downloaded_bytes'] / s['total_bytes_estimate'],
|
|
||||||
lambda: s['downloaded_bytes'] == 0 and 0)
|
|
||||||
s.update({
|
s.update({
|
||||||
'_eta_str': self.format_eta(s.get('eta')).strip(),
|
'_eta_str': self.format_eta(s.get('eta')).strip(),
|
||||||
'_speed_str': self.format_speed(s.get('speed')),
|
'_speed_str': self.format_speed(s.get('speed')),
|
||||||
'_percent': progress,
|
'_percent_str': self.format_percent(try_call(
|
||||||
'_percent_str': self.format_percent(progress),
|
lambda: 100 * s['downloaded_bytes'] / s['total_bytes'],
|
||||||
|
lambda: 100 * s['downloaded_bytes'] / s['total_bytes_estimate'],
|
||||||
|
lambda: s['downloaded_bytes'] == 0 and 0)),
|
||||||
'_total_bytes_str': _format_bytes('total_bytes'),
|
'_total_bytes_str': _format_bytes('total_bytes'),
|
||||||
'_total_bytes_estimate_str': _format_bytes('total_bytes_estimate'),
|
'_total_bytes_estimate_str': _format_bytes('total_bytes_estimate'),
|
||||||
'_downloaded_bytes_str': _format_bytes('downloaded_bytes'),
|
'_downloaded_bytes_str': _format_bytes('downloaded_bytes'),
|
||||||
@ -434,13 +432,13 @@ class FileDownloader:
|
|||||||
"""
|
"""
|
||||||
nooverwrites_and_exists = (
|
nooverwrites_and_exists = (
|
||||||
not self.params.get('overwrites', True)
|
not self.params.get('overwrites', True)
|
||||||
and os.path.exists(filename)
|
and os.path.exists(encodeFilename(filename))
|
||||||
)
|
)
|
||||||
|
|
||||||
if not hasattr(filename, 'write'):
|
if not hasattr(filename, 'write'):
|
||||||
continuedl_and_exists = (
|
continuedl_and_exists = (
|
||||||
self.params.get('continuedl', True)
|
self.params.get('continuedl', True)
|
||||||
and os.path.isfile(filename)
|
and os.path.isfile(encodeFilename(filename))
|
||||||
and not self.params.get('nopart', False)
|
and not self.params.get('nopart', False)
|
||||||
)
|
)
|
||||||
|
|
||||||
@ -450,7 +448,7 @@ class FileDownloader:
|
|||||||
self._hook_progress({
|
self._hook_progress({
|
||||||
'filename': filename,
|
'filename': filename,
|
||||||
'status': 'finished',
|
'status': 'finished',
|
||||||
'total_bytes': os.path.getsize(filename),
|
'total_bytes': os.path.getsize(encodeFilename(filename)),
|
||||||
}, info_dict)
|
}, info_dict)
|
||||||
self._finish_multiline_status()
|
self._finish_multiline_status()
|
||||||
return True, False
|
return True, False
|
||||||
@ -491,7 +489,9 @@ class FileDownloader:
|
|||||||
if not self.params.get('verbose', False):
|
if not self.params.get('verbose', False):
|
||||||
return
|
return
|
||||||
|
|
||||||
if exe is None:
|
str_args = [decodeArgument(a) for a in args]
|
||||||
exe = os.path.basename(args[0])
|
|
||||||
|
|
||||||
self.write_debug(f'{exe} command line: {shell_quote(args)}')
|
if exe is None:
|
||||||
|
exe = os.path.basename(str_args[0])
|
||||||
|
|
||||||
|
self.write_debug(f'{exe} command line: {shell_quote(str_args)}')
|
||||||
|
@ -23,6 +23,7 @@ from ..utils import (
|
|||||||
cli_valueless_option,
|
cli_valueless_option,
|
||||||
determine_ext,
|
determine_ext,
|
||||||
encodeArgument,
|
encodeArgument,
|
||||||
|
encodeFilename,
|
||||||
find_available_port,
|
find_available_port,
|
||||||
remove_end,
|
remove_end,
|
||||||
traverse_obj,
|
traverse_obj,
|
||||||
@ -66,7 +67,7 @@ class ExternalFD(FragmentFD):
|
|||||||
'elapsed': time.time() - started,
|
'elapsed': time.time() - started,
|
||||||
}
|
}
|
||||||
if filename != '-':
|
if filename != '-':
|
||||||
fsize = os.path.getsize(tmpfilename)
|
fsize = os.path.getsize(encodeFilename(tmpfilename))
|
||||||
self.try_rename(tmpfilename, filename)
|
self.try_rename(tmpfilename, filename)
|
||||||
status.update({
|
status.update({
|
||||||
'downloaded_bytes': fsize,
|
'downloaded_bytes': fsize,
|
||||||
@ -183,9 +184,9 @@ class ExternalFD(FragmentFD):
|
|||||||
dest.write(decrypt_fragment(fragment, src.read()))
|
dest.write(decrypt_fragment(fragment, src.read()))
|
||||||
src.close()
|
src.close()
|
||||||
if not self.params.get('keep_fragments', False):
|
if not self.params.get('keep_fragments', False):
|
||||||
self.try_remove(fragment_filename)
|
self.try_remove(encodeFilename(fragment_filename))
|
||||||
dest.close()
|
dest.close()
|
||||||
self.try_remove(f'{tmpfilename}.frag.urls')
|
self.try_remove(encodeFilename(f'{tmpfilename}.frag.urls'))
|
||||||
return 0
|
return 0
|
||||||
|
|
||||||
def _call_process(self, cmd, info_dict):
|
def _call_process(self, cmd, info_dict):
|
||||||
@ -457,6 +458,8 @@ class FFmpegFD(ExternalFD):
|
|||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
def available(cls, path=None):
|
def available(cls, path=None):
|
||||||
|
# TODO: Fix path for ffmpeg
|
||||||
|
# Fixme: This may be wrong when --ffmpeg-location is used
|
||||||
return FFmpegPostProcessor().available
|
return FFmpegPostProcessor().available
|
||||||
|
|
||||||
def on_process_started(self, proc, stdin):
|
def on_process_started(self, proc, stdin):
|
||||||
@ -617,7 +620,7 @@ class FFmpegFD(ExternalFD):
|
|||||||
args += self._configuration_args(('_o1', '_o', ''))
|
args += self._configuration_args(('_o1', '_o', ''))
|
||||||
|
|
||||||
args = [encodeArgument(opt) for opt in args]
|
args = [encodeArgument(opt) for opt in args]
|
||||||
args.append(ffpp._ffmpeg_filename_argument(tmpfilename))
|
args.append(encodeFilename(ffpp._ffmpeg_filename_argument(tmpfilename), True))
|
||||||
self._debug_cmd(args)
|
self._debug_cmd(args)
|
||||||
|
|
||||||
piped = any(fmt['url'] in ('-', 'pipe:') for fmt in selected_formats)
|
piped = any(fmt['url'] in ('-', 'pipe:') for fmt in selected_formats)
|
||||||
|
@ -9,9 +9,10 @@ import time
|
|||||||
from .common import FileDownloader
|
from .common import FileDownloader
|
||||||
from .http import HttpFD
|
from .http import HttpFD
|
||||||
from ..aes import aes_cbc_decrypt_bytes, unpad_pkcs7
|
from ..aes import aes_cbc_decrypt_bytes, unpad_pkcs7
|
||||||
|
from ..compat import compat_os_name
|
||||||
from ..networking import Request
|
from ..networking import Request
|
||||||
from ..networking.exceptions import HTTPError, IncompleteRead
|
from ..networking.exceptions import HTTPError, IncompleteRead
|
||||||
from ..utils import DownloadError, RetryManager, traverse_obj
|
from ..utils import DownloadError, RetryManager, encodeFilename, traverse_obj
|
||||||
from ..utils.networking import HTTPHeaderDict
|
from ..utils.networking import HTTPHeaderDict
|
||||||
from ..utils.progress import ProgressCalculator
|
from ..utils.progress import ProgressCalculator
|
||||||
|
|
||||||
@ -151,7 +152,7 @@ class FragmentFD(FileDownloader):
|
|||||||
if self.__do_ytdl_file(ctx):
|
if self.__do_ytdl_file(ctx):
|
||||||
self._write_ytdl_file(ctx)
|
self._write_ytdl_file(ctx)
|
||||||
if not self.params.get('keep_fragments', False):
|
if not self.params.get('keep_fragments', False):
|
||||||
self.try_remove(ctx['fragment_filename_sanitized'])
|
self.try_remove(encodeFilename(ctx['fragment_filename_sanitized']))
|
||||||
del ctx['fragment_filename_sanitized']
|
del ctx['fragment_filename_sanitized']
|
||||||
|
|
||||||
def _prepare_frag_download(self, ctx):
|
def _prepare_frag_download(self, ctx):
|
||||||
@ -187,7 +188,7 @@ class FragmentFD(FileDownloader):
|
|||||||
})
|
})
|
||||||
|
|
||||||
if self.__do_ytdl_file(ctx):
|
if self.__do_ytdl_file(ctx):
|
||||||
ytdl_file_exists = os.path.isfile(self.ytdl_filename(ctx['filename']))
|
ytdl_file_exists = os.path.isfile(encodeFilename(self.ytdl_filename(ctx['filename'])))
|
||||||
continuedl = self.params.get('continuedl', True)
|
continuedl = self.params.get('continuedl', True)
|
||||||
if continuedl and ytdl_file_exists:
|
if continuedl and ytdl_file_exists:
|
||||||
self._read_ytdl_file(ctx)
|
self._read_ytdl_file(ctx)
|
||||||
@ -302,7 +303,7 @@ class FragmentFD(FileDownloader):
|
|||||||
elif to_file:
|
elif to_file:
|
||||||
self.try_rename(ctx['tmpfilename'], ctx['filename'])
|
self.try_rename(ctx['tmpfilename'], ctx['filename'])
|
||||||
filetime = ctx.get('fragment_filetime')
|
filetime = ctx.get('fragment_filetime')
|
||||||
if self.params.get('updatetime') and filetime:
|
if self.params.get('updatetime', True) and filetime:
|
||||||
with contextlib.suppress(Exception):
|
with contextlib.suppress(Exception):
|
||||||
os.utime(ctx['filename'], (time.time(), filetime))
|
os.utime(ctx['filename'], (time.time(), filetime))
|
||||||
|
|
||||||
@ -389,7 +390,7 @@ class FragmentFD(FileDownloader):
|
|||||||
def __exit__(self, exc_type, exc_val, exc_tb):
|
def __exit__(self, exc_type, exc_val, exc_tb):
|
||||||
pass
|
pass
|
||||||
|
|
||||||
if os.name == 'nt':
|
if compat_os_name == 'nt':
|
||||||
def future_result(future):
|
def future_result(future):
|
||||||
while True:
|
while True:
|
||||||
try:
|
try:
|
||||||
|
@ -16,7 +16,6 @@ from ..utils import (
|
|||||||
update_url_query,
|
update_url_query,
|
||||||
urljoin,
|
urljoin,
|
||||||
)
|
)
|
||||||
from ..utils._utils import _request_dump_filename
|
|
||||||
|
|
||||||
|
|
||||||
class HlsFD(FragmentFD):
|
class HlsFD(FragmentFD):
|
||||||
@ -73,23 +72,11 @@ class HlsFD(FragmentFD):
|
|||||||
|
|
||||||
def real_download(self, filename, info_dict):
|
def real_download(self, filename, info_dict):
|
||||||
man_url = info_dict['url']
|
man_url = info_dict['url']
|
||||||
|
self.to_screen(f'[{self.FD_NAME}] Downloading m3u8 manifest')
|
||||||
|
|
||||||
s = info_dict.get('hls_media_playlist_data')
|
urlh = self.ydl.urlopen(self._prepare_url(info_dict, man_url))
|
||||||
if s:
|
man_url = urlh.url
|
||||||
self.to_screen(f'[{self.FD_NAME}] Using m3u8 manifest from extracted info')
|
s = urlh.read().decode('utf-8', 'ignore')
|
||||||
else:
|
|
||||||
self.to_screen(f'[{self.FD_NAME}] Downloading m3u8 manifest')
|
|
||||||
urlh = self.ydl.urlopen(self._prepare_url(info_dict, man_url))
|
|
||||||
man_url = urlh.url
|
|
||||||
s_bytes = urlh.read()
|
|
||||||
if self.params.get('write_pages'):
|
|
||||||
dump_filename = _request_dump_filename(
|
|
||||||
man_url, info_dict['id'], None,
|
|
||||||
trim_length=self.params.get('trim_file_name'))
|
|
||||||
self.to_screen(f'[{self.FD_NAME}] Saving request to {dump_filename}')
|
|
||||||
with open(dump_filename, 'wb') as outf:
|
|
||||||
outf.write(s_bytes)
|
|
||||||
s = s_bytes.decode('utf-8', 'ignore')
|
|
||||||
|
|
||||||
can_download, message = self.can_download(s, info_dict, self.params.get('allow_unplayable_formats')), None
|
can_download, message = self.can_download(s, info_dict, self.params.get('allow_unplayable_formats')), None
|
||||||
if can_download:
|
if can_download:
|
||||||
@ -132,12 +119,12 @@ class HlsFD(FragmentFD):
|
|||||||
self.to_screen(f'[{self.FD_NAME}] Fragment downloads will be delegated to {real_downloader.get_basename()}')
|
self.to_screen(f'[{self.FD_NAME}] Fragment downloads will be delegated to {real_downloader.get_basename()}')
|
||||||
|
|
||||||
def is_ad_fragment_start(s):
|
def is_ad_fragment_start(s):
|
||||||
return ((s.startswith('#ANVATO-SEGMENT-INFO') and 'type=ad' in s)
|
return (s.startswith('#ANVATO-SEGMENT-INFO') and 'type=ad' in s
|
||||||
or (s.startswith('#UPLYNK-SEGMENT') and s.endswith(',ad')))
|
or s.startswith('#UPLYNK-SEGMENT') and s.endswith(',ad'))
|
||||||
|
|
||||||
def is_ad_fragment_end(s):
|
def is_ad_fragment_end(s):
|
||||||
return ((s.startswith('#ANVATO-SEGMENT-INFO') and 'type=master' in s)
|
return (s.startswith('#ANVATO-SEGMENT-INFO') and 'type=master' in s
|
||||||
or (s.startswith('#UPLYNK-SEGMENT') and s.endswith(',segment')))
|
or s.startswith('#UPLYNK-SEGMENT') and s.endswith(',segment'))
|
||||||
|
|
||||||
fragments = []
|
fragments = []
|
||||||
|
|
||||||
@ -190,7 +177,6 @@ class HlsFD(FragmentFD):
|
|||||||
if external_aes_iv:
|
if external_aes_iv:
|
||||||
external_aes_iv = binascii.unhexlify(remove_start(external_aes_iv, '0x').zfill(32))
|
external_aes_iv = binascii.unhexlify(remove_start(external_aes_iv, '0x').zfill(32))
|
||||||
byte_range = {}
|
byte_range = {}
|
||||||
byte_range_offset = 0
|
|
||||||
discontinuity_count = 0
|
discontinuity_count = 0
|
||||||
frag_index = 0
|
frag_index = 0
|
||||||
ad_frag_next = False
|
ad_frag_next = False
|
||||||
@ -218,11 +204,6 @@ class HlsFD(FragmentFD):
|
|||||||
})
|
})
|
||||||
media_sequence += 1
|
media_sequence += 1
|
||||||
|
|
||||||
# If the byte_range is truthy, reset it after appending a fragment that uses it
|
|
||||||
if byte_range:
|
|
||||||
byte_range_offset = byte_range['end']
|
|
||||||
byte_range = {}
|
|
||||||
|
|
||||||
elif line.startswith('#EXT-X-MAP'):
|
elif line.startswith('#EXT-X-MAP'):
|
||||||
if format_index and discontinuity_count != format_index:
|
if format_index and discontinuity_count != format_index:
|
||||||
continue
|
continue
|
||||||
@ -236,12 +217,10 @@ class HlsFD(FragmentFD):
|
|||||||
if extra_segment_query:
|
if extra_segment_query:
|
||||||
frag_url = update_url_query(frag_url, extra_segment_query)
|
frag_url = update_url_query(frag_url, extra_segment_query)
|
||||||
|
|
||||||
map_byte_range = {}
|
|
||||||
|
|
||||||
if map_info.get('BYTERANGE'):
|
if map_info.get('BYTERANGE'):
|
||||||
splitted_byte_range = map_info.get('BYTERANGE').split('@')
|
splitted_byte_range = map_info.get('BYTERANGE').split('@')
|
||||||
sub_range_start = int(splitted_byte_range[1]) if len(splitted_byte_range) == 2 else 0
|
sub_range_start = int(splitted_byte_range[1]) if len(splitted_byte_range) == 2 else byte_range['end']
|
||||||
map_byte_range = {
|
byte_range = {
|
||||||
'start': sub_range_start,
|
'start': sub_range_start,
|
||||||
'end': sub_range_start + int(splitted_byte_range[0]),
|
'end': sub_range_start + int(splitted_byte_range[0]),
|
||||||
}
|
}
|
||||||
@ -250,7 +229,7 @@ class HlsFD(FragmentFD):
|
|||||||
'frag_index': frag_index,
|
'frag_index': frag_index,
|
||||||
'url': frag_url,
|
'url': frag_url,
|
||||||
'decrypt_info': decrypt_info,
|
'decrypt_info': decrypt_info,
|
||||||
'byte_range': map_byte_range,
|
'byte_range': byte_range,
|
||||||
'media_sequence': media_sequence,
|
'media_sequence': media_sequence,
|
||||||
})
|
})
|
||||||
media_sequence += 1
|
media_sequence += 1
|
||||||
@ -278,7 +257,7 @@ class HlsFD(FragmentFD):
|
|||||||
media_sequence = int(line[22:])
|
media_sequence = int(line[22:])
|
||||||
elif line.startswith('#EXT-X-BYTERANGE'):
|
elif line.startswith('#EXT-X-BYTERANGE'):
|
||||||
splitted_byte_range = line[17:].split('@')
|
splitted_byte_range = line[17:].split('@')
|
||||||
sub_range_start = int(splitted_byte_range[1]) if len(splitted_byte_range) == 2 else byte_range_offset
|
sub_range_start = int(splitted_byte_range[1]) if len(splitted_byte_range) == 2 else byte_range['end']
|
||||||
byte_range = {
|
byte_range = {
|
||||||
'start': sub_range_start,
|
'start': sub_range_start,
|
||||||
'end': sub_range_start + int(splitted_byte_range[0]),
|
'end': sub_range_start + int(splitted_byte_range[0]),
|
||||||
|
@ -15,6 +15,7 @@ from ..utils import (
|
|||||||
ThrottledDownload,
|
ThrottledDownload,
|
||||||
XAttrMetadataError,
|
XAttrMetadataError,
|
||||||
XAttrUnavailableError,
|
XAttrUnavailableError,
|
||||||
|
encodeFilename,
|
||||||
int_or_none,
|
int_or_none,
|
||||||
parse_http_range,
|
parse_http_range,
|
||||||
try_call,
|
try_call,
|
||||||
@ -57,8 +58,9 @@ class HttpFD(FileDownloader):
|
|||||||
|
|
||||||
if self.params.get('continuedl', True):
|
if self.params.get('continuedl', True):
|
||||||
# Establish possible resume length
|
# Establish possible resume length
|
||||||
if os.path.isfile(ctx.tmpfilename):
|
if os.path.isfile(encodeFilename(ctx.tmpfilename)):
|
||||||
ctx.resume_len = os.path.getsize(ctx.tmpfilename)
|
ctx.resume_len = os.path.getsize(
|
||||||
|
encodeFilename(ctx.tmpfilename))
|
||||||
|
|
||||||
ctx.is_resume = ctx.resume_len > 0
|
ctx.is_resume = ctx.resume_len > 0
|
||||||
|
|
||||||
@ -239,7 +241,7 @@ class HttpFD(FileDownloader):
|
|||||||
ctx.resume_len = byte_counter
|
ctx.resume_len = byte_counter
|
||||||
else:
|
else:
|
||||||
try:
|
try:
|
||||||
ctx.resume_len = os.path.getsize(ctx.tmpfilename)
|
ctx.resume_len = os.path.getsize(encodeFilename(ctx.tmpfilename))
|
||||||
except FileNotFoundError:
|
except FileNotFoundError:
|
||||||
ctx.resume_len = 0
|
ctx.resume_len = 0
|
||||||
raise RetryDownload(e)
|
raise RetryDownload(e)
|
||||||
@ -348,7 +350,7 @@ class HttpFD(FileDownloader):
|
|||||||
self.try_rename(ctx.tmpfilename, ctx.filename)
|
self.try_rename(ctx.tmpfilename, ctx.filename)
|
||||||
|
|
||||||
# Update file modification time
|
# Update file modification time
|
||||||
if self.params.get('updatetime'):
|
if self.params.get('updatetime', True):
|
||||||
info_dict['filetime'] = self.try_utime(ctx.filename, ctx.data.headers.get('last-modified', None))
|
info_dict['filetime'] = self.try_utime(ctx.filename, ctx.data.headers.get('last-modified', None))
|
||||||
|
|
||||||
self._hook_progress({
|
self._hook_progress({
|
||||||
|
Some files were not shown because too many files have changed in this diff Show More
Loading…
x
Reference in New Issue
Block a user