youtube-dl: [YouTube] Randomly slow youtube download speed

Checklist

  • [*] I’m reporting a broken site support
  • [*] I’ve verified that I’m running youtube-dl version 2021.06.06
  • [*] I’ve checked that all provided URLs are alive and playable in a browser
  • [*] I’ve checked that all URLs and arguments with special characters are properly quoted or escaped
  • [*] I’ve searched the bugtracker for similar issues including closed ones

Verbose log

root@server:~# youtube-dl https://youtu.be/8PecfdkEM2Y --source-address 64.31.22.34 --verbose
[debug] System config: []
[debug] User config: []
[debug] Custom config: []
[debug] Command-line args: [u'https://youtu.be/8PecfdkEM2Y', u'--source-address', u'64.31.22.34', u'--verbose']
WARNING: Assuming --restrict-filenames since file system encoding cannot encode all characters. Set the LC_ALL environment variable to fix this.
[debug] Encodings: locale ANSI_X3.4-1968, fs ANSI_X3.4-1968, out ANSI_X3.4-1968, pref ANSI_X3.4-1968
[debug] youtube-dl version 2021.06.06
[debug] Python version 2.7.13 (CPython) - Linux-4.9.0-15-amd64-x86_64-with-debian-9.13
[debug] exe versions: ffmpeg 4.1.2, ffprobe 4.1.2, phantomjs 2.1.1
[debug] Proxy map: {}
[youtube] 8PecfdkEM2Y: Downloading webpage
[debug] Default format spec: bestvideo+bestaudio/best
[debug] Invoking downloader on u'https://r2---sn-ab5l6n67.googlevideo.com/videoplayback?expire=1623953531&ei=GzzLYMyiKojn8wSQ-42YBA&ip=64.31.22.34&id=o-AMDSXCc14P0ndQaRCkihVxb1SwdClDAhUbq8xP8nD2ss&itag=313&aitags=133%2C134%2C135%2C136%2C137%2C160%2C242%2C243%2C244%2C247%2C248%2C271%2C278%2C313%2C394%2C395%2C396%2C397%2C398%2C399%2C400%2C401&source=youtube&requiressl=yes&mh=e_&mm=31%2C26&mn=sn-ab5l6n67%2Csn-vgqsrnee&ms=au%2Conr&mv=m&mvi=2&pl=24&initcwndbps=14958750&vprv=1&mime=video%2Fwebm&ns=gA0RPx_S5d5eK1Qi3zH6xdcF&gir=yes&clen=202533160&dur=299.160&lmt=1617981974189982&mt=1623931501&fvip=2&keepalive=yes&fexp=24001373%2C24007246&c=WEB&txp=5532432&n=0B2xjorVelV0Xu6Hq&sparams=expire%2Cei%2Cip%2Cid%2Caitags%2Csource%2Crequiressl%2Cvprv%2Cmime%2Cns%2Cgir%2Cclen%2Cdur%2Clmt&lsparams=mh%2Cmm%2Cmn%2Cms%2Cmv%2Cmvi%2Cpl%2Cinitcwndbps&lsig=AG3C_xAwRQIhAI7W7dT0pYaOQgxn1mHYX3js6NByrqiykD9fsPJs3kAXAiApHiHXVdDMO1k6OKyg2sAb1PMyMO1jfgtZV5R-7frcpw%3D%3D&sig=AOq0QJ8wRgIhAPdD-seXFXT-yOEqoIqCQfPnRqMLASvU8SbymG5TPpNhAiEA1pVFpS3hDYqTVe1ia5sDOi9RaPf3BCuT94XB-vICq_E='
[download] Destination: Hatik_-_daron_des_demain_session_acoustique-8PecfdkEM2Y.f313.webm
[download]  18.4% of 193.15MiB at 77.46KiB/s ETA 34:44Terminated

While the download was in progress, I ran the exact same command in another terminal, another folder, and the download was completed in a few seconds:

root@server:~/test# youtube-dl https://youtu.be/8PecfdkEM2Y --source-address 64.31.22.34 --verbose
[debug] System config: []
[debug] User config: []
[debug] Custom config: []
[debug] Command-line args: [u'https://youtu.be/8PecfdkEM2Y', u'--source-address', u'64.31.22.34', u'--verbose']
WARNING: Assuming --restrict-filenames since file system encoding cannot encode all characters. Set the LC_ALL environment variable to fix this.
[debug] Encodings: locale ANSI_X3.4-1968, fs ANSI_X3.4-1968, out ANSI_X3.4-1968, pref ANSI_X3.4-1968
[debug] youtube-dl version 2021.06.06
[debug] Python version 2.7.13 (CPython) - Linux-4.9.0-15-amd64-x86_64-with-debian-9.13
[debug] exe versions: ffmpeg 4.1.2, ffprobe 4.1.2, phantomjs 2.1.1
[debug] Proxy map: {}
[youtube] 8PecfdkEM2Y: Downloading webpage
[debug] Default format spec: bestvideo+bestaudio/best
[debug] Invoking downloader on u'https://r2---sn-ab5l6n67.googlevideo.com/videoplayback?expire=1623953574&ei=RjzLYLyWBIHGhwak2IzwBg&ip=64.31.22.34&id=o-AFgzn9Zdn1KSGbuO09ZkpRma9GzWqWUApkavXXN93_F6&itag=313&aitags=133%2C134%2C135%2C136%2C137%2C160%2C242%2C243%2C244%2C247%2C248%2C271%2C278%2C313%2C394%2C395%2C396%2C397%2C398%2C399%2C400%2C401&source=youtube&requiressl=yes&mh=e_&mm=31%2C29&mn=sn-ab5l6n67%2Csn-ab5szne7&ms=au%2Crdu&mv=m&mvi=2&pl=24&initcwndbps=14958750&vprv=1&mime=video%2Fwebm&ns=hoSqVT_3ust7ILej5iYoT40F&gir=yes&clen=202533160&dur=299.160&lmt=1617981974189982&mt=1623931501&fvip=2&keepalive=yes&fexp=24001373%2C24007246&c=WEB&txp=5532432&n=l6PFqwM4uREk1JKwP&sparams=expire%2Cei%2Cip%2Cid%2Caitags%2Csource%2Crequiressl%2Cvprv%2Cmime%2Cns%2Cgir%2Cclen%2Cdur%2Clmt&lsparams=mh%2Cmm%2Cmn%2Cms%2Cmv%2Cmvi%2Cpl%2Cinitcwndbps&lsig=AG3C_xAwRQIhAJU5426qtqf6BwLiB48OKkcK_ATe_S9jDPYAVbttM7T1AiBoVGwb1ZBagaiUyKeVGLv562cloZeh5xBT2lFZx61gyQ%3D%3D&sig=AOq0QJ8wRAIgWSAyj1JyqoTHFWMdJ04gjcIDJ8tFryw5sNsf5soVaPQCIE81_29FoDoAOeRv2_hdcnBi2-4XxvlGvbX9YNajLi6y'
[download] Destination: Hatik_-_daron_des_demain_session_acoustique-8PecfdkEM2Y.f313.webm
[download] 100% of 193.15MiB in 00:04
[debug] Invoking downloader on u'https://r2---sn-ab5l6n67.googlevideo.com/videoplayback?expire=1623953574&ei=RjzLYLyWBIHGhwak2IzwBg&ip=64.31.22.34&id=o-AFgzn9Zdn1KSGbuO09ZkpRma9GzWqWUApkavXXN93_F6&itag=251&source=youtube&requiressl=yes&mh=e_&mm=31%2C29&mn=sn-ab5l6n67%2Csn-ab5szne7&ms=au%2Crdu&mv=m&mvi=2&pl=24&initcwndbps=14958750&vprv=1&mime=audio%2Fwebm&ns=hoSqVT_3ust7ILej5iYoT40F&gir=yes&clen=5324207&dur=299.201&lmt=1617980855722369&mt=1623931501&fvip=2&keepalive=yes&fexp=24001373%2C24007246&c=WEB&txp=5531432&n=l6PFqwM4uREk1JKwP&sparams=expire%2Cei%2Cip%2Cid%2Citag%2Csource%2Crequiressl%2Cvprv%2Cmime%2Cns%2Cgir%2Cclen%2Cdur%2Clmt&lsparams=mh%2Cmm%2Cmn%2Cms%2Cmv%2Cmvi%2Cpl%2Cinitcwndbps&lsig=AG3C_xAwRgIhAIqCBWf7PmHH8y1wnG8QvB-0vxKzRG26qCAIWdgOAT1PAiEAwTSl4J0e9L7emiYUDKV_YjfApo2gchge3iVfrYH76lo%3D&sig=AOq0QJ8wRQIgODhcEL0uT0u1nXP41IARrB63CfmpDmUzl6HhPwrXOTwCIQD-OUE152N7yzYXcgU_tAPCP0YdRdfVyFlHE4kIYyyoew=='
[download] Destination: Hatik_-_daron_des_demain_session_acoustique-8PecfdkEM2Y.f251.webm
[download] 100% of 5.08MiB in 00:00
[ffmpeg] Merging formats into "Hatik_-_daron_des_demain_session_acoustique-8PecfdkEM2Y.webm"
[debug] ffmpeg command line: ffmpeg -y -loglevel 'repeat+info' -i 'file:Hatik_-_daron_des_demain_session_acoustique-8PecfdkEM2Y.f313.webm' -i 'file:Hatik_-_daron_des_demain_session_acoustique-8PecfdkEM2Y.f251.webm' -c copy -map '0:v:0' -map '1:a:0' 'file:Hatik_-_daron_des_demain_session_acoustique-8PecfdkEM2Y.temp.webm'
Deleting original file Hatik_-_daron_des_demain_session_acoustique-8PecfdkEM2Y.f313.webm (pass -k to keep)
Deleting original file Hatik_-_daron_des_demain_session_acoustique-8PecfdkEM2Y.f251.webm (pass -k to keep)

Description

Hello,

Important: The problem is random, maybe 1 chance on 8 to produce it. You have to download several videos in a row (about 10) to notice it.

Since a few weeks, randomly a youtube video can be slowed down to 48 Kio/s, so it takes 5-10 minutes to download a short video of 5 minutes instead of 4, 5 seconds, often the download does not succeed and stops after a few minutes.

This happens on several servers, several internet providers as well as with my private connection.

I even managed to launch a second download in parallel when the first one was taking time, the second one finished in 1 seconds, the first one in 5 minutes and was interrupted before the end. Same video, same connection, same command. (tested only with ipv4 because I don’t have ipv6 on my servers or with my internet connection).

Attached is an extraction of the results with --dump-pages. ko.txt ok.txt

About this issue

  • Original URL
  • State: closed
  • Created 3 years ago
  • Reactions: 168
  • Comments: 342 (50 by maintainers)

Commits related to this issue

Most upvoted comments

Since there haven’t been any takers, I thought to poke at it and write down what I found (while writing a simple transcoder for another scripting language and for myself - it already works, but it’d count as an external dependency). I will explain what Youtube’s n token function does. This is not a patch and is only of interest to other devs if they wish to implement it themselves or improve the current interpreter. But if you wanted to know what useless job Google developers get paid for, continue reading!

Here’s your keygen music while reading. Sort of.

Overview of the control flow (already known):

  1. The Youtube API provides you with n - your video access token
  2. If their new changes apply to your client (they do for “web”) then it is expected your client will modify n based on internal logic. This logic is inside player...base.js
  3. n is modified by a cryptic function
  4. Modified n is sent back to server as proof that we’re an official client. If you send n unmodified, the server will eventually throttle you.

This cryptic function is currently named “lha” at line 1162 of the current player. My explanation is based on this player. The function code is randomly generated with each player release. It consists of:

  1. input (argument “a”), takes n as string
  2. b = a.split("") this is where n will be modified letter by letter
  3. c - the really long array that’s supposed to confuse you
  4. a lot of calls into random indexes of c[]. This is where magic happens
  5. return: new value of n that holds the key to Narnia. If anything goes wrong, an enhanced_except_ is returned along with original n
    1. This is interesting and did not exist in very early players. The developer who created this ran into problems and needed a way to debug them. They came up with the idea to return all values needed for debugging: n and the seed needed to generate the cryptic function: "enhanced_except_nJMB4-z-_w8_" + original_n - the random base64 letters are the parameters used by the code generator to create this function. Seed = nJMB4-z- (changed with each player) and w8 is probably the generator version (hadn’t changed yet).

The c array contains 3 references to itself and 3 references to b. The b’s are assigned at the time of array creation, the c’s are filled in where the placeholder null was placed (indexes 3,4,9). Three different references to the same variable are supposed to make correlation harder? They didn’t. Idk what’s that useful for.

(4) Function calls look like this: c[39](c[3], c[22]) where:

  • c[39] is one of the functions in array to shuffle data with
  • c[3] is either b or c - to which the function will apply
  • c[22] is an integer/string argument. The only function that doesn’t need a 2nd argument is Array.reverse()

What are the long numbers for? Like 1222607021 and -1989164346 - these are indexes into either b/c: simply something like 1222607021 % 52 = 25 aka c[25]

If the argument is a string, it is used for “translateA/B” functions (see below). Nothing is eval'ed or anything (even if you see javascript keywords)

Here’s a visual representation (for player 2840754e, Aug 4/5th 2021) of what the shuffling of data looks like:

diag-4x

Arrows track objects of c between shuffles. If all arrows are straight on this diagram, it means a shuffle on b was performed instead (The above diagram describes this order: CCC, BBB, C, BB, C, BB, CC, B, CCCCC, B, CCCCC, BB, CCC, BBBBB). There’re only 3 kinds of shuffle operations: swap two elements, reverse entire array and move all elements by E positions.

Now it’s time to explain what kind of shuffle functions there are.

  • “unshiftPop” aka moveElements: shifts elements up/down by E poisitions in the array D, wrapping around (last element becomes first etc.)
    • for (e = (e % d.length + d.length) % d.length; e--;) d.unshift(d.pop())
  • “spliceReverseUnshift” - same as above
    • d.splice(-e).reverse().forEach(function(f) { d.unshift(f) })
  • “translateA/B”: both functions with switch-case: these generate a base64 alphabet as a LookUpTable and use it to modify N
  • “spliceOnce”: deletes one element at position E, like Python’s .pop(E)
    • d.splice(e, 1)
  • “spliceTwice” aka swap0: swaps index 0 with index E
    • d.splice(0, 1, d.splice(e, 1, d[0])[0])
  • a function that calls Array.reverse on the passed array
  • “pushSplice” aka literally Array.reverse again
    • for (var e = d.length; e;) d.push(d.splice(--e, 1)[0])
  • The latest player has a simple Array.push function too.

As you can see some functions are implemented TWICE with different code. It is beyond me why anyone would do this. I really hope it was a “puzzle task” for an intern and not a real attempt.

Alright, so you call some simple functions that shuffle both c and b, nothing complicated. While you could incrementally deduce and reduce the “randomized” index values inside the try-catch block without executing any code, the functions translateA and translateB require you to execute their code (either interpreter or reimplementation).

However you do NOT need to implement the switch-case statement. Whoever created this Rube-Goldberg machine was not as smart as they imagine. Let’s take a look at the full function code for “translateA/B”

function(d, e) {
	for (var f = 64, h = []; ++f - h.length - 32;) {
		switch (f) {
			case 91:
				f = 44;
				continue;
			case 123:
				f = 65;
				break;
			case 65:
				f -= 18;
				continue;
			case 58:
				f = 96;
				continue;
			case 46:
				f = 95
		}
		h.push(String.fromCharCode(f))
	}
	d.forEach(function(l, m, n) {
		this.push(n[m] = h[(h.indexOf(l) - h.indexOf(this[m]) + m - 32 + f--) % h.length])
	}, e.split(""))
}

Here’s the equivalent function with my comment:

function(d, e) {
// look, there's really no need for overengineering even if your IQ is below average
// and you can't write iterators to generate this, like I show below:
	h = ["0", "1" ... "a", "b", ... "A", ... "Y", "Z", "-", "_"]; // shortened array for readability
	d.forEach(function(l, m, n) {
		this.push(n[m] = h[(h.indexOf(l) - h.indexOf(this[m]) + 64) % h.length])
	}, e.split(""))
}
  1. At the end of the for-loop, f is always = 96
  2. IF and only IF there’s a case 65: in the switch then the array is base64 in the following order: 0aA-_, otherwise it is Aa0-_ (A-Za-z0-9-_)
  3. Since m increases while f-- decreases and -32 remains constant, the expression + m - 32 + f-- can be replaced with a constant +64
  4. I don’t wish to know how many hours/days the developer wasted to create and debug this useless control flow.

The translateA/B functions modify both N and themselves (this) hence modifying the LUT hence requiring an interpreter, I do not think it is possible to reduce this one (unlike everything else).

At the end of all these, in essence, trivial transformations, you get the new N. That’s all this long-winded function does. I hope you had fun reading ❤️

The problem is the reimplementation of youtube-dl’s basic interpreter because its current approach is a dead-end: I tried in the other scripting language and it works… with ever more hacks applied. Trust me and start with a basic parser. Or use a library, that’s what they’re for 😉

PS: If you want to have fun with Google engineers, let their cryptic function deliberately “crash” by sending the “enhanced_except” message to their endpoint. They track these messages to react to bugs. Why not send them a message? 😎 Here’s the command to get the current “enhanced_except” marker:

curl -L "https://youtube.com/$(curl -L https://youtube.com/ | grep -oP 's/player/([^/]+)/player_ias\.vflset/[\w_]+/base\.js' | head -n 1)" | grep -oP 'enhanced_except[\w_-]+'

Then all you need to do is modify and repeat the /videoplayback? call from F12 (Browser’s Network tab): Replace value of &n with enhanced_except_..._YourMessage (don’t use special symbols)

byebye~

PPS: Bonus containing 81 base.js players - collection-of-81-youtube-players.zip

我也注意到了,我一直是配合aria2使用的,只是也和你一样的问题,希望更多人提供有帮助的信息

This is incredibly helpful. wow. Thanks.

Deliberate sabotage of YouTube-dl by Google. Sucks to see this continue.

Please, Googlers, you are reading this. We are just trying to download a small fraction of YouTube videos for offline playback, personal use. Please don’t sabotage us.

On Tue, 22 Jun 2021, 11:29 pm Aaron Wojnowski, @.***> wrote:

I have the solution for this issue. I do not have the bandwidth to actually implement it in the source, but this should be more than enough information to do so.

The issue is that YouTube is modifying the n query parameter on the video playback URLs in a very similar fashion as the signature cipher. There’s a pure function in the player JavaScript which takes the n parameter as input and outputs an n parameter which is not subject to throttling.

As an example, let’s look at https://www.youtube.com/s/player/52dacbe2/player_ias.vflset/et_EE/base.js. The code in question which modifies n is as follows:

a.C&&(b=a.get(“n”))&&(b=Dea(b),a.set(“n”,b))}};

In this case, Dea is the function we are looking for:

function(a){var b=a.split(“”),c=[-704589781,1347684200,618483978,1439350859,null,63715372,function(d){d.reverse()},

159924259,-312652635,function(d,e){for(e=(e%d.length+d.length)%d.length;e–;)d.unshift(d.pop())}, -1208266546,function(d,e){d.push(e)},

-2143203774,-103233324,b,function(d,e){e=(e%d.length+d.length)%d.length;d.splice(0,1,d.splice(e,1,d[0])[0])}, 837025862,1654738381,1184416163,1983454500,b,-200631744,1130073900,null,2047141935,-337180565,1654738381,1913297860,-399114812,b,714887321,function(d,e){for(var f=64,h=[];++f-h.length-32;){switch(f){case 58:f-=14;case 91:case 92:case 93:continue;case 123:f=47;case 94:case 95:case 96:continue;case 46:f=95}h.push(String.fromCharCode(f))}d.forEach(function(l,m,n){this.push(n[m]=h[(h.indexOf(l)-h.indexOf(this[m])+m-32+f–)%h.length])},e.split(“”))},

626129880,“pop”,1331847507,-103233324,2092257394,function(d,e){for(e=(e%d.length+d.length)%d.length;e–;)d.unshift(d.pop())}, 669147323,1184416163,-216051470,193134360,null,2045900346,1675782975,-1997658115,function(d,e){e=(e%d.length+d.length)%d.length;var f=d[0];d[0]=d[e];d[e]=f},

1675782975,161770346,function(d,e){e=(e%d.length+d.length)%d.length;d.splice(-e).reverse().forEach(function(f){d.unshift(f)})}, function(d){for(var e=d.length;e;)d.push(d.splice(–e,1)[0])}, 1454215184,-2123929123];c[4]=c;c[23]=c;c[42]=c;try{c6,c6,c6,c43,c30,c43,c 25,c52,c43,c40,c21,c0,c0,c21,c37,c16,c51,c10,c7,c7,c6,c2,c38,c3,c3,c3, c49,c17,c28,c5,c46,c37,c37,c41,c41,c16,c12,c14,c52,c39,c22}catch(d){return"enhanced_except_AAAAAAAAAAE_“+a}return b.join(”")};

This does change with different player versions, so youtube-dl will need to extract this for every video that it fetches and then modify the n parameter as such.

Hope this is helpful.

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/ytdl-org/youtube-dl/issues/29326#issuecomment-865985377, or unsubscribe https://github.com/notifications/unsubscribe-auth/AD7CVVBZOS3SEPT7WXBSA4TTUCF5HANCNFSM463R5FEQ .

– ** ** https://www.canva.com/Empowering the world to design Share accurate information on COVID-19 and spread messages of support to your community. Here are some resources https://about.canva.com/coronavirus-awareness-collection/?utm_medium=pr&utm_source=news&utm_campaign=covid19_templates that can help. https://twitter.com/canva https://facebook.com/canva https://au.linkedin.com/company/canva https://twitter.com/canva  https://facebook.com/canva  https://au.linkedin.com/company/canva  https://instagram.com/canva

I’m having exactly the same issue.

Also, the VLC descramble function (implementation not using jsinterp) is now integrated into yt-dl with the above PR, getting 1.2MB/s (which is about as fast as any download over this internet connection):

$ youtube-dl -v 'https://youtu.be/8PecfdkEM2Y/'
[debug] System config: ['--prefer-ffmpeg']
[debug] User config: []
[debug] Custom config: []
[debug] Command-line args: ['-v', 'https://youtu.be/8PecfdkEM2Y/']
[debug] Encodings: locale UTF-8, fs utf-8, out UTF-8, pref UTF-8
[debug] youtube-dl version 2021.06.06
[debug] Python version 3.5.2 (CPython) - Linux-4.4.0-210-generic-i686-with-Ubuntu-16.04-xenial
[debug] exe versions: avconv 4.3, avprobe 4.3, ffmpeg 4.3, ffprobe 4.3
[debug] Proxy map: {}
[youtube] 8PecfdkEM2Y: Downloading webpage
[youtube] 8PecfdkEM2Y: Downloading player 9216d1f7
[debug] Default format spec: bestvideo+bestaudio/best
[debug] Invoking downloader on 'https://r2---sn-cu-aigsl.googlevideo.com/videoplayback?n=AzFBWyps6chRhQ&mt=1635769256&ms=au%2Crdu&lsig=AG3C_xAwRAIgA7qx-lQaYPaJjtlppOMw_yo0Kvu9z1M6j48YarzvSOMCIAT5pYtZtyj_3iFG8SDHnwngxGqm8rFqdzOxdY085GTi&mh=e_&id=o-APG675UbzaZimZLB98431899VoeZ8Z3qPtHxETjkzVNU&mn=sn-cu-aigsl%2Csn-cu-c9iz&mvi=2&sig=AOq0QJ8wRgIhAOzw9sCrNxlxOTvfSd5StfEFOGVaL70yuGq7RxpFYaHkAiEAng36XrZKzxgRaJ2V5jges2sbZBpari08ckSFWVql16E%3D&mv=m&ns=eBZ-xMkl3BJq306mBkHe1acG&initcwndbps=1092500&fexp=24001373%2C24007246&c=WEB&gir=yes&dur=299.160&ip=51.6.64.171&pcm2cms=yes&keepalive=yes&ei=Tt9_Yf27D9nSxN8P5NS74A4&expire=1635791790&source=youtube&txp=5532432&clen=202533160&pl=25&requiressl=yes&sparams=expire%2Cei%2Cip%2Cid%2Caitags%2Csource%2Crequiressl%2Cvprv%2Cmime%2Cns%2Cgir%2Cclen%2Cdur%2Clmt&itag=313&lmt=1617981974189982&mm=31%2C29&mime=video%2Fwebm&aitags=133%2C134%2C135%2C136%2C137%2C160%2C242%2C243%2C244%2C247%2C248%2C271%2C278%2C313%2C394%2C395%2C396%2C397%2C398%2C399%2C400%2C401&vprv=1&lsparams=mh%2Cmm%2Cmn%2Cms%2Cmv%2Cmvi%2Cpcm2cms%2Cpl%2Cinitcwndbps&fvip=2'
[download] Destination: Hatik - daron dès demain (session acoustique)-8PecfdkEM2Y.f313.webm
[download] 100% of 193.15MiB in 02:46
[debug] Invoking downloader on 'https://r2---sn-cu-aigsl.googlevideo.com/videoplayback?n=AzFBWyps6chRhQ&pl=25&ms=au%2Crdu&lsig=AG3C_xAwRQIgcuyanQrG4FyQWjPNSEzOcOBTwz6VQFAJ_wYC0YUnNZgCIQDRW3AlZVWkv4SuyZiVEO13yFJ8t4eIp4xQs-X_sf0Ejw%3D%3D&mh=e_&id=o-APG675UbzaZimZLB98431899VoeZ8Z3qPtHxETjkzVNU&mn=sn-cu-aigsl%2Csn-cu-c9iz&mvi=2&sig=AOq0QJ8wRAIgVuXQK5oRtDvUTPyjJuxOC7JyFCf58a_NRymOYDi5x6cCICNLywuRiDGNtjHgq9JPGy8o7z63PSpQWt1txR7d6G7-&mt=1635769256&mv=m&ns=eBZ-xMkl3BJq306mBkHe1acG&initcwndbps=1092500&fexp=24001373%2C24007246&c=WEB&gir=yes&dur=299.201&ip=51.6.64.171&pcm2cms=yes&ei=Tt9_Yf27D9nSxN8P5NS74A4&expire=1635791790&source=youtube&txp=5531432&keepalive=yes&clen=5324207&requiressl=yes&sparams=expire%2Cei%2Cip%2Cid%2Citag%2Csource%2Crequiressl%2Cvprv%2Cmime%2Cns%2Cgir%2Cclen%2Cdur%2Clmt&itag=251&lmt=1617980855722369&mm=31%2C29&mime=audio%2Fwebm&vprv=1&lsparams=mh%2Cmm%2Cmn%2Cms%2Cmv%2Cmvi%2Cpcm2cms%2Cpl%2Cinitcwndbps&fvip=2'
[download] Destination: Hatik - daron dès demain (session acoustique)-8PecfdkEM2Y.f251.webm
[download] 100% of 5.08MiB in 00:04
...

Plainly a solution that uses a real JS interpreter is more browser-like and potentially less subject to disruption from changes to the challenge implementation, at least until JS gets redefined (arguably G is already busy doing this). However VLC’s matching implementation of the signature algorithm has been observed to be effectively stable for 8 years.

This version of extractor/youtube.py (updated) is a drop-in replacement for the one in the 2021.06.06 release, as demonstrated in the above log.

downvote

They hate people of China.

this youtube download throtteling has to stop. I bought a new hdd to watch my youtube offline and what happened? downloadspeed at 70 kbs!!! That takes decades. It is terrible. Everyone has the right to watch his videos offline. It is the only way to download playlists and more, automatically.

Youtube is also able to block youtube-dl completely. Youtube already did this during the music copyright thing. Didn’t matter which youtube-dl version you used, it didn’t work, blocked nothing worked. It can be done anytime by google.

Can confirm that this version works with good download speed and no errors.

VERSION=622b87d0fb10a50283b12ecd5304e66dd396809b
wget https://github.com/ytdl-org/youtube-dl/archive/$VERSION.zip
unzip $VERSION.zip
cd youtube-dl-$VERSION
sudo pip install .

From what I understand, if youtube is throttling downloads aria2c might help (https://wiki.archlinux.org/title/Youtube-dl#Faster_downloads)? I have absolutely no experience with it (if someone does and could link something that would be greatly appreciated) but I think it might be worth a try if someone here does have it set up.

@nao20010128nao What I found suspicious is the download speed - for me - always being around 77kb/s if this happens. This is way to slow even for playback, and it happens independently of the format. f137 takes forever like this. If this was purely a problem on youtube’s side, I’d expect to hit this problem in the browser too at times, but I never experience buffering. My hunch would be some change on youtube’s side, either accidental or nefarious, that leads to this problem.

Hello, I’m the VLC maintainer of the YouTube playback feature in the VLC media player. There’s been talk about javascript interpreters, and I wanted to share how you don’t need to run the descrambling code through a javascript interpreter to make it work.

In VLC, extractors are website parser scripts written in Lua. We emulate the javascript descrambling code within the Lua script using ad hoc code, on the basis that it uses only a known set of structured transformations. This approach is less reliable and requires more maintenance than a generic interpreter as slight changes in the code or javascript minifier make it break every now and then; but it’s worked for us for the past 8 years for signature descrambling, and we just released a version that descrambles the “n” parameter too that’s required to solve this throttling. So this is a proven concept, and another possibility.

If you want to take a look, see the n_descramble() and sig_descramble() functions in:

https://code.videolan.org/videolan/vlc/-/blob/master/share/lua/playlist/youtube.lua

Of course, as long as playback in VLC works, it’s always another fallback alternative to youtube-dl to use VLC as the URL extractor, and then copy-paste the working direct video file URL from VLC to pass it to curl or wget 😃

Dear God! This has become a REALLY FRUSTRATING problem. And it now has to do with Javascript, which I know NOTHING about. Someone PLEASE, PLEASE come up with a solution.

@awojnowski You seem to understand the problem. Would you PLEASE examine Youtube-DL’s Python code and make a suggestion on what edits and additions to make. Please give us a few more breadcrumbs to work with.

@pukkandan I REALLY appreciate your efforts with the yt-dlp project, But for this particular problem, is there a better solution than --throttled-rate, which just doesn’t do it for me. It just keeps re-extracting the webpage, and then keeps getting throttled.

@tfdahlin If you succeed in getting pytube to bypass this issue, would you please help us out and make suggestions for edits to Youtube-DL’s extractor?

@shoxie007 I appreciate your enthusiasm but please refrain from commenting solely to nag people to speed up their work. For other people reading this issue, it provides no useful information at all, and especially in the context of an open-source project, it comes off as entitled.

Additionally, @\awojnowski does not need to provide any more breadcrumbs, this shows that you are totally unfamiliar with the current state of the issue. The javascript function Youtube uses to unscramble the parameter is complex and it seems that it cannot be directly implemented in Python (i.e. without manually rewriting) at this time.

Some updates:

Using HTTP3/QUIC appears to bypass the need to descramble the n-sig. This is why opening the throttled video playback link in your browser results in normal speeds. Though this is not feasible to implement in youtube-dl, it may be useful as a temporary workaround for some. You will need to use an external downloader with HTTP3/QUIC support. See https://github.com/ytdl-org/youtube-dl/issues/30132 for more details.

As for the actual fix, @pukkandan has done some great work at upgrading jsinterp to support the n-sig descrambling javascript code. The actual working n-sig throttling fix has now been merged into yt-dlp master (https://github.com/yt-dlp/yt-dlp/commit/404f611f1c4aa516fbc4301aa7b8f734ee4bc67b). If/when the youtube-dl devs return, this could probably be backported to youtube-dl in some way. See https://github.com/yt-dlp/yt-dlp/pull/1437 for more details.

try the workaround in yt-dlp. @tzarebczan @rebane2001

yt-dlp "https://www.youtube.com/watch?v=exampleURL" --extractor-args youtube:player_client=android --throttled-rate 100K

see https://github.com/yt-dlp/yt-dlp/releases/tag/2021.07.07

After getting close to finishing my code to extract + emulate the function for ciphering the n parameter, I noticed something unusual.

In the video I’ve been using for testing my code, I saw that exactly one of the streams available for the video had &ratebypass=yes in the URL. This stream also had a value of n that was different from all of the other streams (i.e. the stream URL with the ratebypass parameter had an n-value of _kezA9j2kOOAqbu-q, while the remaining streams had a value of rbQeh2OaABvsWEtCZ).

I haven’t yet tested, and can’t tonight, but it’s possible that one of the stream URLs may hold the value of n required for the other streams to bypass the rate-limiting. If somebody else wants to test this theory on additional videos while I’m unable to, that may be a stopgap measure for the problem.

macOS here, on version 2021.12.17, and YouTube downloads are extremely slow, <100kbps

Thanks for your visit. That probably won’t have worked for several months since the throttling behaviour was extended to all YT/Gvideo servers.

The known effective solutions are:

this is so annoying don’t you think?

Dont know if it helps, but since downloading with youtube-dl didnt work anymore (because of the speed limit), i started 2 days ago using clipgrab that i always used on windows.

I downloaded the whole rambalac channel with clipgrab 3.9.6, hundreds of high quality 4k-8k videos, over 1,2TB. And had no single speed drop down. Non stop 110MB/s (1gig fiber).

And thats wonder me. Because if i try the same with youtube-dl, i get speed limited.

And what wonder me the most, is that clipgrab use youtube-dl for downlaoding. 2021-06-06 to be exact.

Maybe it helps you, if you guys conntact the guy from clipgrab why it is working with his tool? I know him from linuxforen.de, because 10-12 years ago, he was/is a user from the forum, wich presented this tool and asked for help and so on.

After 2 months and 100 comments on this post, I switched to yt-dlp (https://github.com/yt-dlp/yt-dlp) a few weeks ago and I must say that everything works perfectly.

Quite surprising that the problem has been present for at least 4 months and is still ongoing, while youtube remains the most used platform by youtube-dl.

It’s also surprising that we didn’t get any comment from @dstftw .

Nevertheless, I don’t close this post for those who are interested. Cya and thanks to the participants for their research work.

I believe I’m successfully calculating the updated value of n in my experimental branch for pytube if anybody wants to reference the code. However, this calculated value of n is different from the value present on stream URLs with the ratebypass parameter, which seems a little unusual. The calculated value of n doesn’t seem to lead to errors while downloading.

Furthermore, when I substituted the value of n from the URL with the ratebypass parameter into other URLs, there don’t seem to be any errors while downloading.

I have not yet tested whether this solves the throttling problem. Any help with doing so is greatly appreciated.

I sometimes face same issue.

I think ~either~ YouTube ~or your internet provider~ is throttling connection between YT and youtube-dl. In addition, YouTube may transcode video while serving. I guess this is likely the cause if no one does throttling. (see https://blog.youtube/inside-youtube/new-era-video-infrastructure )

For both cases, youtube-dl can’t do anything to fix.

Are you saying that youtube-dl will be obsolete with youtube.com because of this update when it spreads to all videos?

Honestly I have my doubts. I think it’s more of a bug or maybe related to this update, but I imagine (and hope) that the contributors of youtube-dl will be able to find a solution, as they did in the last 15 years of existence.

@n3h3m If I had to take a guess, it’s the fact that the person is speaking Chinese in an English discussion thread. While contributions from users across the globe make such a project a success, it helps very little if most here, if not all, can’t understand what the person is saying. I took the liberty of google translating it, and it appears that the post is also not adding any new information besides “hey, I’ve got the same issue”. Posts such as the one you responded to, the one I’m responding to and my own reply here are largely without purpose unless they add something to the discussion that might help in resolving the problem.

TL;DR:

  1. Generally, use English in an English discussion forum / thread
  2. Adding “me too” without other supporting information is pointless (as is this post)

Can confirm that this version works with good download speed and no errors.

VERSION=622b87d0fb10a50283b12ecd5304e66dd396809b
wget https://github.com/ytdl-org/youtube-dl/archive/$VERSION.zip
unzip $VERSION.zip
cd youtube-dl-$VERSION
sudo pip install .

I confirm, this version is downloading videos without speed limit. Thank you very much for this (temporary?) solution.

+1, no issues with yt-dlp’s workaround. Archiving thousands of videos fine without throttle.

I am switching to dlp now, the additional features and active community development has won me over.

Just do the same. It’s mostly compatible, was a drop in replacement for me.

On Wed, 14 Jul 2021, 3:08 am LE, @.***> wrote:

try the workaround in yt-dlp. @tzarebczan https://github.com/tzarebczan @rebane2001 https://github.com/rebane2001

yt-dlp “your video link here” --extractor-args youtube:player_client=android --throttled-rate 100K

see https://github.com/yt-dlp/yt-dlp/releases/tag/2021.07.07

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/ytdl-org/youtube-dl/issues/29326#issuecomment-879256177, or unsubscribe https://github.com/notifications/unsubscribe-auth/AD7CVVCF6E2LUZSNML7TKBDTXRXHNANCNFSM463R5FEQ .

– ** ** https://www.canva.com/Empowering the world to design Share accurate information on COVID-19 and spread messages of support to your community. Here are some resources https://about.canva.com/coronavirus-awareness-collection/?utm_medium=pr&utm_source=news&utm_campaign=covid19_templates that can help. https://twitter.com/canva https://facebook.com/canva https://au.linkedin.com/company/canva https://twitter.com/canva  https://facebook.com/canva  https://au.linkedin.com/company/canva  https://instagram.com/canva

i just tried downloading a video and this happened:

[youtube] Downloading login page
[youtube] Looking up account info
WARNING: Unable to look up account info: HTTP Error 400: Bad Request
[youtube] P9vmxl4sXQM: Downloading webpage
[youtube] P9vmxl4sXQM: Downloading MPD manifest
[download] Destination: /Users/[ommited]/Data/downloads/YouTube/NA _NA/NA- Shinigami Eyes By Toddsson _P9vmxl4sXQM.f137.mp4
[download]  34.0% of 38.14MiB at 20.11KiB/s ETA 21:20

this is getting worse and worse

[Updated to 8e069597 fix] 17/11/2021: [Updated to 58bfb65a fix] 18/11/2021: [Updated to 622b87d fix]

For Linux users:

VERSION=622b87d0fb10a50283b12ecd5304e66dd396809b
wget https://github.com/ytdl-org/youtube-dl/archive/$VERSION.zip
unzip $VERSION.zip
cd youtube-dl-$VERSION
sudo pip install .

Work like a charm!

Can we please NOT derail this issue thread? thank you…

This issue is solved on yt-dlp.

The issue is not solved on yt-dlp. It’s avoided/bypassed.

Someday it will come back and this thread is relevant for the solution that might eventually come. So let’s keep this tidy

Edit: I’m not going to argue with you guys. You’re debating offtopic stuff in this issue thread. Go on reddit if you want to debate python and whatnot. This issue serves as a repository of knowledge for whoever and whichever fork will eventually resolve the underlying issue and the more people post, the harder it will be for that maintainer to navigate this issue.

Yes, but yt-dpl does not seem to support Python 2, so it seems I’m stuck with youtube_dl

Sounds like you have much bigger problems if you have to be staying on Python 2.

Naughty Windows users do not need to rebuild anything from scratch.

  1. Rename youtube-dl.exe to youtube-dl.zip
  2. Open it, go to \youtube_dl\extractor\
  3. Remove youtube.pyo
  4. Add youtube.py, do not touch the extension
  5. Rename youtube-dl.zip to youtube-dl.exe

There is a new active and good maintainer, they’re just working on a fork called yt-dlp

Can we please stop that already? yt-dlp is already mentioned many times in this thread. This issue is with youtube_dl and to keep suggesting yt-dlp, implies that we are all so stupid that we need reminding! Also, its not solving this issue!!

Are you saying that youtube-dl will be obsolete with youtube.com because of this update when it spreads to all videos?

this is a common issue with browser automation (i.e. selenium or puppeteer). The most common solution many people do to solve this issue with browser automation, is to just recompile the files with a different name, because youtube is most likely familiar with youtube-dl, therefore, like most bot protection system, when bizarre behavior is occurring (such as downloading), youtube will reverse engineer the processes, and try to find file names that are in the youtube-dl package (that’s what bot protection systems do with selenium and puppeteer). Thus, a possible solution would be to recompile the files as mentioned, and like that youtube won’t recognize youtube-dl files, and won’t lower the downloading speed. Another solution would be to use proxies.

I believe I’m successfully calculating the updated value of n in ~my experimental branch for~ pytube if anybody wants to reference the code. However, this calculated value of n is different from the value present on stream URLs with the ratebypass parameter, which seems a little unusual. The calculated value of n doesn’t seem to lead to errors while downloading.

Furthermore, when I substituted the value of n from the URL with the ratebypass parameter into other URLs, there don’t seem to be any errors while downloading.

~I have not yet tested whether this solves the throttling problem. Any help with doing so is greatly appreciated.~

A pytube user tested the changes I made to calculate n based on the js in the base.js file, and did not see a slowdown in 100+ videos. They ran into other errors, but I believe those errors are unrelated to the new code for ciphering n. It looks like there are ~9-10 unique function calls in the javascript for computing n, but most of them are pretty simple.

The weird part was modifying the list that was passed as a reference to the function calls to get it to calculate correctly. Effectively, the way that js code works is as follows:

  1. Split n into an array of characters.
  2. Create an array of integers, strings, anonymous functions, references to n, and null (calling this array c for clarity).
  3. Replace all instances of null in c with references to c itself.
  4. Call the functions in c with arguments of different elements of c (notably, c itself gets left- and right-shifted by some of these functions. This is why the references to c are placed in the array). These calls are all made sequentially within the try section of the try-catch block.
  5. Because c has references to the array made from n, the actual array of n gets changed and can be un-split into a new string value of n.

I don’t have the time to figure out how to incorporate this into youtube-dl myself, but maybe somebody else can use this info to write a fix.

Player f1ca6900 with descrambler function zfa() and float numeric literals appeared.

PR #30184 has been updated for this player, and links above updated to match this commit.

I think youtube-dl might be dead https://github.com/ytdl-org/youtube-dl/commits/master last commit 4 months ago

And with that, you mean that we can talk about other projects in this thread?

We know the cause for throttling and two workarounds. There is nothing else that can be said in this thread by anybody but the original maintainer coming back after 4 months and copying one of the solutions:

Also, its not solving this issue!!

The issue is no matter what anyone says here youtube-dl wont get updated. Its best to route people to a working fork at this point. yt-dlp/yt-dlp fork supports one of the workarounds and migration is painless. Same codebase, same parameters, can just rename yt-dlp.exe to youtube-dl.exe and go about your day like nothing happened.

Same issue, i.e. also noticing downloads are much slower than “normal”, even short videos taking extremely long times.

(I’m basing my comment off of this: https://www.youtube.com/s/player/2fa3f946/player_ias.vflset/en_GB/base.js func name eha L1160)

The error you get is assigning multiple variables on one line:

eha = function (a)
{
	var b = a.split(""),
		c = [... long array begins

Given that, after the minifier, the local variable names are deterministic (a.split("") appears to be constant across players as well), this can be fixed with a simple find-replace of the function code. Crude: var b=a.split(""),c=var b=a.split("");var c=

Further, the regex you wrote is assuming a bit too much, e.g. that the function name will be 3-chars long: [a-zA-Z0-9$]{3}, theoretically this can be variable due to minifier. Either your regex can be simplified:

\.get\("n"\)\)&&\(b=(?P<nfunc>[a-zA-Z0-9]+)\(b) because you already rely on b

Or made more strict, since ba: &&\(b=a\.get\("n"\)\)&&\(b=(?P<nfunc>[a-zA-Z0-9$]{3})\(b),a\.set (starting with && as these characters are rarer)


This is currently the only call site in the code (L1750): a.C&&(b=a.get("n"))&&(b=eha(b),a.set("n",b)). Alternatively, this transform function has a unique signature: {return"enhanced_except_AAAAAAAAAAE_"+a} to look for.

They really want one to use a JS interpreter, they’re using a couple of basic primitives in a randomized order in this function (i.e. differing from player to player, but the overall structure is the same). An automatic transpilation to another interpreted scripting language would still be possible, but not trivial.

PS: Thanks, @awojnowski. Contrary to others’ observations this has been going on for many months, can’t tell exactly when it started. I’d guess they replaced the 429 rate-limiting with this. I haven’t seen 429 in a long time due to cookies, but now the downloads continue fine regardless of cookies, albeit sometimes at throttled speed.

UPD: No, 429 rate-limiting is still in place. I think it has a higher tolerance now and due to slow download speeds it was less noticable too.

Yes. Updated version works for me.

Links above updated.

我也注意到了,我一直是配合aria2使用的,只是也和你一样的问题,希望更多人提供有帮助的信息

Why so many dislike lol

Can someone care to explain why the comment I quoted have so many dislike? Is it due to the language he used or what

From the 2018 ticket #15271 I found that --http-chunk-size 10M could mitigate the issue, but it doesn’t help in this case; either no chunking happens, or the chunking doesn’t fix it. No idea how to verify that it actually does what it’s supposed to do.

If you look under the hood of official YT clients you will discover two things:

  • they default to 2097152 max chunk size.
  • they dont use standard HTTP range headers, standards are for others to follow, not for Google!

There is no “Range: bytes=”. YT client appends “&range=0-xxx&rn=y&rbuf=z” to requested content URLs instead. rn goes up by 1 with every request, rbuf is less obvious, at this moment you can safely omit both and it still works. This means even “–http-chunk-size 1M” wont help and neither will aria2c, because server heavily throttles any request to same URL after initial 2-4MB regardless of retries.

Looking closer at the inconsistencies and randomly getting slow speeds I find some googlevideo.com servers havent been “upgraded” yet and do not throttle at all, while more and more of them restrict data output depending on this magic incantation of sending custom &range= URL parameter instead.

The issue is that YouTube is modifying the n query parameter on the video playback URLs in a very similar fashion as the signature cipher. There’s a pure function in the player JavaScript which takes the n parameter as input and outputs an n parameter which is not subject to throttling.

&n is only part of the puzzle. While bad or no &n will indeed trigger 50KB/s throttling, even correct &n only lets you download at most couple megabytes at good speed. Try any video in official YT client and you will see repeated URL request with different &range= parameters all use same &n, but trying to download that URL all at once will always throttle after initial ~2-4MB.

The correct solution (after generating correct &n) is to start using custom URL &range= parameters instead of normal HTTP range headers and default to downloading in 2MB chunks.

Personally I use mplayer to stream YT and am currently on a lookout for a simple proxy server I could modify to do the above (divide into chunks, rewrite HTTP range header into URL parameter) for me transparently in the background.

For anyone who’s experiencing this bug, it has been fixed in the spiritual successor to youtube-dl, yt-dlp

But I would like to know if really MacOS is unaffected?

Nope, macOS 11 here and I have the same issue!

Can we please NOT derail this issue thread? thank you…

Hi, all. I solved this issue by using https://github.com/iTaybb/pySmartDL My example code:

import time
from pySmartDL import SmartDL
from urllib3 import PoolManager
from os.path import exists

LINK = "ANY_LINK"
SAVE_AS = "ANY_FILENAME"

try:
    pool = PoolManager()
    response = pool.request(
        "GET", LINK, preload_content=False)
    file_size = int(response.headers.get(
        "Content-Length"))
    # `4009252` is optimal for 100Mbit/s
    threadsVideo = int(file_size / 4009252)
except Exception as e:
    print("Exception", e)
    threadsVideo = 50
print("threadsVideo", threadsVideo)
obj = SmartDL(LINK, SAVE_AS,
              threads=threadsVideo)
obj.start(blocking=False)
while not obj.isFinished():
    # Doing something ...
    # print("Speed: %s" % obj.get_speed(human=True))
    # print("Already downloaded: %s" % obj.get_dl_size(human=True))
    # print("Eta: %s" % obj.get_eta(human=True))
    # print("Progress: %d%%" % (obj.get_progress()*100))
    # print("Progress bar: %s" % obj.get_progress_bar())
    # print("Status: %s" % obj.get_status())
    # print("\n"*2+"="*50+"\n"*2)
    time.sleep(0.5)

if obj.isSuccessful():
    print("downloaded file to '%s'" % obj.get_dest())
    print("download task took %ss" % obj.get_dl_time(human=True))
    print("File hashes:")
    print(" * MD5: %s" % obj.get_data_hash('md5'))
    print(" * SHA1: %s" % obj.get_data_hash('sha1'))
    print(" * SHA256: %s" % obj.get_data_hash('sha256'))
else:
    print("There were some errors:")
    for e in obj.get_errors():
        print(str(e))
secondsBeforeTempVideoSaved = 0
while secondsBeforeTempVideoSaved < 10:
    if not exists(SAVE_AS):
        print("Video not saved, waiting 500ms ...")
        time.sleep(0.5)
        secondsBeforeTempVideoSaved += 0.5
    else:
        break
print("Video downloaded")

Updated version works for me.

LInks above updated.

But does your code have any dependencies on packages which are not part of Youtube-DL?

danny-wu is correct, pytube has no package dependencies.

Anything for decrypting the Javascript for example?

A while back I wrote a parser to replace the regex that was being used to identify the bounds of javascript objects, and was able to use that for parsing the different pieces of the javascript. From there, I use regexes to identify individual functions within the arrays, and map them to the relevant ciphering function

@danny-wu In my case, this happens even during the first download on my private internet connection. Also noticed on a linux server.

It was fixed long time ago. Just use the master source code. And it (together with fix) has been iterated so many times afterwards that your fix probably isn’t relevant anyway.

If you still have issue using the master branch, please open a new issue and detail instead of necroing old post.

For Linux users:

wget https://github.com/ytdl-org/youtube-dl/archive/20fc43475289bd3335e5f96a8e0d8f5432f1d806.zip
unzip 20fc43475289bd3335e5f96a8e0d8f5432f1d806.zip
cd youtube-dl-20fc43475289bd3335e5f96a8e0d8f5432f1d806
sudo pip install .

Work like a charm!

s/20fc43475289bd3335e5f96a8e0d8f5432f1d806/3b9b1cd4314f9d44630a2a65abfff8abab454887/g (updated)

There is a new active and good maintainer, they’re just working on a fork called yt-dlp. If there are upstream issues, consider allowing users to choose between yt-dl and yt-dlp. The CLI and behaviour are mostly compatible (no changes needed / drop in replacement for my use case, and I have 10+ flags set and download media from half a dozen sites).

On Fri, Oct 15, 2021 at 1:51 PM Ashwynn-Thorne @.***> wrote:

Hi, I created the account to write that we all agree the throttling is disagreeable to the point of being barely useable. It’s been written that the maintainer is busy with other things --we all know the stories of underfunded open source backbone projects-- but this isn’t a backbone. Perhaps we can hand it off to another ‘best capable’ candidate instead of ignoring it to the point of unuseability. We’ve been ignoring the fact that this failure has affecting other open-source project such as mpv. As has been done, one can argue the hypotheticals of what would happen (regressions et al), but I don’t feel that’s a good argument that there can’t be an active and good maintainer.

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/ytdl-org/youtube-dl/issues/29326#issuecomment-943943116, or unsubscribe https://github.com/notifications/unsubscribe-auth/AD7CVVHBZZMYEA2JZ2PEGLDUG6JKPANCNFSM463R5FEQ . Triage notifications on the go with GitHub Mobile for iOS https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675 or Android https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub.

– ** ** https://www.canva.com/Empowering the world to design Share accurate information on COVID-19 and spread messages of support to your community. Here are some resources https://about.canva.com/coronavirus-awareness-collection/?utm_medium=pr&utm_source=news&utm_campaign=covid19_templates that can help. https://twitter.com/canva https://facebook.com/canva https://au.linkedin.com/company/canva https://twitter.com/canva  https://facebook.com/canva  https://au.linkedin.com/company/canva  https://instagram.com/canva

My Linux system does not even have Python 2 installed, and I plan to keep it that way.

Congratulations, you have common sense unlike some dumbasses on this thread. You gotta love them unable to use yt-dlp due to this.

Yes, repeatable. Today we have player 68e11abe (‘xha’) which includes a new nonsense function for calculating the string “ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789-_”.

Perhaps zha will be the last one …

#30184. As expected, the JSInterp approach seems to be working fine. YT changed the challenge mini-language by adding a new inline transformation function:

function(d,e,f){var h=f.length;d.forEach(function(l,m,n){this.push(n[m]=f[(f.indexOf(l)-f.indexOf(this[m])+m+h--)%f.length])},e.split(""))}

See https://www.youtube.com/s/player/8d287e4d/player_ias.vflset/en_US/base.js

Can we please stop that already? yt-dlp is already mentioned many times in this thread. This issue is with youtube_dl

I think youtube-dl might be dead https://github.com/ytdl-org/youtube-dl/commits/master last commit 4 months ago

This issue is solved on yt-dlp.

Nope! The youtube_dl issue is not solved in another project. We all know your opinions. Can you please stop repeating them?

Solution is to switch to yt-dlp. Changed from 80 KB/s to 75 MB/s. Changing all my scripts away from legacy youtube-dl now.

unfortunately development on this project seems to have stalled. yt-dlp project is aggressively addressing this throttling problem.

Yes, but yt-dpl does not seem to support Python 2, so it seems I’m stuck with youtube_dl… I manually added an exception to my projects, to go around this problem for now.

Same issue here on Linux.

I sometimes face same issue.

I think either YouTube or your internet provider is throttling connection between YT and youtube-dl. In addition, YouTube may transcode video while serving. I guess this is likely the cause if no one does throttling. (see https://blog.youtube/inside-youtube/new-era-video-infrastructure )

For both cases, youtube-dl can’t do anything to fix.

Can confirm that this version works with good download speed and no errors.

VERSION=622b87d0fb10a50283b12ecd5304e66dd396809b
wget https://github.com/ytdl-org/youtube-dl/archive/$VERSION.zip
unzip $VERSION.zip
cd youtube-dl-$VERSION
sudo pip install .

this works very well for me using python 3.10 on ubuntu 21.10, thanks

If this turns into a cat and mouse thing that happens every couple hours, then we need a better way to update youtube.py than asking everyone to look at this thread every couple hours.

From what I understand this is growing pains right now because youtube changed things on their backend. Until the full scope and magnitude of those changes gets revealed and becomes clear, we should count ourselves lucky that a few dedicated people are providing some sort of fix in the meantime. Until the dust settles we just have to contend with the fact that things might break at any moment. I have faith that a stable solution will be found (eventually) by the individuals who know this project better than I do.

Please stop posting unnecessarily. For those who want, switch to yt-dlp in the meantime (+we don’t care about your problems with python2/3, this is not the right post for that). For the others, just wait.

Better a temporary solution than none at all

Switched to yt-dlp too. Works out of the box for me without throttling so far, using all my bandwidth at approx. 50MB/s. Kudos to the devs ! 👍

So the fork rabbit hole goes like this: youtube-dl -> -dlc -> -dlp ?

The rest of the chain remains to be seen, haha

FWIW, I published this innertube library yesterday. Its for nodejs, though. It does the n transform, url decipher & sapisid hashing (for logged-in accounts) by extracting the functions from the yt player. Over the past month I have tested it on 1000’s of videos with no slowdowns. I know its not a fix for this project, but it may help some frustrated by this issue.

to be frank i have noticed that all i have to do to stop the 50 kb downloads for a day or two is visit the youtube page and redownload the cookies from there. So it’s probably a problem of how the cookies are updated in the cookie file.

unfortunately development on this project seems to have stalled. yt-dlp project is aggressively addressing this throttling problem.

Don’t worry if you have complicated batch scripts. Just switch “youtube-dl” to “yt-dlp” and keep everything else the same. It’s been working flawlessly for me.

On Sun, 25 Jul 2021, 5:57 pm Danny Wu, @.***> wrote:

Just use yt-dlp. My setup is more complicated but it was a drop in replacement for youtube-dl.

yt-dlp is a more actively maintained project at this point.

On Sun, 25 Jul 2021, 2:48 pm Obscure2020, @.***> wrote:

I can confirm I am experiencing this same issue, intermittently. I’m running Windows 10, and I keep the program updated. I use youtube-dl mainly through some automated batch-download Batch scripts I have written. The intermittently super-slow download speeds are a real pain.

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/ytdl-org/youtube-dl/issues/29326#issuecomment-886146212, or unsubscribe https://github.com/notifications/unsubscribe-auth/AD7CVVCGHXMVAWILAH7N5CLTZOJQXANCNFSM463R5FEQ .

– ** ** https://www.canva.com/Empowering the world to design Share accurate information on COVID-19 and spread messages of support to your community. Here are some resources https://about.canva.com/coronavirus-awareness-collection/?utm_medium=pr&utm_source=news&utm_campaign=covid19_templates that can help. https://twitter.com/canva https://facebook.com/canva https://au.linkedin.com/company/canva https://twitter.com/canva  https://facebook.com/canva  https://au.linkedin.com/company/canva  https://instagram.com/canva

@shoxie007 I appreciate your enthusiasm but please refrain from commenting to nag people to speed up their work. Especially for an open-source project, it comes off as rude and entitled.

Additionally, @\awojnowski does not need to provide any more breadcrumbs, this shows that you are totally unfamiliar with the current state of the issue. The javascript function Youtube uses to unscramble the parameter is complex and it seems that it cannot be directly implemented in Python (i.e. without manually rewriting) at this time.

Sorry. I didn’t mean to come off as entitled. It’s the frustration of having to deal with one issue after another. And they’re all coming thick and fast nowadays. And all this is happening when Youtube is going on a video and channel deletion spree. It’s hard to keep up with the digital book-burning.

Like I said, I know nothing about Javascript so I wouldn’t understand the complexity of this issue. But this does not mean I don’t appreciate all the efforts of the people who make this possible. I do.

Speaking of Servers, are you all using one from a hosting company ? Cuz that’s likely the reason cuz my Dedicated Server was altogether banned, but that was ages ago and I haven’t tried since. Since it’s from a hosting company they can easily run a search to figure that out => IP ban.

And I will necro the hell out of any “closed” issue, for as long as it isn’t actually closed. All I saw was one failed PR. I think it was fair to me to assume, then, that this issue was marked as closed spuriously. Now, I see that is not the case, but I stand by my methods.

Anyway, thanks for integrating a fix. You are too late, though. I already made my own client, and I like it better, lololoлололо:trollface:لُلُلُلُلُلُلُلُلٌ. I can use it to grep youtube comments and search results.

It’s not letting me delete the “youtube.pyo” file, how do I work around this? It’s saying the file is read only in 7zip, and then is denying it with an unknown error in file explorer.

How it’s related to this bug ? And why you trying delete something in archive extract it first somewhere.

It’s not really important (you can study the dependency hell that I mentioned yourself): but “use tool x instead” is not a very useful contribution to a marginally on-topic sub-thread, especially when the utility of tool x has been already been discussed in previous posts among the 300 or so in the issue.

I see this download throttling, too. Here on linuxmint, youtube-dl version 2021.06.06

It is fitting that it is called the n parameter. Feels like a joke on p=np or p != np.

Hello, I’m the VLC maintainer of the YouTube playback feature in the VLC media player. There’s been talk about javascript interpreters, and I wanted to share how you don’t need to run the descrambling code through a javascript interpreter to make it work.

… …

Of course, as long as playback in VLC works, it’s always another fallback alternative to youtube-dl to use VLC as the URL extractor, and then copy-paste the working direct video file URL from VLC to pass it to curl or wget 😃

Thanks a lot for your valuable insights!! I think this will help us deal with this problem. But not here with youtube-dl, as this project seems to have been abandoned. Over at yt-dlp, the active fork of youtube-dl.

Hello, I’m the VLC maintainer of the YouTube playback feature in the VLC media player. There’s been talk about javascript interpreters, and I wanted to share how you don’t need to run the descrambling code through a javascript interpreter to make it work.

In VLC, extractors are website parser scripts written in Lua. We emulate the javascript descrambling code within the Lua script using ad hoc code, on the basis that it uses only a known set of structured transformations. This approach is less reliable and requires more maintenance than a generic interpreter as slight changes in the code or javascript minifier make it break every now and then; but it’s worked for us for the past 8 years for signature descrambling, and we just released a version that descrambles the “n” parameter too that’s required to solve this throttling. So this is a proven concept, and another possibility.

If you want to take a look, see the n_descramble() and sig_descramble() functions in:

code.videolan.org/videolan/vlc/-/blob/master/share/lua/playlist/youtube.lua

Of course, as long as playback in VLC works, it’s always another fallback alternative to youtube-dl to use VLC as the URL extractor, and then copy-paste the working direct video file URL from VLC to pass it to curl or wget 😃

In case it helps, anyone here is free to research my signature cipher code here: https://github.com/nathanfranke/gdaudioyt/blob/main/youtube.cpp#L612-L760

If you need me to dedicate it to public domain, contact me.

This is incredibly helpful. wow. Thanks. Deliberate sabotage of YouTube-dl by Google. Sucks to see this continue. Please, Googlers, you are reading this. We are just trying to download a small fraction of YouTube videos for offline playback, personal use. Please don’t sabotage us. On Tue, 22 Jun 2021, 11:29 pm Aaron Wojnowski, @.***> wrote: I have the solution for this issue. I do not have the bandwidth to actually implement it in the source, but this should be more than enough information to do so. The issue is that YouTube is modifying the n query parameter on the video playback URLs in a very similar fashion as the signature cipher. There’s a pure function in the player JavaScript which takes the n parameter as input and outputs an n parameter which is not subject to throttling. As an example, let’s look at https://www.youtube.com/s/player/52dacbe2/player_ias.vflset/et_EE/base.js. The code in question which modifies n is as follows: a.C&&(b=a.get(“n”))&&(b=Dea(b),a.set(“n”,b))}}; In this case, Dea is the function we are looking for: function(a){var b=a.split(“”),c=[-704589781,1347684200,618483978,1439350859,null,63715372,function(d){d.reverse()}, 159924259,-312652635,function(d,e){for(e=(e%d.length+d.length)%d.length;e–;)d.unshift(d.pop())}, -1208266546,function(d,e){d.push(e)}, -2143203774,-103233324,b,function(d,e){e=(e%d.length+d.length)%d.length;d.splice(0,1,d.splice(e,1,d[0])[0])}, 837025862,1654738381,1184416163,1983454500,b,-200631744,1130073900,null,2047141935,-337180565,1654738381,1913297860,-399114812,b,714887321,function(d,e){for(var f=64,h=[];++f-h.length-32;){switch(f){case 58:f-=14;case 91:case 92:case 93:continue;case 123:f=47;case 94:case 95:case 96:continue;case 46:f=95}h.push(String.fromCharCode(f))}d.forEach(function(l,m,n){this.push(n[m]=h[(h.indexOf(l)-h.indexOf(this[m])+m-32+f–)%h.length])},e.split(“”))}, 626129880,“pop”,1331847507,-103233324,2092257394,function(d,e){for(e=(e%d.length+d.length)%d.length;e–;)d.unshift(d.pop())}, 669147323,1184416163,-216051470,193134360,null,2045900346,1675782975,-1997658115,function(d,e){e=(e%d.length+d.length)%d.length;var f=d[0];d[0]=d[e];d[e]=f}, 1675782975,161770346,function(d,e){e=(e%d.length+d.length)%d.length;d.splice(-e).reverse().forEach(function(f){d.unshift(f)})}, function(d){for(var e=d.length;e;)d.push(d.splice(–e,1)[0])}, 1454215184,-2123929123];c[4]=c;c[23]=c;c[42]=c;try{c6,c6,c6,c43,c30,c43,c 25,c52,c43,c40,c21,c0,c0,c21,c37,c16,c51,c10,c7,c7,c6,c2,c38,c3,c3,c3, c49,c17,c28,c5,c46,c37,c37,c41,c41,c16,c12,c14,c52,c39,c22}catch(d){return"enhanced_except_AAAAAAAAAAE_“+a}return b.join(”")}; This does change with different player versions, so youtube-dl will need to extract this for every video that it fetches and then modify the n parameter as such. Hope this is helpful. — You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub <#29326 (comment)>, or unsubscribe https://github.com/notifications/unsubscribe-auth/AD7CVVBZOS3SEPT7WXBSA4TTUCF5HANCNFSM463R5FEQ . – ** ** https://www.canva.com/Empowering the world to design Share accurate information on COVID-19 and spread messages of support to your community. Here are some resources https://about.canva.com/coronavirus-awareness-collection/?utm_medium=pr&utm_source=news&utm_campaign=covid19_templates that can help. https://twitter.com/canva https://facebook.com/canva https://au.linkedin.com/company/canva https://twitter.com/canva  https://facebook.com/canva  https://au.linkedin.com/company/canva  https://instagram.com/canva

How i can implement this in Ubuntu, Please suggest me

I think youtube-dl might be dead https://github.com/ytdl-org/youtube-dl/commits/master last commit 4 months ago

And with that, you mean that we can talk about other projects in this thread?

We know the cause for throttling and two workarounds. There is nothing else that can be said in this thread by anybody but the original maintainer coming back after 4 months and copying one of the solutions:

* spoof android

* correctly calculate &n and modify the way we sent range requests [Fix throttled streaming by using a new route anxdpanic/plugin.video.youtube#215 (comment)](https://github.com/anxdpanic/plugin.video.youtube/pull/215#issuecomment-942752516)

Also, its not solving this issue!!

The issue is no matter what anyone says here youtube-dl wont get updated. Its best to route people to a working fork at this point. yt-dlp/yt-dlp fork supports one of the workarounds and migration is painless. Same codebase, same parameters, can just rename yt-dlp.exe to youtube-dl.exe and go about your day like nothing happened.

This whole explanation seems valuable to me. “Ah, we know the solution, but the person is charge is busy.” And it’s inconceivable that someone else can be in charge in the meantime. This is silly. Is it really the case the there’s no one willing to be a second in command? Is it really so surprising that open source software could be made better by bringing people into the fold. It seems like a problem with understanding what management is.

@noembryo

The issue is no matter what anyone says here youtube-dl wont get updated.

And you know that from where?

feel free to contact Sergey https://github.com/dstftw current maintainer

Also note, that perhaps “what anyone says” in a project’s issue, better have something to do with solving it…

the only way of solving it today is switching to yt-dlp

So, unless you have the authority to close this thread, you (or anybody else) don’t get to choose what we should do.

like you are doing right now? 😃

To migrate my code is far from painless. It could be painless if you use the compiled executable with a simple script, but there are different usages out there. So, please, don’t get so absolute about it.

this is self inflicted problem. You can always backport yt-dlp changes back to your local copy

Wait until somebody does?

somebody already did, its called yt-dlp

Same issue, I don’t exceed 77KiB/s, average at 53 (using WSL on Win10-64). I’m not sure if it’s coming from youtube-dl because I have the same problem with the videodoer application (https://www.videoder.com/), which was not the case some time ago.

Use yt-dlp then see if the problem remains: https://github.com/yt-dlp/yt-dlp

I’ll be trying yt-dlp, but there’s also youtube-dlc (or yt-dlc). So many… Is -dlp the best one right now?

I think yt-dlc is no longer maintained and has merged with yt-dlp. I highly recommend yt-dlp as an alternative. It is updated often and the community is active.

yt-dlp is superior to both, they even say in the readme it combines both.

Youtube has algorithm which detects based on user agent and then reduces the speed. This is not dynamic. During windowing it delivers speed and once it learns, youtube throttles.

Another point, your g-account is not used which is another parameter to throttle.

I could not find way to add user agent or user id to authenticate.

Does yt-dlp have the same -U command-line auto update feature?

Yes.. Please read the documentation before asking questions.

Is there a reason why youtube-dl can’t do the same thing they did to go around the throttling?

Likely because it’s not a proper solution. It’s a workaround until someone implements decoding the n parameter, which is the real problem.

A workaround that works is better than no solution at all imo, but I suppose anyone who thinks that way can just use the fork, it’s a drop-in replacement after all.

Could it be quicker to just take the JS and just execute it as a sub-process, like youtube-dll does with ffmpeg?

Building on top of @colethedj’s work, I was able to make commas work as well as improve JSInterpreter.extract_function to correctly capture the entire function. However, the jsinterp doesn’t seem to support for, switch and nested functions 😕

I have the solution for this issue. I do not have the bandwidth to actually implement it in the source, but this should be more than enough information to do so.

The issue is that YouTube is modifying the n query parameter on the video playback URLs in a very similar fashion as the signature cipher. There’s a pure function in the player JavaScript which takes the n parameter as input and outputs an n parameter which is not subject to throttling.

As an example, let’s look at https://www.youtube.com/s/player/52dacbe2/player_ias.vflset/et_EE/base.js. The code in question which modifies n is as follows:

a.C&&(b=a.get("n"))&&(b=Dea(b),a.set("n",b))}};

In this case, Dea is the function we are looking for:

function(a){var b=a.split(""),c=[-704589781,1347684200,618483978,1439350859,null,63715372,function(d){d.reverse()},
159924259,-312652635,function(d,e){for(e=(e%d.length+d.length)%d.length;e--;)d.unshift(d.pop())},
-1208266546,function(d,e){d.push(e)},
-2143203774,-103233324,b,function(d,e){e=(e%d.length+d.length)%d.length;d.splice(0,1,d.splice(e,1,d[0])[0])},
837025862,1654738381,1184416163,1983454500,b,-200631744,1130073900,null,2047141935,-337180565,1654738381,1913297860,-399114812,b,714887321,function(d,e){for(var f=64,h=[];++f-h.length-32;){switch(f){case 58:f-=14;case 91:case 92:case 93:continue;case 123:f=47;case 94:case 95:case 96:continue;case 46:f=95}h.push(String.fromCharCode(f))}d.forEach(function(l,m,n){this.push(n[m]=h[(h.indexOf(l)-h.indexOf(this[m])+m-32+f--)%h.length])},e.split(""))},
626129880,"pop",1331847507,-103233324,2092257394,function(d,e){for(e=(e%d.length+d.length)%d.length;e--;)d.unshift(d.pop())},
669147323,1184416163,-216051470,193134360,null,2045900346,1675782975,-1997658115,function(d,e){e=(e%d.length+d.length)%d.length;var f=d[0];d[0]=d[e];d[e]=f},
1675782975,161770346,function(d,e){e=(e%d.length+d.length)%d.length;d.splice(-e).reverse().forEach(function(f){d.unshift(f)})},
function(d){for(var e=d.length;e;)d.push(d.splice(--e,1)[0])},
1454215184,-2123929123];c[4]=c;c[23]=c;c[42]=c;try{c[6](c[4]),c[6](c[23],c[22]),c[6](c[38],c[12]),c[43](c[10],c[5]),c[30](c[16]),c[43](c[22],c[36]),c[25](c[47],c[51]),c[52](c[7],c[26]),c[43](c[32],c[8]),c[40](c[7],c[11]),c[21](c[22],c[29]),c[0](c[22],c[17]),c[0](c[47],c[52]),c[21](c[32],c[19]),c[37](c[23],c[31]),c[16](c[9],c[34]),c[51](c[44],c[43]),c[10](c[34],c[15]),c[7](c[43],c[5]),c[7](c[53],c[41]),c[6](c[39],c[40]),c[2](c[24]),c[38](c[39],c[14]),c[3](c[24],c[52]),c[3](c[39],c[0]),c[3](c[49],c[7]),
c[49](c[41],c[37]),c[17](c[44],c[52]),c[28](c[41],c[32]),c[5](c[48],c[50]),c[46](c[23]),c[37](c[38],c[26]),c[37](c[38],c[18]),c[41](c[9],c[48]),c[41](c[48],c[6]),c[16](c[48],c[3]),c[12](c[8],c[38]),c[14](c[18],c[53]),c[52](c[1],c[13]),c[39](c[10],c[11]),c[22](c[33],c[48])}catch(d){return"enhanced_except_AAAAAAAAAAE_"+a}return b.join("")};

This does change with different player versions, so youtube-dl will need to extract this for every video that it fetches and then modify the n parameter as such.

Hope this is helpful.

Great find.

FYI for devs, it seems like the built-in jsinterp can’t interpret this function?

I wrote up this quickly in youtube.YouTubeIE

def test_n_js(self, player_url):
    player_id = self._extract_player_info(player_url)
    if player_id not in self._code_cache:
        self._code_cache[player_id] = self._download_webpage(
            player_url, None,
            note='Downloading player ' + player_id,
            errnote='Download of %s failed' % player_url)

    jscode = self._code_cache[player_id]
    funcname = self._search_regex(
        (r'\.get\("n"\)\)&&\(b=(?P<nfunc>[a-zA-Z0-9$]{3})\([a-zA-Z0-9]\)',),
        jscode, 'Initial JS player n function name', group='nfunc'
    )
    jsi = JSInterpreter(jscode)
    initial_function = jsi.extract_function(funcname)
    return lambda s: initial_function([s])

It extracts the function name correctly, however when interpreting, you get youtube_dl.utils.ExtractorError: Unsupported JS expression 'a.split(""),c=[';

I don’t really know javascript so about all I can say.

I can confirm I am also experiencing this, since July 14th. I am either noticing ~77kb/s, or ~50kb/s.

The throttling is consistent across all of my servers and seems to only occur if I download a lot of videos; making me think it is an intentional form of throttling on youtube’s side.

I have increased the timeouts and reduced the frequency of my throttling, and I am noticing far less throttles. If you are hitting this, I recommend you increase your timeouts.

Can confirm, this works!!

https://github.com/ytdl-org/youtube-dl/issues/29326#issuecomment-981108888 install WinRar 6.10beta2 64-bit

https://github.com/ytdl-org/youtube-dl/issues/29326#issuecomment-975153883 1. Rename youtube-dl.exe to youtube-dl.zip 2. Open it, go to \youtube_dl\extractor
3. Remove youtube.pyo 4. Add youtube.py, do not touch the extension 5. Rename youtube-dl.zip to youtube-dl.exe

@Vangelis66 OK I found the problem. I’ve never downloaded youtube-dl.exe from the releases here. I was referring to the "...\Python\Scripts\youtube-dl.exe" file that I got in my system and probably is compiled when the youtube_dl gets installed/upgraded… 😉

OTOH, using the youtube_dl.exe I’ve got from the releases, I can open, edit, add or remove files using any WinRAR version, even with the older I could find v3.93!

Thank you for your help.

I can confirm standard Windows Explorer native unzip feature and/or latest 7-zip (21.06) don’t work for the task outlined previously here; however, latest WinRAR (6.10b2) does !

NB: You should not decompress the renamed (.exe -> .zip) file, simply open it with WinRAR; depending on .zip file-extension association in your system, you may have to open WinRAR itself, then from its top menu: File => Open (or you could right-click the archive and from Explorer’s context menu select Open with WinRAR, if available… Browse to youtube-dl.zip\youtube_dl\extractor and locate/select youtube.pyo; use the Delete toolbar button to remove file youtube.pyo; do NOT exit WinRAR!

WR

Copy the previously downloaded file youtube.py (kindly offered/linked to by dirkf), then in WinRAR: Top menu -> File -> Paste ; you should be able to see file youtube.py placed inside youtube-dl.zip\youtube_dl\extractor (in lieu of previous file youtube.pyo). Now exit WinRAR; rename youtube-dl.zip back to youtube-dl.exe

For those inquisitive, .pyo is just an optimised and compiled version of a .py (Python script) file; if you somehow have a Python 3.4.x installation available, then save below cmd file (as PythonOptimizeCompile.cmd)

@echo off
if %1x==x goto usage
%~d1
cd "%~p1"
echo on
python -O -c "import py_compile;py_compile.compile('%~n1%~x1','%~n1.pyo')"
@echo off
goto end
:usage
echo drag+drop the .py file onto this to compile to a .pyo file.
:end
pause

and in Explorer drag-n-drop onto that .cmd file the downloaded, patched, youtube.py file; use the newly created youtube.pyo (adjacent to former) to replace/overwrite the original inside youtube-dl.zip, per instructions above… But, you don’t need to create a new youtube.pyo file; replacing the old one with just the plain .py script works!

@Kano97

It’s not letting me delete the “youtube.pyo” file

If you are having troubles doing that try WinRAR. It’s basically freeware for individual users.

If this turns into a cat and mouse thing that happens every couple hours, then we need a better way to update youtube.py than asking everyone to look at this thread every couple hours.

so you are saying if I use

wget https://github.com/ytdl-org/youtube-dl/archive/20fc43475289bd3335e5f96a8e0d8f5432f1d806.zip
unzip 20fc43475289bd3335e5f96a8e0d8f5432f1d806.zip
cd youtube-dl-20fc43475289bd3335e5f96a8e0d8f5432f1d806
sudo pip install .

Should not be throttled anymore?

Need to update to 8e069597 fix, already update the original script

OK, this seems to do the trick.

Links earlier in the thread updated.

As for implementing this patch, can anyone clarify the process? I’m running on macOS, installed via curl.

Just find the Python installation directory and then replace the PYTHON/Lib/site-packages/youtube_dl/extractor/youtube.py file, with the provided one…

Outstanding! Looks like pukkandan and coletdjnz are on the verge of a final and explicit solution to this throttling issue, with a full-fledged decrypter for the n value. No need for the android-client workaround. Anyone who still has a problem with throttling, yt-dlp is the way to go.

So here’s something I don’t get: I could switch to yt-dlp, which is faster, but it also seems format selection is very different, even if I use -f bv+ba/b to force what should be the current youtube-dl behavior.

youtube-dl -f bestvideo+bestaudio/best (which I manually specified to be sure):
filename.f137.mp4
filename.f140.m4a
youtube-dl --get-format -f bestvideo+bestaudio/best $_ID
137 - 1920x1080 (1080p)+140 - audio only (tiny)

yt-dlp -f bv+ba/b:
filename.f248.webm
filename.f251.webm
yt-dlp --get-format -f bestvideo+bestaudio/best $_ID
248 - 1920x1080 (1080p)+251 - audio only (medium)

I’m not sure what’s going on, or which exactly is higher quality (I’m tempted to say the MP4 is since that’s more likely to be the original file from the uploader???), or why the audio stream is labelled “tiny” (but other issues seem to say higher-quality audio is “tiny” in general so IDK). But this difference does make me wary about switching, or whether either program is even giving me high-quality archives, or if I should just download all formats and multistream merge them (lots of disk space!).

I will say that as someone who also has projects that look like I have abandoned them (I haven’t; I’ve just been distracted and brain-fried) but for which those projects are significantly smaller and have significantly smaller user bases, I do have to sympathize with the youtube-dl authors here. They have a lot on their plate, and I understand fighting this good fight is indeed tiring. I for one am very much not cool with the belligerent copy-pasted claims that yt-dlp is the “replacement” or implied designated successor for this project or that it’s formally “dead” that aren’t coming from any source of authority on either side.

After trying a number of workarounds and external downloaders, and still getting throttled for several weeks now, I wrote my own external downloader with aggressive parallelism and retrying:

https://github.com/CyberShadow/turbotuber

It avoids throttling by continuously creating new connections. Worked extremely well for me (500x speed-up), maybe it will be useful to others.

Can we please NOT derail this issue thread? thank you…

This issue is solved on yt-dlp.

At least it’s a good rabbit hole 😄 I wouldn’t have archived even a fraction of all the things if the original ytdl dev hadn’t made it Unlicense to allow other ppl with the skills take the challenge.

FWIW, I wouldn’t be able to do many of the fixes cuz my fairly good understanding of networking ain’t THAT good, without going into the other problems. Like, if you read the YT code, it’s bordering on wizardry.

I have implemented same concept of ytdl(only YouTube) for my college mini project as an Android app. I’m facing this throttling problem from last December. A week back I have removed my downloader that I created long ago and switched over to Fetch library. Now the throttling issue is solved. I have tested URL from response.json with ADM(Advance Download Manager), Downloader Navi (Open Source). There are no issues with speed now. If you have enough time try same thing with external downloader for Windows and find out the cause.

@OmniSexualCofeeBean Outstanding analysis! So I take it that Youtube-DL as it is currently designed cannot be modified/extended to accommodate this new change. What other package can one use, which Youtube-DL can import, to outsource this decryption algorithm to?

If you can come up with a complete working solution for us (maybe your own fork of youtube-dl?), @tzarebczan has put up a crypto bounty for anyone who cracks the problem. I’d be willing to add a small amount to this. I don’t have much in my Bitcoin wallet. He didn’t state an amount, but whatever it is, you can claim it if you can solve this problem for us. We would all be very grateful. Would it be worth your while?

Just use yt-dlp. My setup is more complicated but it was a drop in replacement for youtube-dl.

yt-dlp is a more actively maintained project at this point.

On Sun, 25 Jul 2021, 2:48 pm Obscure2020, @.***> wrote:

I can confirm I am experiencing this same issue, intermittently. I’m running Windows 10, and I keep the program updated. I use youtube-dl mainly through some automated batch-download Batch scripts I have written. The intermittently super-slow download speeds are a real pain.

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/ytdl-org/youtube-dl/issues/29326#issuecomment-886146212, or unsubscribe https://github.com/notifications/unsubscribe-auth/AD7CVVCGHXMVAWILAH7N5CLTZOJQXANCNFSM463R5FEQ .

– ** ** https://www.canva.com/Empowering the world to design Share accurate information on COVID-19 and spread messages of support to your community. Here are some resources https://about.canva.com/coronavirus-awareness-collection/?utm_medium=pr&utm_source=news&utm_campaign=covid19_templates that can help. https://twitter.com/canva https://facebook.com/canva https://au.linkedin.com/company/canva https://twitter.com/canva  https://facebook.com/canva  https://au.linkedin.com/company/canva  https://instagram.com/canva

It went away for us for 2 weeks and is now back. Any other updates? Would be willing to put a bounty in crypto for this.

That’s the spirit!! A bounty would work wonders.

@tfdahlin appears to have solved the problem with his pytube project. I’m just waiting for someone to implement in youtube-dl. If you don’t mind using another repo, then use pytube to download the actual video files. https://github.com/pytube/pytube/issues/1033

can confirm it’s still an issue image

I have multiple internet connections with fixed IP addresses. I mainly run youtube-dl on one of the IP. Here is my test result.

On the main IP, which is used to download by youtube-dl for a long history, the speed it limited to 1MB/s mostly. Sometimes, it is limited to 50KB/s. When it happened, I break it by CTRL-C, open a browser to see that video from youtube directly, then restart the downloading. It will resumed to 1MB/s.

At the same time, if I run youtube-dl from other IP, the speed it good without limitation.

All my IP addresses are from a same ISP, through a same fiber link. It should not be the problem from my ISP side. I GUESS youtube has a black list for the IP address downloading video not from webpage.

I’ve worked around this issue by checking the status of a playlist download, and made use of control-c to terminate if it’s like 80 KB/s and then running it again.

Since this seems to vary from video to video, this isn’t the best option, but it makes youtube-dl usable.

This app is one of my favorites; I hope it can be patched soon.

Win10 x64 here. But I would like to know if really MacOS is unaffected?

Well I have a big sur build that works perfectly with multiple dl per days and works flawlessly. But maybe it’s because those are newly added videos ?

Seems so. I don’t notice buffering or quality loss on new videos. Buffering/throttling on several months old videos only. Maybe they introduced some new encoding codec and are reencoding old videos and new videos are already good? Maybe so.

It’s not letting me delete the “youtube.pyo” file, how do I work around this? It’s saying the file is read only in 7zip, and then is denying it with an unknown error in file explorer.

622b87d0fb10a50283b12ecd5304e66dd396809b works, master does not

ytdlder commented 8 hours ago

        Naughty Windows users do not need to rebuild anything from scratch.

        1. Rename youtube-dl.exe to youtube-dl.zip

        2. Open it, go to \youtube_dl\extractor\

        3. Remove youtube.pyo

        4. Add youtube.py, do not touch the extension

        5. Rename youtube-dl.zip to youtube-dl.exe

    emmm, when i opened the .zip i dodn't ping anything other than main.py ..

Yeah, I can open the .exe in 7z, but it won’t let me delete/add anything…

Same for me. I’m getting an issue with “opened with offset” in 7zip so can’t do anything but extract everything.Thats fine ut i can’t rebuild it to .exe after. Even the (chocolatey) make makefile gives me an error 255 “d not expected at this time” 😦

with @dirkf fixed py file in place, i can call main.py and it all works. but damn, it’s a comparitively longwinded process. i need my .exe back. Can anybody help?

Naughty Windows users do not need to rebuild anything from scratch.

1. Rename youtube-dl.exe to youtube-dl.zip

2. Open it, go to \youtube_dl\extractor\

3. Remove youtube.pyo

4. Add youtube.py, do not touch the extension

5. Rename youtube-dl.zip to youtube-dl.exe

emmm, when i opened the .zip i dodn’t ping anything other than main.py …

Yeah, I can open the .exe in 7z, but it won’t let me delete/add anything…

I agree with you. It’s so much easier on linux. Unfortunately the youtube-dl tool is used in a windows environment. So I have to deal with it.

The standalone exe version exists because Windows systems typically don’t have Python already. If you have Python you can just use Pythonic installation tools like pip, and your installations will look much more like those on Unix-like systems that do come with Python by default (or for which yt-dl demands Python as a pre-req).

@unityconstruct Thank you ! I agree with you. It’s so much easier on linux. Unfortunately the youtube-dl tool is used in a windows environment. So I have to deal with it. I am currently trying with the setup.py file as suggested by @dirkf. The generation of the executable is done by Py2exe. It doesn’t seem to work at the moment. I have the impression that there is a compatibility problem between Py2exe and the version of linux on the windows computer (3.9.5). I will look in that direction again. I will then follow your advice with make.

Needs a patch to pretend your viewing the videos page. It is a new java human check

On Tue, Nov 16, 2021, 12:53 AM Whodiduexpect @.***> wrote:

That version just gives me this error and it continues to get throttled as the warning messages describes.

WARNING: [youtube] Couldn’t parse unidentified YouTube video throttling parameter descrambling data Near: “e.split(”“))}, 257572101,b,null,-940104274,b,function(d,e){e=(e%” WARNING: [youtube] Invalid data type encountered during YouTube video throttling parameter descrambling transformation chain, aborting Couldn’t descramble YouTube throttling URL parameter: data transfer will be throttled WARNING: [youtube] Couldn’t process youtube video URL, please check for updates to this script WARNING: [youtube] Couldn’t parse unidentified YouTube video throttling parameter descrambling data Near: “e.split(”“))}, 257572101,b,null,-940104274,b,function(d,e){e=(e%” WARNING: [youtube] Invalid data type encountered during YouTube video throttling parameter descrambling transformation chain, aborting Couldn’t descramble YouTube throttling URL parameter: data transfer will be throttled WARNING: [youtube] Couldn’t process youtube video URL, please check for updates to this script

— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/ytdl-org/youtube-dl/issues/29326#issuecomment-969929763, or unsubscribe https://github.com/notifications/unsubscribe-auth/AJMRJQMKJ3TRTZVCOXVCPLLUMH5WZANCNFSM463R5FEQ . Triage notifications on the go with GitHub Mobile for iOS https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675 or Android https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub.

Do YouTube developers watch this thread and are determined to stop youtube_dl from working?

In today’s case, YouTube developers haven’t really changed anything. This was expectable, the last fix hardcoded a parameter when it turns out something a little more generic is needed: it worked yesterday but not today.

Yeap… Throttling here too! 😞 What is happening? Do YouTube developers watch this thread and are determined to stop youtube_dl from working? These, are a lot of changes in a very short time. … But of course they do … 👿

I’m experiencing the same thing as @noembryo it seems. I initially thought that was because I was trying to download VxxO material. Here’s a part of the output of me trying to list available formats for a given video (not a VxxO one, it’s this one):

[youtube] sxmsR02TAl0: Downloading webpage [youtube] sxmsR02TAl0: Downloading player 8d287e4d WARNING: [youtube] Couldn’t parse unidentified YouTube video throttling paramete r descrambling data Near: “e.split(”“))}, -1628334556,-296171745,b,363030319,701629700,1102” WARNING: [youtube] Invalid data type encountered during YouTube video throttling parameter descrambling transformation chain, aborting Couldn’t descramble YouTube throttling URL parameter: data transfer will be thro ttled WARNING: [youtube] Couldn’t process youtube video URL, please check for updates to this script WARNING: [youtube] Couldn’t parse unidentified YouTube video throttling paramete r descrambling data Near: “e.split(”“))}, -1628334556,-296171745,b,363030319,701629700,1102” WARNING: [youtube] Invalid data type encountered during YouTube video throttling parameter descrambling transformation chain, aborting Couldn’t descramble YouTube throttling URL parameter: data transfer will be thro ttled WARNING: [youtube] Couldn’t process youtube video URL, please check for updates to this script

Then the download attempt of said video effectively caps at 70-80Kbps.

@hiephm

Works on MacOS as well after uninstalling the homebrew binary.

Thanks so much!

@Butterfly-Dragon You sound like you don’t exactly know how Python packages work, so I would recommend you get the latest yt-dlp build instead of attempting to patch whatever you have at the moment

Not sure how you arrived at the decoded value, but in my experience correctly transformed n values are 2 characters shorter than the original values. If I plug KHy-yHaPN1HvjA into my session test, it gives a transformed value of -AUE3QPaqlIT.

But it really depends on the n-transform function in the actual base.js file that is linked to the url you are downloading.

@StefH Not all videos require signature decipher. Maybe 1 in 8.

The format streams from YouTube that require deciphering are not given an url string but a signatureCipher string that contain s, sp and url parameters that need to be deciphered and merged. The s field is the signature that must be deciphered. The sp field defines the tag that should be given to the deciphered signature when added to final url.

But I have this issue like everytime, not at all random. I generate the video playback stream url through youtube-dl -g <video-link> & stream it in browser, I have no issues streaming it. But when streamed through mpv, ffmpeg or vlc from terminal, it’s always slow (< 100Kbps).

It already exists. You can either use a switch (--compat-options format-sort) or just manually use -s to overwrite the default lang,quality,res,fps,hdr:12,codec:vp9.2,size,br,asr,proto,ext,hasaud,source,id to restore youtube-dl’s bahavior (basically just remove codec:vp9.2 part).

(TBH this is kinda off-topic. ytl-dp’s readme has all the information already about the difference between it and youtube-dl.)

I wondered the same thing. It was helpful that a standard test like using aria2 would be reported early on as being unhelpful for finding a fix. It’s really confusing considering we need tell no one here about translate.google.

Same codebase, same parameters, can just rename yt-dlp.exe to youtube-dl.exe and go about your day like nothing happened.

Wow, THANK YOU! It was so simple, amazing. I was completely lost, I don’t even know how my conf is working, so I was very sad that ytdl was not working properly anymore on youtube, I just found out about MPV and ytdl like -2 months ago, and I can’t live without. It’s my video player of choice now. It’s the only way my old notebook can play 1080p 60fps video, 30fps is the max everywhere else (streamed/local)… anyway I’m way off topic, sorry, I’m just very happy that the change was so simple (I’m on win10 btw and know nothing about linux/unix but has some DOS knowledge ;sadface;). I had no idea how to integrate this ydlp.exe to my mpv.net but I found this place, subscribed in case of something new, and it worked ! Just to tell you what was my issue in case someone is like me, I’m using it to watch youtube videos, and I was struggling to find a video that has more that 1s cache, now it’s working almost flawlessly. Sometimes, the cache fall down to 0 and the video stop, but it’s the exceptions, while before it’s was the other way. I still need to find a way to log into my youtube account and I would like mpv.net/ydpl to download every video I watch on my HDD, I guess this is for another time. Thanks again for the solution, I was getting mad furious about the situation lol

This is an open source project, nobody is getting paid to do this work, and we’re all volunteers here. If you have special needs that aren’t supported by yt-dlp, that’s fine, but I think you should implement this functionality yourself instead of harassing other commentators

Can we please NOT derail this issue thread? thank you…

This issue is solved on yt-dlp.

No it isn’t, I’m having this exact same issue on yt-dlp.

I keep seeing people saying saying this occasionally, with no evidence.

There are some very rare edge cases where this issue may still persist in yt-dlp. But likely you are seeing something different.

YouTube has other “kinds” of throttling/slowdowns too. ex: https://github.com/yt-dlp/yt-dlp/issues/985

ofc, if you are certain then open an issue in the yt-dlp issue tracker with verbose logs.

Can we please NOT derail this issue thread? thank you…

This issue is solved on yt-dlp.

This has been well established enough already. I think we should just leave the youtube-dl folks alone to come up with their own solution. If that ever even happens.

Same issue, I don’t exceed 77KiB/s, average at 53 (using WSL on Win10-64). I’m not sure if it’s coming from youtube-dl because I have the same problem with the videodoer application (https://www.videoder.com/), which was not the case some time ago.

Python 2.7 lost security support almost two years ago, you have bigger problems if you insist on using that. Consider containing the packages you need that require 2.7 in a docker or VM isolated from the rest of the network. There’s a reason distributions like Fedora and others removed Python 2.7 from their repositories, it is not secure.

Whether this is relevant is entirely dependent on what you’re using Python for. If you’re using Python 2 to run the backend of a public website, you need to stop that right the $%&# now! If you’re using Python 2 to crunch numbers, batch edit text files or download Youtube videos, there’s really no cause for concern.

is there any way on how to decypher the &n parameter

pytube has an implementation

@OmniSexualCofeeBean Thanks for the clarification. I saw the arbitrary shuffling of data (CCC, BBB, C, BB, C, BB etc) and assumed that it would need a new n when this happens. A little confused. I am NOT adept with Javascript and the inner workings of streaming and media players. Hence the reason I haven’t come up with a solution myself.

@shoxie007 youtube-dl has JSInterpreter which is used in signature decryption. The same can be done for “n” parameter

I have this too and I find if I set youtube-dl to resume and restart the script it would regain the speed!

try the workaround in yt-dlp. @tzarebczan @rebane2001

yt-dlp "https://www.youtube.com/watch?v=exampleURL" --extractor-args youtube:player_client=android --throttled-rate 100K

see https://github.com/yt-dlp/yt-dlp/releases/tag/2021.07.07

I can vouch for this workaround as well. Switched to using it a few hours ago and already a mind-blowing difference. Downloading is so much more efficient now. Perfect!

It went away for us for 2 weeks and is now back. Any other updates? Would be willing to put a bounty in crypto for this.

Here is one reprieve for anyone who only wants to download only a few videos, and to your home computer:

  • Open the video(s) in your browser
  • Then use the option --cookies Youtube_Cookie.txt with youtube-dl, where Youtube_Cookie.txt is the Youtube cookie you copied and pasted from your browser (eg from the cookies.txt extension)
  • You should get full speed without any throttling.

@putara seems to have come up with a decryption solution to resolve the Javascript interpretation: #2222 for the invidious project. I hope this can be somehow transposed and implemented in Python for Youtube-DL.

UPDATE: I’m having success employing variations in IP addresses, in conjunction with the --rm-cache-dir option in Youtube-DL. I have a subscription for 10 instantproxies.com proxies. Before downloading each video, I run: youtube-dl --rm-cache-dir to remove any previous data cached by youtube-dl. Then I route my connection for the next video download through a different proxy from the last: youtube-dl --proxy XXX.XXX.XXX.XXX:PORT … and in this way, though the download does not consume my full available bandwidth, it’s at least not the sinfully slow 70KBps. I think that each time a youtube-dl request to Youtube presents itself as a fresh connection, it will not be throttled, at least not as much. I also use the --force-ipv4 option to add further variation in the IP which Youtube sees.

Gotcha, thanks! Hopefully it gets fixed soon!

@JJ840 No. How to solve this has been figured out in https://github.com/ytdl-org/youtube-dl/issues/29326#issuecomment-865985377, but there is no actual implementation yet