You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
With HTTP/2 gaining popularity some of the optimization techniques developed for HTTP/1 are becoming counter-productive. Specifically, because connections are multiplexed, HTTP requests are cheap in HTTP/2. So, combining unlike files can actually hurt performance by minimizing the effectiveness of caching.
It strikes me that the original purpose and design of this script fits quite well in an HTTP/2 environment. But, I'm curious to hear what the developers think.
With HTTP/2 gaining popularity some of the optimization techniques developed for HTTP/1 are becoming counter-productive. Specifically, because connections are multiplexed, HTTP requests are cheap in HTTP/2. So, combining unlike files can actually hurt performance by minimizing the effectiveness of caching.
It strikes me that the original purpose and design of this script fits quite well in an HTTP/2 environment. But, I'm curious to hear what the developers think.
Here's a whitepaper on performance in HTTP/2 that suggests unwinding some HTTP/1 based optimizations going forward.
The text was updated successfully, but these errors were encountered: