Breaking Web: Mozilla plans to deprecate non-secure HTTP
Mozilla plans to make fundamental changes to the Firefox web browser in regards to non-secure HTTP contents on the web.
According to a new post by Richard Barnes on the organization's security blog, Mozilla plans to make new features only available to secure websites in the future and phase out features for non-secure sites gradually as well.
The reason behind the decision is "a broad agreement that HTTPS is the way forward for the web".
The organization acknowledges that there will be trade-offs between security and web compatibility when features are blocked from running on HTTP sites.
While this does not mean -- yet -- that HTTP support is removed completely from Firefox, it is a first step in that direction.
So what does this mean for users and webmasters?
It is fair to say that there will be sites that won't be upgraded to HTTPS. Even if certificates are available for free, it still requires time and the necessary infrastructure to implement it.
While this may work on self-hosted servers and virtual servers, solutions like Lets Encrypt won't work for shared hosting accounts.
Most web hosting companies offer upgrades to make sites HTTPS but that comes at additional costs that not everyone is willing to commit to.
If you run a personal blog for instance or an informational site on a shared hosting account, you may not want to pay $20 or more per year extra just to keep it compatible with certain web browsers.
Webmasters who cannot afford to buy these certificates or install them for their sites -- it is still a technical process which usually requires quite some troubleshooting on the site itself to get it right -- will face an uphill battle against feature deprecation in future browser versions.
Internet users may benefit from improved security on the Web in the long run but they may as well run into websites that no longer work properly or at all due to feature deprecations which should raise the question if this is the best method to move the web towards using HTTPS.
Several commenters have already mentioned that they dislike Mozilla strong-arming webmasters by removing feature support for HTTP websites in the Firefox browser.
Mozilla is not the only organization that plans to do away with HTTP.Â Google will mark non-HTTPS sites as insecure in Chrome in 2015 but that is far away from Mozilla's proposal to limit functionality of HTTP sites on the Internet.
The core idea behind the announcement on Mozilla's Security Blog is to give webmasters and companies enough time -- years -- to make the necessary changes to their web properties before features are removed for HTTP sites.
This could backfire big time especially if Mozilla's vision of an ideal Internet collides with reality.
word “secure” doesn’t mean what it should mean in this day and age. mozilla has no business forcing adopt users orwellian concepts who wouldn’t know better. It’s like putting bandage on bloated corpse that has been in the tropical sun for 3 weeks and telling everyone that it’s revolutionary medical technique they just come up.
my comment was either deleted or waiting for accept on their page, so i post it here :)
btw, Pale Moon FFS
are we going to be like in “Idiocracy” movie ? all the people in charge are dumb??
– MS wanted to change the way people look at computers with win8 and they found out it was their way only and people decided to remain on w7
– Mozilla wanted to join the “identical interface” (chrome, opera ) of a browser with Australis and they lost people ( i switched to pale moon and while i like chrome`s interface i loved the previous firefox also, but not the new one)
now mozilla moves to cripple sites with their browser, enjoy your drop in users
ps. MS had the advantage that there is no real OS that can compete with windows and the choices were stay with old w7 and or move the w8 so there was no real drop in users on “windows” but in browser area there are some very good choices
About time. All browsers should be moving in this direction. I still find it incredible that in 2015 we’re still sitting with websites that don’t use https by default. I really expected the push for forced https to start in earnest around 2010 – 2012, but we’re still stuck in the dark ages it seems.
Why should BROWSERS do that? They need to make the web accessible, not to decide how web looks.
Yeah I agree. It should have been standard right from the very start, but now-a-days with so many people using public Wi-Fi, it really is long over due. Although PKI has it’s flaws, at least it provides some way of authenticating that the website you’re visiting really is that website and that it’s contents haven’t been intercepted/modified by a third party. With certificate pinning also looking to become the norm, I’m all for it. Guarantee those ad-injecting scammers will start bleating though. Oh well, I guess they’ll just have to pay Lenovo to install malware at the factory instead to get round it… :O
Next step, end-to-end encryption for email. Due to the vast amount of confidential information that’s sent via email, it needs tightening up as well.
As a plan at this time Mozilla’s idealism is at least an incentive to have websites switch to https. But as Rott Weiller (Hello France!) mentioned above, not sure idealism will be carried on by the Reference that is people, Web administrators included. In democracy the Truth is that of the masses be they wrong.
Wait and see. I ignore if Mozilla’s idealism is stronger than its pragmatism but, should it resolve to stubbornness that many (unconvinced) users would likely move on to other browsers. I’d prefer a worldwide communication plan to exhort users to avoid as much as possible non-secured domains, that would call upon them to exercise pressure on those domains… but I may well be, saying so, idealist myself!
People at Mozilla, Google, Microsoft should realize that they don’t own the Internet, and should have no saying in how it looks like. Their concern should be about building quality browser software, and not what protocols are used on the Internet.
Who cares? When they do this Firefox will be forked – if it isn’t forked before then over the add-on signing thing.
And no Pale Moon isn’t a viable alternative. Pale Moon is a mess.
Pale Moon is a godsend for people who disliking Australis. Seamonkey too.
The browser who is a real mess is Firefox itself!
+1 on Pale moon & seamonkey
Oh well… What is with people who host small projects and do not have the knowledge to put everything on http?
This is a clear discrimination of standards. There are many reasons why someone will not switch towards HTTP – And if it is for example a static-only designed webpage – i see no problem with staying with HTTP. HTTPS makes sure if there is some dynamic content, forums, mail providers, online banking – something where the user is needing a password or a secure connection.
But not for small little Websites which only for example present a single simple website or something like that.
My not existing respect for Mozilla is shrinking more and more. Well done morons!
https is a protection against a modification of the page by the third party, and hinder the ability to know which page exactly of the site you looked at. So there is a little interest.
However, things like self-signed https aren’t accepted – so in the big scheme of thing moving to https isn’t easy for everyone and the move of mozilla is foolish.
I agree with David, Pale Moon has once been a true alternative to those people who didn’t want to use a Mozilla-branded Firefox browser, yet remain closely tied to the code-base nevertheless. With its continued development this gap has widened, until suddenly users were directly affected by it in the wake of extension incompatibilities of Australis and beyond. Nowadays one cannot recommend Pale Moon without reservations anymore, which is a sad truth in my opinion.
The Firefox market share is definitely shrinking, but it’s not as fast as some people make it out to be. Globally, about 12 to 20 percent of all Internet users do so with a Mozilla Firefox browser. These numbers vary so wildly, because all the different market research companies can only use small snapshots of data to build their estimations and the companies don’t always use the same criteria to arrive at the final statistics. StatCounter for example counts billions of page hits, which means that a single Firefox user visiting a webpage daily counts 7 to 8 times more toward the statistic than an Internet Explorer user doing so only once per week. While NetMarketShare counts unique page views (per IP, cookie, whatever), it multiplies/divides these raw results by global user data per country. This means, that any national or regional trends are always going to be biased. Here’s a good example for those regional deviations: I’m a regular visitor over at ComputerBase, a German language IT online magazine, where they only published data of their own user statistics, which show Firefox in the lead with 42.1 percent down from 44.3 percent the year before.
But back to topic: no matter if the market share of their browser is 12, 20 or 40 percent, Mozilla needs to be concerned with keeping users and not inventing new methods of annoying them. Although their losses in market share have slowed down to a trickle, now that Google Chrome has become an established market leader, they should have learned from the much criticized Australis debacle and think about what would a) keep users loyal and b) bring old users back. At one point in time, about ten years ago, they were on their way to take the lead away from Internet Explorer. Now they’re in a similar position, having long lost that lead of course, but they don’t have the advantages of a huge company like Microsoft to support their battle back to the top.
That they’re even leading a discussion about the removal of features from unencrypted browsing via HTTP is unconscionable. I think Google’s way is definitely the best, because a website accessed via the HTTP protocol is by definition not secure and it should be marked like that. Nevertheless, there is plenty of data in our world wide web, which can safely be accessed without an encrypted connection and I for one don’t need Mozilla to scold me, that I shouldn’t use them anymore. As long as the data transferred is not sensitive, as long as at most only a throwaway account is affected, there’s no reason to panic.
Instead of thinking of the glory days and ignoring the facts, Mozilla must begin looking to the future once again. Their mobile browser is not a bad idea and I personally like the extension capability, but with less than 1 percent market share in 2014, their mobile browser was counted among the two to five percent of “other browser” in many statistics. Sadly, this hasn’t changed in 2015. Even the niche browser Opera and Internet Explorer, the latter of which has a really small distribution because it’s only available on Windows-based mobile devices, are loads better than that.
And again… For a sane browser with sane developers…
These guys know that is it not good to anger the user and instead supporting the more advanced ones too!
Yea, this is a really bizarre idea. Somebody thought they’d get too far ahead of the curve and look `cool` beating everyone to https. I can tell you that my own sites won’t be https unless my provider makes it free. I run a player finder for tabletop gamers, not a banking service.
I think it is not that difficult.
Just change the DNS name servers to CloudFlare ones and you get free HTTPS on your website in addition to other types of protection and CDN for your site. You may have to add code to redirect HTTP pages to HTTPS but it works without any trouble.
>Just change the DNS name servers to CloudFlare ones
I’ve found Cloudfare is incompatible with Tor and some VPN nodes. It’s a good way of limiting traffic to sites.
what about this browser, https://www.whitehatsec.com/aviator/. Anyone ever hear of it?
Just a Chromium rebuild.
If you want Chromium with real modifications…
http://Opera.com – i do dislike Opera because they betrayed power users similar like Mozilla, but anyway, offers still more than Chrome
Yes I’ve heard of it, but I don’t think this Chromium fork is all that interesting or recommendable. I could say a lot, but the text on their website should already tell everything you need to know. Aviator is not under active development anymore! By now it is 5 versions out-of-date compared to Chromium, the last commit on their GitHub project page is from February 9th and there has been no official release either since the version based on Chromium 37 (which is still available as download from their webpage).
Anyway, the only obvious things Aviator does different from Chrome are those two: first, it ships with two extensions (Disconnect and a user-agent switcher) and second, it seems to run in protected/incognito mode by default. You can install the extensions easily by yourself and if you always want to run in incognito mode, you can do so by adding a command-line switch to the Chrome shortcut (-incognito).
“If you run a personal blog for instance or an informational site on a shared hosting account, you may not want to pay $20 or more per year extra just to keep it compatible with certain web browsers.”
This… I have no idea why I would need HTTPS, it’s not like they’re going to send personal information to me… I just use my site to post random crap.
Two important points here:
* It’s more than encryption: this move has the added advantage of signing, meaning you have some sense that what you’ve asked for and where you’ve arrived are one in the same. If you go to http://www.yourbank.com your browser can tell you if your connection has been hijacked. Wider usage of this tool is good for the web as a whole.
* If you’re running a static webpage with not much on it, this won’t affect you. It’s right in Martin’s second paragraph: Mozilla is setting this up for NEW features. The small time bloggers and developers that can’t afford the extra cash for a certificate won’t be impacted. If you have advanced tools and functionality (the sort that might include vulnerabilities) you SHOULD have some layer of security on top of that. If ongoing Flash vulnerabilities have taught us anything, it’s that cool stuff often has a catch.
webfork, HTTPS does not protect you against vulnerabilities, it just prevents intercepting your traffic.
If HTTPS were just encryption, you’re right it would only help prevent interception. However, the protocol helps make sure that the website you visit is the one you want. So for example, lots of websites spoof other websites (almost every phishing attack) by behaving like an intermediary, and you’re more likely to be attacked by a website you don’t know than a website you do.
webfork, I agree, HTTPS helps you identify the site you are browsing, but that still doesn’t help you to avoid vulnerabilities. And that is because a vulnerability can be present on a legit site and on a malicious site…
I’m torn on this move. Obviously, promoting HTTPS and security is good, but removing all HTTP support could really kill the browser, since I think that many, many sites will never be upgraded. Compatibility will be a problem, and for non-power users, this might cause them to leave Firefox. Heck, probably many power users will, although I personally would just use Microsoft’s new Edge for HTTP websites. It would be annoying to have to use a different browser just to access some sites though. It seems like a bold move ready for a big backfire.
Personally I think it’s a good step forward to remove the 20 years or older protocols that are still in use. Btw it isn’t Mozilla which started this, it was Google (remember that HTTPS sites are preferred over non-secure sites and SPDY was developed from and with Google).
I think the problem is not that sites need to be upgraded, it’s the provider which are forced to upgrade, since they provide the features, but as long IPv6/SPDY/IPsec and others are not widely used it’s still a shame that we need to “fix” such stuff on the Browser site and not directly on the protocols – problem is that as long the user take control over options there could be security risks, if the protocol was well designed to really secure something the user doesn’t need anything to change (except to choose a browser which supports it). But updated software/hardware is always a must if we talking about security (the stay up2date concept).
About the coasts and the extra traffic for auth, I don’t see that so critical, the whole https stuff doesn’t come out of the blue, we already had enouth time to update and money shouldn’t be a problem if we really want security – we not talking about so much more, mostly it’s worth to upgrade. And of course if everyone use it the coasts would be much lower as today coasts.
Personally it doesn’t make any sense to encrypt everything, but on critical sites like banking it’s definitely recommend to encrypt all stuff to prevent MITM and other stuff. On login sites minimum the fields should be encrypted but to “secure” every picture or such makes really no sense.
As much as I love Firefox, I hate the draconian attitude of the core developers.
I’m going to have to wait and see if I’ll still be able to use it to develop my own website. My provider is geared towards vanity sites like mine and doesn’t offer HTTPS at all.
I tried Palemoon last week, but the Greasemonkey extension isn’t compatible. Apparently it’s due to some functionality Palemoon doesn’t offer. Neither developer seems willing to budge on the issue.
It’s very possible that I’ll have to abandon Firefox and all its siblings if Mozilla manages to either break HTTP badly enough or deny me the ability to implement new technologies.
Try out http://vivaldi.com – all what a power user will love :)
Seamonkey is Gecko gased, has none of the Firefox annoyances and you can use many Firefox add-ons thanks to the Mozillazine add-on converter, and there is also a Seamonkey Port of that add-on of you available, so i recommend to check it out :)
I’m reasonably sure SeaMonkey will inherit this policy, albeit with a few months of delay. I don’t know about Vivaldi’s security plans.
Oddly enough I found the SeaMonkey Port a few minutes ago, Thanks for pointing out the Mozilla zine converter. So far it worked on Pocket and Scrapbook X.
I’m a bit hesitant to switch over to Vivaldi this early in it’s development because I use a lot of extensions that are going to be difficult to replace.
Vivaldi is already capable of extensions, but installing them might take some manual work and those extensions requiring an icon to use/configure will be missing most if not all of their functionality. Once the developers have reached the point where extensions work properly, the browser should be ready for personal use at least. The engine it is based on is rather stable after all.
Heartbleed anyone? https in itself can become a vulnerability, there are other https exploits around, have to be due to the complex nature of encryption.
Not to mention high traffic websites will need massive upgrades, anonymity will also suffer.
Feature removal, permanent addon api deprecation and other idiocies, mozilla never ceases to amaze me, boneheadedness at its best.
It is a shame that there really is very little in terms of alternatives around, Vivaldi is still based on chrome, with its countless tracking and other exploitative features at its core. Ie .websockets, an open port to control the browser from afar, it may have uses, but its a vulnerability by design. Chrome also lacks the customizability of firefox with its thousands of options.
Vivaldi will be like Firefox and Opera old.
If you really want a browser which you can customize to the bone – Vivaldi will be the one, but first it has to reach stable state. :)
This is going to cause problems on web hosts that use virtual Apache hosting, instead of full virtual servers. Some older, yet still supported, versions of long-life Linux use a version of Apache that does NOT support using virtual Apache hosts on the same IP address. Many C-Panel websites are based upon RedHat Enterprise Linux 5, (which itself has this issue).
More info can be read here:
RHEL 5 has a version of Apache that is 2.2.3-x, far below the required 2.2.12 minimum.
Other than voting with their feet, the customers of those websites have no control over this.
Mozilla has become infected with ‘spend other people’s money’ syndrome (hey you, just go pay for an upgrade!) that is so prevalent in open-source vendors and support.
History repeats too quickly in Firefox development…!
Why they have to *force* users and webmasters into another of their ideas? In my opinion, it’s just unnecessary to use HTTPS everywhere (i.e. in gHacks). Why they just don’t add an option (and set it as default for new installations if they want…) the user can disable if he/she doesn’t need to force HTTPS use?
And it’s not like HTTPS is a solution to everything… It encrypts traffic end to end, but only authenticates the destination webserver as long as you trust the certificate authority (would you really trust all the CAs in your browser default list?). HTTPS proxies can just hand out fake certificates the browser won’t complain about if a CA certificate has been installed (i.e. company networks…).
At the risk of people severely disagreeing with me, I think my opinion might clarify my (and Pale Moon’s) stance on the matter:
– HTTPS, although great for preventing eavesdropping, is not a magic wand. Using it indiscriminately for everything (especially public data normally served over HTTP) is actually detrimental to the web as a whole, because due to its end-to-end nature, too much data will have to be retransmitted and retransmitted, and retransmitted again; analogous to having individual armored trucks delivering the morning newspaper to all houses in the street, instead of just one paperboy.
– What about CDNs? Does anyone even think about the potential implications of having to know the exact infrastructure of your CDN and the maintenance involved in dynamic infrastructures of them if you want to securely serve CDN-hosted material over HTTPS?
– Wanting to move all this public information to https “or else be crippled” is total insanity. What is Mozilla thinking – that web users are going to stay with their browser if it starts failing to render the websites as-intended just because it’s served over HTTP? What about all the users who, due to their location, can’t use HTTPS?
– Many website owners and bloggers and even company websites are not going to be worried about their data being altered in-transit, because, honestly, that kind of attack isn’t free to pull off and needs an incentive – and setting up something like this kind of attack for simple, public, open-web information has no incentive. People are not going to do this “just because”, even if it would be rather trivial if you have access to a conveniently located transparent proxy that can do this.
– In my opinion, making secure servers commonplace will also train people the wrong thing: to no longer pay attention to the secure state of a website when it actually matters.
– It will likely also push people to start using self-signed certificates on the open web. This is BAD. Don’t do this.
A very clear, sane and well explained opinion. Thanks, Moonchild!
I am not internet-educated. But that being said, I remember reading years ago that the web was so terribly insecure because it was designed by universities and the govt. to be “open”. OK, the “open” part was fabulously successful, and now the whole world lives on the web. But that “open” thing has caused the HUGE hacking (& related problems) and it has exposed us all to really bad stuff – ID theft being just one of those bad “stuffs”. So, before we get even deeper into trouble, what would it take to build a new web infrastructure designed for the security of our data – health data, financial data, ID data, and the like, with heavy encryption required everywhere it was appropriate? Would this make more sense than all the security patching, security software, password stuff, and all the rest that has proven to be a gigantic and costly failure? Just asking. Seems like we’re just pouring in good money after bad, and it’s getting worse.