Lets face facts, as much as we like to think we are protected, many times we are not. Our computers get viruses
First of all Mac OSX is vulnerable to attack it is simply attacked less, Apple software was rated the most vulnerable in the world even over Oracles software! If you don't believe me look at this http://www.ahitagni.com/?p=282. While I do agree that some servers are extremely insecure most major companies networks are well protected and they spend millions if not billions to keep it that way. That doesn't mean that they are not vulnerable to attack even companies that follow best security practices get hacked it happens to everyone you just have to be prepared and have a incident response team ready to clean up and fix whatever happened (its not IF it happens its WHEN it happens). The Defensive side in computer security is much harder then the attacking side because when your defending you have to protect a large amount of servers, workstations etc. and all the hacker needs to do is find one way to get into your network. Think of it like Home Land security they have to protect all the major cities, airports etc. from terrorists but all the terrorist needs to do is attack one city,airports subway etc. and his mission will be successful.
Would it bother you if the internet went down forever?
It is extremely unlikely if not impossible that the internet would go down forever (and if there was some major catastrophe it would be fixed and the internet would be back up) but theoretically if it did yes it would bother me but if it did the world would probably be made up of a lot of smaller WANs (eg. the United States has its own network and Russia has theirs china has theirs etc.) so the web wouldn't be world wide. Even without the internet companies would still use computers they would use them to send emails etc. internally on their own networks but it wouldn't be as nice as having access to the internet because employees couldn't access the corporate network from home since they wouldn't have internet access. Or the company would have to lay cabling to all the employees houses (expensive) so they could work from home. I actually wouldn't mind it if all the countries Russia, china, the united states etc. all had their own private country specific networks because it would help to prevent other countries from attacking our governments networks remotely they would have to somehow get into the United States country wide network that isn't accessible from the outside world it would make it much harder for countries to wage cyber warfare against each other because they wouldn't have access to each others networks unless they were able to sneak into the country and access their country wide network (we could abbreviate it CWAN haha).
Now for those who are RUDE, think about this. When a website goes down, it is often the result of problems with the mainframe. Many times this is an easy fix, but what if 100 mainframes went down at once? What if 1,000 mainframes went down at once? While I hear people state there is no way the internet could ever go completely down, it is possible. And many times this possibility comes from our own computer programming.
First of all most business don't use single mainframes to run all their servers on they usually run lots of smaller servers and combine that with load balancing so even if quite a few servers go down there website will still be up even if their entire data center gets destroyed (I seriously doubt this considering the fact that I have seen data-centers that can withstand an F5 tornado have built in power generators etc. Even if it did somehow get destroyed companies have backup data-centers that are used in emergencies like this. I agree with you that it is POSSIBLE for the internet to go down completely it is EXTREMELY unlikely and even if it did go down people would fix it and then it would be back up again.
many times this possibility comes from our own computer programming.
I disagree with you on this because most of the software running on servers is ROCK solid, on servers you don't run the nightly developer build of software you run the tried and true versions that have been widely used and tested so even if there was some sort of major bug or problem this mostly occurs in the more current versions of software and usually on servers you want to run the stable long term support versions of software (eg. On servers I run the 2.6.x version of the linux kernel or the 3.2.x version not the most current version 3.9-rc1).Not that there are not any vulnerabilities in stable versions of software all I am saying is that there are very few vulnerabilities and almost no reliability problems since the software has been tested time and time again and there are usually no new features being thrown into the software like the most recent development versions of software.