theory: data corruption

I've been told I'm wrong... but I just keep looping back...

Here's an article I looked at...  the blog has a link to it.
http://www.zdnet.com/article/data-corruption-is-worse-than-you-know/

Now, I'm not so sure I have all the technical bits right, but in summary,  the more data you feed the machine, the more likely you are to get an error.

My theory... speed.   The faster = more you manipulate something, the more likely it is to become corrupted from it's original state.  It just is.  In nature too...  you run a piece of equipment enough, even with constant maintenance, it will need parts replaced, it will be altered from it's original state.

An animal, a plant a living thing.. it wears, even with all the things in place in it's body to keep it alive and healthy, things alter it make it sick... it alters... age, time.... 

If you push something beyond it's capacity it will be corrupted.

I worked on a paper machine, it was a massive machine.  It ran 24-7.  The goal was to not have a 'paper-break'.   That occurred when there was a flaw in the paper running through the machine and it ripped or tore, and it could no longer flow through the machine and had to be re-thread and the machine had to be 're-started' with a new continuous sheet.

If you ran the machine too fast,  that continuous sheet was more prone to flaws,  it became to weak.  There were of course other factors that could make the sheet break, but the goal was to make as much paper as the machine could handle, as fast as possible.

But there was a limit.  Our machine was not the fastest.... we ran slower then the fastest machine.  But our machine constantly won awards for the least amount of paper breaks.  And we produced a around 350 tons of paper a day... we had a record for a 17 day stretch without a paper break....
Which was a huge deal.... paper breaks meant a loss of profit... running machine was all profit. Machine not running paper through it, meant loss.

I guess I understand that there are things that we want to accomplish with computers that require mass amounts of speed.....  but I'm saying that maybe speed results in all these errors.  It's a machine.  It's made up of hardware and software and electricity ..... and programmed by humans, built by humans.

In conclusion....
It's a machine... all it's components are required to make the end goal of the user happen.  If it's being pushed beyond it's capability....  if we're asking it to go so fast, and store so much, that it can't possibly keep the data safe from corruption, we built it. Human's built it. Human's designed it. Human's programmed it.  Maybe we're just running things too fast.

The more you edit/manipulate/copy/alter a piece of data the more likely it is to become corrupted.
corrupted: altered in a way that is not functioning as intended-unnatural
Therefore the faster we process data, the more likely errors can occur. Corruption can be directly linked to the speed being to great in relation to amount of processes being done.

Like my sister said to me.  "It's not the computer's fault.  It does exactly what you tell it to."

Comments

Popular posts from this blog

JavaScript Ascii animation with while loops and console.log

playing with trigonometry sin in pygame

JavaScript and a Matrix