General Question

flo's avatar

Edited: If people did their jobs too well, would there be very high rate of unemployment?

Asked by flo (13313points) June 15th, 2011

It is just a verbalized suspicion that I have heard 2 people make. I don’t believe it for a second. There is too much yet to be done! Even if man masters one area there is other areas. There is nature to battle. Do you have a website that explains why that is an erroneous conclusion? And/or would you explain why you believe it is?

Observing members: 0 Composing members: 0

36 Answers

zenvelo's avatar

That’s a false presumption. Increased productivity helps the economy grow, which increases the demand for additional workers.

Coloma's avatar

Things change, people change, therefore job performance changes.

No right/wrong dichotomy, just is what it is.

krrazypassions's avatar

Good work only leads to good things

“If people did their jobs too well, would there be very high rate of unemployment”- That would be the case if a lot of people seemed to be doing their job well, but actually used unfair means and shortcuts and corrupt practices to fake good results!

example: If a lot of students used unfair means to score high in the exam, then you would have a lot of “undeserving” candidates that take up the good seats in the higher educational institutes. As a result, the new batch of graduate students will be of low quality because many of them are corrupt individuals who don’t believe in hard work or dedication to their field, but instead use unfair means and shortcuts to gain access to avenues much better than they deserve. This will ultimately lead to low quality products and services and economy will tumble down. Society will be in tantrums. And it is! Corruption in all fields has caused a lot of trouble to our society and is continuing to do so.

On the other hand, if people are actually working hard and doing their jobs well, it will only lead to solving of the numerous problems that plague all spheres of mankind. The world is running fine but there is a lot of mess. Which means there is a lot of work to do! New jobs will be created as our standards and efficiency goes on increasing and as we are faced with new challenges that we may have overlooked because of our existing mess.

lillycoyote's avatar

Increased productivity is one of the factors that leads to economic growth, so no, I don’t think so. And you’re right, there is too much to be done.

the100thmonkey's avatar

I watched an old edition of Horizon (a BBC documentary show) fom the 70s about the microprocessor. It was introduced with the (rather punchy) thesis “this is the microprocessor, and it is the reason your children will be out of work”.

As it stands, the increases in workplace efficiency enabled by the microprocessor contributed to a significant increase in both employment and in potential for employability – it opened up whole new areas of work

I don’t see how ‘perfect’ efficiency from people (the concept is incoherent, in my opinion) would be any different.

Thammuz's avatar

Hm. I can’t really follow the reasoning behind such a fear. It bases itself on a flawed premise that jobs are a necessary factor for people to survive, when this couldn’t be further from the truth.

When we finally manage to create abundance, in the sense of actually managing to create something else that does everything for us without needing anything in return, like superintelligent robots, or an inexhaustible source of energy, there will be no point for jobs anymore.

But utopistic speculation aside, having better workers would ensure less firings (obviously, you don’t fire good workers) and would probably reduce the interchangeability between employed and unemployed people, because the unemployed would be even more undesirable, being probably less skilled than the current employees, on top of everything else.

I don’t see how this would lead to more unemployment, rather than to stagnation in the unemployment lines.

LuckyGuy's avatar

Nope. Work better, make more efficient and better products that more peole can afford.

Statements like that come from people with limited skills who fear the work ethic and the need to learn new things. Take note.

Thammuz's avatar

Addendum: there is something to be said about planned obsolescence, though.

While skilled workers are doubtlessly beneficial, there are people who argue that too good a product can in fact kill the market.

Personally, not giving a shit about “the market” and considering the customer the foremost reason for a product’s existence, i couldn’t care less if a product does indeed kill the market, as long as said product is worth losing the market to. Like for instance the Intel standard of processors.

It does make sense to imagine, though, that in a hypothetical scenario of a company making only a selected range of products that are too good (meaning they rarely break and are also ahead of their time, and thus won’t become obsolete anytime soon) that could potentially lead to the company saturating the market, earning a lot in the short term, but then ending up without customers unless it manages to find areas they haven’t already flooded themselves.

This is a ludicrous scenario, obviously, but that’s the only way i could imagine too much competence being detrimental rather than positive.

the100thmonkey's avatar

@Thammuz – Microsoft.

This is not to suggest that they’re “too good” (windows is deeply flawed in many ways), but that they have defined the OS market in such a way that they both own it and are hurt by it. Microsoft’s biggest competitor in the PC market is.. Windows XP.

I take your point about planned obsolescence but it doesn’t seem to hold with XP.

Thammuz's avatar

@the100thmonkey That’s because they failed their planned obsolescence with Xp. They overestimated the weight of the changes in hardware in the time after Xp, expecting there to be changes significant enough to warrant a next big change, like they did with Xp itself.

Problem is that most people don’t need a 64 bit system to go about their daily pc usage, nor do they need an increible amount of ram.

The need for higher specs in common pc usage reached a plateau during the Xp era, and since there hasn’t been any significant change in how people browse the internet, write on Word or look at their email, there is no good reason to trade up to a new computer and therefore get the next windows OS. And, let’s face it, that’s how most people change systems.

Even gaming is starting to plateau at this point, accomplice the Xbox360 that forces most developers to make the best of not so stellar hardware, instead of just expecting people to have PCs that won’t be out for the next 5 to 7 years (I’m looking at you, Crysis).

Plus there’s the companies, that are understandably pissed about having to change a system that still works perfectly well for what they need to do, and therefore are unlikely to shell out whatever ludicrously high sum microsoft wants them to shell out for their enterprise licenses.

To sum it up: microsoft did try and make Xp obsolete on time for their Vista release, Seven at worst, like they did with ‘98, ME and 2000, problem is that they failed because they estimated the trend of desire for upgrades and upgrades being produced to keep going at the same rate while the growth of the former decreased dramatically for most sectors. So, in a sense, it’s not that Xp is too good, it’s that it is good enough and for the most part needs no upgrading.

EDIT: now that i read my own post again, i think it fits microsoft perfectly. Afterall it’s during the Xp era that they put out the Xbox and thus stepped into the console market, completely untouched by their sweaty fingers up to that point. And as you said they did end up shooting their own foot off with Xp, so apparently it needn’t be “too good” a product, it just needs to be “better than what is needed”

The_Inquisitor's avatar

If people did their jobs well, then that’s just growth in human capital. It contributes to the increase in labor productivity which also increases real wage. If people did their jobs well, and wage increases, then firms will be more willing to pay their workers more in an hour, which increases the demand for labor, and also increases the labor supply.

flo's avatar

What inspired my question was part of an answer in one of my latest OPs ”…On that note, if computers didn’t suck (or if they easier to use), how many people would be out of a job?

the100thmonkey's avatar

If computers didn’t suck, almost eneryone would have a job.

Thammuz's avatar

@flo If computers were easier to use we could hire baboons to do what people do now.

Thammuz's avatar

@flo I don’t know, but from the blurb sounds like a presumptuous unemployed programmer wanted to get snarky.

Software sucks, sure, no arguing about that, a person arguing that he found the answer as to why (besides human falliability, which wouldn’t be something to write a book about) and that there is a way to, basically, make humans not commit mistakes, is a fucking moron, no ifs no buts.

Software development is the work of a team, teams are made of people, and people make mistakes, end of story. Actually, it’s impressive that we can make such complex programs like 3D Studio Max and Photoshop and have them crash so little, compared to the complexity of what they’re doing.

flo's avatar

@Thammuz Software sucks, sure, no arguing about that…,”
” ...people make mistakes, end of story.”
not exactly a ringing endorsement the way your last post “If computers were easier to use we could hire baboons to do what people do now.”

I read this I’m paraphrasing):

“The writer sounds like he is willing to be more honest that so many people who are too busy showing off: “Look how smart I am! loooooook, just look!, ...what? this faulty programming? well if you, a lay person can’t figure out the booboos I make then….it is your fault”

By the way how necessary are the 3D Studio Max Photoshop? Should they not be too busy perfecting the necessary things?

flo's avatar

…By the way, the people who are creating Photoshop 3D, Xbox, ...may be a different group of people, but I’m talking the industry as a whole.

Thammuz's avatar

@flo Now, i’m studying computer sciences, so i could just pull rank on this one, but i won’t, when there’s perfectly good logic i can explain this with.

In short: you’re mistaking usability with reliability.

In long:

On Software sucking:

Using computers has nothing to do with computers working properly. Computers most of the time don’t have problems, hence why they are so consistently relied upon by the totality of western civilization, and when they do, they are minor inconveniences, assuming those who are using them aren’t drunk chimps. If you’re working on an important project, you should be saving once every ten minutes, you should also have a daily backup on at least two separate devices. Why? Because just as you can fuck up a piece of paper beyond repair by spilling ink on it, you can lose a file by accident. The hard drive can get fucked up, or you could accidentally delete the file itself. These are normal precautions, and those who use them rarely complain about the fact that the computer crashed making them lose “all their work”. If you don’t use them you have nobody to blame but yourself, that’s not an “ease of use” issue.

These problems also have nothing to do with usability, they are reliability problems.

As for usability, that’s the only issue that determines who can or cannot use a computer. How easy it is to use. For instance, i can use a command line program, no problem; people i know just freeze up in front of a command line because they have no fucking clue what to do without a mouse. Both systems are reliable but one is easier to use than the other, the problem is that it’s also much more complicated to program, and that’s why the earliest examples were gynormous pieces of shit, see Windows 3.1.

Hence why we started coming up with methods to efficiently and reliably program mouse based interfaces. Mind you, we had to come up with them. Computers as we know them have less than half a century of development behind them, and already we’ve reached the point where they can render close to photorealistic images in real time, 60 per second. And this startig from jack shit. That’s a fucking achevement if i ever saw one. And we’re actually progressing very fast. The problem is that the programs that do fuck up are much more complex than that, not in small part because we also have to make them comprehensible and intuitive. Because most people don’t know how a computer actually works, and wouldn’t use a program that needed them knowing about computers. Which means we need to (metaphorically) make a guy who knows nothing about pen and paper able to write books, without teaching them about pen and paper.

So, because of this, yes, user oriented software does suck. User oriented software is less reliable, but easier to use. It’s the tradeoff you get for wanting stuff to be easier, because the difficulty is still there, only you don’t see it because it’s all on our shoulders.

The programs that crash the least are the ones that do the least things, because they’re simpler, easier to write and easier to bugfix. The bigger, more complex and easier to use is the program, the more work falls on the programming team and thus the more errors slip through the cracks.

That’s why an unreliable yet easy to use system like Windows rules the market while a much more reliable yet much harder to use free OS (Linux) is consistently shunned for everything that doesn’t have the utmost reliability as a priority.

As for the “necessary things”:

What is a necessary computer program? You can do pretty much anything a computer allows you to do using pen and paper, computers only make it a fuck of a lot easier to do, reproduce, distribute and preserve. What computer programs are necessary is entirely subjective on the single user. Photographers want photoshop to work, 3D artists need Studio Max and i only need notepad and gcc because i program. Which of these is necessary from an objective standpoint? None.

What is necessary is the OS and, even then, no macine works indefinitely without routine maintenance, so even a perfect OS ends up fucking up as time passes, because errors accumulate and increase exponentially over time.

That’s why every company produces a different set of programs, because programs are made for niche markets, and the only program that everyone needs is the OS itself.

And since humans are falliable, the OS is going to have problems. The more complex the OS, the more problems it’s going to have, mainly because the more stuff there is the more room for errors there is. If you write a sentence, you can easily spot inconsistencies and errors in it, if you write a book, it’s harder, if you write a book on separate pieces of paper with no page number it’s hard to find mistakes and even harder to spot inconsistencies. And when the OS has problems, the other programs are going to have them as well, and more of their own.

As for the industry:

You seem to work under the assumption that there is such a thing as a homogenous “industry” in computer sciences as there is in, say, food production. There is no such thing.

The three major OSs (Win, Mac and Linux) are completely incompatible with eachother (except for emulation, which can’t be relied upon anyway) and, with the exception of Linux, they’re products owned by companies that are in competition with eachother, at least in theory. Companies that don’t even share similar development philosophies.

So there is no such thing as “focusing on solving the necessary things”.

FIAT isn’t going to help Ford solve the problems of gas usage their cars have out of sheer goodwill, because they’re both in the same market and an enemy’s weakness is your strenght. And in a market dominated by two titans, like computer OSs, there is no alliance.

To close everything up:

What is transparent from the questions you raised is that you have no idea of what exactly software design entails and how the computer sciences “industry” works.

You need to realize that everything a computer does could be very well be typed in by hand in 0s and 1s, all software does is make it more accessible (“higher”, in jargon).

The “higher” a program is (aka the less actual computer knowledge it needs to be operated) the more complex its coding is, the heavier on the OS and the machine itself and, ultimately, the more unreliable it is.

That’s simply how it is, and it’s still much more efficient than bashing out line after line of code yourself every time, because a) you do it once and then it’s always there and b) it needs no knowledge from the end user.

That’s why i mentioned Photoshop and 3D Studio Max: they’re programs that are targeted at people that could very well know nothing about a computer’s inner workings. Photographers aren’t required to know how to program in C++ or Java, and they shouldn’t be.
They also shouldn’t be complaining if the software they understand nothing about, crashes sometimes, because if it were for people with their skillset they’d be out of a job, because film and cameras weren’t invented by photographers but by engineers of various fields.

Therefore, just as they can’t honestly complain about the top of the range in cameras without sounding like unsatisfiable whiny twats, they can’t complain about the best software we can put together. Why? Because they had no input in actually making it and nobody is forcing them to use it and, being this a market based on competition, they can bet their asses that if any company could seriously jump forward and upstage their competition, they would. What we have now is what we can manage now. What we will have tomorrow is what we will be able to manage tomorrow. Simple as that.

flo's avatar

I hope computer experts will respond to that.
But let me ask you this: Let’s say I have to use someone else’s computer, I am reading one of my Yahoo emails, and I open a new window to use their Q&A page, instead of finishing reading the email, and then something comes up, I decide to finish reading the email next time. I will let you finish what comes after that. What should I do?

flo's avatar

@Thammuz by the way, could be Gmail, or maybe other email client, not necessarily just Yahoo

flo's avatar

@Thammuz let me correct myself: I hope other people will respond to that.

Thammuz's avatar

@flo I have to say, i’m usually not one to stop and read a sentence twice because i think i might have misread it, but i had to do it with this one. Several times.

What’s your point? I seriously don’t get that. Are you referring to a specific bug and asking me why that happens? Are you asking me to see if i can read your mind and see what you would do after opening an email and leaving it only partially read? What?

Let’s say I have to use someone else’s computer, I am reading one of my Yahoo emails,
Ok.

and I open a new window to use their Q&A page, instead of finishing reading the email,
Still ok.

and then something comes up, I decide to finish reading the email next time.
Fair enough, what is the problem with leaving an email only half read as long as you remember you did?

I will let you finish what comes after that. What should I do?
Well, making sense would be a start! A) What of the previous sentences would allow me to figure out what comes next? B) What should you do about what? and C) How in the world would i know? I don’t even know what you’re talking about!

flo's avatar

@Thammuz I rest my case.

flo's avatar

@Thammuz Forget the “what comes after that” part. What is the minimum I should do before I leave the computer? Sign out? There is more.

Thammuz's avatar

@flo What is the minimum I should do before I leave the computer?

Ok, NOW i know what you want to know.

The minimum you should do is logging out. Most people don’t know or care about the fact that their email is still in the temp files and that probably pressing “back” or checking the history in offline mode is enough to reload their email page (not all providers allow this, though, it depends highly on how well their site is structured).

What you should do to protect your personal data is, if you’re using firefox, activate anonymous browsing mode which doesn’t save any files on the computer and clears temps after you stop using it, before checking the email or manually clear cookies, cache, history and temp files after you’re done.

Then if you’re really paranoid you have to scrub them by overwriting the sectors a few times to prevent their recovery

On a side note: This has fuckall to do with how much i know about computers. I’m a programmer, i know how fucking software design and programming works and, while i have no idea how a programmer could not know this stuff, i have to inform you that, even if i did not, my point would still stand. This is menial shit that most people don’t care about, even those who work in “the industry”.

I have a laptop, i don’t need to use another person’s PC to check my email. If it’s that urgent a matter i will have my laptop with me! Furthermore, even if you scrub the data from that particular PC, someone could well have sniffed the data from your connection, or they could have something that relays the data from either your end or the other end (Trojan horses, keyloggers and shit like that) to them.

When using PCs, you have to realize, your data will never be completely safe. It wouldn’t be even in the best possible scenario with the best computer and best software in the galaxy. Sorry.

And one more thing: If you’re trying to use this as evidence of bad design, i’ve got to tell you, this is all so you can have it easy, again. Cookies are there so you don’t have to log back in a billion times in the same session, site histories are there to allow fuckwits who don’t add to their favourites a page they need (or just jot down the address for that matter) not to lose it and have to look it up again, temp files are there to speed up your browsing because they save the data you already downloaded and allow you not to download it every time you go back to a page you already visited and caches just speed up the process.

So basically, it’s the tradeoff looming its head again.

flo's avatar

@Thammuz there is something else to it than just logging out.

flo's avatar

@Thammuz in your second to last post ’’you’re mistaking usability with reliability.’’
Doesn’t reliabilty increase usability? I mean if people find something unreliable they would be less inlcined to want to use it, no?

Thammuz's avatar

@flo there is something else to it than just logging out.
Yeah, there is all the other stuff i listed. Clearing cookies, cache, temp files, history. Are you even reading the goddamned posts? And if you think i missed something could you just get on with it and spit it out?

Doesn’t reliabilty increase usability?
No it does not.

Usability == ease of use, comprehensibility of the interface, comprehensiveness of functionalities, streamlining of the options. Radically clutters up the code and makes it much more error prone.

Reliability == probability of the program not crashing while you’re working. Needs as little errors as possible, hence needs more streamlined code. Further proof of this is that server programs generally don’t come with a graphical interface, because they need to be always up and working with as little maintenance as possible.

I mean if people find something unreliable they would be less inlcined to want to use it, no?
People who pick what program to use =/= people who use the program, 90% of the time.

Regardless, apparently not, since windows is unreliable and still dominates the market hands down.

flo's avatar

@Thammuz Not the clearing cookies, cache, temp files, history, but just the simple act of logging out. There is more detail to it.

Thammuz's avatar

@flo Which is? Because as far as i know you log out. What the process entails changes from a provider to the next.

Also, this has to do what with the topic at hand, exactly?

flo's avatar

@Thammuz Which is? I will let you find out, just because you probably like challenges.

Thammuz's avatar

@flo Trust me, i don’t. And i couldn’t care less about the question, because it’s completely irrelevant and has nothing to do with the topic, which you are consistently refusing to address.

flo's avatar

@Thammuz It is software related but it is “irrelevant”? You don’t like challenges?
Okay.
“And i couldn’t care less about the question.” Sorry, I thought you pursued the question this far.
“Also, this has to do what with the topic at hand, exactly?”
The topic is about doing a job well enough, too well, ... and you responded re. flaws when it comes to software, so that is what it has to do with the topic at hand.

Thammuz's avatar

@flo Web security is not relevant as far as software design is concerned, because there are factors which are impossible to fix. I don’t care about that specific question because it has no bearing on the overarching point that the reasons why software suck have nothing to do with the programmers’ skill.

flo's avatar

@Thammuz That is just impressive.

Answer this question

Login

or

Join

to answer.

This question is in the General Section. Responses must be helpful and on-topic.

Your answer will be saved while you login or join.

Have a question? Ask Fluther!

What do you know more about?
or
Knowledge Networking @ Fluther