Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase «««34567»»

It Happens Expand / Collapse
Author
Message
Posted Thursday, September 26, 2013 9:48 AM


SSCrazy Eights

SSCrazy EightsSSCrazy EightsSSCrazy EightsSSCrazy EightsSSCrazy EightsSSCrazy EightsSSCrazy EightsSSCrazy EightsSSCrazy EightsSSCrazy Eights

Group: General Forum Members
Last Login: Today @ 7:19 PM
Points: 8,684, Visits: 9,214
Dave62 (9/25/2013)

I wrote my first program on punch cards in the late 1970's.

Really? That's unusual. I never wrote on punch cards or on paper tape, I used to punch holes in them instead or, sometimes, have someone else do that for me after I had written some program on paper.
(Sorry, couldn't resist that)
I don't think the view that our programs will have bugs or may not work entirely as intended is too gloomy. I don't think that's an excuse to be lazy either. I just think it shows the maturity that's required to realize we're not perfect.

It's delusional to arrogantly deploy a solution and then walk away thinking it's perfect and will never need to be touched again.

As soon as we have perfect humans we can then start expecting perfect programs. It may be a long wait...

All that makes perfect sense. Well said.


Tom
Post #1498938
Posted Thursday, September 26, 2013 10:09 AM
SSCrazy

SSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazy

Group: General Forum Members
Last Login: Thursday, August 7, 2014 3:34 PM
Points: 2,282, Visits: 1,341
L' Eomot Inversé (9/26/2013)
[quote]
Really? That's unusual. I never wrote on punch cards or on paper tape, I used to punch holes in them instead or, sometimes, have someone else do that for me after I had written some program on paper.


Tom,

You have made me think of those old green and white forms we used to code Assembler and RPG back in the day. I remember desk-checking a pile of those sheets trying to determine the correctness of the process and flow before I had a deck of cards punched. My how times have changed!





Not all gray hairs are Dinosaurs!
Post #1498945
Posted Thursday, September 26, 2013 10:21 AM


SSCrazy Eights

SSCrazy EightsSSCrazy EightsSSCrazy EightsSSCrazy EightsSSCrazy EightsSSCrazy EightsSSCrazy EightsSSCrazy EightsSSCrazy EightsSSCrazy Eights

Group: General Forum Members
Last Login: Today @ 7:19 PM
Points: 8,684, Visits: 9,214
Gary Varga (9/25/2013)
simon.crick (9/25/2013)
Steve Jones - SSC Editor (9/25/2013)
simon.crick (9/25/2013)

I was lucky. I had teachers who drummed this into me from an early age, and I have written some fairly complex systems that have not needed a single bug fix or enhancement in 20+ years.


I can't argue with that. I hope you're well compensated as you must be one of the best developers around. I've never me anyone that could make this claim.


I sense a degree of disbelief, and I can't blame you given the culture of not caring about the odd bug here and there that seems to invaded the computing industry, but it is 100% true. The first system I worked on was a client database and commission tracking system for a financial advisor. I finished it in the early 1990's, and it is still in use today, and it hasn't been touched for over 20 years.

Simon


I struggle with "fairly complex systems" and "not needed a single [...] enhancement". No changing requirements? Are you sure? Where in this world is a place where that neither financial regulation nor personal data regulation has changed in 20+ years???

Surely nowhere N America, Scandinavia, the European Union, Australia, New Zealand, or South Aftrica. Unless Mr Crick did that work in a very primitive country he is evidently equipped with a Crystal Ball which allowed him in the early 90s to antcipate and program to comply with all the regulatory changes that were going to occur in the next 20 years.


Tom
Post #1498953
Posted Thursday, September 26, 2013 10:38 AM


SSCrazy

SSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazy

Group: General Forum Members
Last Login: Today @ 7:59 AM
Points: 2,038, Visits: 2,038
L' Eomot Inversé (9/26/2013)
Dave62 (9/25/2013)

I wrote my first program on punch cards in the late 1970's.

Really? That's unusual. I never wrote on punch cards or on paper tape, I used to punch holes in them instead or, sometimes, have someone else do that for me after I had written some program on paper.
(Sorry, couldn't resist that) ...

That's a good one Tom!

I guess technically we still don't "write" programs even today. We just type them into an IDE or editor of some kind. I haven't found a compiler yet that can make heads or tails of my chicken scratch.
Post #1498958
Posted Thursday, September 26, 2013 10:38 AM
SSC Rookie

SSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC Rookie

Group: General Forum Members
Last Login: Wednesday, June 25, 2014 7:33 AM
Points: 27, Visits: 101
L' Eomot Inversé (9/26/2013)
Gary Varga (9/25/2013)
simon.crick (9/25/2013)
Steve Jones - SSC Editor (9/25/2013)
simon.crick (9/25/2013)

I was lucky. I had teachers who drummed this into me from an early age, and I have written some fairly complex systems that have not needed a single bug fix or enhancement in 20+ years.


I can't argue with that. I hope you're well compensated as you must be one of the best developers around. I've never me anyone that could make this claim.


I sense a degree of disbelief, and I can't blame you given the culture of not caring about the odd bug here and there that seems to invaded the computing industry, but it is 100% true. The first system I worked on was a client database and commission tracking system for a financial advisor. I finished it in the early 1990's, and it is still in use today, and it hasn't been touched for over 20 years.

Simon


I struggle with "fairly complex systems" and "not needed a single [...] enhancement". No changing requirements? Are you sure? Where in this world is a place where that neither financial regulation nor personal data regulation has changed in 20+ years???

Surely nowhere N America, Scandinavia, the European Union, Australia, New Zealand, or South Aftrica. Unless Mr Crick did that work in a very primitive country he is evidently equipped with a Crystal Ball which allowed him in the early 90s to antcipate and program to comply with all the regulatory changes that were going to occur in the next 20 years.


It was in the UK. The key to making it future proof was NOT to try to predict future, i.e. NOT to make any assumptions, but instead to identify the fundamentals that would never change.

It's a good strategy that I try to apply to all my software.

I thoroughly recommend it.

Simon
Post #1498959
Posted Thursday, September 26, 2013 11:04 AM


SSCrazy Eights

SSCrazy EightsSSCrazy EightsSSCrazy EightsSSCrazy EightsSSCrazy EightsSSCrazy EightsSSCrazy EightsSSCrazy EightsSSCrazy EightsSSCrazy Eights

Group: General Forum Members
Last Login: Today @ 7:19 PM
Points: 8,684, Visits: 9,214
simon.crick (9/25/2013)
I believe technology also has a role to play in the prevalence of bugs in modern software.

I certainly agree. Many bugs are caused by appallingly ill-conceived technology. The C++ language is perhaps the best example of this. C# has of course inherited all C++'s idiocies, and although it is possible to work in a mode that leaves the worst of the C++ rubbish out people have got used to using that rubbish and are too lazy to learn to avoid it. The rot started with the spread of C, but C's fault was that people treated it as a high level language rather than ralising that it was an extremely low level language.

I have no hard evidence to back this up, but I would be prepared to bet quite a lot of money that modern languages like C# with lots of "advanced" programming concepts suffer from way more bugs than earlier languages like C++ that are more minimalistic.
Advanced programming concepts are fine in principle, but they can lead to code with lots of unnecessary layers and levels of abstraction that can obscure the underlying logic, resulting in mis-understandings that would not have occurred if simpler programming concepts had been used.

The idea that abstraction is a bad idea is just plain crazy. Abstraction is one of the main tools for reducing a problem into multiple smaller parts. Simple advanced programming concepts like reduction, function mapping, and unification are going to cause far less damage that hideous primitve complexity like C++'s Friend concept.

I firmly believe that computer science will eventually realize this and there will eventually be a return to simpler and more minimalistic languages and technologies.

I think computer science is way ahead of you.

But this won't happen until computer scientists get over their inferiority complex (controversial, I know, but a well recognised phenomenon) and start thinking about how they can help us build more reliable systems rather than how they can impress their physics/engineering/mathematics colleagues with difficult to understand research papers.

My view is that software with fewer bugs will be developed only when unscientific developers stop turning their noses up at science and actually take the trouble to learn some. You'll find that software written in languages like Haskell, ML, Erlang, SCCS and verified using tools like the Scott Calculus or Z have far lower bug rates than software written in C++ or Visual Basic. You'll also discover that developers who have been taught by computer scientists to consider error management as an essential part of every program manage to write programs that cope well with environmental problems (OS and compiler bugs, hardware failures, incorrect input, etcetera), while those that haven't will at best have a try at dealing with invalid input.


Tom
Post #1498967
Posted Thursday, September 26, 2013 11:29 AM


SSC-Dedicated

SSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-Dedicated

Group: Administrators
Last Login: Today @ 4:05 PM
Points: 33,165, Visits: 15,299
One quick note:

I don't see the idea of accepting we'll have bugs in our software as implying that we allow those bugs to live or that we don't work to eliminate them. It's not an excuse to write shoddy code.

It's a reality and a challenge to work to improve our skills and software.







Follow me on Twitter: @way0utwest

Forum Etiquette: How to post data/code on a forum to get the best help
Post #1498977
Posted Friday, September 27, 2013 8:53 AM
SSC Rookie

SSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC Rookie

Group: General Forum Members
Last Login: Wednesday, June 25, 2014 7:33 AM
Points: 27, Visits: 101
L' Eomot Inversé (9/26/2013)
The idea that abstraction is a bad idea is just plain crazy. Abstraction is one of the main tools for reducing a problem into multiple smaller parts. Simple advanced programming concepts like reduction, function mapping, and unification are going to cause far less damage that hideous primitve complexity like C++'s Friend concept.


Abstraction is good when used appropriately, but all too often I find myself debugging a rediculously simple piece of logic that could have been written in 10 lines of code that someone has somehow turned into 1000 lines of services, factories, handlers, delegates, extensions, overrides and god knows what else. I trace it through the 17 layers of junk only to find that it has done "x = a + b" instead of "x = a * b".

Such elementary bugs would never occur if the logic was not obscured by vast amounts of unnecessary abstraction.

This is not directly the fault of the programming language, as programmers do not have to use these techniques, but the fact that so much emphasis is placed on learning "advanced" programming concepts seems to make a lot of new programmers think they should be using the advanced concepts all the time.

Personally, I would like to see more computer scientists working on ways to prevent unnecessary complexity, e.g. automatic compiler warnings when the amount of wrapper code exceeds some multiple of worker code.

Simon


Post #1499443
Posted Friday, September 27, 2013 12:33 PM


SSCrazy Eights

SSCrazy EightsSSCrazy EightsSSCrazy EightsSSCrazy EightsSSCrazy EightsSSCrazy EightsSSCrazy EightsSSCrazy EightsSSCrazy EightsSSCrazy Eights

Group: General Forum Members
Last Login: Today @ 7:19 PM
Points: 8,684, Visits: 9,214
simon.crick (9/27/2013)
L' Eomot Inversé (9/26/2013)
The idea that abstraction is a bad idea is just plain crazy. Abstraction is one of the main tools for reducing a problem into multiple smaller parts. Simple advanced programming concepts like reduction, function mapping, and unification are going to cause far less damage that hideous primitve complexity like C++'s Friend concept.

Abstraction is good when used appropriately, but all too often I find myself debugging a rediculously simple piece of logic that could have been written in 10 lines of code that someone has somehow turned into 1000 lines of services, factories, handlers, delegates, extensions, overrides and god knows what else. I trace it through the 17 layers of junk only to find that it has done "x = a + b" instead of "x = a * b".

I couldn't agree more. In fact one of the things I really dislike about C++ is the way that sort of idiocy is encouraged in the C++ world. I've no particular objection to template libraries, factories, post offices, services, and so on but I have an intense dislike of their misuse, which appears to be much more common than sensible use of them; and certainly overuse implies misuse.

Such elementary bugs would never occur if the logic was not obscured by vast amounts of unnecessary abstraction.

I dont regard that sort of misuse of tools abstraction, I call it stupidity. The purpose of abstraction is to simplify, to reduce complexity, to make understanding easier. For example using STL iteraters to loop over the numbers from 1 to 7 instead of a simple C-style loop is not abstraction, it's obfuscation, because it makes the code harder to unerstand, not easier. Using prolog for a task which requires unification instead of reinventing the wheel in C++ is abstraction - because implementing unification in C++ is probably going to be a nightmare, whereas in prolog unification happens behind the scenes so you've abstracted away all the detail of its implementation (that's why we have logic programming languages - because they can do serious unification and constraint resolution without asking us to implement it in a language not designed for it). Using prolog for a task that doesn't require unification is not abstraction, it's just idiocy.

This is not directly the fault of the programming language, as programmers do not have to use these techniques, but the fact that so much emphasis is placed on learning "advanced" programming concepts seems to make a lot of new programmers think they should be using the advanced concepts all the time.

Yes, that's often true. But I can't understand anyone who thinks that. Surely anyone who has been taught to programme anything in whatever language (except perhaps C++ or Java) knows that you pick the tools and off-the-shelf components that will give you the easiest design, programming, testing, integration, validation, rollout and support task? If they haven't been made to understand that, they surely shouldn't be allowed to do anything except under very close supervision until they have learnt to understand it, because letting a developer ignore the impact of what he does on any part of that chain is giving him carte blanche to write unusable rubbish. Anyway, in some cases it is the fault of the programming language (or the development system or the available components/libraries for that language) - for example you probably can't write anything useful in Java without using a lot of stuff from run-time libraries and in the case of Java it seems to be the norm for those libraries to wrap things up in such a way that it is sometimes well nigh impossible to understand what they are supposed to be an abstraction of (and some, but not all, of the C++ template libraries are as bad or even worse, or were last time I looked which admittedly is a while back).

Personally, I would like to see more computer scientists working on ways to prevent unnecessary complexity, e.g. automatic compiler warnings when the amount of wrapper code exceeds some multiple of worker code.

Unfortunately it's not the computer scientists who provide the development tools, it's companies like IBM and Microsoft and Oracle and Apple and so on. Abstraction is where I don't need to know the exact details of something my program does because my development tool kit or a component I get off the shelf deals with that for me, a wrapper sometimes is and sometimes isn't. A wrapper can indeed be an abstraction (SQL Agent is a wrapper that abstracts the handling of scheduling for me, so that I don't have to worry about it, and by its job-step mechanism abstracts some mixed language programming issues for me, because I can have a job with steps using a dozen different subsystems which give me several different languages (but I've never actually written a job that used more than 3). A wrapper can also be nothing but obfuscation, something that's used because the developer thinks it's a cute thing to do.


Tom
Post #1499545
Posted Friday, September 27, 2013 2:46 PM


SSCertifiable

SSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiable

Group: General Forum Members
Last Login: Today @ 5:49 PM
Points: 5,314, Visits: 2,999
A great example of complication for the sake of it in .NET is the overuse of Linq.

Gaz

-- Stop your grinnin' and drop your linen...they're everywhere!!!
Post #1499595
« Prev Topic | Next Topic »

Add to briefcase «««34567»»

Permissions Expand / Collapse