distributed collective mind
435 stories
·
4 followers

It Doesn’t Have to be Perfect

1 Share
Feel the Legacy Code flowing through you…

Legacy code. It’s something the “other developers” left behind. You know those guys. They were unprofessional. They were lazy. They didn’t know how to write good code. Or, if we’re being generous, we might say they were rushed for time. They had to get the product to market fast, so they cut corners.

The very term “Legacy Code” encourages this view. The code is a Legacy. It’s something we inherited. It’s our responsibility, but it’s not our fault. This paradigm gives us a light where we can shine blame — and a reason we can leverage to do a rewrite!!

But legacy code isn’t only something we inherit from other developers. It’s also the code we saddle ourselves with. I can’t count the times I’ve looked back at code I’ve written and thought: “What the hell was I thinking?” (usually followed by a refactor of said code).

If we’re honest, this is a more accurate definition of Legacy Code:

Legacy code is anything we’re not writing right now.

Respect

That legacy code you hate is paying your salary. Those stupid developers who hacked it together did a good enough job that you get a cool office, a new Mac Powerbook, and free Club Mate. Were they lazy? Were they unprofessional or not as senior as you are? Maybe. Maybe not. But, they deserve a little respect.

I’ve rarely met an engineer who didn’t care about their code. Until proven guilty, assume everybody’s trying to do a good job. Assume that everyone put their best effort forth.

This doesn’t mean that we can’t get frustrated by legacy code. This doesn’t mean that we can’t say we know more than the developers who came before us. But having a respectful attitude about the code base we inherited, the coders who developed it, and the problems it was trying to solve can transform our relationship with the code, and with the organization that created it. And, if that legacy code was written by the devs on the team you work on, having empathy for them, instead of contempt, will make the difference between helping that team turn around and creating a scared team of embarrassed developers who try to hide their mistakes from you.

Listen to the Legacy Code

Teams produce software systems that mimic the structures of their organizations — Paraphrase of Conway’s Law

We disdain and discard legacy code at our peril. Without paying attention to it, we have no chance to learn from it. When we get caught up in the “right” way of doing things, it becomes easy to forget:

  • The quality of our code is a symptom of our organization. It’s not just the developers.
  • “Right” is often a matter of style or preference —and can have nothing to do with whether the code is good.
  • We’re always learning, and evolving — and so are our teammates. That’s why the code we wrote one month ago often looks horrible to us.
  • The code is the way it is because people were trying to solve a particular problem. Learning about this problem — which may not be technical — can provide important insights.
  • Our product is always changing.
  • Legacy code is inevitable, because… (see bullet below)
  • Ultimately, all code is legacy code.

What Makes Good Code?

In the long run every program becomes rococo — then rubble — Alan Perlis

As I said above, I believe almost every engineer wants to do a good job. They want to follow “best practices” and write good code. Driven by stories of the perfect code base that is bug free, thoroughly tested, easily deploys into production, and has the esteem of its peers, we look to build a Utopia. For what it’s worth, “Utopia” literally means “nowhere”.

Software systems break down over time. There are many, many reasons for this. They’re interesting, and understanding them can be important. But, what is more important is accepting the inevitability of imperfection and decline.

Why is this helpful? It moves us to focus on the important things. And these almost invariably turn out to have nothing to do with software.

Most of us write software for Organizations: Corporations, Small-to-mid-size Businesses, Governments, Non-profits, &c. Our software is supposed to solve problems. If our software solves the problem it’s supposed to solve, it’s good software.

If our software solves the problem it’s supposed to solve, it’s good software.

Or, should I say good enough? To be completely fair, that’s an insufficient definition — but it’s not designed to be perfect. It’s trying to make a point:

  • It doesn’t have to be perfect. It never will be perfect. The most important thing is that it works.
  • It has to be maintainable, and able to evolve. It doesn’t have to be totally easy to maintain.
  • It doesn’t have to always follow “best” practices — which change over time, anyway, and are easily scraped as “Legacy Code” at a later date (CORBA anyone?)
  • It doesn’t have to use the latest, or coolest tech.

The Power of Honesty

There are code bases we hate, and those that are literally unworkable. However, if your software is functioning and has an active user base, there’s a good chance your software is good enough to refactor. If the software was unworkable, your business would be noticeably suffering.

  • You’d be losing customers.
  • You’d be trailing the competition.
  • Adding new features would be nearly impossible — either because you couldn’t figure out how the code worked, or touching anything could cause the whole edifice to collapse.
  • Bug reports would come in on a daily basis, bugs would stick around for months unsolved, and fixing one bug would predictably lead to two or three new bugs being created.

I’ve encountered systems like these. They bear all the signs of the Titanic post iceberg: It’s pretty clear they’re going down. Then, there’s systems that are just frustrating. They’re a pain to work in. The code is ugly, unclear, poorly documented. It’s here where we need to be careful.

When we have to live with code that:

  • Is hard to change
  • Is buggy
  • Is slower than we’d like
  • etc.

It can be tempting to rewrite the whole thing from scratch. If nothing else, pause before you make that decision, and at least take the time to really look into the scope of what you’re doing.

The Power of Acceptance

It wasn’t until I accepted myself, just as I was, that I was free to change — Carl Rogers

Importantly, this is NOT a call to laziness. It’s often said that good developers are lazy developers. I get what’s intended here, but I think it’s just not true. Good developers care deeply about their code. They are thorough, and thoughtful. They work hard.

This is also NOT a call to mediocrity. Rather, this is a call to reality. Systems are complex. They’re “messy” by nature. Much of a coder’s job is akin to herding cats. It’s trying to reason about complex relationships between multiple moving parts and information that changes over time. It is about working inside a complex human system — one that has a history, a future, competing interests, and ingrained ways of doing things.

This all means that our job is difficult. Because systems are complex, it will always be hard. It will always be easy to make mistakes. There’s always something new to learn. Because of the transient nature of our profession, there will often be multiple ways of doing things scattered across a code base — many of which we don’t like. There will be code written by people more senior than us that we don’t understand, and code written by juniors who made predictable mistakes. In a profession riddled with bad documentation — when it exists at all — there will be important decisions documented by cryptic lines of code.

Rather than throwing out the “baby with the bathwater”, and losing a lot in the process. Rather than getting angry and frustrated with the crap code we have to deal with. Rather than recreating the behemoth we swore to destroy. We can accept that our apps will never be perfect. We can accept the imperfections we have to live with now. We can take the time we should to learn from the legacy code we live with.

Acceptance and patience are antidotes to reactive thinking. They help us create the space we need to think clearly. When we have this, we can more wisely proceed to actually evolve a code base we can be proud of.

Summary

  • Learn first. If you’re frustrated with your current code base, don’t rush into a rewrite. The code may be complex. It might be slow. It might be hard to maintain or evolve. But, there could be good reasons for this. Learn them.
  • Refactor over Rewrite. Refactoring is considerably cheaper than rewriting. If you have working software, it’s a LOT less risky, as well. You make smaller changes, with less impact, and have more ability to easily pivot if your newer better code has problems.
  • Learn to be at peace with imperfection. This helps you avoid making reactive decisions that could cost a lot in the long run.

It Doesn’t Have to be Perfect was originally published in ITNEXT on Medium, where people are continuing the conversation by highlighting and responding to this story.

Read the whole story
sness
8 days ago
reply
milky way
Share this story
Delete

AI Overhyped?

1 Comment
The problem is the term AI itself.   The assumption that it is far more than it is.   This does not mean you should not think about smarter capabilities could be inserted into codes to augment our capabilities.  For a while we were using the term 'Cognitive Systems' to indicate methods closer and even mimicking human perception and abilities.  Probably be better to ditch 'AI' and go with Cognitive.  Though even the latter requires too much explanation and can be over emphasized.   Our Cognitive Systems Institute, monitored here,  attempts to emphasize cognitive aspects. Beware over-marketing.

 Artificial intelligence is often overhyped—and here’s why that’s dangerous

AI has huge potential to transform our lives, but the term itself is being abused in very worrying ways, says Zachary Lipton, an assistant professor at Carnegie Mellon University.
by Martin Giles

To those with long memories, the hype surrounding artificial intelligence is becoming ever more reminiscent of the dot-com boom.

Billions of dollars are being invested into AI startups and AI projects at giant companies. The trouble, says Zachary Lipton, is that the opportunity is being overshadowed by opportunists making overblown claims about the technology’s capabilities.

During a talk at MIT Technology Review’s EmTech conference today, Lipton warned that the hype is blinding people to its limitations. “It’s getting harder and harder to distinguish what’s a real advance and what is snake oil,” he said.

AI technology known as deep learning has proved very powerful at performing tasks like image recognition and voice translation, and it’s now helping to power everything from self-driving cars to translation apps on smartphones,

But the technology still has significant limitations. Many deep-learning models only work well when fed vast amounts of data, and they often struggle to adapt to fast-changing real-world conditions.

In his presentation, Lipton also highlighted the tendency of AI boosters to claim human-like capabilities for the technology. The risk is that the AI bubble will lead people to place too much faith in algorithms governing things like autonomous vehicles and clinical diagnoses.

“Policymakers don’t read the scientific literature,” warned Lipton, “but they do read the clickbait that goes around.” The media business, he says, is complicit here because it’s not doing a good enough job of distinguishing between real advances in the field and PR fluff.

Lipton isn’t the only academic sounding the alarm: in a recent blog post, “Artificial Intelligence—The Revolution Hasn’t Happened Yet,” Michael Jordan, a professor at University of California, Berkeley, says that AI is all too often bandied about as “an intellectual wildcard,” and this makes it harder to think critically about the technology’s potential impact. ... " 

Read the whole story
sness
8 days ago
reply
this is why i decided to not go into machine learning as a job after my Ph.D. Maybe my move to learn software engineering at a deep level will turn out to be a good decision after all. still it sucks to have missed that bubble, i was right there :)
milky way
Share this story
Delete

British Airways site had credit card skimming code injected

1 Comment and 2 Shares
Article intro image

Enlarge / Thousands of BA customers had their credit card data "skimmed" by malicious JavaScript code inserted into the airline's website. (credit: Alf van Beem)

Last week, British Airways revealed that all the payment information processed through the airline's website and mobile app between August 21 and September 5 had been exposed. As many as 38,000 British Airways customers may have had their contact and financial information stolen in the breach, which evidence suggests was the result of malicious JavaScript code planted within British Airway's website.

According to a report by RiskIQ's Head Researcher Yonathan Klijnsma published Tuesday, RiskIQ detected the use of a script associated with a "threat group" RiskIQ calls Magecart. the same set of actors believed to be behind a recent credit card breach at Ticketmaster UK. While the Ticketmaster UK breach was the result of JavaScript being injected through a third-party service used by the Ticketmaster website, the British Airways breach was actually the result of a compromise of BA's own Web server, according to the RiskIQ analysis.

"This attack is a highly targeted approach compared to what we’ve seen in the past with the Magecart skimmer,” said Klijnsma. "This skimmer is attuned to how British Airways’ payment page is set up, which tells us that the attackers carefully considered how to target this site in particular."

The suspect scripts were detected based on a daily crawl of websites conducted by RiskIQ, which gathers data on more than two billion pages a day. Focusing on how the scripts on the BA site changed over time, the RiskIQ researchers found a modified script within the BA site. Code added to a JavaScript library utilized by the BA site called an API on a malicious Web server at baways.com—a virtual private server hosted by a provider in Lithuania, using a TS certificate registered through Comodo (apparently to raise its appearance of legitimacy) on August 15.

The 22 lines of code are targeted to export the data entered in the BA website's payment form to the malicious server when the "submit" button was clicked by a customer, with the data being sent as a JSON object. As a result, the transaction would go through for the customer without any errors, while the attackers received a full copy of the customer's payment information despite the payment apparently being over a secure session. The attackers also added a "touchend" callback to the script, which made the attack functional for users of BA's mobile app—which called the same, modified script.

While the modified script file's timestamp matches with the beginning of the attack reported by British Airways, the registration date for the malicious site's certificate, Klijnsma said, "indicates [the attackers] likely had access to the British Airways site before the reported start date of the attack on August 21st—possibly long before. Without visibility into its Internet-facing web assets, British Airways were not able to detect this compromise before it was too late."

British Airways would not comment on the RiskIQ report, as a criminal investigation is still underway.

Read Comments

Read the whole story
sness
12 days ago
reply
milky way
Share this story
Delete
1 public comment
acdha
12 days ago
reply
If you run a website, ask how you’d detect an attack like this. Things like SRI and CSP offer some countermeasures but it’s a hard problem.
Washington, DC

Ultima in View

2 Shares
New Horizons has made its first detection of its next flyby target, the Kuiper Belt object nicknamed Ultima Thule, more than four months ahead of its New Year's 2019 close encounter.
Read the whole story
sness
21 days ago
reply
milky way
Share this story
Delete

FURminator De-Shedding Tool

1 Share

The FURminator ($22+) is the only really functional cat-grooming tool I’ve ever found. The stiff steel rake grabs the undercoat while leaving the topcoat intact. It does a tremendous job of removing loose fur. Be prepared, especially the first time you brush your cat. For my cats, the big difference between the FURminator and regular brushes is that the softer bristles of standard brushes just get hair from the surface — the topcoat, and a bit of undercoat — whereas the stiffer teeth of the FURminator primarily snag the undercoat (and lots of it!) as well as loose hairs of the topcoat. The best part is that all that fur goes in the trash, and not on your sofa, bed, or carpet. The environment of my apartment has been improved dramatically, and I no longer need to spend a lot of time vacuuming up cat hair. While the FURminator is expensive for a grooming tool, it’s solidly constructed and ergonomically designed, and best of all, it really works. My vet used it on my cats while they were in for a visit. I was shocked at how much hair came off in just a few strokes, so I bought one to take home and have been using it for several months. I then threw out the other standard, cat/slicker brushes I had acquired over the years, and bought two more FURminators to give to cat-owning friends. The one I use is 1.75″ and is intended for cats, so although the FURinator comes in larger sizes for dogs, I can really only speak to its utility when it comes to cats.

[This is a Cool Tools Favorite from 2008]

Here's a dog getting de-fur'd.

Read the whole story
sness
22 days ago
reply
milky way
Share this story
Delete

CIA Network Exposed through Insecure Communications System

1 Comment

Interesting story of a CIA intelligence network in China that was exposed partly because of a computer security failure:

Although they used some of the same coding, the interim system and the main covert communication platform used in China at this time were supposed to be clearly separated. In theory, if the interim system were discovered or turned over to Chinese intelligence, people using the main system would still be protected -- and there would be no way to trace the communication back to the CIA. But the CIA's interim system contained a technical error: It connected back architecturally to the CIA's main covert communications platform. When the compromise was suspected, the FBI and NSA both ran "penetration tests" to determine the security of the interim system. They found that cyber experts with access to the interim system could also access the broader covert communications system the agency was using to interact with its vetted sources, according to the former officials.

In the words of one of the former officials, the CIA had "fucked up the firewall" between the two systems.

U.S. intelligence officers were also able to identify digital links between the covert communications system and the U.S. government itself, according to one former official -- links the Chinese agencies almost certainly found as well. These digital links would have made it relatively easy for China to deduce that the covert communications system was being used by the CIA. In fact, some of these links pointed back to parts of the CIA's own website, according to the former official.

People died because of that mistake.

The moral -- which is to go back to pre-computer systems in these high-risk sophisticated-adversary circumstances -- is the right one, I think.

Read the whole story
sness
22 days ago
reply
"The moral -- which is to go back to pre-computer systems in these high-risk sophisticated-adversary circumstances -- is the right one, I think."
milky way
Share this story
Delete
Next Page of Stories