Why Less (Code) Is More

write-less-code

Writing less code helps developers build clean, functional software that’s easy to maintain over time.

Ask any industry expert what makes a good developer and they’ll offer a variety of answers.

Broad technical experience, good communication skills, and excellent time management head the list. Those are all useful characteristics.

However, there is one trait that usually gets overlooked, something that has an enormous impact on both the development process and final quality: the ability to write lean, concise code.

The best developers know how to get more mileage from less code.

It’s an especially important skill in this era of reusable code, when the availability of ready-made components provides so many shortcuts for busy developers.

Those components represent a huge step forward by cutting the amount of tedious programming required in the early stages of a project.

The downside is that these development tools make it easy for inexperienced developers to write bulky code.

By flipping the script – focusing on writing less code instead of faster code – developers can build reliable software with low technical debt.

What Do Developers Do (Really)?

Developers are subject matter experts with the technical skills to build and maintain software.

hey work to understand technology in order to create technology-based solutions for real world problems.

Nowhere in that description does it say, “write code”.

Code is obviously how the technology gets built, but it should be seen as a means to an end rather than the whole job description.

Developers need to combine business sense, problem-solving skills, and technical knowledge if they want to deliver the best value to their clients.

Too often, developers forget their true purpose.

They write code according to habit, personal style, or excitement over a new tool they’ve been hoping to use instead of prioritizing ease of maintenance and scalability.

The result is software with a huge code base and a shortened usable life.

The Code Economy Advantage

When it comes to code, more is actually less. The odds of project failure go up as the code size increases.

After all, more code translates into more chances for something to go wrong.

One area of concern is bugs that make it into the final software.  The industry average ranges from 15-50 errors per 1000 lines of code delivered.

Projects with sizable code bases are going to have more flaws as a flat statistical reality.

Denser code is less likely to be read thoroughly, too, which means the ratio of errors to code will fall towards the higher end of that scale.

Having more lines of code also leads to higher technical debt.

Future maintainers (and anyone trying to update the software) must navigate that code to fix bugs, add features, and integrate the software with future systems.

Software development is a place where labor is a significant expense. When time spent on development and maintenance rises with a program’s size, it spurs an equal rise in IT spending.

There’s another increase in developer overhead from additional meetings and longer on-boarding processes when new team members are added.

Considering all of this, there are clear advantages to emphasizing concise code, both in cost and quality.

Code written efficiently and directly is:

  • Simple to maintain
  • Easy to understand
  • More flexible
  • Ages better
  • Easier to update and build onto
  • Reusable & elegant

Developers should work to write as much code as they need to get the job done correctly- and no more.

Why Developers Get “Code Happy”

If writing less code has such a powerful effect, why do developers continue to write far more code than is actually needed?

There are a few typical motivations:

Desire for Productivity

In some agencies, lines of code is used as a measure of productivity. The thinking goes that more lines of code – often abbreviated to LoC – equals more work done.

This is particularly common when running distributed teams who typically work without in-person direction.

The problem is that measuring productivity by LoC completed leads to sloppy writing and a focus on quantity over quality.

It’s like measuring a hotel’s success by how many towels they use; the fact has some bearing on success but can be very misleading.

Misaligned Priorities

Software development always involves trade-offs: simplicity versus feature richness, security versus performance, speed over space.

Sometimes the value of writing less code gets lost in the shuffle.

There’s no unimportant component of development.

Every project has different demands that require a tailored approach.

However, brevity is important enough that it should always be a high priority.

Personal Preference

Developers tend to be “code geeks”. They like to write a bunch of software and try new things purely for the sake of trying new things.

It’s a great quality when they’re learning or experimenting.

However, it’s not the best idea when working on enterprise software. The approach usually results in four times as many lines of code as needed for a task.

Developers need to direct their talent towards building software that meets the product owner’s goals even when that conflicts with their personal preferences.

Lack of Skill

Writing clean, concise code takes skill and practice.

Often less-experienced developers don’t know how to reduce LoC without cutting features or impacting performance.

Almost everyone does this in the beginning, but getting past it is part of honing developer skills.

Convention

Developers all learned their trade somewhere.

Every school of thought and development philosophy imposes certain ideas about how to make code more readable.

The issue arises when holding to convention comes at the expense of code economy.

As much as 54% of LoC are inspired by convention as opposed to utility.

Creating whitespace, intentionally skipping lines, and linguistic keywords are examples.

There are ways to improve readability without conventions that pad out the code base.

How to Write Less Code?

Making something complex look simple is hard, but it’s very easy to complicate simple things. Code economy is like that.

These are a few straightforward guidelines that can help developers get past complexity and write less code.

Build on an Existing Foundation

There’s no reason to reinvent the wheel. Use libraries instead of recreating what others have done well countless times.

Innovation should be saved for problems that don’t already have good solutions.

The Right Black Boxes are Good

Choose tools that enable code efficiency.

The Chrome V8 JavaScript Engine powers NodeJS, React Native, Electron, and Chrome Browser with 1,884,670 lines of code.

Be Careful Selecting Dependencies & Frameworks

Anything used should be mature, well-supported, and preferably have already proven its worth in practice.

Prioritize lean and simple frameworks whenever possible.

It’s also important to consider the size of the community and strength of company backing so future developers will have an easier time working with the code.

Reframe “LoC written” as “LoC spent”

Break down the connection between LoC and assumed productivity by changing the way it’s measured.

Instead of counting how many lines a developer was able to write, measure how many they needed to get the job done.

Compare it to a golf score: the fewer LoC that can be written while still delivering a good product and meeting sprint deadlines, the better.

Spend some time during the planning phase to brainstorm how less code can be written.

Planning ahead instead of charging in allows more opportunities for code economy.

The Code Economy Mindset

A huge part of writing less code is maintaining a direct, economical mindset.

Optimize code for correctness, simplicity, and brevity.

Don’t depend on assumptions that aren’t contained in the code, and never use three lines where one is just as readable and effective.

Make consistent style choices that are easy to understand. Avoid “run on coding sentences” by breaking different thoughts and concepts into separate LoC.

Only add code that will be used now, too.

There’s a tendency to try and prepare for the future by guessing what might be needed and adding in the foundations of those tools.

It seems like a smart approach, but in practice it causes problems, mainly:

  • Guesses about what may be useful in the future may be wrong.
  • Time spent writing code that won’t be used yet can delay the product’s launch.
  • Extra work translates into a higher investment before the product has even proven its worth or begun generating ROI.

Abstracting is another practice that should only be done according to present need. Do not abstract for the future.

Along these same lines, don’t add in comments or TODOs.

This messy habit encourages sloppy code that can’t be understood on its own.

Comments are more easily accessed when placed where everyone can read them using inline documentation.

Reusable code is a major asset, but make sure it’s fully understood instead of blindly copying and pasting to save time.

Try to choose options that follow good code economy guidelines.

Finally, don’t rush through development with the idea of refactoring later on.

The assumption that refactoring is “inevitable” leads to software that already needs work at launch.

It is possible to create a solid product with low technical debt by writing clean, concise code up front.

Keep A Hand on The Reins

Most importantly, don’t go too far by trying to use the absolute minimum LoC over more practical concerns.

As discussed earlier, a developer’s job is to solve problems.

When minimalist code is forced to override other needs, it becomes part of the problem instead of a solution.

Writing less code helps our developers create technology-based enterprise solutions with a long shelf life. Set up a free consultation to find out how we can solve your company’s most urgent business problems!

Request a Consultation

Concepta Welcomes CTO Leo Farias as CEO

CEO-Leo-Farias-Concepta

Leonardo Farias, Concepta’s Co-founder and Chief Technology Officer, succeeds Humberto Farias as CEO. Humberto Farias will continue as chairman of the board.

Leo Farias has been part of the Concepta team since founding the company with his partner in 2006. He has an MPS in Business of Art & Design from the Maryland Institute College of Art.

Earlier this year he was recognized with the Orlando Business Journal’s “Innovations in Technology Award”, and the OBJ went on to honor him as one of their “40 Under 40” most influential business leaders of 2018.

The awards are due in part to his active interest in the community and in mentoring young developers.

Leo sits on the advisory committee for the Valencia College Graphic & Interactive Advisory Board, where he helps ensure the curriculum prepares students for real-world tech industry careers.

He shares his experience with other developers at local technology meetups, as well as offering mentorship to entrepreneurs looking to launch their own start-ups.

At heart, though, Leo is a programmer and a problem solver. His technical skills were a major part of building Concepta’s strong foundations.

In the company’s first year of operation they were approached with a major project: find a way to fix an outdated, unscalable piece of software critical to the State of Texas’s emergency response system.

It seemed an impossible task, especially working under the strain of two hurricanes which had struck the state in rapid succession.

Leo didn’t shrink from the challenge. He pulled together the necessary resources to diagnose the problems with the system, find a solution, and put a plan into action that aid FEMA and the State of Texas during their relief efforts.

It was Concepta’s first major contract. Thanks to Leo’s technical guidance, it wasn’t the last.

Concepta has grown from a small start-up into a local leader in technology solutions, with a client list that includes Fortune 500 companies like Disney and Warner Music.

Now, as the company is poised for another surge of growth, Leo’s forward-thinking brand of leadership is more welcome than ever.

Request a Consultation

Is JSON Schema the Tool of the Future?

json-schema

JSON Schema is a lightweight data interchange format that generates clear, easy-to-understand documentation, making validation and testing easier.

JSON Schema is used to describe the structure and validation constraints of JSON documents.

Some have called it “the future for well-developed systems that have nested structures”.

There’s some weight to those claims; it’s definitely become a go-to tool for those who get past its steep learning curve.

Reviewing the Basics

JSON, which is the acronym for JavaScript Object Notation, is a lightweight data-interchange format.

It’s easy for humans to read and write, and equally easy for machines to parse and generate.

JSON Schema is a declarative language for validating the format and structure of a JSON Object.

It describes how data should look for a specific application and how it can be modified.

There are three main parts to JSON Schema:

JSON Schema Core

This is the specification where the terminology for a schema is defined.

Schema Validation

The JSON Schema validation is a document which explains how validation constraints may be defined. It lists and defines the set of keywords which can be used to specify validations for a JSON API.

Hyper-schema

This is where keywords associated with hyperlinks and hypermedia are defined.

What Problem Does JSON Schema Solve?

Schemas in general are used to validate files before use to prevent (or at least lower the risk of) software failing in unexpected ways.

If there’s an error in the data, the schema fails immediately. Schemas can serve as an extra quality filter for client-supplied data.

Using JSON Schema solves most of the communication problems between the front-end and the back-end, as well as between ETL (Extract, Transform and Load) and data consumption flows.

It creates a process for detailing the format of JSON messages in a language both humans and machines understand. This is especially useful in test automation.

Strengths of JSON Schema

The primary strength of JSON Schema is that it generates clear, human- and machine-readable documentation.

It’s easy to accurately describe the structure of data in a way that developers can use for automating validation.

This makes work easier for developers and testers, but the benefits go beyond productivity.

Clearer language allows developers to spot potential problem faster, and good documentation leads to more economical maintenance over time.

Weaknesses of JSON Schema

JSON Schema has a surprisingly sharp learning curve.

Some developers feel it’s hard to work with, dismissing it as “too verbose”. Because of the criticism, it isn’t well known.

Using JSON Schema makes projects grow quickly. For example, every nested level of JSON adds two levels of JSON Schema to the project.

This is a weakness common to schemas, though, and depending on the project it may be outweighed by the benefits. It’s also worth considering that JSON Schema has features which keep the size expansion down.

For example, objects can be described in the “definitions section” and simply referenced later.

What Else Is There?

Some developers prefer to use Mongoose, an Object Document Mapper (ODM) that allows them to define schemas, then create models based on those schemas.

The obvious drawback is that an extra abstraction layer delivers a hit to performance.

Another option is Joi, a validation library used to create schemas for controlling JavaScript objects. The syntax is completely different, though, and Joi works best for small projects.

Sometimes developers jump into a new MongoDB with a very flexible schema. This inevitably dooms them to “schema hell”, where they lose control as the project grows.

When JSON Schema Is the Right Choice

Performance is undeniably important. However, there are times when the cost of recovering from mistakes is far higher than the cost of taking the speed hit that comes with schema validation.

For those times the performance drop isn’t large enough to justify the risk of bad data entering the system, and that’s where JSON Schema comes into play.

JSON Schema is proving itself as a development option, but there’s no single “best tool” for every project. Concepta takes pride in designing a business-oriented solution that focuses on delivering value for our clients. To see what that solution might look like for your company, reserve your free consultation today!

Request a Consultation

Which Business Analytics Trends Can Be Put To Use Today?

business analytics trends

Originally published April 6, 2017, updated Dec. 18, 2018.

The BI technologies which offer the best chance of success today are those that allow companies to take advantage of time-sensitive opportunities while providing more responsive customer service.  

One of the most important parts of developing a digital strategy is knowing when not to jump on a high-tech bandwagon.

Some technologies show potential in small-scale trials but haven’t had enough real-world usage to prove their worth to enterprise. Adopting too early puts companies at risk of losing their investment.

On the other hand, waiting too long leaves them in their competition’s shadow.

There’s a lot at stake in this balancing act. Digital transformation isn’t a luxury anymore. It’s critical for companies who want to stay competitive.

Even chains of three or four locations can fall behind their peers if they aren’t maximizing their data usage.

Of course, building up digital infrastructure costs money. Choosing the right technology is the best way to ensure a smooth return on that investment.

A number of business analytics trends are already picking up speed coming into 2019.

Some are years away from being able to deliver on their promises. Others have reached the stage where a company can reliably use them to gain a competitive edge while side-stepping the risks inherent to early adoption.

The best of this second group are outlined below. These are the trends to adopt for enterprises seeking to improve their data agility.

Predictive analytics

Predictive analytics as a field has existed since the late 1600s, when Lloyd’s of London used it to estimate insurance rates on seagoing vessels.

Until the rise of computers, though, it wasn’t a practical means of steering business.

There were too many variables for a human to consider in time to form more than broad predictions.

Widely available cloud storage and increased processing power changed that.

The field has seen a resurgence as the most efficient way to maximize data usage and feed a data-driven decision making process.

73% of companies consider themselves to be analytically driven, and predictive analytics are behind the most successful of these.

Predictive analytics detect deviations in patterns, generate insights based on evolving activity, and predict future outcomes from gathered data.

The benefits of predictive analytics are clearly demonstrated by the variety of practical applications in use today. One unexpected example is human resources.

Retaining experienced workers is a constant challenge for employers who must cope with turnover rates of nearly 20% (averaged across US industries).

The tech sector suffers from even higher turnover. Replacing lost workers can cost up as much as half their annual salary, not counting lost productivity during the training process.

Using predictive analytics, HR managers can find patterns in their employment data that highlight the reasons good employees leave and suggest the incentives most likely to make them stay: higher salaries, additional training, more appealing benefits packages, or in some cases transfers to more engaging positions.

The data also predicts which employees are most desirable to hire and retain.

There’s still a long way to go before the full potential of predictive analytics is realized. That said, the technology is maturing much faster than experts predicted.

Its current capabilities are more than reliable enough to justify making an investment.

Real time analytics

Real time analytics (also known as streaming analytics) give enterprises an up-to-date visualization of their operations.

It was a growing trend back in 2017, and today it’s living up to that promise.

In the traditional analytics model, information is stored in a data warehouse before analysis is applied.

This causes a gap between collection and results where perishable opportunities are lost.

There’s no rule that says data has to be stored first. It can be analyzed mid-stream to sift out data that will only stay relevant for a short time.

Companies then have the chance to make the most of the opportunity through swift action.

Information gathered by real-time analytics is usually displayed in a dynamic graphic format that doesn’t require a data science degree to understand, too. That makes it easy to act on quickly.

A business that can spot opportunities in time to take action makes much greater use than those left playing catch-up on trends.

The one caveat about streaming analytics is that they work best in data-driven cultures. Be sure to provide both technical training and executive support when launching a real-time analytics tool.

Chatbots and Natural Language Processing

Natural Language Processing (NLP) has grown from an internet novelty to a reasonably robust tool.

While it hasn’t seen as much use in the corporate world as its cousin, Natural Language Generation (NLG), it has developed enough for enterprise use.

The most relevant NLP application right now is employing chatbots to provide 24/7 customer support availability. Customers can interact with a chatbot using normal, everyday language.

The sophistication of a bot varies widely. Some have very basic account support capabilities; others can guide a customer from selecting a product all the way through checkout.

At this stage of maturity users generally know they’re speaking to a chatbot, though NLP has evolved to the point where the bot doesn’t frustrate users by getting stuck or spitting out garbled answers.

Instead, bots provide a clear, straightforward path to resolving common customer issues. The convenience of having uninterrupted access to routine account services tends to negate any annoyance.

Virtual assistants also fall under the heading of NLP. These let users request analytics and services using natural language and receive replies either out loud or projected to a specific device.

There are virtual assistant integrations for a huge variety of popular enterprise programs. Some even provide a path for the assistants to complete purchases using pre-approved sites.

Looking forward

Some interesting trends are gaining traction right now. Connected “multi-cloud” strategies are maturing, and research firms like Gartner have been tracking the application of real-time analytics to automated insight generation.

For now, though, the analytics trends are the ones which have demonstrated their utility and staying power.

While nothing is guaranteed in the tech world, even tech-averse companies can expect at least a reasonable ROI from their adoption.

Are you interested in adding to your business analytics toolkit? Get a full assessment of your BI needs!

Request a Consultation

Channeling Chaos into Better Software

chaos-theory

Chaos theory suggests small changes in initial conditions can result in vast differences in the future.

It implies that massive, unpredictable events can be directed with a few small early changes in the right place. While this is a simplification, it’s a useful one when it comes to enterprise software development.

Consider how many times a tiny decision has snowballed into a major situation.

It rarely seems like a significant decision when it’s made, but by the time developers spot the issue it’s an avalanche that threatens the entire project.

It seems impossible to avoid those random setbacks. After all, no developer can see the future.

In practice, though, developers can head off the majority of unpleasant surprises by embracing and preparing for chaos.

Why does Chaos happen?

The seeds of chaos are planted by dangerous mindsets that might seem like positives in the beginning: faith, optimism & bliss.

In this context faith refers to a belief that all initial assumptions were correct.

It’s an unrealistic confidence in one’s own skills, thinking experience means the team has all the answers and there’s no edge case they may be missing. The truth is that no one can foresee every potential problem.

Good developers are always open to learning and overcoming their limitations.

Optimism is general feeling that the easiest, most fluid path will cover 99% of situations. It’s a mistake to assume that a basic implementation is enough to cover most scenarios.

Operating under the belief that nothing too disruptive will happen removes the incentive give to create functional contingency plans.

If ignorance is bliss, the reverse is true as well. Bliss here describes a cheerful lack of understanding of the technology stack, the project’s scope, and business requirements.

Every tool has strengths and weaknesses. Developers have to know those weaknesses to compensate for them.

These mindsets invite chaos into the development process right from the start. Left unchecked, they increase the risk of a major oversight.

Embracing Chaos

There are ways for developers to stay on top of potential chaos without knowing exactly what form it will take. Start by prioritizing evidence over faith.

Turn the unknown into the known through a solid discovery. Best case scenarios are rare in software development industry; anything that can go wrong, will. It’s better to be over-prepared than caught off guard.

Don’t make assumptions without strong supporting evidence.

If no evidence is available, commit to a course of action as late as possible to allow room for change.

Practice “inversion thinking” during development. Game out the potential hazards ahead of time. What are all the negative things that can happen? How likely are they?

Brainstorming worst case scenarios provides the chance to create viable contingency plans.

It’s also a good idea to communicate thoroughly with the product owner about the impact of certain requirements.

Make sure everyone knows which options are riskiest. Provide a full risk-benefits analysis to guide product owners in making decisions about feature priorities and change orders.

Rolling with the Punches

Be alert for early signs of chaos and head them off at the pass.

Defensive programming is key. Test early and often. Every time a bug is found, write a test against that bug.

Knowledge is the currency of success. Have a clear understanding of the requirements before writing a single line of code.

Always understand one level below the level being worked on. Never stop learning. The technology steamroller is constantly moving, and it can roll over those who don’t keep up.

Finally, remember this: developers aren’t paid to write code (although they do). Developers are paid to think and solve problems. Don’t just patch based on assumptions.

Work through the actual problem to create a solution aligned with the products owner’s business requirements and stack technology.

A problem half stated is a problem half solved, so understand the actual problem from its roots before taking action.

Stay Alert, Stay on Schedule

Pessimism in development allows optimism in production.

Controlling for the mindsets that feed chaos leads to fewer and more manageable disruptions later in the game.

There’s always an element of chaos in software development, but the best teams know how to channel it into better software. Schedule your free consultation to hear our plan for your next project!

Concepta Expands Its Mobile App Capabilities with Teeps App Development Consulting Business Acquisition

concepta-technologies-teeps

The acquisition transfers Teeps brand, projects, and digital assets to Concepta.

Orlando, FL (December 6, 2018) Concepta Technologies is proud to announce their acquisition of fellow Orlando-based company Teeps consulting business.

The mobile app development company’s brand will become part of the Concepta family of technology service providers.

Teeps was established in 2012 by co-founders Terrence Donnelly and Joshua Imel. Early on, they established a reputation for building apps that combine functionality and highly intuitive user interfaces.

The user-friendly nature of their apps appeal to product owners, who benefit from shorter training times and higher adoption rates.

Over the past five years Teeps grew from a two-person shop to a business employing nearly two dozen creative professionals.

Past projects include a Virtual Visit system to help Orlando Health patients securely access remote care and a mobile iOS app for online training company Code School.

Teeps also worked with the Orlando Magic on the sports team’s popular Mobile Rewards platform.

Now, ready for a new challenge, Donnelly and Imel have agreed on terms to merge Teeps’ consulting business, including the Teeps brand, services, and ongoing projects, to Concepta Technologies.

The deal is an exciting one for the Teeps co-founders and the Concepta team, whose mobile application services were previously in competition with Teeps.

“They have a great reputation and have experienced substantial growth over the past several years,” said Leo Farias, Concepta Co-Founder and CEO.

“Concepta is growing, too. Bringing Teeps into the fold accelerates our growth and allows us to continue offering personalized services to our larger client base.”

“This represents a significant step forward for us,” added Concepta’s Co-Founder Humberto Farias.

“Concepta and Teeps are two of the top development companies in Orlando, so this deal positions Concepta as a local leader in technology solutions.”

Concepta already offers a wide range of enterprise technology services including web and mobile development, data intelligence, and software development.

Adding Teeps increases their capacity for mobile app development as they continue to establish themselves as one of Florida’s leading technology service providers.

“Stay tuned,” Leo Farias advised. “We plan to hire more technology talent and developers next year to handle the expanded operations. Orlando attracts some of the best talent around, so we’re confident we’ll be able to find the right additions to our team.”

Request a Consultation