"Digital" for den Dummkopfen - Part 1
December 03, 2019

Terry Critchley
Author of "Making It in IT"

Share this

The word "digital" is today thrown around in word and phrase like rice at a wedding and never do two utterances thereof have the same meaning. Common phrases like "digital skills" and "digital transformation" are explained in 101 different ways. As Humpty Dumpty said to Alice; "When I use a word, it means just what I choose it to mean — neither more nor less." This applies to "digital."

The outcome of this is a predictable cycle of confusion, especially at business management level where often the answer to business issues is "more technology."

Digital

A term applied to computers which only understand 1s and 0s to differentiate them from their analogue cousins, which understand waveforms, graphs and other visual indicators of numeric values. No arguments here, except for quantum computers which have many more configuration options to play with, not just two, in atomic configurations. I suspect though that it is possible to become "digital" on those too.

Digital Skills Shortage

A term applied to shameful shortfalls of dummkopfen who don't have digital skills and should have. However, acquiring IT (digital) skills does not mean there is a standard level of competence throughout IT and you either have it or you don't. Skill is not binary in that sense and I have a personal mental picture of IT Skill being at four levels for the business-level user:

1. Awareness of the basic use of IT at the level of computer, data and connections between computers; school students should have this and members of the public who interact with computer services. Teaching and learning coding at this level is a waste of time; view from 10,000 feet is far better.

2. Acquaintance with these elements and the ability to enter a discussion about the use of computing. This I feel is the level that non-IT managers in enterprises undergoing digital transformation should possess. For too long they have been spoon-fed systems and applications then complain at the end that the deliverables are not fit for purpose.

This might be avoided if they were in at the start of the project process with the ability to question computing projects sensibly at a pragmatic level. The days when such managers might say; ‘Oh I leave all that stuff to my techie chaps!' are over.

3. Overall IT knowledge, analogous to the type acquired by medics at medical school but not at specialist level. It is a mandatory precursor to any IT specialisation, as I repeat ad nauseam to anyone who will listen.

4. Specialist knowledge in a particular area but only acquired when the person has traversed level 3. above. Ideally, level 4 should be provided by the employer since his requirements of any specialisation will vary from some perceived ‘standard' for that specialisation. This is only partially recognised by employers.

For levels 3. and 4. the education should include enough material to give a level 1. or 2. understanding of topics peripheral to the main themes of IT, such that the person can follow a discussion on these topics. Examples are GDPR (General Data Protection Regulation), TCO (total cost of ownership), ROI (return on investment), RCA (root cause analysis and so on.

This sort of knowledge is essential in digital transformations, depending on the part each person play in it. In addition, broad IT knowledge is the icing on the cake which gives a person the edge over others in the job and career progression stakes.

Specialization

There is a tendency these days to isolate specialist subject training as if it stands alone in the computing environment. Many courses and training paths today are aimed at specialist subject, such as cybersecurity or data science, without specifying any pre-requisite knowledge. This is a mistake, similar to teaching someone how to sail a boat but neglecting to tell them about the sea, its hazards and its navigation.

To take an analogy, can you imagine a cardiologist reaching his/her exalted position without any other medical training, that is, general medical school? Look at the following quotation which explains this to a "T":

John Muir said: "When we try to pick out anything by itself, we find it hitched to everything else in the Universe."

For example, cybersecurity will require knowledge of networks, their protocols in forensics and wire data plus knowledge of the access points for this data and further some knowledge of data structures and so on. This knowledge will have to be pragmatic and not purely theoretical as it tends to be in computer science education, both in school and in universities.

The specialist too will need to take part in team discussions where part of a project involves his/her specialism but they will find it very hard to communicate with other members of the team who will often be speaking a different language. (Modern computing is a team game, not a game of tennis singles.)

Other Facts about "Digital Skills"

Another factor mandating broad IT skills is that the use of these new "toys" will eventually be judged by executives on their usefulness to business in either ££/$$s or other tangible benefits and NOT the degree of "Gee Whizziness" or trendiness; those criteria are the province of the "geek" and the IT-unwashed media and politicians. In addition, there may well come a time when these tools have run their course for a business and become business as usual (BAU) which leaves the skills incumbent on a sticky wicket. I have seen it happen in my decades-long sojourn in IT in various roles and industry environments.

In any case, which specialized graduate would want to be writing AI algorithms or Python programs from the age of 20 to age 65 or more when he/she reaches retirement? They can't shift jobs easily in the interim without a solid IT background and the alternative to this shift is leaving. Without a doubt, computer job migration is happening in most industries and so the "one-subject" person is highly exposed.

Also, it is said sometimes that the half-life of any job in IT is 18-24 months which means anything from complete job change to a job morphing into something slightly different. Either way, broad skills are necessary to ride this wave of change.

Read Part 2 of this blog.

Dr. Terry Critchley is an IT consultant and author who previously worked for IBM, Oracle and Sun Microsystems
Share this

The Latest

November 21, 2024

Broad proliferation of cloud infrastructure combined with continued support for remote workers is driving increased complexity and visibility challenges for network operations teams, according to new research conducted by Dimensional Research and sponsored by Broadcom ...

November 20, 2024

New research from ServiceNow and ThoughtLab reveals that less than 30% of banks feel their transformation efforts are meeting evolving customer digital needs. Additionally, 52% say they must revamp their strategy to counter competition from outside the sector. Adapting to these challenges isn't just about staying competitive — it's about staying in business ...

November 19, 2024

Leaders in the financial services sector are bullish on AI, with 95% of business and IT decision makers saying that AI is a top C-Suite priority, and 96% of respondents believing it provides their business a competitive advantage, according to Riverbed's Global AI and Digital Experience Survey ...

November 18, 2024

SLOs have long been a staple for DevOps teams to monitor the health of their applications and infrastructure ... Now, as digital trends have shifted, more and more teams are looking to adapt this model for the mobile environment. This, however, is not without its challenges ...

November 14, 2024

Modernizing IT infrastructure has become essential for organizations striving to remain competitive. This modernization extends beyond merely upgrading hardware or software; it involves strategically leveraging new technologies like AI and cloud computing to enhance operational efficiency, increase data accessibility, and improve the end-user experience ...