I believe that in the UK and US there is a lack, nay absence, of pragmatic computing education which matches the needs of the current business world of information technology (IT). Current computer education, school and university, appears to me to be computer science based, very theoretical and does not follow the logical sequence of activity in the development, use and management of business applications that I observed in my long IT career spanning many industries.
(In this blog I use "business" in its broadest sense to mean the "world of work," be it commercial, scientific, medical or industrial.)
In fact, the curricula appear to me to be a collection of topics with little synergy and no end-to-end flow which IT projects have. As an analogy, consider the following scenario which I believe is a parallel to this.
A technical course on the motor car is run at Knowalot College, covering the internals of the car; Carnot cycle, adiabatic expansion, electronic ignition etc.; very detailed and demanding. At the end of the course, the student will probably have no concept of the motor car as vehicle, might not know how to drive, read a map or plan a journey from A to B. It is almost certain that he/she will not know how to decide on which car, van or lorry to recommend for the business he works for. In short, he/she is doomed to be a head-under-the-bonnet techie forever. That job of course is necessary but it cannot be classed as covering "motor transport," simply a technical corner of it.
Not only that, but the words "business" or "requirements" do not even appear anywhere in CS curricula I have searched. Only under the title "problem solving" could one guess that it refers to business. This is not to say CS education per se is bad; it just isn't a comfortable fit to the current computing world although it is gradually finding a niche in various areas of computing. These areas include big data, data science, cognitive and similar computing, and cybersecurity.
However, a broader knowledge across key IT concepts and architectures is needed since no person in IT is an island and anyone totally specialized will find it difficult to cross-communicate where his/her field overlaps with another, particularly in meetings or presenting to the business.
What Are the Differences?
In this part of blog, I will try to demonstrate this CS vs. IT dichotomy but first some outside view of the differences between CS and IT:
The proposition I put to CS people as to what modern IT is goes roughly as follows:
■ IT needs to be presented as sequence of related activities within a framework, not a simple collection of topics.
The flow of IT projects can be represented as:
- Business idea/need
- Specification of business flow
- IT Architecture (product-free)
- Populate the design with Technology
- Code/Buy software
- Implement
- Manage
- Update
- Retire systems and Start again
(There will of course be reviews and the like throughout this sequence of activity.)
You can see "coding" in context here; students and teachers cannot see this far.
■ There should be a pragmatic, contextual "wrapping" around major topics, for example, "this is used in the oil industry to map the subsea strata in the search for oil deposits." – the "so what?" test.
■ Emphasize important aspects of IT as a framework in which to teach topics. Over the years I have decided that FUMPAS represent the key elements (others can be found within these):
FUNCTIONALITY
USABILITY
MANAGEABILITY
PERFORMANCE
AVAILABILITY
SECURITY.
These are the criteria to map onto any business IT project to whatever degree of detail (reflecting its importance) the business decides.
■ Two large topics totally absent from CS curricula are mainframes, their operating systems and high performance computing (HPC). Much of the world's financial work is done on mainframes and its influence is growing, believe it or not. HPC computing is now a big field and is expanding beyond pure science into medicine, financial modelling, AI and other power hungry areas. Not to even mention them is dereliction of IT teaching duty, whatever the syllabus mandates. This sort of add-on could be done by selection of a suitable reading list, even if it is not in the syllabus.
CS school and university syllabuses I have studied do not fit the "real world" IT scene in breadth, depth or velocity of change and I therefore generated a keyword list to demonstrate this dichotomy. The list then developed into a learning Glossary, now on Amazon Kindle (check tomorrow for Part 2 of this blog), to show where IT fits in the business world and the topics which make it tick. The CS world can then see if their output matches these requirements.
So what? The world has gone mad on the "digital revolution" impacting nearly all business. I believe this issue needs to be addressed vigorously and quickly to tackle the much discussed "IT skills shortage." The current computer education, at least in the UK, will not achieve this aim, still less cater for the skills needs post-Brexit. I see no difference between UK CS and US CS, ergo much of what I say also applies the US.
Finally, I cannot find a syllabus anywhere I have looked that remotely covers IT as demonstrated by the list and subsequently the Glossary. I see this as a start in resolving the "IT Skills issue," a mantra that has been trotted out since the year 2000, if not earlier.
As Mark Twain said; "Everybody is talking about the weather, nobody is doing anything about it." I hope the Glossary is a beginning.
Read Computer Science (CS) and Information Technology (IT): Part 2
The Latest
Broad proliferation of cloud infrastructure combined with continued support for remote workers is driving increased complexity and visibility challenges for network operations teams, according to new research conducted by Dimensional Research and sponsored by Broadcom ...
New research from ServiceNow and ThoughtLab reveals that less than 30% of banks feel their transformation efforts are meeting evolving customer digital needs. Additionally, 52% say they must revamp their strategy to counter competition from outside the sector. Adapting to these challenges isn't just about staying competitive — it's about staying in business ...
Leaders in the financial services sector are bullish on AI, with 95% of business and IT decision makers saying that AI is a top C-Suite priority, and 96% of respondents believing it provides their business a competitive advantage, according to Riverbed's Global AI and Digital Experience Survey ...
SLOs have long been a staple for DevOps teams to monitor the health of their applications and infrastructure ... Now, as digital trends have shifted, more and more teams are looking to adapt this model for the mobile environment. This, however, is not without its challenges ...
Modernizing IT infrastructure has become essential for organizations striving to remain competitive. This modernization extends beyond merely upgrading hardware or software; it involves strategically leveraging new technologies like AI and cloud computing to enhance operational efficiency, increase data accessibility, and improve the end-user experience ...
AI sure grew fast in popularity, but are AI apps any good? ... If companies are going to keep integrating AI applications into their tech stack at the rate they are, then they need to be aware of AI's limitations. More importantly, they need to evolve their testing regiment ...
If you were lucky, you found out about the massive CrowdStrike/Microsoft outage last July by reading about it over coffee. Those less fortunate were awoken hours earlier by frantic calls from work ... Whether you were directly affected or not, there's an important lesson: all organizations should be conducting in-depth reviews of testing and change management ...
In MEAN TIME TO INSIGHT Episode 11, Shamus McGillicuddy, VP of Research, Network Infrastructure and Operations, at EMA discusses Secure Access Service Edge (SASE) ...
On average, only 48% of digital initiatives enterprise-wide meet or exceed their business outcome targets according to Gartner's annual global survey of CIOs and technology executives ...
Artificial intelligence (AI) is rapidly reshaping industries around the world. From optimizing business processes to unlocking new levels of innovation, AI is a critical driver of success for modern enterprises. As a result, business leaders — from DevOps engineers to CTOs — are under pressure to incorporate AI into their workflows to stay competitive. But the question isn't whether AI should be adopted — it's how ...