Data architecture: the digital backbone of a business

We are each the sum of our parts, and, in our modern technological age, that includes data. Our search queries, clicking compulsions, subscription patterns and online shopping habits – even the evidence collected from wearable fitness tech – feeds into our digital footprint. And, wherever we choose to venture on our online quests, we are constantly being tracked.

Experts claim that we create 2.5 quintillion bytes of data per day with our shared use of digital devices. With the big data analytics market slated to reach a value of $103 billion by 2027, there are no signs of data storage slowing down.

But it’s less about acquisition than application and integration, with poor data quality accounting for a cost of $3.1 trillion per year against the US economy according to market research firm IDC. While device-driven data may be fairly easy to organise and catalogue, human-driven data is more complex, existing in various formats and reliant on much more developed tools for adequate processing. Around 95% of companies can attest that their inability to understand and manage unstructured data is holding them back.

Effective data collection should be conceptual, logical, intentional and secure, and with numerous facets of business intelligence relying on consumer marketplace information, the data processed needs to be refined, relative, meaningful, easily accessible and up-to-date. Evidently, an airtight infrastructure of many moving parts is needed. 

That’s where data architecture comes into the equation.

What is data architecture?

As the term would imply, data architecture is a framework or model of rules, policies and standards that dictate how data is collected, processed, arranged, secured and stored within a database or data system.

It’s an important data management tool that lays an essential foundation for an organisation’s data strategy, acting as a blueprint of how data assets are acquired, the systems this data flows through and how this data is being used.

Companies employ data architecture to dictate and facilitate the mining of key data sets that can help inform business needs, decisions and direction. Essentially, when collected, cleaned and analysed, the data catalogues acquired through the data architecture framework allow key stakeholders to better understand their users, clients or consumers and make data-driven decisions to capitalise on business.

For example, e-commerce companies such as Amazon might specifically monitor online marketing analytics (such as buyer personas and product purchases) to personalise customer journeys and boost sales. On the other hand, finance companies collect big data (such as voice recognition and facial detection) to enhance online security measures.

When data becomes the lifeblood of a company’s potential reach, engagement and impact, having functional and adaptable data architecture can mean the difference between an agile, informed and future-proofed organisation and one that is constantly playing catch-up.

Building blocks: key components of data architecture

We can better visualise data architecture by addressing some of the key components, which act like the building blocks of this infrastructure.

Artificial intelligence (AI) and machine learning models (ML)

Data architecture relies on strong IT solutions. AI and machine learning models are innovative technologies designed to make calculated decisions, including data collection and labeling.

Data pipelines

Data architecture is built upon data pipelines, which encompass the entire data moving process, from collection through to data storage, analysis and delivery. This component is essential to the smooth-running of any business. Data pipelines also establish how the data is processed (that is, through a data stream or batch-processing) and the end-point of where the data is moved to (such as a data lake or application).

Data streaming

In addition to data pipelines, the architecture may also employ data streaming. These are data flows that feed from a consistent source to a designated destination, to be processed and analysed in near real-time (such as media/video streaming and real-time analytics).

APIs (or Application Programming Interface)

A method of communication between a requester and a host (usually accessible through an IP address), which can increase the usability and exposure of a service.

Cloud storage

A networked computing model, which allows either public or private access to programs, apps and data via the internet.

Kubernetes

A container or microservice platform that orchestrates computing, networking, and storage infrastructure workloads.

Setting the standard: Key principles of effective data architecture

As we’ve learned, data architecture is a model that sets the standards and rules that pertain to data collection. According to simplilearn, effective data architecture, then, consists of the following core principles.

  • Validate all data at point of entry: data architecture should be designed to flag and correct errors as soon as possible.
  • Strive for consistency: shared data assets should use common vocabulary to help users collaborate and maintain control of data governance.
  • Everything should be documented: all parts of the data process should be documented, to keep data visible and standardised across an organisation.
  • Avoid data duplication and movement: this reduces cost, improves data freshness and optimises data agility. 
  • Users need adequate access to data. 
  • Security and access controls are essential.

The implementation and upkeep of data architecture is facilitated by the data architect, a data management professional who provides the critical link between business needs and wider technological requirements.

How is data architecture used?

Data architecture facilitates complex data collection that enables organisations to deepen their understanding of their sector marketplace and their own end-user experience. Companies also use these frameworks to translate their business needs into data and system requirements, which helps them prepare strategically for growth and transformation.

The more any business understands their audience’s behaviours, the more nimble they can become in adapting to ever-evolving client needs. Big data can be used to improve upon customer service, cultivate brand loyalty, and ensure companies are marketing to the right people.

And, it’s not all about pushing products. In terms of real-world impact, a shifting relationship to quality data could improve upon patient-centric healthcare, for example.

Take a dive into big data

Broaden your knowledge of all facets of data science when you enrol on the University of York’s 100% online MSc Computer Science with Data Analytics

Get to grips with data mining, big data, text analysis, software development and programming, arming you with robust theoretical knowledge to step into the data sector.

The real world impact of facial detection and recognition

From visual confirmation of rare diseases to securing smartphones, facial detection and recognition technologies have become embedded in both the background of our daily lives and the forefront of solving real-world problems. 

But is the resulting impact an invasive appropriation of personal data, or a benchmark in life-saving security and surveillance? Wherever you stand on the deep-learning divide, there is no denying the ways in which this ground-breaking biometric development is influencing the landscape of artificial intelligence (AI) application.

What is facial detection and recognition technology?

Facial detection and recognition systems are forms of AI that use algorithms to identify the human face in digital images. Trained to capture more detail than the human eye, they fall under the category of ‘neural networks’; aptly-named computer softwares modelled on the human brain, built to recognise relationships and patterns in given datasets.

Key differences to note

Face detection is a broader term given to any system that can identify the presence of a human face in a visual image. Face detection has numerous applications, including people-counting, online marketing, and even the auto-focus of a camera lens. Its core purpose is to flag the presence of a face. Facial recognition, however, is more specialised, and relates specifically to softwares primed for individual authentication. Its job is to identify whose face is present.

How does it work?

Facial recognition software follows a three-part process. Here’s a more granular overview, according to Toolbox:

Detection

A face is detected and extracted from a digital image. Through marking a vast array of facial features (such as eye distance, nose shape, ethnicity and demographic data, and even facial expressions), a unique code called a ‘faceprint’ is created to identify the assigned individual.

Matching

This faceprint is then fed through a database, which utilises several layers of technology to match against other templates stored on the system. The algorithms are trained to capture nuance and consider differences in lighting, angle and human emotion.

Identification

This step depends on what the facial recognition software is used for — surveillance or authentication. The technology should ideally produce a one-to-one match for the subject, passing through various complex layers to narrow down options. (For example, some software providers even analyse skin texture along with facial recognition algorithms to increase accuracy.)

Biometrics in action

If you’re an iPhone X user, you’ll be familiar with Apple’s Face ID authentication system as an example of this process. The gadget’s camera captures a face map using specific data points, allowing the stored user to unlock their device with a simple glance.

Some other notable face recognition softwares include:

  • Amazon Rekognition: features include user verification, people counting and content moderation, often used by media houses, market analytics firms, ecommerce sites and credit solutions
  • BioID: GDPR-compliant solution used to prevent online fraud and identity theft
  • Cognitec: recognises faces in live video streams, with clients ranging from law enforcement to border control
  • FaceFirst: a security solution which aims to use DigitalID to replace cards and passwords
  • Trueface.ai: services span to weapon detection, utilised by numerous sectors including education and security

Real-world applications

As outlined in the list above, reliance on this mode of machine learning has permeated almost all areas of society, extending wider still to healthcare and law enforcement agencies. This illustrates a prominent reliance on harvesting biometric data to solve large-scale global problems, spanning – at the extreme – to the life-threatening and severe. 

Medical diagnoses

We are beginning to see documented cases of physicians using these AI algorithms to detect the presence of rare and compromising diseases in children. According to The UK Rare Diseases Framework, 75% of rare diseases affect children, while more than 30% of children with a rare disease die before their fifth birthday. With 6% of people slated to be impacted by a difficult to diagnose condition in their lifetime, this particular application of deep learning is imperative.

Criminal capture

It was recently reported that the Metropolitan Police deployed the use of facial recognition technology in Westminster, resulting in the arrests of four people. The force announced that this was part of a ‘wider operation to tackle serious and violent crime’ in the London borough. The software used was a vehicle-mounted LFR system, which enables police departments to identify passers-by in real-time by scanning their faces and matching them against a database of stored facial images. According to the Met Police website, other applications of face identification include locating individuals on their ‘watchlist’ and providing essential information when there is an unconscious, non-communicative or seriously injured party on the scene.

Surveillance and compliance

A less intensive example, but one that could prove essential to our pandemic reality. Surveillance cameras equipped with facial detection were used to filter face mask compliance at a school in Atlanta, while similar technology has been applied elsewhere to conduct gun control.

Implications of procuring biometric information

Of course, no form of emerging or evolving technology comes without pitfalls. According to Analytics Insight, the accuracy rates of facial recognition algorithms are notably low in the case of minorities, women and children, which is dangerously problematic. Controversy surrounding data protection, public monitoring and user privacy persists, while the generation of deepfake media (and softwares like it), used to replicate, transpose and project one individual’s face in replacement of another, gives rise to damaging – and potentially dangerous – authentication implications. Returning to the aforementioned Met Police arrests, even in this isolated sample, reports of false positives were made, sparking outcry within civil rights groups.

At the centre of this debate, however, one truth is abundantly clear; as a society, we are becoming rapidly reliant on artificial intelligence to function, and the inception of these recognition algorithms is certainly creating an all new norm for interacting with technology.

Want to learn more about facial detection softwares? 

Dive deeper into the helps and harms and real-world applications of this mode of machine learning (and more) as part of our MSc Computer Science with Artificial Intelligence

On this course, you’ll develop core abilities of computational thinking, computational problem solving and software development, while acquiring specialist knowledge across increasingly sought-after skill sets spanning neural networks, genetic algorithms and data analytics. You’ll even undertake your own independent artificial intelligence project.

With employer demand for this expertise at an all-time high, enrol now and be part of this thrillingly fast-paced, far-reaching and ground-breaking field.

What are flexible work arrangements?

Flexible work arrangements are working hours, work locations, or working patterns that are altered to suit an employee’s individual circumstances. They are an increasingly popular way for employees to balance their personal and professional lives, and for employers to attract and retain talented people.

Types of flexible working arrangements include:

Remote working

This is one of the most common and well-known forms of flexible working. With the coronavirus pandemic necessitating a work-from-home mandate for the majority of workplaces, businesses and employees had to quickly adapt to this flexible working arrangement. Telecommuting and telework – an employee completing their work from outside their traditional office using tools such as email, phones, and video apps like Zoom or Teams – swiftly became the norm. Now, even as the working world begins to return to offices and other workplaces, many are opting to work remotely, either on a full-time basis, or some of the time, which is known as hybrid working. Many remote workers have found that they save time and money on their commutes, have fewer distractions while working, and have increased their productivity.

Staggered hours

Staggered hours are when an employee has a start time, finish time, or break time that differs from their colleagues’ hours at the organisation. For example, someone may request to work from 12:00 until 20:00 every day, even though the typical working hours at the business are 09:00 until 17:00, to accommodate their personal circumstances.

Compressed hours

If a person works full-time hours over fewer days, this is known as compressed hours. For example, an employee might choose to work what’s called a nine-day fortnight – the employee works a little later than other employees every day in order to have every other Friday off work.

Job sharing

Job sharing is when one role is split between two people. For example, one employee may work Monday and Tuesday, while the second employee has a Wednesday-to-Friday work week, but both do the same job when at work.

Part-time hours

Part-time work is often requested when an employee wants to work reduced hours during the day, or work fewer days a week. For example, a parent or guardian may request a working day of 9am until 3pm every day so that they can be home for their children or dependants before and after school during term-time. This can also reduce the likelihood of a parent needing to take parental leave.

Flexitime hours

Not to be confused with staggered hours, flexitime, or flextime, allows an employee to choose their working start and finish times, but always works the organisation’s “core hours” – for example, between 10am and 2pm every day.

Annualised hours

Annualised hours mean that an employee works a certain number of hours during a year, but their schedule is a bit more flexible. For example, agency workers may work certain core hours, and then complete the rest of their hours when needed for projects or by clients, and so on.

The increasing popularity of flexible work arrangements has prompted the UK government to complete a consultation on ‘Making flexible working the default’. While the results of the consultation are still pending, the government has noted that flexible working can be particularly useful for people who need to balance their personal and working lives. For example, people with carer responsibilities may be better able to access the labour market, or stay in work, with flexible working options. The government has also noted that flexible working arrangements can help employers by attracting more applicants to new roles, as well as by increasing productivity and motivation levels within workplaces.

How does flexible working affect a business?

The impact of flexible working on businesses is overwhelmingly positive for both employers and employees. The UK government’s consultation document for changes to flexible working states that by “removing the invisible restrictions to jobs, flexible working fosters a more diverse workforce – and the evidence shows that this leads to improved financial returns for businesses.”

Meanwhile, the Chartered Institute of Personnel and Development (CIPD), the UK’s association for human resources professionals, says that quality flexible arrangements and flexible work schedules can also help businesses to:

  • improve employee work-life balance, job satisfaction, loyalty, and well-being
  • increase staff retention
  • reduce absenteeism
  • become more responsive to change

However, it’s worth noting that research conducted by the CIPD suggests that not all employers offer flexible working practices. In fact, 46% of employees say they do not have flexible working arrangements in their current roles.

The CIPD also notes that while working from home, or remote work, has increased during the COVID-19 pandemic, 44% of people did not work from home at all during the past two years. So while remote working is a popular flexible working arrangement, it’s just one of the options available – and 75% of employees say it’s important that people who can’t work from home have choices to work flexibly in other ways.

How to implement flexible work arrangements

All employees are entitled to request flexible working in the UK, as long as they have 26 weeks of service with their employer. Requests can be made once every 12 months and must be submitted in writing.

The Advisory, Conciliation and Arbitration Service (ACAS), which offers advice and support on flexible working arrangements, recommends that employers:

  • offer clear guidance about what information is needed when an employee submits their flexible working request
  • talk to the employee requesting flexible working as soon as possible after receiving the request. This conversation should be in a private place, and determine how the request might benefit the employee and the business
  • allow an employee to be accompanied by a work colleague for any discussions, and make sure the employee knows they have this option
  • let the employee know the decision about their request as soon as possible, in writing
  • allow the employee to appeal the decision if their request is denied

It’s also worth noting that a request for flexible working can only be rejected for one of the following reasons:

  • the burden of additional costs
  • an inability to reorganise work among existing staff
  • an inability to recruit additional staff
  • a detrimental impact on quality
  • a detrimental impact on performance
  • a detrimental effect on ability to meet customer demand
  • insufficient work for the periods the employee proposes to work
  • a planned structural change to the business

Become a business leader

When leading people within a business, it’s clear that flexible working initiatives can be a fantastic motivator – but they’re just one of the ways that talented people managers and leaders can create high-performing work environments.

Gain all of the skills and qualifications you need to become a business leader with the MSc in International Business, Leadership and Management at the University of York. This flexible master’s degree is offered 100% online, so you can fit your studies around your full-time work and personal commitments.

Embracing the technological revolution: launching a career in computer programming

With our modern, globalised world so heavily reliant on data and technology, it is now almost impossible to comprehend the impact its absence would have on our lives. The prevalence of data and technology is advancing at an unprecedented speed and scale, fundamentally transforming the ways in which we live and work.

Supporting our increasingly automated lives and lifestyles through data collection, information analysis and knowledge sharing – in an effort to continuously advance and innovate upon existing processes and structures – is of strategic importance.

The UK digital skills shortage

The UK suffers from a critical digital skills shortage. Reports from a number of sources – including the latest report from the Department of Digital, Culture, Media and Sport (DCMS) – reveal that:

  • almost 20% of UK companies have a skills vacancy, with 14.1% reporting a lack of digital know-how
  • 66% of digital leaders in the UK are unable to keep up with changes due to a lack of talent
  • the UK tech industry is facing its greatest shortages in cybersecurity, data architecture, and big data and data analysis
  • only 11% of tech leaders believe the UK is currently capable of competing on a global scale
  • data analysis is the fastest-growing skills clustering in tech, set to expand by 33% over the next five years
  • 80% of digital leaders feel retention is more difficult post-pandemic due to shifting employee priorities

Evidently, there is a stark need for individuals with the skills and fundamentals necessary to harness technology’s potential, using it to guide, improve and provide insights into today’s global business environments. Millions are being invested to encourage more people to train for roles which require skills such as coding, data analytics, artificial intelligence (AI) and cybersecurity.

Digital skills are considered vital to post-pandemic economic recovery; in competitive, crowded marketplaces, evidence and data are key to guiding decision-making and business efforts. For those considering a career in computer science – whether in big data, web development, application development, programming, or any number of other fields – there has never been a better time to get involved.

Computer programming as a career

Depending on the role, industry and specialism, programmers can expect to undertake a wide-ranging array of tasks. For example:

  • designing, developing and testing software
  • debugging, to ensure that operating systems meet industry standards and are secure, reliable and perform as required
  • integrating systems and software
  • working alongside designers and other stakeholders to plan software engineering efforts
  • training end-users
  • analysing algorithms
  • scripting and writing code in different languages

The applications for this specialist skill set are vast – and the skills are required in almost every industry and sector. Individuals can work across, for example, websites and web applications, mobile and tablet applications, data structures and video games. Most of us will be familiar with the global, household names of Microsoft, Google and IBM – titans of the computing and technology industry. However, the technological skills and expertise gained from a computer science degree can open doors to careers in any number of businesses and sectors.

Potential career paths and roles could include:

  • computer programmer
  • software application developer
  • front-end/back-end web developer
  • computer systems engineer
  • database administrator
  • computer systems analyst
  • software quality assurance engineer
  • business intelligence analyst
  • network system administrator
  • data analyst

It’s a lucrative business. The current average salary for a programmer in the UK is £57,500 – a figure that can be well-exceeded with further experience and specialisation. It’s also a career with longevity; while computer programming is of paramount importance today, as the data and digital landscape continues to evolve, it’s only going to be even more important in the future.

What skills are needed as a computer programmer?

In the role of a programmer, it’s essential to combine creativity with the more technical and analytical elements of information systems. It’s a skilled discipline which requires artistry, science, mathematics and logic.

Indeed list a number of the more common skills required by computer programmers:

  • Proficiency with programming languages and syntax: While JavaScript is currently the most commonly used programming language, there are also many others, including Python, HTML/CSS, SQL, C++, Java, and PHP. Learning at least two computer programming languages will help to boost employability. Most programmers choose their area of computing specialism and then focus on the most appropriate language for that field.
  • Learning concepts and applying them to other problems: Take the example of CSS, where styles that are applied to a top-level webpage are then cascaded to other elements on this page. By understanding how programming concepts can be translated elsewhere, multiple issues can be resolved more efficiently.
  • Solid knowledge of mathematics: For the most part, programming relies on an understanding of mathematics that goes beyond the basics. Possessing solid working knowledge of arithmetic and algebra underpins many aspects of programming proficiency.
  • Problem-solving abilities: Code is often written and developed in order to create a solution to a problem. As such, having the capabilities to identify and solve problems in the most efficient way possible is a key skill for those working in programming.
  • Communication and written skills: Demonstrating how certain processes and results are produced – for example, stakeholders who may have limited or no programming and technical knowledge – is often a necessary part of the role. The ability to coherently communicate work is vital.

For those interested in developing their skill set, there exist a wealth of interactive, online courses and certifications to get started. Typical entry requirements include an undergraduate/bachelor’s degree.

Launch a new, fulfilling career in information technology and programming

Kickstart your career in the computing sector with the University of York’s online MSc Computer Science with Data Analytics programme – designed for those without a background in computer science.

This flexible course offers you in-depth knowledge and skills – including data mining and analytics, software development, machine learning and computational thinking – which will help you to excel in a wide variety of technological careers. You’ll also become proficient in a number of programming languages, all available to start from beginner’s level. Your studies will be supported by our experts, and you’ll graduate with a wide array of practical, specialist tools and know-how – ready to capitalise on the current skills shortage.

The Internet of Things in the age of interconnectivity

Global online interconnectivity has woven itself seamlessly into our lives. How many of us can conceive of a modern life without the internet?

Going about our daily lives both personally and professionally, we reach for our mobile phones and devices for news, information, entertainment, and to communicate with each other. The ease and expectation of accessing information online 24/7 is taken as a matter of course. What most people may not consider, however, is how all this information technology is delivered to us. Digital transformation, due to emerging technologies, continues to grow exponentially. The Internet of Things (IoT) is an essential, and integral, element in ensuring current and future business success.

What is the Internet of Things and how did it evolve?

Simply put, the IoT is the concept of networking connected devices so that they can collect and transmit data. Nowadays, it enables digital technology to be embedded in our physical world, such as in our homes, cars, and buildings, via vast networks connecting to computers.

Historically, the concept of IoT devices has an interesting timeline, and its early pioneers are names that remain well-known to many of us today:

  • 1832. Baron Schilling creates the electromagnetic telegraph.
  • 1833. Carl Friedrich Gauss and Wilhelm Weber invent a code-enabling telegraphic communication.
  • 1844. Samuel Morse transmits the first Morse code public message from Washington D.C. to Baltimore.
  • 1926. Nikola Tesla conceives of a time when what we know as a mobile phone will become a reality.
  • 1950. Alan Turing foresees the advent of artificial intelligence.
  • 1989. Tim Berners-Lee develops the concept of the World Wide Web.

Even ordinary physical objects became the subject of IoT applications:

  • 1982. Carnegie-Mellon University students install micro switches to check the inventory levels of Coca-Cola vending machines and to see whether they were cold enough to drink.
  • 1990. John Romkey and Simon Hackett connect a toaster to the internet.

As the technology and research grew exponentially from the 1960s onwards, the actual term ‘Internet of Things’ was coined in 1999 by Proctor & Gamble’s Kevin Ashton. By 2008, the first international conference on IoT was held in Switzerland. By 2021, it was reported that there were 35.82 billion IoT devices installed globally, with projections of 75.44 billion worldwide by 2025.

Real-world application

Given the huge potential of IoT technology, the scale of its cross-sector assimilation is unsurprising. For example, it impacts:

  • The consumer market. Designed to make life easier, consider the sheer number of internet-enabled smart devices – including wearables and other goods – that are in daily use. Common examples include smartphones and smartwatches, fitness trackers, home assistants, kitchen appliances, boilers, and home security cameras. We interact with internet connectivity every day; increasingly, many of us are already living in ‘smart’ homes. Optimising the customer experience is key to business success. Whether related to data-collecting thermostats which monitor energy consumption, or wifi providers which supply the best Bluetooth packages, all are driven by IoT systems.
  • Physical world infrastructure. On a grander scale, IoT technology is now focused on developing smart buildings and, in the long run, smart cities. In buildings, elements such as heating, lighting, lifts and security are already directed by automation. In the outside world, real-time, data-gathering traffic systems and networks rely on IoT to share data using machine learning and artificial intelligence.
  • Industrial and domestic sectors. Where, previously, many items and goods were manufactured and serviced off-grid, everything is now internet-connected. Items used domestically include washing machines, doorbells, thermostats, and gadgets and virtual assistant technology such as Alexa and Siri. Amazon distribution centres, factories and international mail delivery systems are all examples of environments that are reliant on IoT platforms.
  • Transportation. In an area of highly complex logistics, keeping the supply chain moving and reaching its destination is critical. The same can be applied to all other modes of transport, such as aeroplanes, ships, trains and vehicles. For the individual, connected cars are already a reality. Many vehicles have the ability to communicate with other systems and devices, sharing both internet access and data.
  • Healthcare. The impact of the global Covid pandemic has taken a huge toll on our lives. The stresses on worldwide healthcare and medical business models have become ever more pressing. The need for strategies and solutions to deliver optimal healthcare, as modelled on IoT, is being researched by many organisations including Microsoft. Devices such as pacemakers, cochlear implants, digital pills, and wearable tech such as diabetic control sensors, are making invaluable contributions to patients across the sector.

The technology behind the Internet of Things

IoT technology presents immeasurable benefits in our lives, and its scope is seemingly limitless. IoT platforms are interconnected networks of devices which constantly source, exchange, gather and share big data using cloud computing or physical databases. They consist of:

  • Devices. These connect to the internet and incorporate sensors and software which connect with other devices. For example, the Apple watch connects to the internet, uses cloud computing, and also connects with the Apple iPhone.
  • Communications. Examples include Bluetooth, MQTT, wifi and Zigbee.
  • Cloud computing. This refers to the internet-based network on which data from IoT devices and applications is stored.
  • Edge computing. The use of tools such as IBM’s edge computing uses artificial intelligence to help solve business problems, increase security, and enhance both capacity and resilience.
  • Maintenance and monitoring. Monitoring and troubleshooting these devices and communications is essential to ensure optimum functionality.

Inevitably, while the benefits to both international businesses and organisations are immense, IoT technology also attracts cybercrime and hackers. Cyber security threats target all areas of IoT – from businesses to individual users.

IoT has been hailed as the fourth Industrial Revolution. Future technology is already blending artificial intelligence with IoT, with the aim of enabling our personal and professional lives to become simpler, safer and more personalised. In fact, in terms of IoT security, artificial intelligence can be used for:

  • Evaluating information for optimisation
  • Learning previous routines
  • Decreasing down times in functionality
  • Increasing the efficiency and efficacy of procedures
  • Creating solutions to ward off potential threats, thus enhancing security

Career prospects

Career opportunities within the computer science and artificial intelligence fields may include, but are not limited to:

  • Natural language processing
  • Machine learning engineering
  • Semantic technology
  • Data science
  • Business intelligence development
  • Research science
  • Big data engineering/architecture

Choosing the right AI and computer science course for you

If you’re looking for the qualifications to help you succeed in the fast-paced and highly rewarding field of IoT, then choose the University of York’s online MSc Computer Science with Artificial Intelligence programme.

Intellectual capital: driving business growth and innovation

How can a business maximise its growth and development? What can be done to increase competitive advantage? Are businesses making the best possible use of all their assets?

In an increasingly crowded global economy, all businesses must work hard to remain relevant, competitive and profitable. Innovation is key to maximising business growth and, for many businesses, they already possess the means to achieve it. Alongside this, developing customer-focused, personalised experiences – and adding value through the customer journey – is key. An organisation’s intellectual capital has the potential to achieve both aims, and add significant economic benefit – but what is it, and how is it best utilised?

What is intellectual capital?

Intellectual capital (IC) refers to the value of an organisation’s collective knowledge and resources that can provide it with some form of economic benefit. It encompasses employee knowledge, skill sets and professional training, as well as information and data.

In this way, IC identifies intangible assets, separating them into distinct, meaningful categories. Although not accounted for on a balance sheet, these non-monetary assets remain central to decision making and can have a profound impact on a company’s bottom line. More than ever, IC is recognised as one of most critical strategic assets for businesses.

Broadly speaking, there are three main categories:

  • Human capital: the ‘engine’ and ‘brain’ of a company is its workforce. Human capital is an umbrella term, referring to the skills, expertise, education and knowledge of an organisation’s staff – including how effectively such resources are used by those in management and leadership positions. A pool of talented employees, with a wealth of both professional and personal skills, adds significant value to a workplace. Companies who prioritise investing in the training, development and wellbeing of their teams are actively investing in their human capital. It can bring a host of benefits, including increased productivity and profitability.
  • Relational capital: this category refers to any useful relationships an organisation maintains – for example, with suppliers, customers, business partners and other stakeholders – as well as brand, reputation and trademarks. Customer capital is adjacent to this, and refers to current and future revenues from customer relationships.
  • Structural capital: structural capital relates to system functionality. It encompasses the processes, organisation and operations by which human and relational capital are supported. This may include intellectual property and innovation capital, data and databases, culture, hierarchy, non-physical infrastructure and more.

Each area offers the means for value creation – which is integral to increasing competitiveness. As such, business leaders should prioritise intellectual capital, and its role within operational strategy, in both short-term and long-term planning.

How is intellectual capital measured?

As stated, while IC is counted among a company’s assets, it is not included in its balance sheet. While there are various ways to measure intellectual capital, there isn’t one widely agreed, consistent method for doing so. Together, these aspects mean that quantifying it can be challenging.

Three main methods are generally used to measure IC:

  • The balanced scorecard method examines four key areas of a business to identify whether they are ‘balanced’. They are:
    1. customer perspective – how customers view the business; 
    2. internal perspective – how a company perceives its own strengths; 
    3. innovation and learning perspective – examining growth, development and shortfalls;
    4. financial perspective – whether shareholder commitments are being met. 

A visual tool which communicates organisational structure and strategic metrics, the scorecard provides a detailed overview without overwhelming leaders with information.

  • The Skandia Navigator method uses a series of markers to develop a well-rounded overview of organisational performance. It focuses on five key areas: 
    1. financial focus – referring to overall financial health; 
    2. customer focus – including aspects such as returning customers and satisfaction scores; 
    3. process focus – how efficient and fit-for-purpose businesses processes are; 
    4. renewal and development focus – which looks at long-term business strategy and sustainability;
    5. human focus – sitting at the centre of the others, human focus encompasses employee wellbeing, experience, expertise and skills.
  • Market value-to-book value ratio is calculated by comparing a company’s book value with its market value, and aims to identify both undervalued and overvalued assets. A ratio above one indicates that there may be undervalued assets which are not being utilised; a ratio below one indicates there may be overvalued assets which action could be taken to strengthen.

How can a business increase its intellectual capital?

Intellectual capital acts as a value-driver in our twenty-first-century economy. As such, it’s no surprise that many businesses are pivoting to focus on human, relational and structural assets over others. Given both its relative importance and the returns an organisation can expect, finding ways to increase IC could be key to achieving key business goals.

For Forbes, efforts to increase IC mean adopting either a solution-focused or perspective-focused approach. The first refers to the methods by which specific results can be achieved – the what, when, why and where. The second refers to how IC can utilise industry and marketplace trends, forecasts and insights to seize opportunities. Whichever approach a business opts for, there are a number of ways in which to boost intellectual capital efforts. These include:

  • Improving employee satisfaction to increase retention rates
  • Recruiting individuals with specific knowledge, competencies and skill sets that are currently lacking among the existing workforce
  • Auditing and enhancing systems and processes
  • Gathering research and data to inform decision making
  • Investing in training and development opportunities for employees
  • Improving employer branding to both attract and retain the best talent
  • Creating new products, services and initiatives through innovation

Influential contributors and further reading

Early and current proponents and authors of intellectual capital thinking include:

  • Patrick H Sullivan who wrote ‘A Brief History of the Intellectual Capital Movement’. He presented a concise overview of the beginnings of the discipline in which he traced it back to three origins. These were: Hiroyuki Hami, who studied invisible assets pertaining to Japanese operational management; the work of various economists (Penrose, Rumelt, Wernerfelt et al) which was included in Dr David J. Teece’s 1986 article relating to technical commercialisation; and Karl Erik-Sveiby, who focused on human capital in terms of employee competences and knowledge base. His model of intellectual capital, published in 1997, was a seminal contribution in the field.
  • Dr David J Teece published ‘Managing Intellectual Capital’ in 2002, and further publications by him are available on Google Scholar.
  • Leif Edvinsson’s 2002 book, ‘Corporate Longitude’, concerned itself with the measurement, valuation and economic impact of the knowledge economy.
  • Thomas A Stewart, a pioneer in the field, authored ‘The New Wealth of Organizations’ in 1997. He delved into areas such as unlocking potential hidden assets, spotting and mentoring talented employees, and investigating methods to identify and retain customer and brand loyalty.

The field of intellectual capital continues to expand and evolve globally. Many well-known international figures such as Johan Roos and Nick Bontis continue to explore both its ramifications and applications.

Develop the specialist skills to succeed in fast-paced, global business environments

Become adept at the management of intellectual capital – alongside a wide variety of other business and leadership skills – with the University of York’s 100% online MSc International Business Leadership and Management programme.

You’ll gain in-depth, real-world know-how and tools to navigate the global business marketplace, exploring the challenges and opportunities associated with leadership and business management. Supported by our experts, you’ll further your knowledge in marketing, operations, strategy, project management, finance, people management and more. 

As well as providing a broad overview of management disciplines, this flexible programme will develop vital decision-making, critical-thinking, problem-solving and communication skills.

The importance of innovation management in business

In a constantly changing commercial world, the challenge is to not be left behind. Gaining and sustaining a competitive edge is key to thriving in today’s global marketplace. Innovation management has become an essential component in navigating this increasingly complex and international business environment.

What is innovation management?

Applied to business, innovation is all about generating new ways of solving problems, using different models, theories and frameworks. It is a creative process which uses techniques such as brainstorming and prototyping, and plays a critical role in the design thinking process. 

There are as many ways to innovate as there are problems to solve. The goal is either to introduce new or improved products or services to gain competitive advantage. By developing a sustainable and ongoing innovation process, a company’s brand image and advancement is set on an upward trajectory.

How innovation management happens

Coming up with innovative ideas, products and services is directly down to the pool of talent available in the workforce. Traditionally, companies would generate ideas in-house, but many are now turning to  open innovation. This refers to companies and organisations working with external agencies such as academic and research institutions, suppliers and clients. It fosters a working model very different from the traditional one, but is advantageous to all parties. 

Initiatives are carried out by an organisation, with the aim to identify and create new business openings through:

  • generating ideas
  • exploring future areas of growth
  • modelling products and services
  • experimenting and testing new concepts.

Not everything needs to start from scratch. Many existing products or services may already work well, and simply need to be approached differently – for example, through adaptation and modification.

Successful innovation in business relies on certain criteria, including:

  • Business models. Your company must be flexible enough to rethink the business and find new revenue streams. Companies may resist looking at new ways of managing existing systems and operations. However, actively challenging current and long-held assumptions is important in order to discover potential opportunities. 
  • Employee engagement. The human resources element available to businesses is invaluable. By tapping innovative ideas directly from the workforce, and engaging employees in showcasing skills and knowledge, ideation and innovation can be disseminated to everyone’s benefit.
  • Use of technology. Most of us have accepted the seamless integration of technological innovation in our professional lives. Although not every innovative idea will involve costly technological input and outlay, in today’s global, fast-moving market many will. Much of the world’s commercial thrust is reliant on the acquisition of data and knowledge. Google, for example, invests heavily in managing the innovation process.
  • Marketing. Brand awareness and visibility is a vital part of a company’s profile. There is no point in developing or producing a product or service if people are unaware of it. Marketing is one of the major factors in driving international sales and profitability.

Key aspects of innovation management

Different types of innovation have been identified within the innovation management process:

  • Incremental innovation. As its name suggests, in this strategy an existing product or service is subject to continual improvement and updates. Although such changes may be small or large, they still require defined methodologies and strategies to ensure continuous improvement. Starting out with its prototype in the early twentieth-century, Gillette is a high-profile brand which continually upgrades its razors with new features while retaining its core design. Likewise, in the current mobile phone market, innovation is delivered through frequent small updates to software.
  • Disruptive innovation. This occurs when product development results in a paradigm shift which has a radical impact on a business market. It can take a long time to get to the creation stage – often months and years in planning and execution. A great deal of project management, research, testing and evaluation is required. A classic example of disruptive innovation is demonstrated by Apple. When the iPhone was introduced in June 2007, it was an instant global success. It wasn’t the first mobile phone, but it overtook the existing competition and effectively launched the smartphone revolution.  
  • Architectural innovation. Introduced by Professor Rebecca Henderson and Dean Kim Clark of Harvard Business School in 1990, architectural innovation involves reconfiguring components of in-use products or services. Whether seeking a new target audience, or adding value to the existing market, it makes changes without radically altering either technologies or parts. As with the other innovation strategies, alterations must be questioned, evaluated and tested to determine whether clients and customers would value any changes.
  • Major innovation. This business process is arguably the ultimate in achievement: it seeks to introduce a brand new sector or industry. Inventions such as the printing press, the telephone and the internal combustion engine have literally changed the world. Forbes has listed some of the top innovation companies in recent times. All are focused on attaining the pinnacle in terms of product and service procurement.

Ways of participating in innovation management

Opportunities in innovation culture are limitless; global trade, marketing and sales continue to grow exponentially. The sector is populated not only with more ‘traditional’ business models – small or large organisations with employees – but also attracts those who are motivated by entrepreneurship and prefer to set up start-ups.

Global commerce both fosters and demands cross-cultural management and organisation. Awareness and servicing of contemporary issues in international business requires professionals with the knowledge and skill set to tackle any and all situations. Areas of interest may include:

  • Providing consultancy services
  • Sourcing new business and sales
  • Forming partnerships with external organisations using open innovation
  • Working with stakeholders, including shareholders, customers, suppliers and employees
  • Dealing with legal matters such as intellectual property and ethical concerns
  • Portfolio management

Choosing the right course for you

Gain the qualifications to help you succeed in the international business sector with the University of York’s online MSc International Business Leadership and Management programme. All practical information regarding the MSc programme – such as modules, topics, entry requirements, tuition fees and English language qualifications – can be found on our course page.

Artificial intelligence and its impact on everyday life

In recent years, artificial intelligence (AI) has woven itself into our daily lives in ways we may not even be aware of. It has become so pervasive that many remain unaware of both its impact and our reliance upon it. 

From morning to night, going about our everyday routines, AI technology drives much of what we do. When we wake, many of us reach for our mobile phone or laptop to start our day. Doing so has become automatic, and integral to how we function in terms of our decision-making, planning and information-seeking.

Once we’ve switched on our devices, we instantly plug into AI functionality such as:

  • face ID and image recognition
  • emails
  • apps
  • social media
  • Google search
  • digital voice assistants like Apple’s Siri and Amazon’s Alexa
  • online banking
  • driving aids – route mapping, traffic updates, weather conditions
  • shopping
  • leisure downtime – such as Netflix and Amazon for films and programmes

AI touches every aspect of our personal and professional online lives today. Global communication and interconnectivity in business is, and continues to be, a hugely important area. Capitalising on artificial intelligence and data science is essential, and its potential growth trajectory is limitless.

Whilst AI is accepted as almost commonplace, what exactly is it and how did it originate?

What is artificial intelligence?

AI is the intelligence demonstrated by machines, as opposed to the natural intelligence displayed by both animals and humans. 

The human brain is the most complex organ, controlling all functions of the body and interpreting information from the outside world. Its neural networks comprise approximately 86 billion neurons, all woven together by an estimated 100 trillion synapses. Even now, neuroscientists are yet to unravel and understand many of its ramifications and capabilities. 

The human being is constantly evolving and learning; this mirrors how AI functions at its core. Human intelligence, creativity, knowledge, experience and innovation are the drivers for expansion in current, and future, machine intelligence technologies.

When was artificial intelligence invented?

During the Second World War, work by Alan Turing at Bletchley Park on code-breaking German messages heralded a seminal scientific turning point. His groundbreaking work helped develop some of the basics of computer science. 

By the 1950s, Turing posited whether machines could think for themselves. This radical idea, together with the growing implications of machine learning in problem solving, led to many breakthroughs in the field. Research explored the fundamental possibilities of whether machines could be directed and instructed to:

  • think
  • understand
  • learn
  • apply their own ‘intelligence’ in solving problems like humans.

Computer and cognitive scientists, such as Marvin Minsky and John McCarthy, recognised this potential in the 1950s. Their research, which built on Turing’s, fuelled exponential growth in this area.  Attendees at a 1956 workshop, held at Dartmouth College, USA, laid the foundations for what we now consider the field of AI. Recognised as one of the world’s most prestigious academic research universities, many of those present became artificial intelligence leaders and innovators over the coming decades.

In testimony to his groundbreaking research, the Turing Test – in its updated form – is still applied to today’s AI research, and is used to gauge the measure of success of AI development and projects.

This infographic detailing the history of AI offers a useful snapshot of these main events.

How does artificial intelligence work?

AI is built upon acquiring vast amounts of data. This data can then be manipulated to determine knowledge, patterns and insights. The aim is to create and build upon all these blocks, applying the results to new and unfamiliar scenarios.

Such technology relies on advanced machine learning algorithms and extremely high-level programming, datasets, databases and computer architecture. The success of specific tasks is, amongst other things, down to computational thinking, software engineering and a focus on problem solving.

Artificial intelligence comes in many forms, ranging from simple tools like chatbots in customer services applications, through to complex machine learning systems for huge business organisations. The field is vast, incorporating technologies such as:

  • Machine Learning (ML). Using algorithms and statistical models, ML refers to computer systems which are able to learn and adapt without following explicit instructions. In ML, inferences and analysis are discerned in data patterns, split into three main types: supervised, unsupervised and reinforcement learning.
  • Narrow AI. This is integral to modern computer systems, referring to those which have been taught, or have learned, to undertake specific tasks without being explicitly programmed to do so. Examples of narrow AI include: virtual assistants on mobile phones, such as those found on Apple iPhone and Android personal assistants on Google Assistant; and recommendation engines which make suggestions based on search or buying history.
  • Artificial General Intelligence (AGI). At times, the worlds of science fiction and reality appear to blur. Hypothetically, AGI – exemplified by the robots in programmes such as Westworld, The Matrix, and Star Trek – has come to represent the ability of intelligent machines which understand and learn any task or process usually undertaken by a human being.
  • Strong AI. This term is often used interchangeably with AGI. However, some artificial intelligence academics and researchers believe it should apply only once machines achieve sentience or consciousness.
  • Natural Language Processing (NLP). This is a challenging area of AI within computer science, as it requires enormous amounts of data. Expert systems and data interpretation are required to teach intelligent machines how to understand the way in which humans write and speak. NLP applications are increasingly used, for example, within healthcare and call centre settings.
  • Deepmind. As major technology organisations seek to capture the machine learning market, they are developing cloud services to tap into sectors such as leisure and recreation. For example, Google’s Deepmind has created a computer programme, AlphaGo, to play the board game Go, whereas IBM’s Watson is a super-computer which famously took part in a televised Watson and Jeopardy! Challenge. Using NLP, Watson answered questions with identifiable speech recognition and response, causing a stir in public awareness regarding the potential future of AI.

Artificial intelligence career prospects

Automation, data science and the use of AI will only continue to expand. Forecasts for the data analytics industry up to 2023 predict exponential expansion in the big data gathering sector. In The Global Big Data Analytics Forecast to 2023, Frost and Sullivan project growth at 29.7%, worth a staggering $40.6 billion.

As such, there exists much as-yet-untapped potential, with growing career prospects. Many top employers seek professionals with the skills, expertise and knowledge to propel their organisational aims forward. Career pathways may include:

  • Robotics and self-driving/autonomous cars (such as Waymo, Nissan, Renault)
  • Healthcare (for instance, multiple applications in genetic sequencing research, treating tumours, and developing tools to speed up diagnoses including Alzheimer’s disease)
  • Academia (leading universities in AI research include MIT, Stanford, Harvard and Cambridge)
  • Retail (AmazonGo shops and other innovative shopping options)
  • Banking
  • Finance

What is certain is that with every technological shift, new jobs and careers will be created to replace those lost.

Gain the qualifications to succeed in the data science and artificial intelligence sector

Are you ready to take your next step towards a challenging, and professionally rewarding, career?

The University of York’s online MSc Computer Science with Data Analytics programme will give you the theoretical and practical knowledge needed to succeed in this growing field.

Digital influences on the way we live and work

The exponential growth of digital connection in our world is all-pervasive and touches on every aspect of our daily lives, both personally and professionally.

Today, many people would be hard-pressed to imagine life before the advent of digital technology. It has blended and integrated seamlessly into everyday living. Global interconnectivity and the ability to communicate and network instantly is now a ‘given’. This expectation has markedly transformed the human experience across all areas, and the ‘always on’ culture has led to the creation of a vast online and computerised world. 

Creatively, this has inevitably resulted in phenomenally new ways of working and interpreting the world through data. Artificial intelligence (AI), and its attendant strands of computational science, are a vital link in the chain of twenty-first century life.

Whose choice is it anyway?

As the everyday world becomes saturated with digital information, the element of choice and decision-making becomes harder to navigate. The sheer amount of data available has led to the development of programs to enable and equip end-users to make such choices bespoke. Examples can be simplistic – from choosing the best shampoo to suit your hair type or which restaurant to choose for a special night out – or, more complexly, looking for a new home in a different area.

Recommender systems are built into many technological platforms and are used for both individual and ecommerce purposes. Although choice availability appears straightforward, the process behind it is remarkably elaborate and sophisticated.

The science behind the experience

The recommender system is an information filtering system run by machine learning algorithms programmed to anticipate and predict user interest, user preferences and ratings in relation to products and/or information browsed online.

Currently, there are three main types of recommender systems:

  1. Content-based filtering. This is driven and influenced by user behaviour, and picks up what has been previously or currently searched for. Keyword-dependent, it seeks out the filtering approach and patterns regarding items in order to inform decision making.
  2. Collaborative filtering recommender. This uses a more-advanced approach in that similar users are selected based on their choice of similar items. Collaborative filtering methods are centred on analysing the importance of user interactions on like-for-like user items and common selections, enabling comparisons to be made. 
  3. Hybrid recommender. An amalgam of the two previous types, this system creates a hybrid once recommended items have been generated. 

The optimal functionality of recommendation engines depends upon information and raw data extracted from user experience and user ratings. When combined, these facilitate the building of user profiles to inform ecommerce targets and aims.

Multiple commonly accessed corporations and e-markets are highly visible and instantly recognisable on the online stage. Household names such as Amazon and Netflix are brands that immediately spring to mind. These platforms invest massively in state-of-the-art operations and big data collection to constantly improve, evolve and calibrate their commercial aims and marketing.

Computer architecture and system software are predicated on a myriad of sources and needs, and rely heavily on machine learning and deep learning.These two terms are often considered interchangeable buzzwords, but deep learning is an evolution of the former. Using programmable neural networks, machines have the ability to make accurate and precise decisions without human intervention. Within the machine learning environment, the term ‘nearest neighbour’ is an essential classification algorithm – not to be confused with its traditional association in the pre-computer era.

Servicing enabling protocols, technologies and real-world applications requires in-depth skills and knowledge across multiple disciplines. By no means an exhaustive list, familiarity with, and indeed specialist awareness of, the following terms are integral to the optimisation of recommendation algorithms and the different types of recommendation models:

  • Matrix factorization. This refers to the collaborative filtering algorithms used in recommender systems. New user items are decomposed into the product of two lower-dimensionality, rectangular matrices. Mathematical modelling further splits these entities into smaller entries in order to discover the features or information leading to interactions between different users and items. Once alerted by the search engine, matrix factorization generates product recommendations.
  • Cold-start problem. This is an issue which presents in both supervised and unsupervised machine learning and is frequently addressed.
  • Cosine similarity. Needing to determine the nearest user to provide recommendations, this is an approach to measure similarities between two non-zero vectors.
  • Data sparsity. Many commercial recommender systems are based around large datasets. As such, the user-item matrices used in collaborative filtering could be large and sparse. Therefore, such data sparsity could present a challenge in terms of optimal recommendation performance.
  • Data science. IBM’s overview offers a comprehensive explanation, and introduction to, the employment of data science within its use of data mining and complex metadata.
  • Programming languages. Globally used programming languages include Scala, Perl, SQL, C++ and Python. Python is one of the foremost languages used. Managed by Grouplens Research at the University of Minnesota, MovieLens makes use of Python in collaborative filtering. Its programme predicts film ratings based on user profiles, plus user ratings, and overall user experience. 

What’s happening with social media?

In recent years, recommender systems have become integral to the continued growth of social media. Due to the nature of the interconnected online community across locations and social demographics, a higher volume of traffic is both generated and triggered by recommendations enforced by likes and shares.

Online shopping has exploded as a result of the global pandemic. Websites such as Meta (formerly Facebook) and Etsy have been joined by new e-businesses and ‘shop fronts’, all of which incorporate the latest recommender technology. Targeted focus centres on growing user profiles by analysing purchase history and the browsing of new items. The aim is to both attract new users and retain existing ones. These embeddings are made possible through the use of recommender systems.

Careers in artificial intelligence and computer science      

Professionally relevant associations such as the Institute of Electrical and Electronics Engineers  (IEEE), and digital libraries such as Association for Computing Machinery (ACM), exist to provide further knowledge and support to those working in this fascinating field. 

Whichever specialisation appeals – computer science, software development, programming, AI-oriented solutions development – the many pathways are leveraged to build a rewarding career. There are many in-demand roles and no shortage of successful and creative organisations in which to work as evidenced in 50 Artificial Intelligence Companies to watch in 2022.

Further your learning in this fast-paced field

If you’re looking for a university course offering up-to-date theoretical and practical knowledge with holistic, pedagogical and real-world expertise, then choose the University of York’s online MSc Computer Science with Artificial Intelligence course and take your next step towards a fulfilling and stimulating career.

What is data visualisation?

Data visualisation, sometimes abbreviated to dataviz, is a step in the data science process. Once data has been collected, processed, and modelled, it must be visualised for patterns, trends, and conclusions to be identified from large data sets.

Used interchangeably with the terms ‘information graphics’, ‘information visualisation’ and ‘statistical graphs’, data visualisation translates raw data into a visual element. This could be in a variety of ways, including charts, graphs, or maps.

The use of big data is on the rise, and many businesses across all sectors use data to drive efficient decision making in their operations. As the use of data continues to grow in popularity, so too does the need to be able to clearly communicate data findings to stakeholders across a company.

The importance of effective data visualisation

When data is presented to us in a spreadsheet or in it’s raw form, it can be hard to draw quick conclusions without spending time and patience on a deepdive into the numbers to understand results. However, when information is presented to us visually, we can quickly see trends and outliers. 

A visual representation of data allows us to internalise it, and be able to understand the story that the numbers tell us. This is why data visualisation is important in business – the visual art communicates clearly, grabs our interest quickly, and tells us what we need to know instantly.

In order for data visualisation to work effectively, the data and the visual must work in tandem. Rather than choosing a stimulating visual which fails to convey the right message, or a plain graph which doesn’t show the full extent of the data findings, a balance must be found. 

Every data analysis is unique, and so a one-size-fits-all approach doesn’t work for data visualisation. Choosing the right visual method to communicate a particular dataset is important.

Choosing the right data visualisation method

There are many different types of data visualisation methods. So, there is something to suit every type of data. While your knowledge of some of these methods may span back to your school days, there may be some which you are yet to encounter.

There are also many different data visualisation tools available, with free options available on Google Charts and the open sourced Tableau Public.

Examples of data visualisation methods:

  • Charts: data is represented by symbols – such as bars in a bar chart, lines in a line chart, or slices in a pie chart. 
  • Tables: data is held in a table format within a database, consisting of columns and rows – this format is seen most commonly in Microsoft Excel sheets.
  • Graphs: diagrams which show the relation between two variable quantities which are measured along two axes (usually x-axis and y-axis) at right angles.
  • Maps: used most often to display location data, advancements in technology mean that maps are often digital and interactive which offers more valuable context of the data.
  • Infographics: a visual representation of information, infographics can include a variety of elements including images, icons, texts and charts which conveys more than one key piece of information quickly and clearly.
  • Dashboards: graphical user interfaces which provide at-a-glance views of key performance indicators relevant to a particular objective or business process.
  • Scatter plots: represents values for two different numerical variables by using dots to indicate values for an individual data point on a graph with a horizontal and vertical axis
  • Bubble charts: an extension of scatter plots which displays three dimensions of data – two values in their dot placement, and a third value through its size.
  • Histograms: a graphical representation which looks similar to a bar graph but condenses large data sets by grouping data points into logical ranges.
  • Heat maps: show the magnitude of a phenomenon as a variation of two colour dimensions which gives cues on how the phenomenon is clustered or varied over physical space.
  • Treemaps: uses nested figures – typically rectangles – to display large amounts of hierarchical data
  • Gantt charts: a type of bar chart which illustrates a project schedule, showing the dependency relationships between activities and current schedule status.

Data visualisation and the Covid-19 pandemic

The Covid-19 outbreak was an unprecedented event which had never been seen in our lifetimes. Because of the scale of the virus, its impacts on our daily lives, and the sudden nature of abrupt change, the way public health messages and evolving information on the situation were communicated was often through data visualisation.

Being able to visually see the effects of Covid-19 enabled us to try to make sense of a situation we weren’t prepared for. 

As Eye Magazine outlines in the article ‘The pandemic that launched a thousand visualisations’: ‘Covid-19 has generated a growth in information design and an opportunity to compare different ways of visualising data’. 

The John Hopkins University (JHU) Covid-19 Dashboard included key statistics alongside a bubble map to indicate the spread of the virus. A diagram from the Imperial College London Covid-19 Response Team was influential in communicating the need to ‘flatten the curve’. Line graphs from the Financial Times created visual representations of how values such as case numbers by country changed from the start of the outbreak to present day. 

On top of this, data scientists within the NHS digital team built their capabilities in data and analytics, business intelligence, and data dashboards quickly to evaluate the rates of shielded patients, e-Referrals, and Covid-19 testing across the UK. 

The use of data visualisation during the pandemic is a case study which will likely hold a place in history. Not only did these visualisations capture new data as it emerged and translate it for the rest of the world, they will also live on as scientists continue to make sense of the outbreak and the prevention of it happening again.

Make your mark with data visualisation

If you have ambitions to become a data analyst who could play an important role in influencing decision making within a business, an online MSc Computer Science with Data Analytics will give you the skills you need to take a step into this exciting industry.

This University of York Masters programme is studied part-time around your current commitments, and you’ll gain the knowledge you need to succeed. Skilled data analytics professionals are in high demand as big data continues to boom. With us, we’ll prepare you for a successful future.

What is computer vision?

Research has shown that 84% of UK adults own a smartphone. As a result, taking a photo or recording a video and sharing it with friends has never been easier. Whether sharing directly with friends on popular messaging app WhatsApp, or uploading to booming social media platforms Instagram, TikTok, or YouTube, the digital world is an increasingly more visual one than ever before.

Internet algorithms index and search text with ease. When you use Google to search for something, chances are the results are fairly accurate or answer your question. However, images and videos aren’t indexed or searchable in the same way. 

When uploading an image or video, the owner has the option to add meta descriptions. This is a text string which isn’t visible on screen but which tells algorithms what is in that particular piece of media. However, not all rich media has associated meta descriptions and they aren’t always accurate.

Computer vision is the field of study focused on solving the problem of making computers see by developing methods that reproduce the capability of human vision, and aims to enable computers to understand the content of digital images. It is a multidisciplinary field encompassing artificial intelligence, machine learning, statistical methods, and other engineering and computer science fields.

How computer vision applications operate

Many computer vision applications involve trying to identify and classify objects from image data. They do this using the following methods to answer certain questions.

  • Object classification: What broad category of object is in this photograph?
  • Object identification: Which type of a given object is in this photograph?
  • Object verification: Is the object in the photograph?
  • Object detection: Where are the objects in the photograph?
  • Object landmark detection: What are the key points for the object in the photograph?
  • Object segmentation: What pixels belong to the object in the image?
  • Object recognition: What objects are in this photograph and where are they?

Other methods of analysis used in computer vision include:

  • video motion analysis to estimate the velocity of objects in a video or the camera itself;
  • image segmentation where algorithms partition images into multiple sets of views;
  • scene reconstruction which creates a 3D model of a scene inputted through image or video; and
  • image restoration where blurring is removed from photos using machine learning filters.

Why computer vision is difficult to solve

The early experiments of computer vision began in the 1950s. Since then it has spanned robotics and mobile robot navigation, military intelligence, human computer interaction, image retrieval in digital libraries, and the rendering of realistic scenes in computer graphics.

Despite decades of research, computer vision remains an unsolved problem. While some strides have been made, specialists are yet to reach the same level of success in computers as is innate in humans.

For fully-sighted humans, seeing and understanding what we’re looking at is effortless. Because of this ease, computer vision engineers originally believed that reproducing this behaviour within machines would also be a fairly simple problem to solve. That, it turns out, has not been the case.

While we know that human vision is simple for us, psychologists and biologists don’t yet have a complete understanding as to why and how it’s so simple. There is still a knowledge gap in being able to explain the complete workings of our eyes and the interpretation of what our eyes see within our brains. 

As humans, we are also able to interpret what we see under a variety of different conditions – different lighting, angles, and distances. With a range of variables, we can still reach the same conclusion and correctly identify an object. 

Without understanding the complexities of human vision as a whole, it’s difficult to replicate or adapt for success in computer vision.

Recent progress in computer vision

While the problem of computer vision doesn’t yet have an entire solution, progress has been made in the field due to innovations in artificial intelligence – particularly in deep learning and neural networks. 

As the amount of data generated every day continues to grow, so do the capabilities in computer vision. Visual data is booming, with over three billion images being shared online per day, and computer science advancements mean the computing power to analyse this data is now available. Computer vision algorithms and hardware have evolved in their complexity, resulting in higher accuracy rates for object identification.

Facial recognition in smartphones has become a key feature of unlocking our mobile devices in recent years, a success which is down to computer vision. 

Other problems which have been solved in this vast field also include:

  • optical character recognition (OCR) which allows software to read the text from within an image, PDF, or a handwritten scanned document
  • 3D model building, or photogrammetry, which may be a stepping stone to reproducing the identification of images from different angles
  • safety in autonomous vehicles, or self-driving cars, where lane line and object detection has been developed
  • revolutionising healthcare with image analysis features to detect symptoms in medical imaging and X-rays
  • augmented reality and mixed reality, which uses object tracking in the real world to determine the location of a virtual object on the device’s display 

The ultra-fast computing machines available today, along with quick and reliable internet connections, as well as cloud networks make the process of deciphering an image using computer vision much faster than when this field was first being investigated. Plus, with companies like Meta, Google, IBM and Microsoft also sharing their artificial intelligence research through open sourcing, it’s certain that computer vision research and discoveries will progress at a quicker speed than was seen in the past.

The computer vision and hardware market is expected to be worth $48.6 billion, making it a lucrative industry where the pace of change is accelerating.

Specialise in artificial intelligence

If you have an interest in computer vision, expanding your skills and knowledge in artificial intelligence is the place to start. With this grounding, you could be the key that solves many unanswered questions in computer vision – a field with potential for huge growth.

The University of York’s online MSc Computer Science with Artificial Intelligence will set you up for success. Study entirely online and part-time around current commitments, whether you already have experience in computer science or you’re looking to change your career into this exciting industry, this master’s degree is for you.

What is a franchise?

Franchises are a good option for people who want to be their own boss and run their own business, but are lacking the knowledge or resources to launch a new product or service on their own. Franchising is also a fairly financially-safe way of being your own boss, as there is a much higher rate of survival than in new businesses and startups.

At its core, a franchise is a partnership between an individual (the franchisee) and an existing organisation (the franchisor).

There are three types of franchise systems:

  • Product: This is when a franchisor gives a franchisee permission to sell a product using their logo, trademark, and brand name.
  • Manufacturing: This is when a franchisor partners with a franchisee to manufacture and sell their products using their logo, trademark, and brand name.
  • Business: This is when a franchisor licences their brand to a franchisee and provides regulations around how the business operates and is managed.

How franchises work

A franchisor grants a franchisee the right to market and/or trade their products and services. When a franchisee purchases a franchise, there is an initial fee to the corporation they’re going into partnership with, and they will usually also pay regular royalties to cover the cost of initial and/or ongoing training, business support and marketing. 

By paying for the organisation to manage these areas of the business, the franchisee can concentrate on the day-to-day running of their business. They also avoid the cost of organising these services in-house.

The franchisee agreement is a contract which governs the partnership between franchisee and franchisor. Within this, the partnership is tied in for a set period of time – generally between five and twenty years. Once the period of time is up, the contract tends to be renewable.

The history of franchises

The franchising model was created in the 1800s, by Isaac Singer – inventor of the widely-used Singer sewing machine. After the US Civil War in the 1860s, he was mass-producing his famous machines, but needed a system in place that would enable repairs and maintenance to cover the whole country. Initially, local merchants across the US were sold licences that permitted them to service the machines. This then grew to enable the merchants to sell the machines, too. The contract used was the earliest form of franchise agreement.

During the Second World War, companies like Coca-Cola and Pepsi looked to expand quickly and began franchising. As the 1950s and 1960s saw growth in population, economic output and social change, franchising grew in popularity in the UK, especially amongst food retailers such as the fast-food chains Wimpy, McDonald’s and KFC, and ice cream brands Lyons Maid and Mr Softee.

Today, franchising is a function of many established brands across multiple sectors, with franchise opportunities in food, pet grooming, homecare agencies, beauty salons, recruitment companies and many more.

How popular are franchises?

The British Franchise Association’s 2018 bfa NatWest Franchise Survey found that the franchise industry is growing more than ever before. At the time of survey, there were 48,600 franchise units in the UK, and 935 business format franchise systems – around double of what existed twenty years prior.

It’s been widely reported that Millennials are turning to self-employment at a faster rate than any previous generations, and so it is no surprise the results of the franchise survey show this as an attractive option to this group. 18% of franchisees were found to be under the age of 30 – a significant rise in recent years. 

With 4 in 10 franchise systems operational from a home office, following the work from home orders issued throughout the recent Covid-19 pandemic, is it possible that this way of working could continue to thrive?

Are franchises a good investment?

The cost of owning a franchise can vary wildly, with the initial fee ranging from £1,000 to £500,000. On top of this, you also need to have budgeted for start-up costs, working capital, monthly rent, salaries, inventory, software and utilities.

The franchise industry contributes £17.2 billion per annum to UK GDP, and employs 710,000 people. The 2018 bfa NatWest Franchise Survey also found that franchises are a largely successful business model, with 93% of franchisees claiming profit. 60% have an annual turnover of more than £250,000.

To ensure franchise business success, a franchisee must first do their due diligence by making sure there is room for expansion in the territory they’ll be working in, understanding how much training and ongoing support will be offered, researching the success of other franchisees, and budgeting and planning for fee payments. With this knowledge, a successful business plan can be written, and a successful future awaits them.

Prepare for success in business

If you have been considering starting your own franchise and are looking to increase your business acumen, the University of York’s 100% online MSc International Business Leadership and Management could equip you with the skills and knowledge you need to succeed. 

This online master’s degree will give you a thorough grounding of multiple areas of business, so you will be prepared to take your career to the next level – whether your ambitions lie in opening a franchise or progressing at an existing company.

With us you’ll develop an understanding of business strategy, operations management, finance, leading and managing people, marketing and sales. As you study part-time, you can continue to earn while you learn, applying the knowledge you gain to your existing role. You’ll connect with a global network of peers as you study alongside professionals from all over the world.