What is neuromorphic computing?

Compared with first-generation artificial intelligence (AI), neuromorphic computing allows AI learning and decision-making to become more autonomous. Currently, neuromorphic systems are immersed in deep learning to sense and perceive skills used in, for example, speech recognition and complex strategic games like chess and Go. Next-generation AI will mimic the human brain in its ability to interpret and adapt to situations rather than simply working from formulaic algorithms. 

Rather than simply looking for patterns, neuromorphic computing systems will be able to apply common sense and context to what they are reading. Google famously demonstrated the limitations of computer systems that simply use algorithms when its Deep Dream AI was trained to look for dog faces. It ended up converting any imagery that looked like it might contain dog faces into dog faces.

How does neuromorphic computing work?

This third generation of AI computation aims to imitate the complex network of neurons in the human brain. This requires AI to compute and analyse unstructured data that rivals the highly energy-efficient biological brain. Human brains can consume less than 20 watts of power and still outperform supercomputers, demonstrating their unique energy efficiency. The AI version of our neural network of synapses is called spiking neural networks (SNN). Artificial neurons are arranged in layers and each of the spiking neurons can fire independently and communicate with the others, setting in motion a cascade of change in response to stimuli.

Most AI neural network structures are based on what is known as von Neumann architecture – meaning that the network uses a separate memory and processing units. Currently, computers communicate by retrieving data from the memory, moving it to the processing unit, processing data, and then moving back to the memory. This back and forth is both time consuming and energy consuming. It creates a bottleneck which is further emphasised when large datasets need processing. 

In 2017, IBM demonstrated in-memory computing using one million phase change memory (PCM) devices, which both stored and processed information. This was a natural progression from IBM’s TrueNorth neuromorphic chip which they unveiled in 2014. A major step in reducing neuromorphic computers’ power consumption, the massively parallel SNN chip uses one million programmable neurons and 256 million programmable synapses. Dharmendra Modha, IBM fellow and chief scientist for brain-inspired computing, described it as “literally a supercomputer the size of a postage stamp, light like a feather, and low power like a hearing aid.”

An analogue revolution was triggered by the successful building of nanoscale memristive devices also known as memristors. They offer the possibility of building neuromorphic hardware that performs computational tasks in place and at scale. Unlike silicon complementary metal oxide semiconductors (CMOS) circuitry, memristors are switches that store information in their resistance/conductance states. They can also modulate conductivity based on their programming history, which they can recall even if they lose power. Their function is similar to that of human synapses.

Memristive devices need to demonstrate synaptic efficacy and plasticity. Synaptic efficacy refers to the need for low power consumption to carry out the task. Synaptic plasticity is similar to brain plasticity, which we understand through neuroscience. This is the brain’s ability to forge new pathways based on new learnings or, in the case of memristors, new information.

These devices contribute to the realisation of what is known as a massively parallel, manycore supercomputer architecture like SpiNNaker (spiking neural network architecture). SpiNNaker is the largest artificial neural network using a million general purpose processors. Despite the high number of processors, it is a low-power, low-latency architecture and, more importantly, highly scalable. To save energy, chips and whole boards can be switched off. The project is supported by the European Human Brain Project (HBP) and its creators hope to model up to a billion biological neurons in real time. To understand the scale, one billion neurons is just 1% of the scale of the human brain. The HBP grew out of BrainScaleS, an EU-funded research project, which began in 2011. It has benefitted from the collaboration of 19 research groups from 10 European companies. Now with neuromorphic tech evolving fast, it seems the race is on. In 2020, Intel Corp announced that it was working on a three-year project with Sandia National Laboratories to build a brain-based computer of one billion or more artificial neurons.

We will see neuromorphic devices used more and more to complement and enhance the use of CPUs (central processing units), GPUs (graphics processing units) and FPGA (field programmable gate arrays) technologies. Neuromorphic devices can carry out complex and high-performance tasks – for example, learning, searching, sensing – using extremely low power. A real-world example would be instant voice recognition in mobile phones without the processor having to communicate with the cloud.

Why do we need neuromorphic computing?

Neuromorphic architectures, although informed by the workings of the brain, may help uncover the many things we don’t know about the brain by allowing us to see the behaviour of synapses in action. This could lead to huge strides in neuroscience and medicine. Although advances in neuromorphic processors that power supercomputers continue at unprecedented levels, there is still some way to go in achieving the full potential of neuromorphic technology.

A project like SpiNNaker, although large-scale, can only simulate relatively small regions of the brain. However, even with its current capabilities, it has been able to simulate a part of the brain known as the Basal Ganglia, a region that we know is affected in Parkinson’s Disease. Further study of the simulated activity with the assistance of machine learning  could provide scientific breakthroughs in understanding why and how Parkinson’s happens.

Intel Labs is a key player in neuromorphic computer science. Researchers from Intel Labs and Cornell University were able to use Intel’s neuromorphic chip, known as Loihi, so that AI could recognise the odour of hazardous chemicals. Loihi chips use an asynchronous spiking neural network to implement adaptive fine-grained computations in parallel that are self-modifying, and event driven. This kind of computation allows this level of odour recognition even when surrounded by ‘noise’ by imitating the architecture of the human olfactory bulb. The neuroscience involved in the sense of smell is notoriously complex, so this is a huge first for AI and wouldn’t be possible with the old-style transistors used in processing. This kind of discovery could lead to further understanding around memory and illnesses like Alzheimer’s, which has been linked to loss of smell.

Learn more about neuromorphic computing and its applications

Artificial intelligence is already helping us to make strides in everyday life from e-commerce to medicine, finance to security. There is so much more that supercomputers could potentially unlock to help us with society’s biggest challenges. 

Interested to know more? Find out about the University of York’s MSc Computer Science with Artificial Intelligence.

 

Why big data is so important to data science

What is big data?

Big data is the term for the increasing amount of data collected for analysis. Every day, vast amounts of unsorted data is drawn from various apps and social media, requiring data processing. 

Creating data sets for such a volume of data is more complex than creating those used in traditional data sorting. This is because the value of the data needs defining; without a definition it is just a lot of detail with no real meaning. Despite the term only relatively recently coming into everyday usage, big data has been around since the 1960s with the development of the relational database. It was the exponential rise in the amount and speed of data being gathered through sites like Facebook and YouTube that created the drive for big data analytics amongst tech companies. The ‘Three Vs’ model characterises big data by volume, variety, and velocity (with veracity and variability sometimes being added as fourth and fifth Vs). Hadoop appeared in 2005 offering the open-source framework to store big data and analyse it. NoSQL, the database for data without a defined structure, also rose in stature around about this time. From that point, big data has been the major focus of data science.

What is big data analytics?

Big data analytics is the sorting of data to uncover valuable insights. Before we had the technology to sort through huge volumes of large data sets using artificial intelligence, this would have been a much more laborious and slower task. The kind of deep learning we can now access through data mining is thanks to machine learning. Data management is much more streamlined now, but it still needs data analysts to define inputs and make sense of outputs. Advances like Natural Language Processing (NLP) may offer the next leap for data analytics, NLP allows machines to simulate the ability to understand language in the way that humans do. This means machines can read content and understand sentences rather than simply scanning for keywords and phrases.    

In 2016, Cisco estimated annual internet traffic had, for the first time, surpassed one zettabyte (10007 or 1,000,000,000,000,000,000,000 bytes) of data. Big data analysis can run into data sets reaching into terabytes (10004) and petabytes (10005). Organisations store these huge amounts of data in what are known as data lakes and data warehouses. Data warehouses store structured data with data points relating to one another that has been filtered for a specific purpose. These offer answers to fast SQL (structured query language) queries, which stakeholders can use for things like operational reporting. Data lakes contain raw data that has not yet been defined, drawn from apps, social media, and Internet of Things devices that await definition and cataloguing in order to be analysed.

The data flow of usable data usually involves capture, pre-processing, storage, retrieval, post-processing, analysis, and visualisation. Data visualisation is important because people tend to grasp concepts quicker through representations like graphs, diagrams, and tables.

What is Spark in big data?

Spark is a leading big data platform for large-scale SQL databases that leads to machine learning. Like Hadoop before it, Spark is a data processing framework, but it works faster and allows stream processing (or real-time processing) as opposed to just batch processing. Spark uses in-memory processing making it 100 times faster than Hadoop. Whereas Hadoop is written only in Java, Spark is written in both Java and Scala, but implementation is in Scala. With less lines of code, this speeds up processing significantly. 

Both Hadoop and Spark are owned by Apache after Spark was acquired from University of California, Berkeley’s AMPLab. Using the two in tandem leads to the best results – Spark for speed and Hadoop for security amongst other capabilities.

How is big data used?

Big data is important because it provides business value that can help companies lead in their sector – it gives a competitive advantage when used correctly.

Increasingly, big data is being used across a wide range of sectors including e-commerce, healthcare, and media and entertainment. Everyday big data uses include eBay using a customer’s purchase history to target them with relevant discounts and offers. As an online retailer, eBay’s use of big data is not new. Yet, within the retail sphere, McKinsey & Company estimate that up to 30% of retailers’ decision-making when it comes to pricing fails to deliver the best price. On average, what feels like a small increase in price of just 1% translates to an impressive 8.7% increase in operating profits (when we assume no loss in volume). Retailers are missing out on these kinds of profits based on a relatively small adjustment by not using big data technologies for price analysis and optimisation.

In healthcare, apps on mobile devices and fitness trackers can track movement and sleep, diet, and hormones creating data sources. All this personal data is fed into big data analysis for further insights into behaviours and habits related to health. Big data can also provide huge strides in some of healthcare’s biggest challenges like treating cancer. During his time as President of the United States, Barack Obama set up the Cancer Moonshot program. Pooling data from genetically sequenced cancer tissue samples is key to its aim of investigating, learning, and maybe finding a cure for cancer. Some of the unexpected results of using these types of data, includes the discovery that the antidepressant, Desipramine, has the capability to help cure certain types of lung cancer.

Within the home, energy consumption can certainly be managed more efficiently with the predictive analytics that a smart meter can provide. Smart meters are potentially part of a larger Internet of Things (IoT) – an interconnected system of objects, which are embedded with sensors and software that feeds data back and forth. This data is specifically referred to as sensor data. As more ‘Things’ become connected to one another, in  theory, the IoT can optimise everything from shopping to travel. Some buildings are designed to be smart ecosystems, where devices throughout are connected and feeding back data to make a more efficient environment. This is already seen in offices where data collection helps manage lighting, heating, storage, meeting room scheduling, and parking.

Which companies use big data?

Jeff Bezos, the founder of Amazon, has become the richest man in the world by making sure big data was core to the Amazon business model from the start. Through this initial investment in machine learning, Amazon has come to dominate the market by getting its prices right for the company and the customer, and managing its supply chains in the leanest way possible.

Netflix, the popular streaming service, takes a successful big data approach to content curation. It uses algorithms to suggest films and shows you might like to watch based on your viewing history, as well as understanding what film productions the company should fund. Once a humble DVD-rental service, Netflix enjoyed 35 leading nominations at the 2021 Academy Awards. In 2020, Netflix overtook Disney as the world’s most valuable media company. 

These are just some of the many examples of harnessing the value of big data across entertainment, energy, insurance, finance, and telecommunications.

How to become a big data engineer

With so much potential for big data in business, there is great interest in professionals like big data engineers and data scientists who can guide an organisation with its data strategy. 

Gaining a master’s that focuses on data science is the perfect first step to a career in data science. Find out more about getting started in this field with University of York’s MSc Computer Science with Data Analytics. You don’t need a background in computer science and the course is 100% online so you can fit it around your current commitments. 

What you need to know about mergers and acquisitions

Mergers and acquisitions (M&A) is the term used for the business of merging or acquiring limited companies (also known as private companies) and public limited companies (PLCs). It’s considered a specialism of corporate finance.

What is the difference between a merger and an acquisition?

A merger is when two separate entities combine. The most common structures are either a vertical merger or a horizontal merger. A vertical merger is when two or more companies come together, each with a different supply chain but the same end product or service. This kind of merger often results in synergies leading to reduced costs and increased productivity by gaining greater control of the supply chain.

A horizontal merger usually happens between competitors operating in the same space that want to increase their market share by joining forces and becoming one entity. A joint venture is slightly different – it involves two companies creating a new entity in which they both invest and share profit, loss and control. This business is entirely separate from both parties’ other companies.

Acquisition is when a larger acquiring company selects a target company to acquire through a buyout. Usually these are friendly acquisitions, but there can be what is known as a hostile takeover. This is when the acquirer aims to buy controlling interest directly from a  company’s shareholders without the consent of its directors. It is a completely legal M&A process but because of the ‘unfriendly’ nature of it, it can affect morale and damage company culture.

Consolidation can refer specifically to an amalgamation, which is the acquiring – and sometimes merging – of many smaller companies that then become part of a larger holding group. This is often seen in the creative industries or with startups.

Are mergers and acquisitions good for the economy?

Mergers and acquisitions tend to be good for the economy because they stimulate business growth, create new jobs and offer investment opportunities for all. Cross-border transactions can also help brands and businesses grow in new territories. However, if a company is looking to grow with the intention of merging or acquiring, it needs working capital to do this. A way of increasing capital is to offer shares on the stock market via an Initial Public Offering (IPO).

IPOs and the rise of SPACs

Famous IPOs in recent decades include when Facebook became a public company in 2012 and Alibaba’s record-breaking IPO in 2014. Facebook’s IPO was one of the most anticipated in history, with stock price steadily increasing up to the opening-day of May 18, and some investors suggesting a valuation of $40 per share in the build-up. The year before, LinkedIn’s stock had doubled in value on its first day of trading, from $45 to $90. Yet on the day, numerous factors – including computer glitches on the part of the Nasdaq stock exchange – led to Facebook share prices actually dropping considerably. This continued for the next couple of weeks with stock closing at $27.72 on June 1. Other tech companies took a hit and investment firms faced considerable losses. Nasdaq offered reimbursements, which its rival, the New York Stock Exchange called a “harmful precedent”. Despite these issues, the stock set a new record for trading volume of an IPO at 460 million shares.

Increasingly seen in global M&A, particularly in the US, is a Special Purpose Acquisition Company (SPAC). SPACs are created specifically to raise capital through an IPO and merge with another company. They’re not new, they’ve been around since the 90s, but SPACs have gained popularity with blue-chip private equity firms, investment banks like Goldman Sachs, and leading entrepreneurs. In turn, this kind of backing encourages more private companies to consider going public. 2020 saw the highest global IPO activity in a decade for the USA as well as the largest SPAC IPO in history.

The role of private equity

Private equity is capital which is not listed on a public stock exchange. Private equity firms are major players when it comes to M&A deals because they are powerful enough to keep on investing capital over an extended period of time. They have a pool of money accrued from previous M&A transactions, which then feeds into further deals. They also receive private equity from limited partners, pension funds, and capital from other companies. The firm, or fund as it is sometimes known, has good cash flow because of this.

Real estate is a form of private equity. There are private equity firms that specialise solely in the purchase of real estate, and by building a property portfolio they create further investment capital. Property is then improved and rented out, or sold on at a higher price, keeping the fund topped-up. Private investors see a return on their investment, and the money that’s left becomes working capital for the fund’s next venture.

Due diligence and pre-M&A analysis

One of the key steps in any merger or acquisition is the due diligence process. This is when investigations and audits are carried out to verify that all financial information provided by the target company is correct and that the purchase price is justified. Discounted Cash Flow (DCF) analysis is part of due diligence. It’s a method used to estimate the value of investment based on its predicted future cash flows.

Another important tool is Accretion Dilution Analysis. This is a basic test carried out before an offer is even made to determine whether a merger or acquisition will increase (accretion) or decrease (dilution) the Earnings per Share (EPS) once completed.

Intellectual Property (IP) must be taken into account as well. Acquisitions with an interest in gaining IP assets can have transaction values of billions. A thorough understanding of the complexities of such high-stake transactions is needed in order to derive precise valuation numbers when negotiating a deal.

Why work in mergers and acquisitions?

Global M&A is seeing growth in all sectors, even as the pandemic has seen some major companies fold. The way that we do business continues to be reshaped by world events, and the flux means that there are many business opportunities to take advantage of through mergers and acquisitions. Global M&A in financial services is seeing a boom with the start of 2021 being the busiest since 1980. The predominance of SPACs is set to spread outside of North America, and brings with it a demand for experienced managers and management teams.

Diversification acquisition will see larger companies offering size and scale to smaller companies which perhaps do not have the capital or resources to adapt in their offering, but which are otherwise doing well. At the other end of the spectrum, specialism may be needed, for example in healthcare, which is seeing a spike in demand for home healthcare solutions. Either way, business continues to seek a competitive advantage and mergers and acquisitions continue to provide this.

If you’re interested in learning more about the world of mergers and acquisitions, there are numerous finance-focused podcasts which look specifically at global M&A activity.

Learn more about mergers and acquisitions 

Mergers and acquisitions are a cornerstone of international businesses. Find out how you can sharpen your expertise in international business and mergers and acquisitions with the University of York’s MSc International Business Leadership and Management.

The future of artificial intelligence

Artificial intelligence (AI) is the machine learning of tasks that we associate with the human brain – things like problem-solving, perceiving, learning, reasoning, and even creativity. AI has grown exponentially in recent years. The Covid-19 pandemic, in particular, highlighted the need for AI systems and automation that could respond swiftly to reduced numbers of workers. 

For organisations that had gone through a digital transformation, AI and associated emerging technologies were already being integrated into business processes. However, for many, Covid was the turning point that highlighted the need for AI solutions to be included in their business models. The AI cloud is a cutting-edge concept that will help make AI software more accessible to businesses by bringing together cloud computing and a shared infrastructure for AI use cases.

Healthcare offers many successful AI case studies, most recently for diagnosing and tracking Covid-19 using rapidly gathered big data, but also increasingly in things like cancer diagnostics or detecting the development of psychotic disorders. Other sectors that use real-world AI applications include the military, agriculture, manufacturing, telecommunications, IT and cybersecurity, and finance. AI art, or neural network art, is a genre in its own right. Holly Herndon, who has a PhD in Music and Acoustics from Stanford’s Centre for Computer Research, uses AI technology in her work.

What are the risks of AI?

Science fiction writers have long been fascinated by the idea of AI taking over. From Blade Runner to The Terminator, the fear is that the machines will start to think for themselves and rise up against humans. This moment is known as the ‘singularity’, defined as the point in time when technological growth overtakes human intelligence, creating a superintelligence developed by self-directed computers. Some people believe that this moment is nearer than we think.

In reality, AI offers many benefits, but the most obvious risks it currently poses are in relation to personal data privacy. In order for deep learning to take place, AI needs to draw information from large amounts of data that must come from people’s behaviours being tracked – their personal data. The Data Protection Act 2018, which enacted the general data protection regulation (GDPR), was brought in to ensure that people have to opt in to having their data gathered and stored, rather than having to make the request to opt out. Previously, businesses and organisations were able to simply use their customers’ data without permission.

Some of us may feel suspicious about our data being collected and yet, many of the applications we use are constantly gathering information about us, from the music we like and the books we read to the number of hours we sleep at night and the number of steps we walk in the day. When Amazon makes suggestions for what you might like to read next, it’s based on your purchasing and browsing history. A McKinsey & Company report from 2013 stated that 35% of Amazon’s revenue comes from recommendations generated by AI. AI is also instrumental in the way that LinkedIn helps both people to find jobs and companies to find people with the right skill set.

The more we allow our actions to be tracked, in theory, the more accurately our behaviours can be predicted and catered to, leading to easier decision making. New technologies like the Internet of Things (IoT) could help make this data even more interconnected and useful – a fridge that has already made a shopping order based on what you have run out of, for example.

Can AI be ethical?

There are certainly big questions around ethics and AI. For example, artificial neural networks (ANNs) are a type of AI that uses interconnected processors which mimic the human brain’s neurons. The algorithm for an ANN is not determined by human input. The machine learns and develops its own rules with which to make decisions, and which are usually not easily traceable by humans. This is known as black box AI because of its lack of transparency, which can have legal as well as ethical implications. In healthcare, for instance, who would be liable for a missed or incorrect diagnosis? If used in self-driving car insurance, who would be  liable for a wrong turn of the wheel in a crash?

When it comes to data analytics, there is also the issue of bias: because human programmers define datasets and write algorithms, this can be prone to bias. Historically, the field of data science has not been very diverse, which can lead to demographics being underrepresented and even inadvertently discriminated against. The more diverse the programming community, the more unbiased the algorithms, therefore the more accurate and useful AI applications, can become.

A popular example of problematic use of AI is deepfakes, imagery that has been manipulated or animated so that it appears that someone (usually a politician) has said or done something they haven’t. Deepfakes are linked to fake news and hoaxes which spread via social media. Ironically, just as AI software can clone a human voice or recreate the characteristic facial expressions of an individual, it is also key in combating fake news because it can detect footage that is a deepfake.

What are the challenges in using artificial intelligence?

Machine learning relies on data input from humans. A machine cannot initially simply start to think for itself. Therefore, a human – or a team of humans – has to pinpoint and define the problem first before presenting it in a computable way. 

A common example of what an AI robot cannot do – which most humans can do – is to enter a kitchen and figure out where all the items needed to make a cup of tea or coffee are kept in order to make a hot drink. This kind of task requires the brain to adapt its decision-making and improvise based on previous experience of being in an unfamiliar kitchen. AI currently cannot develop the data processing systems to spontaneously do this, but it is a situation that the neural networks of a human brain can naturally respond to.

What problems can AI solve?

Artificial Intelligence is mainly suited to deep learning which demands the scanning and sifting through of vast amounts of data looking for patterns. These algorithms developed through deep learning can, in turn, help with predictions. For instance, understanding a city’s traffic flow throughout the day and synchronising traffic lights in real-time can be facilitated through AI implementation. AI can also strategise. One of the milestones in the machine learning of AI systems was Google’s DeepMind AlphaGo beating the world’s number one Go player in 2017, Ke Jie. Go is considered to be particularly complex and much harder for machines to learn than chess.

On the practical side, AI can help reduce errors and carry out repetitive or laborious tasks that would take humans much longer to carry out. In order to increase the use of AI responsibly, the UK government launched the National AI Strategy in March 2021 to help the economy grow via AI technologies. Some of the challenges that are hoped to be addressed are tackling climate change and improving public services. 

In conclusion, AI has huge potential, but ethical, safe and trustworthy AI development is reliant on direction from humans. 

If you’re interested in understanding more about artificial intelligence, our MSc Computer Science with Artificial Intelligence at the University of York is for you. Find out how to apply for the 100% online course. 

Everything you need to know about data analytics

Data analytics is a key component of most business operations, from marketing to supply chain. But what does data analytics mean, and why are so many organisations utilising it for business growth and success?

What is data analytics?

Data analytics is all about studying data – and increasingly big data – to uncover patterns and trends through analysis that leads to insight and predictability. Data analytics emerged from mathematics, statistics and computer programming before becoming a field in its own right. It’s related to data science and it’s a skill that is highly desirable and in demand. 

We live in a world full of data gleaned from our various devices, which track our habits in order to understand and predict behaviours as well as help decision-making. Algorithms are created based upon the patterns that arise from our usage. Data can be extracted from almost any activity, whether it’s tracking sleep patterns or measuring traffic flow through a city. All you need are defined metrics. Although much of data extraction is automated, the role of data analysts is to define subsets, look at the data and make sense if it, thereby providing insight that can improve everyday life

Why is data analytics important?

Data analytics is particularly important in providing business intelligence that helps with problem-solving across organisations. This is known as business analytics, and it’s become a key skill and requirement for many companies in making business decisions. Data mining, statistical modelling, and machine learning are all major elements of predictive analytics which uses historical data. Rather than simply looking at what happened in the past, businesses can get a good idea of what will happen in the future through analysis and modelling of different types of data. This can then help them assess risk and opportunity when planning ahead.

In healthcare, for example, data analytics helps streamline operations and reduce wait times, so patients are seen more quickly. During the pandemic, data analysis has been crucial in analysing figures related to the rate of infection, which then helps in identifying hotspots, and forecasting either an increase or decrease in infections.

Becoming qualified as a data analyst can lead to work in almost any sector. Data analysis is essential for managing global supply chains and for planning in banking, insurance, healthcare, retail and telecommunications.

The difference between data analytics and data analysis

Although it may seem like data analytics and data analysis are the same, they are understood slightly differently. Data analytics is an overarching term that defines the practice, while data analysis is just a section of the entire process. Once data sets have been  prepared, usually using machines to speed up the sorting of unstructured data, data analysts use techniques such as data cleansing, data transforming and data modelling to build insightful statistical information. This is then used to help improve and optimise everyday processes with data analytics as a whole.

What is machine learning?

Machine learning – a form of artificial intelligence – is a method of data analysis that uses automation for analytical model building. Once the machine has learnt to identify patterns through algorithms, it can make informed decisions without the need for human input. Machine learning helps speed up data analysis considerably, but this relies on data and parameters being accurate and unbiased, something that still needs human intervention and moderation. It’s a current area of interest because the way that data analysis progresses and supports us is reliant on a more diverse representation amongst data analysts. 

Currently, most automated machine learning is based on simple, straightforward problems. More complex problems still require at least two people to work on them, so artificial intelligence is not going to take over any time soon. Human consciousness is still a mystery to us, but it is what makes the human brain’s ability to analyse unique.

What are data analytics tools?

There are a number of tools that help with analysis and overall analytics, and many businesses utilise them at least some of them for their day-to-day operations. Here are some of the more popular ones, which you may have heard of:

  • Microsoft Excel is one of the most well-known and useful tools for tabular data.
  • Tableau is business intelligence software that helps to make data analysis fast and easy by linking with Excel spreadsheets.
  • Python is a programming language used by data analysts and developers which makes it easy to collaborate on machine learning and data visualization amongst other things.
  • SQL is a domain-specific programming language that uses structured query language.
  • Hadoop is a distributed file system that can store and process large volumes of data.

Analysts also use databases that provide storage for data which is relational (SQL) and non-relational (NoSQL). Learning about all of these tools and becoming fluent in how to use them is necessary to become a data analyst.

How to get into data analytics

Working in data analytics requires a head for numbers and statistical techniques. But it also requires the ability to spot problems that need solving and the understanding of the criteria needed for data measurement and analysis to provide the solutions. 

You need to become familiar with the wide range of methods used by analysts such as regression analysis (investigating the relationship between variables), Monte Carlo simulation (frequently used for risk analysis) and cluster analysis (classifying relative groups). In a way, you are telling a story through statistical data so you need to be a good interpreter of data and communicator of your findings. You will also need patience because, in order to start your investigations, it’s important to have good quality data. This is where the human eye is needed to spot things like coding errors and to transform data into something meaningful.

Studying for an MSc Computer Science with Data Analytics online

You can become a data analyst with the postgraduate course, MSc Computer Science with Data Analytics from the University of York. The course is 100% online with six starts per year so you can study anywhere, any time. 

You can also pay per module with topics covered such as Big Data Analytics, Data Mining and Text Analysis, and Artificial Intelligence and Operating Systems. Once you’ve completed the learning modules you can embark on an Individual Research Project in a field of your choice. 

Take the next step in your career by mastering the science of data analytics.

Can leaders be flawless?

One of the key traits that many people identify in leaders is confidence and an aura of strength, whether it’s in making tough choices or in guiding your team and company through a challenging business landscape. There is a perception that this confidence and strength means that leaders are unerring and never falter, either in their actions, their attitude or assuredness of their own abilities. We expect our leaders to be flawless.

Can a leader ever be perfect?

While nobody likes to make mistakes, everybody does. It’s admitting to errors, taking responsibility and owning the solution that makes people seem open, honest and transparent. In a leader this can appear, more ‘human’ and therefore relatable. Employees working for a leader who is seen as a real person may find them more approachable, meaning that teams are more cohesive, can resolve problems faster and communicate and collaborate more effectively.

In contrast, leaders who are perceived as being too ‘perfect’ may find that their employees feel less able to approach them when things aren’t going well. On top of this, the weight of expectation placed on a ‘perfect’ leader may cause stress, hamper their ability to seek assistance and increase feelings of isolation and loneliness.

Should leaders aim for perfection?

While leaders should strive to exemplify the highest standards and inspire employees to do the same, an aversion to being seen as anything less than perfect can be very restrictive. At its worst, perfectionism can prevent positive actions being taken, just in case they go wrong. Fear of failure can be a key limiting factor to the success of the company.

Beyond striving for perfection, leaders who become over-confident can be a liability. Believing success to be assured, they can fail to accurately assess risks and, when things go wrong, may seek to shift the blame on to others. You can’t be perfect if you don’t recognise your own limitations.

Is there a balance between confidence and humanity?

Leaders who can balance their confidence and assuredness with approachability and humility could find that their role is easier.

Candidates who can demonstrate their leadership credentials with critical skills such as effective communication and the ability to critically analyse and solve workplace problems, including what to do when mistakes occur often find themselves in high demand. This is where the University of York’s 100% online Masters degree courses in Leadership and Management come in. Gaining a prestigious postgraduate degree from a Russell Group university could help you to differentiate yourself from other aspiring leaders. As all learning materials are delivered digitally you can study online when it suits you. There’s no need to take an extended study break: you can keep your current role and apply what you learn as you go.

As you can keep earning while you’re learning, it minimises the financial impact of study. It’s also possible to pay-per-module and you may be able to apply for a government-backed loan to assist with course fees. There are six start dates per year, meaning that you can start studying within weeks.

Find out more and to begin your application.

Can we ever eliminate cyber security threats?

High profile data breaches have frequently made the headlines over the last few years, with household names and respected tech brands like Facebook and Uber falling victim to large scale attacks. The fact that some of the biggest and most profitable companies in the world have been duped by such attacks highlights just how difficult the situation is.

Cyber-attacks may not be a new concept, but they’re certainly increasing in volume. The growing sophistication of attacks means that measures which once worked to prevent or minimise damage no longer have any effect.

The problem

As the volume of attacks, alerts and threats increases, IT teams are put under increasing pressure. Each potential cyber security threat flagged by the system needs to be explored to determine its credibility and the impact it could have on the business. If a serious threat is identified then the team must take further action to prevent or minimise damage.

Across the world there’s a chronic computer science skills shortage, and the picture is no different in the UK. Businesses are already stretched as the number of unfilled tech roles is set to grow from the current level of 600,000 to 1 million by 2020, couple that with an increase in workload due to the proliferation of cyber security threats and it is perhaps easy to see why so many businesses are struggling to fend off attacks.

The solution

There are many options when it comes to technology which can help prevent attacks. Researchers are developing new ways to fend off threats all the time. More and more frequently, companies are deploying artificial intelligence (AI) in order to support IT teams and free up some of the time it takes to identify legitimate threats. There are now a plethora of cost-effective products available to businesses which utilise AI, data and machine learning to help detect breaches, helping IT teams to detect attacks faster and more accurately, to minimise their frequency and severity.

Taking this time-consuming work away from IT departments frees more time up to shore up cyber defences: ensuring that employees are informed of how they could be used as a conduit for an attack through phishing scams; looking at the security of legacy software; checking old code to ensure there are no weaknesses which could be taken advantage of.

It is unlikely that we’ll ever be able to entirely eliminate the threat of cyber-attacks, but with an increased use of AI, businesses are able to manage the threats more effectively. It takes skill and an in-depth understanding of cyber security issues to implement and maintain these systems. This is why the University of York has introduced the 100% online MSc Computer Science with Cyber Security, for ambitious individuals looking to move into computer science roles.

The course covers specific topics such as cyber security threats and security engineering. It also covers key areas of computer science expertise, including advanced programming and artificial intelligence, giving ambitious students the skills required to pursue a career in cyber security.

There’s no need to take a career break or juggle family commitments as the course is delivered 100% online, with all programme materials accessible from a wide variety of devices at any time. There’s also a choice of six start dates per year and a pay per module option which eliminates the need for a large upfront payment. All this means you can earn a prestigious Masters degree from a Russell Group University in a flexible way that suits you.

Find more information and to begin your application.

What is an Entrepreneurial Leader?

Expectations placed upon leadership are understandably high. We expect our leaders to have perfect strategy and superhuman decision-making skills. The truth is, some of the most important attributes for a leader are curiosity, learning and a constant desire to iterate and improve. Without the ability to listen to others, to embrace new ideas and to change course in the face of new information, success will always be out of reach.

An enquiring nature and an open mind should be true of any leader, but for entrepreneurial leaders it is of paramount importance. Being able to absorb and assimilate new information gives the greatest chance of success. Entrepreneurs thrive on the new and the innovative and are highly desirable to companies of all sizes to help them generate and test out new ideas and to keep fresh and disruptive thinking at the forefront of the business.

The willingness to change

Being able to take on board new concepts and ideas isn’t always enough to enable effective leadership – sometimes it can effectively take ‘unlearning’ what you think you know. There are many examples of new, exciting things being repeatedly rejected by those who couldn’t accept something radical; for example engineer Steven Sasson of Kodak invented a digital camera in 1975, but the company were not convinced by the new technology. They were couldn’t understand why consumers would want to view images on a TV screen when film was so inexpensive. They eventually made the move to digital 18 years later.

Fostering an open culture

Another skill that can be highly beneficial to leadership is a level of empathy that allows you to pick up on how employees are feeling. Approachable and understanding leaders that balance their skills and expertise with a forthright and open attitude may find that their employees are harder-working and more loyal.

In new markets, start-up environments and areas of business that are being created from scratch, the ability to bring together a cohesive team can make all the difference. As technologies and businesses develop, teams must communicate closely and react quickly to cope with the pace of change.

Confidence is key

Having the confidence to look at a range of options, make sound judgements and be decisive is a key business skill. Experience in business can give you a level of assuredness, but it’s also important to have confidence in the people around you. Entrepreneurial leaders rely on their colleagues to fill gaps in skills and knowledge. By using sound decision-making processes, they increase the chances of success in new fields.

An entrepreneurial mindset isn’t just something certain people have and others don’t. It’s a learned skillset which you can develop, given the right environment.

The University of York has a suite of 100% online Masters degrees in Leadership and Management to develop these skills in aspiring business leaders. As all learning materials are delivered completely online, you can study around work or family commitments whenever it suits you. This in turn means there’s no need to take an extended study break and you can apply what you learn as you go, keeping your current grade and salary. There are pay-per-module options available, to reduce large, up-front costs and also the a government-backed loan for those that are eligible to assist with course fees. With six start dates per year, you can begin your studies and personal growth as soon as you’re ready.

Find out more and begin your application.

How do company culture and stress impact on innovation?

While it’s quite commonly assumed that innovation occurs in a sudden “Eureka!” moment, businesses have known for a long time that it can be a gradual process, and that it can be fostered by company culture.

Innovation is critical for success. Not accepting the status quo, going beyond ‘good enough’ and ensuring that a company isn’t just looking to the future, but is actively trying to bring it closer can offer a significant advantage in business.

Is there such a thing as too much innovation?

Innovation is the search for new and better ways of doing things. Facebook, one of the most powerful companies on the planet, lives by the motto “Move fast and break things”, which to many embodies the ethos of never settling and always seeking to shake things up.

Many companies have looked at this ideal and tried to replicate it, but some now believe that the constant search for the ‘new’ is giving employees ‘innovation fatigue’ or ‘innovation stress’. It can quickly become counterproductive.

Recognising and creating the conditions for improvement

It’s important to recognise that innovation can come from every area and level of the business. It’s not the responsibility of one department, team or seniority level. To facilitate an innovative culture, the C-suite must communicate with all employees to ensure that they understand that they have the permission and the ability to innovate.

Employees need to feel engaged and valued. They need to understand and believe in the business goals to give direction to their input. Management that demands something new, better or more efficient doesn’t necessarily give employees the tools to innovate. Allowing everybody at every level of the company to understand what the business wants, where it’s heading and what it aims to achieve helps employees to make strategically valuable contributions.

Support and collaboration for innovation

Something truly new and inventive can be hard to conceptualise and explain. Employees may need support to communicate and deliver their vision. Supporting innovative practice closely will show employees that their suggestions and ideas are valued – in turn helping to keep them on board and working towards the company goals. Even unworkable ideas can become a learning experience, with every small failure informing the next big success.

Managers need a wide range of skills to cultivate a company culture that supports innovation without overburdening employees. The University of York has designed a 100% online Masters degree in Innovation, Leadership and Management to help innovative businesses flourish. All learning materials are delivered online, meaning that you’re free to study whenever and wherever it suits you – and you don’t need to take an extensive or costly career break in order to complete your MSc. It allows you to learn while you earn and apply what you learn to your current role.

There are six start dates per year, so you can start whenever you’re ready, and there are options to pay-per-module. You may also be eligible for a government-backed postgraduate loan to cover the cost of the course.

Company culture plays a big part in the success of innovation. Being able to demonstrate the right skills could not only play a big part in your own career, but also the future of an entire business.

Find out more and start your application.

The business benefits of the Internet of Things

The Internet of Things (IoT) is much more than just the smart fridges or intelligent heating systems that we seem to hear about so frequently. The global market for hardware, software, data and telecoms is set to expand to $520bn by 2021, an incredible doubling in size from 2017.

Businesses are already starting to make the most of the technology, utilising it to monitor processes, boost efficiency and provide insights which can guide company strategy. Here are just a few ways that adopting IoT technology can have a positive impact:

Business insights

In industries such as manufacturing, agriculture and healthcare the use of IoT tech is rapidly increasing. By collecting more data and increasing the potential for analysis, companies are gaining a greater insight into their business, its strengths, weaknesses and how their customers use their products. In piecing together this information, there’s the chance to increase efficiency, better meet customer need or increase profits – such as using artificial intelligence to inspect railways and identify faults.

Customer experience

IoT data gives businesses much deeper insights into their customers and how they interact with both the company and its products and services. Making the customer experience as frictionless as possible is something many businesses strive for, particularly in retail. Employing IoT technologies in shopping environments can not only aid the journey, but also provide data to innovate and give shoppers what they want in future, essentially creating a personalised customer experience.

Waste reduction

Enhancing green credentials and making genuine, positive environmental decisions is a key focus for modern businesses. Reducing waste and saving energy is a key part of this and IoT tech can play a big role in identifying inefficiencies that create increased wastage. Leveraging IoT tech that monitors what inventory is on hand, naturally rotates old stock to go out first and prevents overstocking works to reduce waste and prevent capital being tied up in stock that may go out of date and spoil. These kind of surplus prevention strategies can stop wastage and the requirement of expensive storage facilities for unneeded, surplus stock, particularly for refrigerated produce.

New business models

The information provided by the IoT can provide valuable insight into consumers, products and business efficiencies, which in turn can highlight opportunities for innovation and growth. Smart TVs, internet-enabled cars and even coffee machines can all feed usage data back to the manufacturers, providing previously unheard-of insights which can significantly improve the design of the next model or even identify a gap in the market for new innovations.

In order for businesses to successfully use IoT they need experienced employees, with knowledge of how to develop and implement the technology. However, currently the demand for computer science skills far outstrips the supply of qualified graduates, and organisations across industries and sectors are desperate for those equipped with the relevant knowledge in this field. The University of York’s 100% online Computer Science Masters programme is designed for professionals and graduates who may not currently have a computer science background, who want to launch their career in this in-demand and lucrative field.

With six start dates a year you can study around work and home commitments. There’s a pay-per-module option available, and some students may be eligible for a government backed postgraduate loan to cover costs of the course.

Find out more and begin your application.

The big data challenges facing companies today

Big data has become a key part of doing business, but that doesn’t mean that it’s easy. In the last five years 95% of businesses have undertaken a big data project, according to a survey of Fortune 1000 business leaders. However, the same survey revealed that less than half of those initiatives actually achieved measurable results.

There are a number of common challenges businesses face when it comes to big data, many of which come from the lack of big data experts internally. Companies know they should be using the information they own to inform and progress the business, but they lack the expertise to successfully implement projects.

Understanding

Often, the successful implementation of big data projects requires a degree of change across departments – be it in capturing more data, or changing how it is managed or shared with other teams. If some employees don’t understand the need for change or simply aren’t willing to play their part in bringing it about, then project progress is impeded. In order to minimise this kind of disruption, it’s important to take a ‘top-down’ approach. Once senior management have bought in to the concept, training and workshops can bring other employees up to speed on the importance of the project’s success to the future of the company.

Cost

Regardless of the approach you take to implementation of big data projects – on-premises or cloud solutions – the costs are high. On-premises solutions require a lot of hardware, which in turn requires space, electricity and security as well as administrators and developers to maintain it. While cloud solutions may be perceived as being lower cost, there is still a significant up-front investment needed for setting-up, configuring, training, maintenance and storage.

Of course, you’ll need to allow for future growth too, which will likely incur more expense. To avoid these costs spiralling and getting out of control, understanding the business’s big data requirements and having a clear and robust vision and strategy is essential.

Integration

It’s often the case that departments within a business work in data silos. Information is not readily shared across the whole company, partly due to the different platforms used by different teams not ‘’talking’ to one another.

Big data adoption is reliant on this not being the case. The key purpose of big data is to give one single 360 degree overview of the business; to do this information from every department needs to be accessible. This could mean a change in software for some departments or utilising APIs which overlay existing tools so that data can be shared. It will require a change in working processes for many employees – such as no longer using locally saved spreadsheets in order to make jobs ‘easier’.

Security

Security should be considered at the first stage of project planning. Not only because it is an opportunity for the business to review its existing security and policies (or lack thereof), but fundamentally because security should be built into the business’s big data solution from the very beginning.

A great example of this comes from the medical field; very sensitive patient data is stored all across the world, however the ability to analyse it all could provide great insights for those looking to cure diseases such as cancer. Security is a huge concern though, and in order to analyse the data it need to be decrypted, but now thanks to the use of fully homomorphic encryption, patient data could be analysed whilst still encrypted.

Big data is being collected constantly, but in order to make it us able and provide valuable insights which can drive the business forward, big data projects need to be implemented with a great deal of planning and expertise. However, the skills required to execute projects of this kind are in short supply. The University of York’s 100% online MSc Computer Science with Data Analytics course is designed for working professionals and graduates who want to start a career in this lucrative field. You’ll develop specialist skills and knowledge in machine learning, data analytics, data mining and text analysis via specialised modules and an independent data analytics project. You’ll also develop your core computer science skills such as computational thinking, computational problem solving and software development.

Our 100% online programmes allow you to study around work and home commitments, in your own time. Choose from six start dates per year, with a pay-per-module option available. You may be eligible for a government-backed postgraduate loan, covering the cost of the course.

Find out more and begin your application.

How global leaders manage multiple cultures

As more and more businesses operate in the global market place, managing teams which work across cultures is a vital skill for leaders. Seemingly simple things such as local customs and cultures can have a huge impact on work ethic and job satisfaction, so understanding them is imperative for leaders to unite teams regardless of the miles between them. But how do you build cross-cultural trust?

The answer comes from insight in three key points: understanding your own thought processes, understanding the cultural differences between team members, and understanding how trust is built between colleagues through shared achievements and individual characters.

Start with the right mindset

It’s essential to recognise that the process of building trust can differ vastly. Workplace trust is established in a huge variety of different ways, through things like relationships, communication, transparency and the dissemination of information. Some businesses pride themselves on high trust environments, where all employees are privy to business-critical information, told about new developments and communicated with as an equal. Other companies have a more fixed hierarchy with information filtered down on a need-to-know basis.

The only constant is that building trust takes patience; it’s essential not to try and force a trust-building strategy that’s a bad fit for the team. People from high trust environments can find it frustrating when they try to bond with colleagues from low trust environments and find it’s not part of the company culture. Likewise, the opposite can be true: those from low trust workplaces may feel overwhelmed in a high trust culture.

Spend time learning about your employees’ cultures

Some cultures are more inclined to trust than others, while others rely on strict social structures to guide personal relationships. Some observation should be made as to how employees naturally act. Allow employees who prefer a one-to-one meeting to open up in private, for example. Rigid hierarchies can prevent some employees from speaking to colleagues or supervisors, which needs to be monitored: allowing colleagues to speak in a manner they feel comfortable is vital.

Understand the importance of results and character in building trust

In the UK, workplace trust is usually results driven; management highlight that success comes from a team effort and that by trusting each other and assisting one another, co-operative working delivers success.

There are places where character is used as a trust builder; usually by management showing that they’re willing to put time, effort and money into employees and their welfare. This kind of perceived benevolence revolves around being a good character and in many countries extends to knowing and caring about employees’ families.

Trust based on character needs time to build credibility, proving that it’s not a hollow gesture and that managers really do care about their employees’ personal lives and wellbeing. Generally in the UK, colleagues can see their personal lives as ‘personal’ and not for work involvement, but to other cultures work is such a big part of a person’s life that there’s much less of a distinction between a ‘work family’ and a ‘home family’. British workers might find some common ground in shared musical taste or hobbies. To other cultures this may seem superficial, as they search for deeper commonalities and build a bond that benefits both parties.

For managers in either style of trust culture, it’s important to show trust. Without leaders displaying trust for their employees, it’s quite difficult to instil a culture of trust across an entire department. When given trust, colleagues’ usual response is to show themselves to be trustworthy, which increases the likelihood that this attitude will spread across the team. Once trustworthiness is established, it can become a form of social currency.

The bottom line? Today’s business leaders are required to operate in an increasingly complex global environment. With the right management, cross-cultural teams have the ability to outperform teams from very similar cultures, bringing a unique perspective to new problems, which leads to increased innovation and efficiency.

The University of York’s 100% online MSc International Business, Leadership and Management places particular emphasis on the challenges associated with global trade. It’s      designed to build knowledge of business practices worldwide while developing a theoretical understanding of the international business environment. As a graduate, your career will benefit from the ethical, socially responsible and international themes that underpin all programme content.

Ideally suited to working professionals, York’s 100% online programme gives you the flexibility to fit your studies around your current commitments. You can access course content and study anytime, anywhere, on a variety of desktop and mobile devices. With six start dates a year, and a pay-per-module fee structure, you can begin whenever you are ready. You may even be eligible for a government-backed post graduate loan to cover the cost of your programme.

Find out more and begin your application.