What are mobile networks?

What are mobile networks?

A mobile network, also known as a cellular network, enables wireless communication between many end users, and across vast distances, by transmitting signals using radio waves. 

Most portable communication devices – including mobile phone handsets, laptops, tablets, and so on – are equipped to connect to a mobile network and enable wireless communication through phone calls, electronic messages and mail, and data. 

How do mobile networks work?

Mobile networks are effectively a web of what’s known as base stations. These base stations each cover a specific geographical land area – called a cell – and are equipped with at least one fixed-location transceiver antenna that enables the cell to send and receive transmissions between devices using radio waves. 

When people experience poor reception or connection using their mobile devices, this is usually because they aren’t in close enough range to a base station. This is also why, in order to provide the best possible network coverage, many network providers and operators will employ as many base station transceivers as they can, and overlap their cell areas. 

How mobile devices connect to mobile networks

In the past, mobile phones – or portable transceivers – used an analog technology called AMPS (Advanced Mobile Phone System) to connect to cellular networks. Today, however, portable communication devices such as the Apple iPhone or Samsung Galaxy Android phone use digital cellular technologies to send and receive transmissions.

These technologies can include:

  • global system for mobile communications (GSM)
  • code division multiple access (CDMA).
  • time division multiple access (TDMA).

What is the difference between GSM and CDMA?

Devices that use the global system for mobile communications (GSM):

  • can transmit data and voice at the same time
  • do not have built-in encryption, and are typically less secure
  • store data on a subscriber identity module (SIM) card that can be transferred between devices

Devices that use code division multiple access (CDMA), on the other hand:

  • cannot send both data types at the same time
  • have built-in encryption and more security
  • store data on the mobile device itself, rather than a SIM

Another key difference is in terms of usage: GSM is the predominant technology used in Europe and other parts of the world, while CDMA is used in fewer countries.

What are the different types of mobile networks?

Mobile networks have become progressively faster and more advanced over the past few decades.

2G

2G dates back to the early 1990s and eventually enabled early SMS and MMS messaging on mobile phones. It is also noteworthy because it marked the move from the analog 1G to digital radio signals. Its use has been phased out in some areas of the world, such as Europe and North America, but 2G is still available in many developing regions.

3G

3G was introduced in the early 2000s, and is based on universal mobile telecommunication service (UMTS) standards. For the first time, mobile devices could use web browsers and stream music and videos. 3G is still widely in use around the world today. 

4G

4G was first introduced around 2010 and offered a significant step forward for mobile networks. Speed increases significantly with 4G, enabling advanced streaming capabilities and better connectivity and performance for mobile games and other smartphone apps even when not connected to WiFi.

5G

5G is the newest addition to the family of mobile networks, rolling out at the end of the 2010s and still being introduced in major centres around the world today. Through high-frequency radio waves, the 5G network offers significantly increased bandwidth and is approximately 100 times faster than the upper limit of 4G.

Different mobile networks providers in the UK

UK networks vary in the United Kingdom, but all are regulated by Ofcom, the regulators and competition authority for UK communication industries such as fixed-line telecoms, mobiles, and wireless device airwaves. It’s worth noting that mobile networks can also fall under the jurisdiction of the Financial Conduct Authority when offering services such as phone insurance.

What are the UK’s main mobile networks?

The UK has four main mobile network providers:

  1. Vodafone
  2. EE
  3. O2
  4. Three

Between them, these four mobile operators – known as the big four – own and manage the UKs mobile network infrastructure. They’re also known as host mobile phone networks, supporting all other mobile service providers – called mobile virtual network operators (MVNOs) – in the UK.

Examples of mobile virtual network operators in the UK

  • ID Mobile, which uses the Three network
  • GiffGaff, which uses the O2 network
  • Tesco Mobile, which uses the O2 network
  • Virgin Mobile from Virgin Media, which uses the Vodafone and O2 networks
  • Sky Mobile, which uses the O2 network
  • BT Mobile, which uses the EE network
  • Plusnet Mobile, which uses the EE network
  • Asda Mobile, which uses the Vodafone network
  • VOXI, which uses the Vodafone network
  • SMARTY, which uses the Three network
  • Talkmobile, which uses the Vodafone network
  • Lebara, which uses the Vodafone network

Other mobile phone businesses, such as Carphone Warehouse, work with multiple providers to offer consumers several options in one place when looking for a new phone provider.

Competition between mobile providers

Regardless of which mobile provider that UK mobile customers choose, there are just four networks supporting the provider’s service. This means that having the UK’s fastest or most reliable network is a huge selling point, and many customers use a dedicated coverage checker to investigate their preferred option. It also means that providers offer a number of additional perks and mobile phone deals to help secure mobile phone contracts.

These benefits might include:

  • reduced tariffs for customers who sign up for a rolling monthly contract
  • data plans such as an unlimited data allowance or data rollover, which allows customers to rollover any unused data at the end of the month into the next month
  • deals and discounts for other services offered by the providers, such as household broadband deals or mobile broadband services
  • access to affiliated entertainment services, such as Netflix, Amazon Prime, or BT Sport
  • discounted SIM-only deals and plans such as a reduced one-month rolling SIM or a 12-month SIM

Explore mobile and computer networks

Discover more about mobile networks and advance your career in computer science with the 100% online MSc Computer Science from the University of York. This flexible Masters programme has been designed for working professionals and graduates who may not currently have a computer science background and want to launch their career in this cutting-edge and lucrative field.

One of the key modules on this programme covers computer and mobile networks, so you will examine internet architecture, protocols, and technologies – as well as their real-world applications. You will also discuss networks and the internet, network architecture, communication protocols and their design principles, wireless and mobile networks, network security issues, and networking standards, as well as related social, privacy, and copyright issues.

Management strategy: a roadmap for business growth and success

Whether formalised or not, most businesses have a mission statement that captures their passions, plans, objectives and unique selling points. What businesses may not have, however, is an appropriate, considered and strategic approach in order to achieve these high-level visions.

Management strategy refers to the planned handling of a company’s resources in order to meet its set objectives and goals. It encompasses strategising, monitoring, analysis and assessment on a continuous basis, taking into account shifts in business environments, marketplaces and a variety of other internal and external factors. Business strategy of this type is universal – it can be adopted by businesses of all sizes and types who need to take stock of their progress and performance and develop strategies to increase these.

The benefits of a strategic management plan

The key to strategic planning of this nature is that it is never finished. It isn’t an exercise that chief executives and top and mid-level managers complete once before producing a roadmap that will take a business from strength to strength.

Formalising the direction for an organisation’s short-term and long-term future can bring with it a number of key benefits, both financial and non-financial. These include the following.

Attainment of goals and targets 

Organisations need to establish, implement, monitor and adapt a roadmap in order to achieve growth and performance objectives. Management strategies align realistic, measurable goals with agreed visions and overall ambitions, providing a course of action for an organisation to follow to achieve their end goals. Strategy formulation allows organisations to thrive in specific areas of interest.

Cohesive corporate culture and drive

Strategic management calls for organisation-wide commitment from team members to senior management to other stakeholders. When the whole business is united under a shared aim and shared action plan, team members can then understand their common goals and are better placed to communicate and collaborate. It also serves to increase the engagement and awareness of team leads and managers; they must consistently re-evaluate their position and performance in line with the objectives.

Structured and sustainable growth

Strategic management serves to streamline performance – and, in turn, increases efficiency. This efficiency is the driver for sustainable, long-term growth, enabling organisations to cement their position in the industry marketplace and build upon its success.

Increased competitive advantage

Organisations focused on their goals and management strategies are, by necessity, forward-thinking. This involves possessing an outlook on the market, including changing customer demands, constraints and unpredictable or unfavourable scenarios that may occur. This perspective is not only critical in strategy, but helps organisations to remain ahead of competitors through limiting damage and seizing opportunities. Additionally, competitive businesses – who move through iterations to be better and perform better  – may be more likely to observe other benefits, such as increased staff retention and recruitment of the best talent.

As the business world changes, strategies must adapt – making this planning process an integral part of running a business effectively.

The stages of strategic management

A useful framework for the strategic management process includes the following four steps:

  • assessment and analysis of the organisation’s current strategic goals and direction
  • identification and analysis of organisational strengths and weaknesses, both internal and external
  • formation of action plans
  • implementation and execution of action plans
  • evaluation and control of action plans, including how successful they have been and whether further adjustments are required

In the analysis stage, leaders must clarify business intentions. Using the mission statement as a starting point, evaluate the overall objectives and assess whether progress and performance is in alignment. Stakeholders are a useful source of information during this initial stage, along with risk management and SWOT analyses (Strengths, Weaknesses, Opportunities, Threats). Once primary focuses have been identified, the business has a basis to build on and aims to work towards.

The formation stage is where leaders establish an action plan in order to meet its primary objectives. Ensure each point of the action plan is aligned with SMART guidelines (Specific, Measurable, Achievable, Realistic, Timely) to build a solid framework into the process. It may be prudent to formulate a ‘Plan B’ for some of the strategic steps as this will encourage flexibility and resilience in unpredictable periods.

When an action plan is in motion, businesses have entered the execution stage. At this point, any elements necessary for smooth implementation – for example, establishment of new operational initiatives in certain business units, allocation of human resource or acquisition of funding – should be in place. To maximise efficiency, all stakeholders should be briefed on the action plan and confident of their role and responsibilities in relation to it.

The final stage is evaluation and control. Refer back to the goals set during the analysis stage and assess the process and its results against these measures for benchmarking purposes. Corrective actions and further adaptations may be required and are both part of the overall strategic management process. This knowledge management exercise enables leaders to use the insights to develop and hone operations, plans and processes, ensuring that with each correction the business moves closer to its core objectives. Internal and external issues must be evaluated, together with data and other useful observations.

The balanced scorecard

Robert S. Kaplan and David P. Norton state that traditional, financial-based performance measures do not provide a detailed-enough indication of how well modern organisations are faring. In today’s competitive business environment, metrics such as return-on-investment can be misleading grounds on which to develop innovative, continuous improvement-based activities.

Enter the balanced scorecard: a set of measures that offer business leaders a quick, comprehensive view of organisational performance. Alongside financial measures, the scorecard provides insights into customer perspective and satisfaction, innovation and improvement activities, and internal processes and perspectives. In doing so, it asks: how do we perform in the eyes of shareholders, customers and ourselves? The scorecard aims to provide a valuable, holistic look at organisational health, reducing information overload to allow leaders to speedily assess the most-critical areas for development and improvement. Strategy and vision, rather than control, become the focus.

Scorecard findings can then be used in decision-making, problem-solving and forecasting efforts, translating overarching strategy into specific and measurable steps. It’s important that managers ensure the goals are Kaplan and Norton advise that the scorecard “keeps companies looking – and moving – forward instead of backward.”

Implement management strategies to structure and align your business goals

If you want to gain the skills to help organisations make strategic decisions to achieve their objectives, choose the University of York’s online MBA programme.

Successful growth and expansion requires leaders with the tools and abilities to evaluate, analyse and act decisively at the highest levels, navigating ever-evolving environments and taking advantage of business opportunities. The key operational fundamentals you’ll acquire – including international business strategy and trade, marketing, financial management, leadership, and project management – will help to underpin long-term business objectives, increasing your employability and preparing you to succeed in senior strategic roles.

Internet protocols: making the world wide web possible

E-commerce, streaming platforms, work, social media, communication – whatever we’re using the internet for, we’re using it on a widespread, wide-ranging and constant basis.

DataReportal states that most of the connected post-pandemic world continues to grow faster than it did previously. Their Digital 2022 Global Overview Report, published in partnership with Hootsuite and We Are Social, states:

  • There are 4.95 billion internet users, accounting for 62.5 per cent of the global population.
  • Internet users grew by 192 million over the past 12 months.
  • The typical global internet user spends almost seven hours per day using the internet across all devices, with 35 per cent of that total focused on social media.
  • The number of people who remain “unconnected” to the internet has dropped below 3 billion for the first time.

With faster mobile connections, more powerful devices set to become even more accessible and more of our lives playing out digitally than ever before, greater convergence across digital activities is likely. Our reliance on it? Greater still.

But what of the structures and processes behind these billions of daily interactions? And just how many of us actually know how the internet works? Individuals with the skills and specialist expertise in the computer science space are in high demand across a huge range of industries – and there’s never been a better time to get involved.

What is internet protocol?

Cloudflare defines internet protocol (IP) as a set of rules for routing and addressing packets of data so that they can travel across networks and arrive at the correct destination. Essentially, it’s a communications protocol. Data packets – smaller pieces of data that traverse the internet that have been divided from greater quantities – each have IP information attached to them. It’s this IP information that routers use to ensure IP packets are data transferred to the right places.

Each device and each domain that has the ability to access the internet has a designated IP address in order for internet communication to work. As packets are sent to IP addresses, the information and data, therefore, arrives at its intended destination. IP is a host-to-host protocol, used to deliver a packet from a source host to a destination host.

There are two different versions of IP, providing unique IP identifiers to cover all devices: IPv4 and IPv6. IPv4 – a 32-bit addressing scheme to support 4.3 billion devices – was originally thought to be sufficient to meet users’ needs, however the explosion in both number of devices and internet usage has meant it’s no longer enough. Enter IPv6 in 1998, a 128-bit addressing scheme to support 340 trillion trillion devices.

The OSI model

The Open Systems Interconnection (OSI) network model is a conceptual framework that divides telecommunications and networking into seven layers.

 Each of the seven layers is tasked with its own function:

  • Physical – represents the electrical and physical system representation.
  • Data link – provides node-to-node data transfer and handles error correction from the physical layer.
  • Network – responsible for data packet forwarding and routing through different routers.
  • Transport – coordinates data transfer between end systems and hosts.
  • Session – a session allows two devices to communicate with each other, and involves set-up, coordination and termination.
  • Presentation – designated with the preparation, or translation, of application format into network format, or vice versa. For example, data encryption and decryption.
  • Application – closest to the end-user, application involves receiving information from users and displaying incoming data to users. For example, web browsers are communications that rely on layer seven.

The OSI model is valuable in understanding technical and security risks and vulnerabilities as it identifies where data resides, offers an inventory of applications, and facilitates understanding of cloud infrastructure migrations.

Transport protocols and other types of protocol

After a packet arrives at its destination, it’s handled accordingly by the transport layer – the fourth later in the OSI model – and the corresponding transport protocol, being used in relation to the IP. Transport layer protocols are port-to-port protocols working on top of internet protocols to deliver the data packet from the origin port to the IP services, before delivering it from the IP services to the destination port.

Transmission Control Protocol (TCP) and User Datagram Protocol (UDP) represent the transport layer. TCP is a connection-oriented protocol that provides complete transport layer services to applications – often referred to as Transmissions Control Protocol/Internet Protocol, TCP/ICP or the internet protocol suite. It features stream data transfer, reliability, flow control, multiplexing, logical connections, and full duplex. UDP provides non-sequenced transport functionality and is a connectionless protocol. It’s valuable when speed and size can be prioritised over security and reliability. The packet it produces is an IP datagram containing source port address, destination port address, total length and checksum information.

Other common types of protocol include:

  • File Transfer Protocol (FTP), where users transfer multimedia, text, programme and document files to each other.
  • Post Office Protocol (POP), for receiving incoming email communications.
  • Hypertext Transfer Protocol (HTTP) and Hypertext Transfer Protocol Secure (HTTPS), which transfers hypertext, the latter of which is encrypted.
  • Telnet, which provides remote login to connect one system with another.
  • Gopher, used for searching, retrieving and displaying documents from isolated web pages and sites.

There is also Ethernet protocol. Ethernet is a method of connecting computers and other devices in a physical space – via packet-based communication – and is often referred to as a Local Area Network (LAN). The Institute of Electrical and Electronics Engineers (IEEE) maintains IEEE 802.3, a working group of standard specifications for Ethernet.

There exist a variety of other protocols that co-function alongside other primary protocols. These include, for example: ARP; DHCP; IMAP4; SIP; SMTP; RLP; RAP; L2TP; and TFTP.

Understand the internet protocol requirements of your business

Gain sought-after skills to excel in a high-demand field with the University of York’s online MSc Computer Science programme.

Our flexible programme is designed to boost your employability, preparing you for a rewarding career in all manner of industries. You’ll gain a strong foundation in a wide range of computing areas, including programming, architecture and operating systems, AI and machine learning, big data analytics, software engineering, cybersecurity, and much more. 

What is the best way to manage across cultures?

Living and working in our interconnected, global age – where, for many businesses, employees may be spread across the world – requires leaders to be highly aware of, and skilled at, cross-cultural management. For those organisations without global teams, there is still the requirement to manage effectively across multicultural teams, where backgrounds, experiences, cultural values and communication styles may be richly diverse.

With multicultural and global teams increasingly the norm, employers are seeking business leaders with the tools and understanding to embrace cultural differences and develop high-performing teams.

What is managing across cultures?

Multicultural and cross-cultural teams are those where team members have diverse backgrounds. This diversity may include ethnicity, nationality, age, religion, gender, socioeconomic status, sexual orientation or other identifiable characteristic. As well as varied lived experiences, it also includes employees working in different countries, continents and time zones.

Leaders and managers must recognise, acknowledge and support these various differences and similarities to lead teams effectively and with minimal conflict.

Benefits and challenges of managing across cultures

There are numerous benefits that arise from having teams based in different countries and teams with unique cultural contexts. Multicultural teams bring a wider variety of talents, skills, perspectives, knowledge and experiences – which can be combined and drawn upon to work towards common goals. These elements can also bring about boosts in creativity – as fresh ideas and different ways of thinking are added to the team mix – leading to innovation, improved ways of working and potential business opportunities. Crucially, it also serves human resource management; a larger talent pool, made possible through global working, allows businesses to hire the most-skilled and capable candidates. This, in turn, can boost brand image, reputation, competitive advantage and profits.

Research statistics collated by InStride further demonstrate the tangible benefits to cross-cultural working:

  • more diverse – and inclusive – teams are 35% more likely to outperform competitors
  • diverse companies are 80% more likely to capture new markets
  • diverse teams are 87% better at making decisions
  • diversity within management leads to 19% higher revenue

However, while diversity is both vital and beneficial in the work environment, it has the potential to cause challenges for leaders who don’t possess the skills to support their teams.

Communication

Communication is one of the most-common barriers that leaders are likely to encounter. As well as language barriers – where some team members may find it harder to engage with day-to-day processes due to communication breakdown – this can also span different communication styles, phrases, colloquialisms, slang, dialect and non-verbal communication. The example given by global human resource organisation, Oyster, is communication differences between Eastern and Western cultures; the former tend to be indirect in communication, while the latter generally favour a more-direct, to-the-point style. Not addressing communication issues can mean employees face difficulties in conveying ideas and engaging with colleagues and, in the worst cases, are considered less-competent due to language barriers.

Work styles

Work styles can differ significantly in cross-cultural teams. Are some team members used to hierarchical structures, or more-collective decision-making? How is authority perceived? Or employee autonomy and independence? Each of these factors, together with numerous others, can present challenges for how teams work together and how individuals expect to be treated in the workplace.

Engagement

Engagement and motivation can often fail to take into account cultural differences within teams. Do team members want promotions, pay rises, more autonomy, different benefits or praise? A one-size-fits-all approach may not be suitable for recognition and reward initiatives and may lead to dissatisfied staff, increased attrition rates and decreased productivity.

Information sharing

Sharing and disclosing information, and business interactions relating to this, may vary between employees. Factors such as personal information, discussing emotions, examining conflicts and misunderstanding can have a significant impact on team dynamics.

Nevertheless, there are tools and approaches that leaders can take to overcome these cross-cultural challenges and reap the rewards of the benefits.

The best ways to manage across cultures

Which management practices should be examined and embedded in teams which contain different cultures?

Prioritise open communication

Company cultures that instil the value of honest, respectful two-way communication – between managers and employees, and amongst employees themselves – make room for all individuals to share their thoughts, ideas and opinions. Communicate messages clearly and transparently so that they are understood regardless of culture.

Get to know team members and listen to them

Getting to know the individuals within teams helps to build personal bonds, facilitates working relationships and demonstrates that team members are seen and appreciated on an individual level – regardless of background. Additionally, encourage team members to get to know one another better through social events and collaborative work activities. Leaders should also practise active listening, aiming to remove assumptions and biases and encourage trust and collaboration in their place.

Develop cross-cultural intelligence – and be flexible

Cultural intelligence and awareness is vital in our modern workplaces. Policies that apply to all employees, regardless of position or background, reinforces equitable treatment and helps to avoid conflict. Cultural learning programmes can help to embrace and celebrate differences, and could be made mandatory for all employees. If training cannot be offered in-house, there are plenty of providers who can address this requirement; LinkedIn Learning, for example, offers a whole suite of online programmes to support cross-cultural management. Leaders who understand specific cultural differences should also be flexible, where reasonable and possible, regarding diverse needs and requirements.

Address conflict

Work conflicts will arise – and, when they do, they should be dealt with promptly and effectively. Seek to bridge the gap between conflicting parties, and pursue resolutions that respect and benefit all individuals involved, as well as the wider team.

It’s worth stating that all of the business practices and considerations related to global leadership and managing across cultures are, as a general rule, useful business practices and considerations full stop. They have the potential to positively impact organisational behaviour and ethos for everyone.

Bring cross-cultural intelligence and management to your role

Want to create productive, inclusive and effective cross-cultural work environments?

Whether you work in global organisations or not, understanding how to manage and support diverse teams is an essential skill. With the University of York’s online MBA programme, you’ll develop as a highly effective leader with an awareness of the right management practices to use with your teams. As well as gaining key leadership skills, your flexible studies will give you in-depth understanding of project management, marketing, international strategy and operations, finance, and more.

The democratisation of public services: an explainer

The democratisation (or, ‘co-production’) of public services is the process of developing and delivering public services and policies in partnership with citizens.

It is, effectively, a more democratic way of designing services within the public sector and civil service. Through co-production and participatory policy-making, citizens can help shape the services they use and that affect their lives, rather than simply being passive beneficiaries of schemes and programmes. Citizens work with public sector professionals, often from a variety of civil services and policy areas, to help create these enhanced services.

Public involvement can occur, partially or completely, in virtually all stages of the collaborative, democratic public service process, including during:

  • commissioning
  • design
  • delivery
  • service implementation, including governance and public management
  • assessment and improvement

What are the benefits of democratisation of public services?

There are a number of positive potential outcomes when democratising public services and service delivery. For example, it can:

  • reinforce the idea that every individual can have an impact within a civil society, and can help reduce citizen apathy
  • shine a brighter light on large-scale issues such as inequalities, human rights, sustainability, and other topics that are better understood through the lived experience of individuals
  • reduce corruption in public services, according to the World Bank.

Other benefits can include the following.

Better relationships

Co-production can strengthen the relationships between frontline public service providers, such as the police, educators, and healthcare professionals, with the public and local communities that they serve. This can help reignite public servants’ passion for their jobs, and can also help humanise them for the everyday people who might otherwise see civil servants as simply their impersonal job descriptions or yet another cog in the machine of central government. 

Better programmes

The democratisation of public services can help iron out any potential wrinkles in public policies and services before they’re implemented. This is because public servants who aren’t on the receiving end of the services they provide can, at times, have trouble spotting issues that might be quickly apparent to service users with an experienced point-of-view. 

Better processes

Breathing new life into existing services through the input, experiences, and ideas of new voices is a huge benefit to the process of developing new public services and policies. The National Organisation for Local Economics (CLES), for example, argues that organisations, individuals, trade unions, and so on, don’t have to be part of the government to be passionate about public services – and they can offer new and different methods for realising social, economic and environmental value within new programmes and policies.

Better value

According to the Civil Service College, co-production can be more cost-effective than conventional methods of service design and delivery. This is because money is less likely to be wasted on ineffective services that fail to meet people’s needs. The college  also argues that by ensuring all services offered are effective and genuinely helpful, this has a knock-on, preventative benefit that further reduces costs. This is particularly noticeable in the health sector. As an example, it points to effective addiction services. If these services are readily available and effective, over time this will decrease the demand – and cost – for overdose treatments.

What role does the government play in the democratisation of public services?

Governments at all levels – from local government to the national level – can harness the power of citizen participation in their service provision.

The New Economics Foundation (NEF), for instance, worked with local authorities to develop a model for co-production, with the aim of supporting the design and delivery of social services that:

  • focus on commissioning for long-term outcomes that make a real, positive impact on people’s lives
  • promote co-production by working in partnership with service users to bring in new resources and create more effective services
  • promote social value, with the triple bottom line – social, environmental, and economic outcomes – placed at the heart of service commissioning.

NEF also provides detailed guidance to governments on its approach to co-production. This includes support with:

    1. Insight. Developing outcomes that are important and helpful to people can be supported by identifying people’s needs and aspirations, as well as the assets and resources that are required.  
    2. Planning. Creating support and activities that meet people’s needs requires an appropriate framework and process. 
    3. Delivery. This includes monitoring and evaluating the value of public services, creating service assessments with the people who use services, and gathering insight and data to improve – and adapt – public services. This can happen through coaching, peer assessment, mystery shopping, customised self-reflection tools, and so on.

What is the difference between democratisation and privatisation?

Democratisation is an increasingly attractive option for governments in recent years. It safeguards the public interest and offers a more democratic alternative to the privatisation of public services.

When privatising public services or other assets of public ownership, governments have a number of options. For example, they might sell public assets to private owners outright, or decide that outsourcing the development and management of public services to a contracted, private third party – rather than managing the service publicly in-house – is preferable.

Privatisation is often the option of choice for governments looking to save – or make – money, but it also creates a gap between the democratic government, the public service, and the service users. 

Democratisation, on the other hand, puts people at the heart of the public service process. While privatised public services might be required to consult the public on new or amended services through forums, social media, or other methods of generating respondents, democratised public services understand their public value, and have the public embedded within the process every step of the way.

What is the difference between the public and private sector?

The public sector consists of all of the organisations that are managed by the government. In the UK, this includes institutions such as the NHS, the police and armed forces.

The private sector consists of organisations that are owned and managed by businesses or individuals. Everything from Amazon to local pubs are private sector organisations.

While co-production is more relevant to the public sector, it’s worth noting that third sector organisations – such as charities and community associations – can also democratise their services, and even private sector organisations can apply the principles and interventions of co-production within their processes and practices.

Help democratise public services

Explore the different facets of public service democratisation, including its benefits, obstacles, and problems, with the 100% online Master of Public Administration (MPA) at the University of York.

One of the core modules on this flexible master’s degree is about the co-production and democratisation of public services, so you’ll have the opportunity to learn more about how citizens can be enabled to influence public policy decision-making around public services, and how to ensure that the delivery of services meets the demands of users – both critical objectives in a democratic society.

You’ll also learn about public-private partnerships in public services, and how the delivery of public services involves a partnership between the public and the private sector, which creates both opportunities and challenges.

Cybersecurity threats and how to avoid them

Issues of cybersecurity are issues for all of us, and exist at an individual, national and international level.

Our interconnected, digital world relies on technological and computer infrastructure. Almost every element of our lives – including how we bank, travel, communicate, work, shop and socialise – intersects with information technology and digital operating systems in some way. 

While such technological advances offer untold benefits and opportunities, they also carry with them the risk of presenting vulnerabilities to individuals and organisations who seek to benefit from disrupting these systems. 

We all know the importance of creating strong passwords, avoiding suspicious links and installing appropriate security software. However, good digital etiquette only gets us so far.

Cybercrime is increasing year on year. The following cyberthreat-related statistics demonstrate the scope and scale of the issue:

  • the global cost of online crime will reach $300 billion by 2024
  • a business falls victim to a ransomware attack every 14 seconds
  • phishing emails are used to launch 91% of cyber-attacks
  • small businesses are the target of 43% of all cyber-attacks
  • on average, it takes six months to detect a data breach.

What are cybersecurity threats?

A cybersecurity threat is any malicious, deliberate activity that targets computer systems, computer networks, information technology assets, intellectual property or sensitive data. The aim of such threats vary, but generally they seek to gain some benefit from the attack, such as disrupting digital life, gaining unauthorised access, or damaging or stealing data. While many cybersecurity attacks originate from unknown individuals or organisations in remote locations, they can also originate from insiders, within an organisation. All are labelled ‘threat actors’, with common types including:

  • Hostile nation-states, who engage in cyber warfare such as disruption of critical infrastructure, espionage, propaganda and website defacement.
  • Hackers, ranging from those who seek to steal data and confidential information, to those who gain access to systems as a challenge.
  • Hacktivists, who are pursuing a political agenda, generally through the sharing of propaganda.
  • Terrorist groups, who seek to damage national interests and national security.
  • Insiders and third-party vendors, who can deliberately expose sensitive data, or accidentally introduce malware that leads to a data breach.

It’s not just a pressing issue for large entities such as financial institutions, national governments and tech companies; small-to-medium-sized businesses, as well as individuals, are among the most vulnerable to cyberthreats and should take steps to defend themselves.

What are the most common threats to cyber security?

Common types of cyberthreats and cyber-attacks include:

  • Malware. Computer viruses, spyware, worms and ransomware are all forms of malicious software, known as malware. They target vulnerabilities in information systems and networks, typically via malicious links and email attachments that introduce dangerous software into the system. Malware can: render systems inoperable, install additional harmful software, obtain information covertly and block access to network components.
  • Phishing. Phishing attacks are an incredibly common cyberthreat. They use fraudulent communications (generally emails), that appear to come from a known or reputable sender to steal personal and sensitive data – such as credit card information, passwords and login information – or install malware onto a system. Spear phishing refers to a phishing attack that targets a specific individual or organisation.
  • Man-in-the-middle (MitM) attack. MitM attacks – also referred to as eavesdropping attacks – involve cybercriminals inserting themselves into a two-party transaction (and so becoming the ‘man in the middle’) to interrupt traffic,filter or steal data.
  • Denial-of-service attack. These attacks flood computer networks, servers and systems with traffic in a bid to cripple bandwidth and resources so legitimate requests cannot be fulfilled. There is also a Distributed-Denial-of-Service (DDoS) attack; a DDoS attack involves the use of multiple devices to stage a cyber-attack.
  • Structured Query Language (SQL) injection. Malicious code is ‘injected’ into a database in order to gain access to sensitive information or data. It’s an example of a ‘backdoor’ cyberthreat.
  • Zero-day exploit. These attacks exploit networks at times when they are vulnerable or compromised – crucially, before solutions or patches are introduced.
  • DNS tunnelling. These attacks re-route DNS requests to a cybercriminal’s server, providing them with a command, control channel and data extraction path in order to obtain data. They are notoriously tricky to detect.

This list is not exhaustive. Other types of cyber-attacks include Trojans, XSS attacks, drive-by attacks, brute force attacks, whale-phishing attacks, ransomware, data breaches and URL interpretation.

How can you protect networks from cyber security threats?

Every organisation should invest in protecting itself from cybercriminals and cyber protection should form part of any risk management plan. This can be achieved by implementing various security measures.

One is to ensure that all team members throughout the business are alert to the dangers of cyber security; they should be trained to prevent breaches and detect potential threats. 

As many issues of data security occur through accidental insider-user error, this is one of the most effective ways to combat digital vulnerability. Employees should be alert to malicious links, check sender information, maintain strong password etiquette – never share passwords and use two-factor authentication – and take care when handling sensitive information.

From a systems perspective, it’s critical that all hardware and software is up to date and fit for purpose. This includes:

  • supporting patch management systems
  • ensuring networks are behind firewalls
  • implementing endpoint protection
  • backing up data in a secure way
  • controlling and monitoring user access to all systems
  • securing wifi networks
  • establishing personal accounts for all employees.

Protect your systems from potential cyber-attacks

Cybersecurity risks aren’t going away, so individuals and security teams with the specialist skills and expertise to safeguard businesses from these attacks are in high demand. People with these skills can often choose from a wide range of rewarding careers.

Whether you have a computing background or not, you can develop the knowledge and skills to succeed in the industry with the University of York’s online MSc Computer Science programme. You’ll gain in-depth understanding in each of the main areas of computer science, learning how to apply your new skills to real-work environments. Through flexible modules, you’ll explore programming, software development, data analysis, computer networks and much more.

 

Leading and managing organisational change

Supporting an organisation or business through change is a critical part of any leader’s job. Change is constant, and work environments continually evolve as new technologies are adopted, and old ways of working are replaced with the development of new knowledge and skills. Despite the constant presence of change, it remains a significant challenge for organisations. According to a study conducted by McKinsey, just 26% of change projects are considered to be successful within their organisations.

This is why successful change management is such a priority for businesses – and why all managers should aim to be change leaders.

What is organisational change?

Organisational change occurs whenever an organisation or business makes a significant alteration to its identity, purpose, or procedures. This can include changes to an organisation’s:

  • brand
  • culture
  • operational systems
  • internal infrastructure
  • hierarchy or structure
  • policies or processes
  • equipment or technologies
  • values
  • stakeholders
  • leadership team
  • products
  • business models.

Drivers of organisational change

There are a number of reasons why a business or organisation will choose to undergo a period of transformation. Common aims include the following.

Cost savings

Organisations often adopt new technological devices or systems to automate processes, reduce workloads, and streamline workflows in order to save money.

Business growth

Whether introducing new products or services, or emerging into new markets, businesses will embrace change to create new opportunities for growth.

Overcoming obstacles and other challenges

In periods of economic downturn or difficult market conditions, or when facing heightened pressure from competitors, suppliers, or customers, organisations will often implement a change management strategy to help steer the organisation forward – and correct course where needed. This may also occur when an organisation is required to implement new government legislation or directives.  

Adopting high-level or large-scale change

When an organisation decides to make changes to its strategic objectives, its core purpose or values, or even its fundamental structure – such as through mergers, a culture change, or new leadership and senior managers – a dedicated change management process can be instrumental in helping ensure success. 

Types of organisational change 

Organisational change comes in many forms, such as:

  • adaptive change, which is made up of small, incremental adjustments within an organisation
  • transformational change, which includes the kind of sweeping, strategic change projects that can alter the fabric of an organisation
  • remedial change, which includes reactionary change efforts that are typically implemented in response to a problem or issue.

What are the phases of organisational change?

Organisational change management usually consists of three major phases.

Preparation

The preparation phase is vital for successful change programmes. It requires change agents to define criteria for success and desired outcomes, as well as any and all potential impacts, and the planned approach for the project. This plan should outline activities, establish roles, and address any risks.

Implementation

The implementation phase is when all of the work outlined in the preparation phase actually happens. During this phase, leaders should be tracking and monitoring the change programme, and adapting it where necessary.

Follow-through

The final phase of organisational change is about sustaining the outcomes of the change programme. This means reviewing the results of the project, and ensuring that the change outcomes are embedded as part of business-as-usual within the organisation.

What is the role of a leader in managing organisational change?

Leaders play a pivotal role in managing any organisational change process.

They’re responsible for:

  • communicating with the people they lead, answering questions, providing clarity, and taking on feedback
  • motivating people, addressing resistance where it occurs, and easing the transition from the old way of working to the new approach
  • providing accountability, making assessments and decisions, and delegating where appropriate.

What is the difference between leading and managing change?

People are what determine whether a change project is successful or not. And that’s why change leaders are so important.

While managers are needed to oversee the logistical aspects of implementing change, leaders who have competency in getting buy-in from people, engaging them in the process of change, and motivating them to see change through. This is particularly important in large-scale change projects where tensions may be high and instability is inevitable. In these scenarios, leaders help people understand the vision behind the project. 

Change leaders won’t necessarily fit into any one role. While some may sit within senior leadership teams, others might work in human resources or project management. Their defining feature will be their passion and dedication for implementing a successful change programme – and getting other people on board along the way.

What are the main approaches to leading organisational change?

Organisational change has generated a number of different models, methodologies, and approaches for helping leaders to plan and implement effective change management projects.

One is the three-stage model of change developed by Kurt Lewin, a German-American psychologist. Lewin’s model suggests that change has three stages.

  1. Unfreeze, where people prepare for the coming change.
  2. Change, the transitional time where change occurs.
  3. Refreeze, when the change because the norm and stability is restored.

Another methodology is the eight-step process for leading change developed by Dr John Kotter, a professor of leadership at Harvard Business School. Kotter’s eight steps are:

  1. creating a sense of urgency
  2. building a guiding coalition
  3. forming a strategic vision
  4. enlisting a volunteer army
  5. enabling action by removing barriers
  6. generating short-term wins
  7. sustaining acceleration
  8. instituting change.

Lead transformational change within your organisation

Help ensure the success of change initiatives within your current or future organisations with the flexible online Master of Business Administration (MBA) programme from the University of York. Leading and Managing Organisational Change is one of the key modules on this programme, so you will have the opportunity to examine change practices within organisations, theories of change, and how change can shape effective practice.

This postgraduate degree is taught 100% online, so you can study around your current professional and personal commitments as you develop knowledge and skills in topics such as management strategy, operations management, contemporary issues in leadership, marketing in a global society, and contemporary topics in global business.

National policy and its role in a greener, more sustainable future

Infrastructure shapes both the human and natural worlds. It’s the backbone and foundation on which communities and nations are built: the energy that powers our businesses, the clean water pumped into our homes, and the road and rail networks that allow us to travel with ease. Thoughtful, ethical and sustainable infrastructure has the ability to improve our quality of life, creating new places and ways in which to live and work.

Nonetheless, infrastructure on a national scale cannot, and should not, proliferate unchecked. Its development involves critical considerations, from the communities it serves, to its impacts and effects – both in the short and longer-terms, to the material and methods used, to maintenance, and much more besides. Policy, regulation, expertise and comprehensive planning and risk management all play an important role, particularly in negating, and even reversing, the impacts of climate change.

The United Nations’ Intergovernmental Panel on Climate Change (IPCC) recognises and outlines “the interdependence of climate, ecosystems and biodiversity and human societies” in its Summary for Policymakers report. It calls for the assessment of climate change impacts and risks, as well as adaptation, set against “concurrently unfolding non-climatic trends” such as rapid urbanisation, land and ecosystem degradation, biodiversity loss and unsustainable consumption of natural resources. So, how can a nation develop responsibly and sustainably? What factors must be taken into consideration when mapping industrial strategy? Who has the power to make decisions on this scale?

What is national policy?

National policies are statements that contain principles and broad courses of action taken by national governments regarding specific objectives.

Their main purpose is to define and guide decision-making efforts to achieve a desired outcome through government policy and plans – in a bid to ensure any decisions have an aggregate positive impact on a national level. For example, a government may wish to improve biodiversity by considering future land use, or lower emissions and improve air quality by developing its public transport networks, or tightly govern waste management in a bid to support a healthier ecosystem. National policy should:

  • establish a planning system and framework for development
  • protect and preserve natural resources
  • shed light on the environmental impact assessment of planned projects
  • ensure regulatory standards are adhered to
  • avoid any adverse impacts on an environmental or social level

In the UK, national policy must relate to climate change efforts – both in mitigation and adaptation. As well as explaining the reasoning behind their proposed outcomes, policies must explain how they:

  • factor into sustainable development
  • integrate with existing policies
  • account for safety and technology issues
  • address any adverse impacts
  • detail actual and projected capacity and demand. 

Where necessary, policies should also pinpoint specific locations affected in order to create a guideline and roadmap that can support future investment and planning decisions.

Planning Act 2008

The UK Infrastructure Planning Commission was borne out of the Planning Act 2008. As well as making provisions for the Commission’s functions, it contains guidelines and thresholds for the authorisation and development of infrastructure. The Act also governs town and country planning and is designed to “make provision about the imposition of a Community Infrastructure Levy.”

UK national policy statements (NPS)

In the UK, there are 12 NPS relating to different areas of nationally significant infrastructure projects and their future development. NPS must be formed from a democratic process – consisting of public consultation and parliamentary scrutiny – before it can become designated policy. In this way, any examining authorities can make recommendations to the Secretary of State.

The NPS, by category, are:

  • Energy: Overarching Energy; Fossil Fuels; Renewable Energy; Oil and Gas Supply and Storage (produced by the Department for Business, Energy and Industrial Strategy)
  • Transport: Ports; National Networks; Airports (produced by the Department for Transport)
  • Water, waste water and waste: Hazardous Waste; Waste Water; Water Resources (produced by the Department for Environment, Food and Rural Affairs).

Any significant development of infrastructure throughout England, Scotland, Wales and Northern Ireland must adhere to the NPS.

What is the difference between national policy and local policy?

The National Planning Policy Framework (NPPF) underpins the UK’s environmental, social and economic planning policies, and covers topics including:

  • business
  • environment
  • economic development
  • transport
  • housing.

The NPPF must inform any planning applications and decisions relating to local and neighbourhood plans; where plans and developments have taken its rules into account, developments can be approved without delay. A local planning authority (LPA) makes the ultimate decision in deciding to grant or refuse planning permission – such as building and development work – in a given area. Local authorities also create a development plan; this occurs every six years and details planning policies and use of areas by local authorities. For example, a plan might contain aspects such as upgrading of amenities, improvements to roads or regeneration and renewal of unused or obsolete areas. Planning applications in the local area are then cross-referenced with the plan, with permission generally granted only if the plans are in line with the development plan.

The UK government publishes planning practice guidance on the gov.uk website to support individuals and organisations with their interpretation and adherence to the NPPF. There are various categories – many of which relate to ensuring a greener, sustainable future – including:

  • air quality
  • climate change
  • effective use of land
  • environmental impact assessment
  • flood risk and coastal change
  • green belt
  • land stability
  • natural environment
  • renewable and low-carbon energy
  • strategic environmental assessment
  • tree preservation orders and trees in conservation areas
  • waste.

Shape the future of public service provision and public administration

If in-depth understanding of national and local policy could benefit your workplace, choose the University of York’s online Master of Public Administration programme.

Develop the skills to analyse and navigate complex public management issues, and lead the way in making a positive impact on both public life and public service provision. Designed for leaders across public, non-profit and third-sector organisations, your studies will draw on service development and enhancement – with a focus on sustainability – across networks and partnerships. You’ll study the public process, the organisational and human context of public service delivery, and the wider social, political and economic environment through which public service demands and constraints are shaped. 

What are the three categories of computer architecture?

Every computer, from the simple device to the most complex machine, operates according to its architecture. This architecture – the rules, methods, and procedures that tell the computer system what to do and how to work – can be broken into three main sub-categories.

Instruction set architecture

An instruction set architecture, or ISA, is a collection of instructions that a computer processor reads. It outlines how the central processing unit (CPU) is controlled by its software, and effectively acts as the interface between the machine’s hardware components and its software. In fact, the instruction set architecture provides the only means of interacting with the hardware, visible to assembly language programmers, compilers, and application managers.

There are two main types of instruction classifications:

  • Reduced Instruction Set Computer (RISC), which implements only the single instruction formats that are frequently used in computer programmes. These include what’s known as MIPS (microprocessor without interlocked pipelined stages), which was developed by John Hennessy at Stanford University in the 1980s.
  • Complex Instruction Set Computer (CISC), which can implement several specialised instructions.

The ISA also defines and supports a number of key elements within the CPU, such as:

Data types

Supported data types are defined by the instruction set architecture. This means that through the ISA, a computer will understand the type of data item, its values, the programming languages it uses, and what operations can be performed on or through it.

Registers

Registers store short-term data and manage the hardware’s main memory – the random access memory (RAM). They are located within processors, microprocessors, microcontrollers, and so on, and store instructions for decoding or executing commands. Registers include:

  • the programme counter (PC), which indicates where a computer is in its programme sequence. The PC may also be referred to as the instruction pointer (IP), instruction address register (IAR), the instruction counter, or the instruction sequencer. 
  • the memory address register (MAR), which holds the address of an instruction’s related data.
  • the memory data register (MDR), which stores the data that will be sent to – or fetched from – memory.
  • the current instruction register (CIR), which stores the instructions that are currently being decoded and executed by the central processing unit.
  • the accumulator (ACC), which stores the results of calculations.
  • the interrupt control register (ICR), which generates interrupt signals to tell the central processing unit to pause its current task and start executing another.

Key features

The instruction set architecture outlines how the hardware will support fundamental features, such as:

  • Memory consistency models, which essentially guarantee that if a programmer follows set rules for operations on memory, then memory will be consistent, and the results of reading, writing, or updating memory will be predictable.
  • Memory addressing modes, which are the methods used for locating data and instructions from the RAM or the cache. Mode examples include immediate memory access, direct memory access, indirect memory access, and indexed memory access.
  • Virtual memory, also known as virtual storage, which utilises both hardware and software to allow a computer to temporarily transfer data from RAM to disk.

Microarchitecture

Also called computer organisation, microarchitecture is an important sub-category of computer architecture. There is an inherent interconnection between the microarchitecture and the instruction set architecture, because the microarchitecture outlines how a processor implements its ISA.

Important aspects of microarchitecture include:

  • Instruction cycles. These are the steps required to run programmes: reading and decoding an instruction; finding data associated with the instruction; processing the instruction; and then writing out the results.
  • Multicycle architecture. Multicycle architectures are typically the smallest and simplest architectures because they recycle the minimum required number of logic design elements in order to operate. 
  • Instruction pipelining. Instruction pipelining is a tool for improving processor performance because it allows several instructions to occur at the same time.

System design

System design incorporates all of a computer’s physical hardware elements, such as its data processors, multiprocessors, and graphic processing units (GPUs). It also defines how the machine will meet user requirements. For example, which interfaces are used, how data is managed, and so on. In fact, because of its link with meeting specified user requirements, system design is often mentioned alongside product development and marketing.

Other types of computer architecture

Von Neumann Architecture

Also known as Princeton architecture, the Neumann model of computer architecture was developed by John von Neumann in the 1940s. It outlines a model of computer architecture with five elements:

  1. A processing unit that has both an arithmetic and logic unit (ALU) as well as processors with attached registers.
  2. A control unit that can hold instructions in the programme counter or the instruction register.
  3. Memory that stores data and instructions, and communicates through connections called a data bus, address bus, and control bus.
  4. External mass storage, or secondary storage.
  5. Mechanisms for input/output devices.

Harvard Architecture

Harvard architecture uses separate memory storage for instructions and for data. This differs from, for example, Von Neumann architecture, with programme instructions and data sharing the same memory and pathways.

Single instruction, multiple data (SIMD) architecture

Single instruction, multiple data processing computers can process multiple data points simultaneously. This paved the way for supercomputers and other high-performance machines, at least until developers at organisations like Intel and IBM started moving into multiple instruction, multiple data (MIMD) models.

Multicore architecture

Multicore architecture uses a single physical processor to incorporate the core logic of more than one processor, with the aim of creating systems that complete multiple tasks at the same time in the name of optimisation and better overall system performance.

Explore the concepts of modern computer architecture

Deepen your understanding of computer architecture with the 100% online MSc Computer Science from the University of York. This Masters degree includes a module in computer architecture and operating systems, so you’ll delve into how computer systems execute programmes, store information, and communicate. You will also learn the principles, design, and implementation of system software such as operating systems, in addition to developing skills and knowledge in wider computer science areas, such as algorithms, data processing, and artificial intelligence.

This flexible degree has been designed for working professionals and graduates who may not currently have a computer science background but want to launch a career in the cutting-edge field. You can learn part-time around your current work and home commitments, and because the degree is taught exclusively online, you can study whenever and wherever you want.

What are the features of modern governance?

Modern governance is good governance achieved with modern tools. By utilising new technologies and methods, governing bodies can gain deeper insights and develop new processes that deliver better governance and better overall results for their organisations.

Effective use of new technology

Whether it’s through automating processes or utilising new systems and platforms, governing bodies that undergo a digital transformation can often gain a competitive advantage or establish themselves as a leader in their fields or sectors.

Making the most of information and data

Modern governance relies heavily upon good data to spot issues, challenges, and new categories in terms of trends and market directions. Good data should also help inform decisions, strategies, policies, and responses in times of crisis or disruption.

Greater transparency

One of the most significant aims of modern governance is transparency. Information is easily accessible and understandable. Communication and conversations are two-way, and reporting is conducted regularly and objectively.

Robust risk and compliance management

Modern governing bodies are focused on safeguarding against a multitude of risks – no easy task in today’s complex world. Through policies, procedures, and other controls, modern governance works to mitigate cyber risk and cyber-attacks; protect strategic, financial, operational, and reputational functions; and, ensure all potential issues are identified, analysed, mitigated, monitored, and reported on. There is also a strong focus on compliance, with internal audits, quality assurance practices, and strict adherence to legislation – and other relevant codes – all firmly cemented within the organisation.

Inclusive environments

Diverse, inclusive, and collaborative environments are a growing necessity in modern governance. From the governing body itself, through all levels of an organisation, good modern governance calls for diversity in race, gender, and age, but also in terms of backgrounds and skills – and research continues to show that diverse organisations are more creative and more decisive.

What is good governance?

Good governance should be an important area of focus for any organisation, whether it’s a public body, an international not-for-profit organisation, or a private business or corporation. Through good governance, an organisation maintains and safeguards its integrity and ethics, its policies and procedures, and its reputation and performance.

Good governance informs decision-making, enables strong relationships with stakeholders, and sets the organisational or corporate culture. It also ensures that funds are used appropriately, employee standards – and human rights – are respected, and operations are free from corruption and fraud.

Good governance in the United Kingdom

In the UK, corporate good governance is ensured through the Financial Reporting Council’s UK Corporate Governance Code. The code steers relationships between companies, shareholders, and stakeholders, and “promotes the importance of establishing a corporate culture that is aligned with the company purpose, business strategy, promotes integrity and values diversity.”

The Chartered Institute of Personnel and Development also offers a number of resources and guides on the topic of good corporate governance.

Finally, the Chartered Governance Institute UK & Ireland works to ensure “effective and efficient governance and administration of commerce, industry and public affairs”.

Examples of best practice in governance

Transparency

Petition and social change non-profit website Change.org has been lauded for its transparency. The organisation shares its financial information, roadmaps, and strategy in its annual impact report, and also publishes details about its diversity and inclusion.

Diversity

Inclusive Companies, an organisation that aims to drive inclusion through innovation and best practice, recently awarded Capegemi UK its top prize in the Inclusive Top 50 UK Employers List. Contributing to this win was Capgemini’s active inclusion programme, which engages all of its staff “in a conscious culture and mindset change for inclusion.”

Risk management 

CIR Magazine’s Risk Management Awards recently awarded AECOM, a multinational infrastructure consulting firm, and the West Midlands Combined Authority, with a Public Sector Risk Management Award, recognising the organisations’ collective and collaborative commitment to risk management.

What is a governance deficit?

Governance deficits occur when those governing – whether it’s a company, organisation, government, or another body – are not equipped to cope with the myriad challenges it faces, and cannot govern effectively. This deficit can look like a lack of:

  • information, which should inform everything from an organisation’s environmental, social, and corporate governance (ESG) to its marketing efforts
  • technology, which can facilitate transformational change in everything from everyday business processes to C-Suite (the name for leadership teams made of a CEO, CFO, COO, and so on) leadership
  • adaptability, whether it’s responsiveness to a crisis or an emerging trend in the market
  • security, from cyber safety to financial health
  • connectivity or collaboration, such as consultation on white papers, governance practices, or even participation in industry events, podcasts, and so on
  • visibility, whether it’s of board members or corporate directors within the organisation, or even an online presence – for example, on social media applications such as LinkedIn.

Deficits in areas such as these can have serious consequences, such as increased risks – and increased costs.

In fact, the Diligent Institute has reported that “governance deficits across over 12 public companies have cost shareholders more than $490bn (US Dollars) in value in the year following the financial crisis. At the same time, good governance has equipped companies to surpass their peers by 15%.”

Modern tools and resources to support good governance

Diligent.com

Diligent is a digital modern governance tool. Effectively an app for boards and board members, it aims to promote good governance by managing risks, supporting audits, enabling compliance, and so on. It is the world’s largest governance, risk, and compliance (GRC) software-as-a-service (SAAS) provider, with one million users across 25,000 organisations and 130 countries. Diligent CEO Brian Stafford says that there is “a gap between people in the middle of the organisation who are data experts, and people at the top who want the right information at their level, to be able to help with pressure testing. We’re putting that all together for our clients.”

The Modern Governance Summit

The Diligent Corporation recently hosted a hybrid in-person and virtual event for governance, risk, compliance, audit, and ESG professionals.

Free data tools, such as Google Analytics

Data and information is key for organisational leaders, boardrooms, and anyone in an executive role. While some of this data may require market research or dedicated system providers, some of it is easily obtained for free. Website data, for example, is a crucial area of information for organisations and can be accessed for free in real time via Google Analytics.

Communication platforms, such as Slack or Microsoft Teams

Open communication supports transparency, inclusive environments, and information-sharing – all important elements of good governance. 

Help shape the future of good governance 

Help embed good governance within the public sector with the 100% online Master of Public Administration (MPA) programme from the University of York. Designed for professionals in public and non-profit organisations, this flexible MPA programme will equip you to make a positive impact on improving public service provision and public life.

As part of your studies, you’ll explore good governance. For example, one of your key modules is in regulatory governance – an inevitable part of modern governance that many businesses undermine, but is crucial in delivering on social protection, safety, and quality assurance standards.

Leadership vs. management: what’s the difference?

Managers are an important part of any organisation. They oversee the day-to-day activities of the business, make sure work is assigned and completed, and take care of administrative tasks, such as budgeting and scheduling, along the way.

But what separates managers from leaders? The words might be used synonymously, but they’re not actually the same thing. People work for a manager – but they follow a leader. And this distinction is important, because while some jobs may require nothing more than a bit of rote direction, many roles are becoming increasingly complex, and require an environment that:

  • enables innovation and creativity
  • fosters engaged, motivated, and empowered employees
  • supports change

This is where effective leadership is required. Leaders are visionaries: they think strategically, they inspire others, and they help develop the culture – and the future – of their organisations. According to Forbes, leaders are agents of change, while managers work to maintain the status quo. In fact, it’s often said that a leader shapes the vision of their organisation, while a manager follows it. So it’s important to note that while a leader can both lead and manage people, a manager can’t always lead the people they manage. Unless, of course, they work to develop their leadership skills and a leadership style.

Why is leadership important?

The 21st century has seen significant changes in the working world. Technology plays an increasingly important role in businesses and organisations, from automated tasks to communication methods. Meanwhile, the COVID-19 pandemic created a seismic shift in working habits, with more people working flexibly and remotely than ever before. 

Within this shifting landscape, leaders have been instrumental in shaping and directing the flow of change within their organisations. They have ensured that their employees have remained engaged and motivated, and that their businesses have adapted as necessary. These leaders have not been afraid to take risks, try new ideas, implement changes, and make the occasional mistake in service to improving overall performance.

Signs of a good leader

Great leaders vary greatly in terms of leadership styles, but they all commonly demonstrate a number of important qualities, traits, and skillsets. 

Communication

Leaders are honest, direct, and transparent in their communications with colleagues. They share their vision, they share their priorities – aligning them to organisational goals – and they share their ideas. Crucially, they also welcome input and feedback from their colleagues and stakeholders, actively listening to all comments and taking other points of view on board – regardless of whether the comments come from a junior staff member or someone in a senior management role.

Emotional intelligence 

Leaders understand feelings – their own, and those of the people around them. They harness this knowledge to understand their employees, demonstrate empathy, offer personalised praise for successful work and initiatives, and acknowledge challenges that have been tackled and overcome. 

Knowledge 

Leaders are experts in their fields, and they share their knowledge and expertise. They encourage and enable professional development – and development programmes – within their teams, and dedicate time to mentoring and coaching people. They are also learners themselves, making time for their own personal development through learning resources such as:

  • in-person leadership development courses
  • training courses to hone their everyday management skills
  • online courses, particularly leadership programmes and management courses, through websites such as LinkedIn

Influence 

Because of their communication skills, emotional intelligence, and knowledge, a leader will have a wide circle of influence. Whether it’s within their team, the wider organisation, or external stakeholders, leaders can effectively and respectfully persuade and negotiate with people to help steer conversations, direct interventions, and influence actions.

Accountability 

Leaders are confident in their decision-making and effective in their problem-solving. However, they’re not afraid to share responsibility with their colleagues – on the contrary, they actively empower people to explore different solutions to the challenges they face. But a leader is always accountable for the decisions made under their watch.

What are the different styles of leadership?

There are a number of different leadership styles to choose from, and the most suitable for any given leader, team, or organisation, can vary greatly. 

According to the Institute of Leadership & Development (ILM), a professional body focused on leadership, the most appropriate leadership style will depend on factors such as the leader’s personality and their current circumstances. With this in mind, the ILM argues that a leader’s style should reflect the situation they’re in – known as situational leadership – rather than their own personal references. 

Transformational leadership

Transformational leaders are focused on inspiring their employees for the benefit of organisational transformation and success. They seek improvements and innovation as part of their standard process, and trust their staff to help deliver results. While this style is ideal within organisations that value new ideas and ways of working, it isn’t always appropriate within organisations that value established processes and doing things the way they’ve always been done.

Democratic leadership

Democratic leaders regularly solicit their team members in their decision-making process. While these leaders will frequently guide conversations and will make final decisions, they also openly encourage debate and new ideas. Also known as participative leadership, this leadership style is effective in engaging employees and encouraging innovation, but can be difficult when a decision needs to be made quickly.

Servant leadership

Servant leaders are focused on people first and foremost. By helping and supporting people as their primary task, servant leaders elevate and empower those around them to achieve results. This style of leadership can create inclusive and dynamic company cultures, but can also create challenges around ownership and accountability.

Become a leader within your organisation

Develop your leadership skills with the Master of Business Administration (MBA) at the University of York. This programme is studied part-time and 100% online, so you can continue to work full-time as you focus on your professional development. Through research and taught modules, you’ll learn about management strategy, contemporary issues in leadership, leading and managing organisational change, and how to manage across cultures, among other important areas of leadership.

What is computational thinking?

Computational thinking (CT) is a problem-solving technique that imitates the process computer programmers go through when writing computer programmes and algorithms. This process requires programmers to break down complex problems and scenarios into bite size pieces that can be fully understood in order to then develop solutions that are clear to both computers and humans. So, like programmers, those who apply computational thinking techniques will break down problems into smaller, simpler fragments, and then outline solutions to address each problem in terms that any person can comprehend. 

Computational thinking requires:

  • exploring and analysing problems thoroughly in order to fully understand them
  • using precise and detailed language to outline both problems and solutions
  • applying clear reasoning at every stage of the process

In short, computational thinking encourages people to approach any problem in a systematic manner, and to develop and articulate solutions in terms that are simple enough to be executed by a computer – or another person. 

What are the four parts of computational thinking?

Computational thinking has four foundational characteristics or techniques. These include:

Decomposition

Decomposition is the process of breaking down a problem or challenge – even a complex one – into small, manageable parts.

Abstraction

Also known as generalisation, abstraction requires computational thinkers to focus only on the most important information and elements of the problem, and to ignore anything else, particularly irrelevant details or unnecessary details.

Pattern recognition

Also known as data and information visualisation, pattern recognition involves sifting through information to find similar problems. Identifying patterns makes it easier to organise data, which in turn can help with problem solving.  

Algorithm design

Algorithm design is the culmination of all the previous stages. Like a computer programmer writing rules or a set of instructions for a computer algorithm, algorithmic thinking comes up with step-by-step solutions that can be followed in order to solve a problem.

Testing and debugging can also occur at this stage to ensure that solutions remain fit for purpose.

Why is computational thinking important?

For computer scientists, computational thinking is important because it enables them to better work with data, understand systems, and create workable algorithms and computation models.

In terms of real-world applications outside of computer science, computational thinking is an effective tool that can help students and learners develop problem-solving strategies they can apply to both their studies as well as everyday life. In an increasingly complicated, digital world, computational thinking concepts can help people tackle a diverse array of challenges in an effective, manageable way. Because of this, it is increasingly being taught outside of a computer science education, from the United Kingdom’s national curriculum to the United States’ K-12 education system.

How can computational thinking be used?

Computational thinking competencies are a requirement for any computer programmer working on algorithms, whether they’re for automation projects, designing virtual reality simulations, or developing robotics programmes.

But this thinking process can also be taught as a template for any kind of problem, and used by any person, particularly within high schools, colleges, and other education settings.

Dr Shuchi Grover, for example, is a computer scientist and educator who has argued that the so-called “four Cs” of 21st century learning – communication, critical thinking, collaboration, and creativity – should be joined by a fifth: computational thinking. According to Grover, it can be beneficial within STEM subjects (science, technology, engineering and mathematics), but is also applicable to the social sciences and language and linguistics.

What are some examples of computational thinking?

The most obvious examples of computational thinking are the algorithms that computer programmers write when developing a new piece of software or programme. Outside of computer programming, though, computational thinking can also be found in everything from instructional manuals for building furniture to recipes for baking a chocolate cake – solutions are broken down into simple steps and communicated clearly and precisely.  

What is the difference between computational thinking and computer science?

Computer science is a large area of study and practice, and includes an array of different computer-related disciplines, such as computing, automation, and information technology. 

Computational thinking, meanwhile, is a problem-solving method created and used by computer scientists – but it also has applications outside the field of computer science.

How can we teach computational thinking?

Teaching computational thinking was popularised following the publication of an essay on the topic in the Communications of the ACM journal. Written by Jeannette Wing, a computer science researcher, the essay suggested that computational thinking is a fundamental skill for everyone and should be integrated into other subjects and lesson plans within schools. 

This idea has been adopted in a number of different ways around the world, with a growing number of resources available to educators online. For example:

Become a computational thinker

Develop computational thinking skills with the online MSc Computer Science at the University of York. Through your taught modules, you will be able to apply computational thinking in multiple programming languages, such as Python and Java, and be equipped to engage in solution generation across a broad range of fields. Some of the modules you’ll study include algorithms and data structures, advanced programming, artificial intelligence and machine learning, cyber security threats, and computer architecture and operating systems.

This master’s degree has been designed for working professionals and graduates who may not have a computer science background, but who want to launch a career in the lucrative field. And because it’s studied 100% online, you can learn remotely – at different times and locations – part-time around your full-time work and personal commitments.