How To Squeeze Hidden Value From The Hidden Data You Didn’t Know You Had

Data… it’s become such an ubiquitous word hasn’t it?

Organisations are bombarded everyday with data on their clients, data on their customers, data on their competitors, data on their donors, data on their members and/or data on their supporters. That’s a lot of data and sometimes it’s hard to know what to do with it.

Even worse, sometimes it’s hard to even collate and categorise it or… worst of all, realise your organisation has it in the first place.

 

That’s not good.

It’s not good for your organisation. It’s not good for your staff and it’s definitely not good for your clients.

In fact, the only thing it is good for is your competitors!

 

In the modern world of cloud computing we now all find ourselves in, it’s vital to squeeze every last drop of wealth from the data your organisation has to hand and use it to derive useful and actionable business insights to maintain a corporate advantage.

 

In fact, several studies have found that nearly 80% of enterprise level executives believe that any organisation not currently adopting the use of Big Data will likely lose any kind of competitive hold on their respective market and could even go under when faced with competitors that are.

 

To ‘discover’ the data you didn’t know you had and then go on to derive useful business insights from it you’ll need to undertake a program known as smart data discovery, sometimes also called augmented intelligence.

 

What Is Smart Data Discovery?

At its core, smart data discovery describes a process through which an organisation can collect and collate data from wide ranging and disparate data sources and then apply it in such a fashion as to garner actionable business intelligence.

The end goal being for the organisation to benefit from an improved data driven decision making process, capable of sharing information efficiently, fluidly and instantly across all departments in a two-way process.

 

However…

Becoming a data driven organisation means first getting to grips with and understanding all the data that is available to you.

Data Discovery is a methodology that empowers everyone within an organisation to make the most of the data available to them by allowing them to derive insights from it in an interactive way.

 

But how does data discovery work? What tools do you need? What platform? How does it go from model to real world application?

 

How Does Data Discovery Work?

Smart data discovery ‘works’ by, as previously mentioned, collecting data from various sources and then looking for patterns within that data with the help of AI, ML and advanced analytics and visual navigation (usually Power BI), which allows for the consolidation of it all into one place.

It uses AI and ML (machine learning) to improve the business intelligence collected whilst automatically searching for hidden trends that can greatly increase the speed in which decisions can be reached.

 

Imagine you’re a CEO, CIO or CTO, running an organisation that needs a way to better view their data and derive in-depth value from it.

Information is key to any good organisational decision making process… for you or a competitor, which means anyone able to better analyse patterns and discern deeper trends automatically gains a competitive edge within their sector to allow them to better meet targets, ensure success and remain relevant.

 

As we already mentioned, data discovery isn’t a platform or a tool, it’s a process that needs to be bedded in and that process can be broken down into two major component steps.

 

  • Data Preparation
  • Advanced analytics

 

Augmented Data Preparation

Augmented data preparation can grant a CEO, CIO or CTO access to data that, simply put, is more ‘purposeful’. It allows all assumptions, strategies and approaches to be tested with ease, based on intelligent decision making protocols.

 

Augmented Analytics

Augmented analytics are a form of business intelligence that use machine learning algorithms as well as natural language processing that automate the insights you receive in from your data.

Taking the concept to it’s most basic form, it simplifies the process of arriving at actionable insights by automating data preparation as well as empowering data sharing across an organisation.

 

Why Is Big Data Discovery Becoming So Important All Of A Sudden?

Quite simply put… the amount of data created by the human race is growing at an exponential rate.

Studies have shown that more data has been generated in the last year or two than in in the previous entirety of human history and that since 2012 big data, data discovery and more recently, augmented intelligence has generated over 13 million jobs across the globe.

It’s popular for two reasons… first, no one person (or even group of people) can trawl through that much data for viable insights and secondly, giving an organisation the power to predict patterns and make connections that had never been imagined before offers a huge competitive edge.

 

That phrase ‘competitive edge’ can be quite nebulous though. In the specific, augmented data discovery allows for:

 

  • Tracking specific business performance indicators to specific metrics (KPIs)
  • More accurate predictions, and to grow from those, solutions, across all departments of an organisation.
  • More time is freed up for strategic decision making
  • Access to credible data becomes accessible for everyone within the organisation
  • Increased ROI and TCO (total cost of ownership).

 

However, with technology changing and advancing at such a rapid pace, new analytical techniques, process and resources are constantly being added.

 

How To ‘Do’ Augmented Data Discovery

 

  • Know What You Want To Achieve: The first step in any augmented data discovery project has to be to clearly define your organisations business goals. Doing so keeps the project focussed on collecting the right information required for the right goal. It’s important to seek input from all stakeholders and staff, not just those who will use the data, but those who’s responsibility it is to collect it as well.
  • Know Your Pain Points: Once you know what you want to achieve, it’s just as important to identify the obstacles that are going to stand in your way. No two organisations will ever have the exact same pain points but some of the most common cloudThing see a lot are; not being able to access the large amounts of information needed or having only a slow or limited access to it; struggling to collate a wealth of data from disparate sources; too much time is spent curating the data rather than understanding it and deriving useful business insights.
  • The More Data The Better: Data, unfortunately, doesn’t just come from one source (wouldn’t it be nice if it did though). Your customers will have multiple touchpoints with your organisation and all that data will appear in different formats, making it that much harder to collate into a useable format. To garner any kind of usable intelligence, as much of that data needs to be collected as possible but, and here’s the important part, transformed into a format for the ML to be able to derive insights from it. The data might be structured, it might be unstructured… the only way to derive new insights is to make all that data readable.
  • Your Data Needs Cleaning: Once you’ve started gathering all that data, you’ll need to hit a routine of cleaning it for it to be of any use. Some of it may be wrong, some of it might be missing fields, there may be duplicates or incorrectly formatted data still. You can go deeper still. If, like many organisations using augmented intelligence, you’re collating unstructured text data from sources like social media then extra care will need to be taken to clean it to avoid syntax misunderstandings, invalid characters or spelling errors. The ultimate aim of this ongoing cleaning is to avoid any data that runs the risk of generating misleading data that could potentially harm the business with incorrect intelligence.
  • Developing An Advanced Data Discovery Model: Developing an advanced data discovery model is the strategic approach to utilising an organisations data. As already mentioned, it will involve the collection, curation and analysis of said data as well as the data-driven decisions an organisation makes upon the discovery of new insights. Choosing a reporting tool to highlight this information will be a big factor in the success story. To help model that will require the aid of advanced diagrams, symbolic references and textual information used to represent how data is reaching, flowing through, and being used by an organisation.
  • Start To Tell a Story: All of the above might sound complicated (and it is and we absolutely recommend you seek technical know how to implement it efficiently) but the easiest way to make use of the data once you have it is to use it to tell stories. These should be easy to follow, that everyone in the organisation can understand and agree with, regardless of how technical they are (or more likely) aren’t. To help achieve that goal, data visualisation tools like Microsoft Power BI will be vital. Telling a story, or perhaps paining a picture with your data, ensures it’s accessibility and uptake throughout the organisation.
  • Automate The Process: Once all of the above has been planned out it’s vital to determine how the process can be automated as there’ll be far too much data to ever do this manually with any kind of success rate (or without occurring a ridiculous amount of human error).

 

In summary, a successful advanced data discovery (or augmented intelligence) program shouldn’t require anyone with advanced technical skills to understand the output (setting it up might be a bit different of course).

Anyone within the organisation should be able to derive actionable insights from the end product.

400% Increase In Microsoft Teams Usage: Can This Bridge The Gap Between Frontline Workers & Their Managers?

Microsoft recently made changes to its Teams and Viva platforms ahead of the February 1st Cloud For Retail general availability – and published a report alongside the changes which outlines the challenges frontline workers are facing in 2022.

According to Jared Spataro (Microsoft’s CVP for Modern Work), Microsoft defines a frontline work as “folks who were not able to go home and did all their work in person” – which can cover everyone from those staffing production lines to healthcare workers, and to hospitality staff and those working to keep power grids running.

The changes will hopefully bridge the gap between manager and worker in building workplace culture.

Satya Nadella, Microsoft’s CEO, had already highlighted the 2 billion (give or take) workers included in that segment which, even prior to the pandemic, would benefit from Microsoft’s tech.

9,600 employees and managers were surveyed, and covered over eight industries across five continents, which formed the report.

The report shows that, while 76% of workers felt bonded to their peers, over 60% felt that communication from higher ups wasn’t great and could be done with improvements made by their employers. On top of that, 51 per cent in non-management positions on the frontline felt less valued.

What does this mean? Following the fact that the pandemic is causing workers to decide if they would benefit from a change in employment, (according to the survey, 4.5 million Americans quit their job in November) and growing stresses, it shows a change of culture. 64 per cent of people surveyed reckoned that a pay rise would ease some of their stresses, half of the surveyed felt paid time off would help and a third suggested improvements in technology tools.

Between March 2020 and November 2021, Microsoft saw a 400% rise in Monthly Teams usage and Healthcare and Financial Services led the way at a 560 % and 550% rise respectively.

There is a danger of some employees being left to struggle, however, as older workers (aged 41 and up) had more problems adapting to the new way of working, conversely the younger workers (40 and under) found the workplace tech actually somewhat behind the technology they were used to, according to the survey.

There’s only so much that Microsoft can do of course, and there isn’t much it can do in terms of frontline worker holidays and pay rises etc but what it can do is improve the employee experience online and it’s presented Viva as a way of linking frontline workers with company culture and improving the accessibility of recourses, such as HR. It’s also making the integration of Teams to Zebra Reflexis easier to connect workforce management platforms with Teams’ Shifts application.

“Empowering frontline workers remains essential for digital transformation.”

Emma Williams, a Microsoft Corporate Vice President

The steps to teach and empower through tech are no doubt improving the lives of workers all over the world, however there are still two top ways that will reduce stress in the workplace, according to the survey: pay frontline workers more and give them paid leave.

Ukraine Target Of Sustained Hacking: Early Reports Point The Finger At Russia

Cyber Actors threatened citizens with the publication of their private data

A number of government websites in the Ukraine recently came under fire from a sustained hacking attack on the 13th of January, which involved cyber attackers distributing menacing messages which appear to be aimed at intimidating Ukrainian citizens.

A Facebook post by the Ministry of Education and Science confirmed  the attack occurred late Thursday and into the early hours of Friday morning.

During the attack, websites related to the Cabinet Office and the Ministry of Foreign Affairs were inaccessible although the Ukrainian government says it’s now restored many of the affected websites.

The attackers wrote  in Ukrainian, Russian and Polish and threatened to release private data to the public, according to what was left on the websites.

“All information about you has become public. Be afraid and expect worse. It’s your past, present and future,”

Media Reports on the contents of the threatening messages

Despite the cyber attackers threats, the Ukrainian government denied that any personal data about their citizens had been leaked.

An investigation is currently underway, according to the Energy Ministry and is being conducted by a police unit which specialises in cyber-attacks.

There is, so far, no one officially named as the suspected perpetrator of the DoS attacks and the Ukrainian government says “it’s too early to draw conclusions”.

“…But there is a long history of Russian attacks on Ukraine.”

A Foreign Ministry Spokesperson

The EU’s head of foreign policy, Josep Borrell, has condemned the attack.

“We are going to mobilise all our resources to help Ukraine to tackle this. Sadly, we knew it could happen.”

Josep Borrell, EU Head of Foreign Policy

Unfortunately an attack like this is nothing new for Ukraine, as hacker groups with suspected links to Russian Intelligence have been involved in previous attacks including an attack on the country’s electricity grid in 2015, which left 200,000 people without power, with a similar attack occurring a year later.  The NotPetya ransomware also hit Ukrainian servers (among others) in 2017 and was also attributed to Russia.

So far the country has been the victim of 288,000 cyber-attacks in just the first 10 months of 2021, and 397,000 in 2020.

West Mids Transport Chiefs Have Agreed Priorities For A £1.3bn Investment In The Region

A list of preferred priorities and schemes will now be submitted by the combined authority to the Department for Transport for final approval.

 

That £1.3bn of investment will be spent on over fifty different transportation improvement projects within the West Midlands, with a full list to be published as soon as the DoT have confirmed final approval.

 

Some of the details we do know include a plan for over sixty miles of new, segregated cycle routes, thirty miles of dedicated bus lanes and priority measures, a brand-new railway station at Aldridge and a light rail line for Coventry city centre.

The money will also be used to encourage the use of electric vehicles with a network of 1,600 electric car charging hubs,  ten ultra-rapid charge points to support van and lorry drivers and with continued investment in existing metro systems.

 

Funding has also been set aside to help further develop the business case and plan for a more rapid extension of the West Midlands metro along the Hagley Road in Birmingham city centre, as well as the tram depot in Wednesbury.

 

The fifty plus schemes, to be developed over the next five years, are being funded from the City Region Sustainable Transport Settlement, which was awarded to the West Midlands last year by the Department of Transport (DoT) and will be topped up by a small amount of local funding.

 

Any scheme which helps promote the decarbonisation of transport within the West Midlands, increases target investment into areas with poor connectivity or the empowerment of inclusive growth have been prioritised within the plan.

 

This unprecedented investment will allow us to deliver more than fifty exciting projects as we continue to revolutionise and decarbonise public transport across the West Midlands. From an expanded Metro network and new railway stations, to more cycle routes and better electric vehicle charging infrastructure, the schemes we have agreed will benefit every area, with improved links for communities right across the West Midlands while also tackling the climate change emergency by cutting down our carbon emissions.

Andy Street – Mayor, West Midlands

UK Government Removes Community Wealth Fund Amendment From Upcoming Bill

The UK Government have just removed an amendment from the upcoming Dormant Assets Bill that would allow for the creation of Community Wealth Funds… however acknowledged  there was a lot of widespread support for the concept.

 

The amendment, previously added by the House of Lords, was removed in the House of Commons during its committee stage, meaning funding will no longer be able to be used specifically to support social infrastructure.

In its place, the Government are planning a consultation on the future correct usage of such funds and to determine if community wealth funds will be suitable.

 

I acknowledge the support expressed by many in the House for using the English portion of dormant assets funding to support, through community wealth funds, the left-behind communities, which experience high levels of deprivation and low levels of social infrastructure. However, the government wanted to protect the integrity of the consultation process, which offers the most appropriate route to make that a reality. This consultation on how funds in England are used will be launched as soon as possible after Royal Assent and will explicitly include community wealth funds as an option to consider for the English portion.

A 12-week consultation on expanding the causes money to which money can go will begin as early as this summer, with community wealth funds included as a clear option. Should it be determined that the community wealth funds are the best use of some of the English portion, the bill is already designed to provide the most appropriate avenue to make that a reality.

Nigel Huddleston – Minister for Civil Society

 

At the moment, money from the dormant assets fund can only be spent on youth, financial inclusion and social investment in England, although the devolved nations have more flexibility on the spending of the fund.

 

I am surprised that the government want to remove a measure that empowers communities and surely goes to the heart of the alleged levelling up agenda. There are members on both sides of the committee who represent areas that will benefit from this kind of initiative. The most deprived areas often have the weakest third-sector capacity and infrastructure, which adds to a cycle of disadvantage. Community wealth funds aim to halt that cycle. They are aligned with the aims of the levelling up agenda and have the potential to transform communities and lives.

Jeff Smith – MP, Labour

 

I do not believe that the minister is correct in claiming that secondary legislation is the most appropriate mechanism for deciding on the distribution. We all understand that there is limited opportunity for debate on secondary legislation, and there is, of course, no opportunity to amend it. That means parliament’s role will be limited to rubber-stamping the government’s proposals. Were the fund to remain written into the bill, the Community Wealth Fund Alliance could start the process of securing match funding and planning to get money into the most left-behind communities as soon as possible after Royal Assent.

Diana Johnson – MP, Labour

Wales Has An Additional Duty To Climate Change Due To Its Coal Mining Past – Conservationist Claims

Leading Welsh NonProfit conservationist claims Wales has a ‘particular responsibility’ to help fight climate change due to its coal mining history.

 

Ru Hartwell, director of NonProfit Carbon Link, has recently been quoted as saying Wales ‘invented’ a model for industrial development based almost exclusively on exploiting fossil fuels.

 

Carbon Link runs a tree-planting programme in Africa that is funded by the Welsh Government and has, to date, planted around four million trees in the Boré community of Kenya.

The programme has expanded a lot since they started back in 2012, from an initial planting of 1,000 cashew trees to a target of over a million trees planted in 2022, designed to provide food and sustainable lumber to the community, as well as creating vital wildlife habitats and improving biodiversity.

 

All of Carbon Links funding comes from either the Welsh Government, NonProfit charity Size of Wales or takings from their climate change charity shops.

 

It’s all about helping the local people protect their existing forest and plant new trees to suck down carbon from the atmosphere and improve the climate for everyone. One of the tragic ironies of climate change was poorer nations that had contributed least to the carbon emissions problem were being worst hit by the impacts of rising temperatures and extreme weather.

Wales has a very long history of releasing carbon.

We’ve got one of the longest legacy footprints of any country in the world because of the industrialisation that came with the south Wales coalfield. The model of industrial development based on the exploitation of fossil fuels was invented in south Wales and every other country in the world has gone on to kind of emulate that.

So, because we were the first industrialised nation, we have a particular responsibility to draw back some of that ancient, historical carbon

Ru Hartwell – Director, Carbon Link

 

Since its inception, the tree planting project has led to the establishment of the largest tree nursery in Coast Province, Kenya, and now involves over 3,000 farmers as well as 200 local schools.

 

The people here have got very small carbon footprints, they don’t drive or fly around all the time like we do in the West but for them climate change is happening right now with crop failures driven by changing weather patterns.

Anna Douglas – Ecologist, Volunteer at Carbon Link and Ru Hartwells daughter

 

The available Welsh funding has also led to the planting of a further fifteen million trees in Uganda over the past decade by other NonProfit’s, with a the Welsh Government aiming for twenty-five million trees planted by 2025.

Question Asked: How Safe For Children is the Oculus VR Headset?

Talks have been prompted due to concerns over multiple instances of child harrasment on VRChat

Meta could face a fine of up to four per cent its annual global turnover if the Oculus headset is breaking child safety rules.

The UK’s Information Commissioner’s Office (ICO) will be seeking clarification on whether the device is compliant with the Children’s Code and will be seeking this clarification from Meta directly by asking about the parental control features on the Oculus Quest 2 VR headset.

The clarification comes about due to warnings from child safety campaigners who have pointed out how the £300 device lacks parental controls, which puts it in violation of the new protection code.

The campaign group Center for Countering Digital Hate (CCDH) found numerous cases of abuse on  VRChat, Oculus’ popular social tool.

CCDH found that one of these cases involved two heavily breathing men following around a young person’s avatar. Another case had a man joke joked in front of an under-18 that he was a “convicted sex offender.”

The creator of the Children’s Code herself, Baroness Beeban Kidron, states her concern over the safety of children on the Oculus platform, as it has been made too easy for children to be exposed to abuse, harassment, and sexual content.

Meta has its own age barriers in place – VR users must use their Facebook account which has a minimum age requirement of 13, but this doesn’t mean Meta is implementing the Code’s age checks, according to Kidron. All a child has to do do, hypothetically, is simply tick a check box to say they’re old enough and then they’re granted access to potentially harmful VR chatrooms.

The ICO wants answers from Meta about whether their VR headsets and services have enough measures in place to protect children’s privacy and data.

Online services and products that use personal data and are likely to be accessed by children are required to comply with the standards of our children’s code,”

We are planning further discussions with Meta on its children’s privacy and data-protection-by-design approaches to Oculus products and virtual reality services. Parents and children who have concerns about how their data is being handled can complain to us at the ICO.

ICO spokesperson

 

What is the Children’s Code?

The UK’s Age Appropriate Design Code (aka the ‘Children Code’) is a set of regulations written into law as part of the 2018 Data Protection Act. Although it came into force in September 2020, organisations were given a 12-month grace period in order to audit themselves for compliance.

The Children’s Code contains 15 standards that companies must implement in any digital services used by children: from social media sites and apps to online games, connected toys, and even educational and news websites.

If Meta has violated the code then Meta could face an onslaught of penalties.

The penalties given by officials could be anywhere from a warning being issued, a fixed financial penalty of £17.5 million being imposed, or they could face a fine up to four per cent of global turnover.

A Meta representative issued a statement and told The Guardian that the company is confident the VR technology complies with the Code’s requirements, and they are committed to honouring the rules established by the ICO.

The spokesperson then goes on to emphasise that children under the age of 13 precluded from its products under the terms and services, however the statement does not address the concerns over how easy it is for minors to circumvent the policy.

Meta is committed to a $50 million (about £37 million) initiative, which it says is to establish the metaverse’s development’s compliance with all applicable laws and regulations.

UK Invests £2bn In A Future Combat Air System With Japan

UK and Japan to develop fighter jet engine demonstrator

 

The UK Government, as part of the UK’s Combat Air strategy, has just announced that they’ll be developing a joint jet engine demonstrator with Japan.

A memorandum of Cooperation has also been signed to enable future development opportunities.

 

Work on the jet engine demonstrator will start early next year, with an initial investment by the UK of £30m to be used in planning, digital designs, and innovative manufacturing developments.

A further £200m will then be spent by the UK Government on developing a full scale demonstrator power system which will support hundreds of highly skilled jobs here in the UK, including many at Rolls-Royce’s plant in Bristol.

Then, over the next four years, over £4bn will be invested into major national and international National Security endeavours to design a world-leading, bleeding-edge, Future Combat Air System.

At the same time, Japan will be developing a future fighter aircraft through its F-X programme to replace the F-2 aircraft.

 

Strengthening our partnerships in the Indo-Pacific is a strategic priority and this commitment with Japan, one of our closest security partners in Asia, is a clear example of that. Designing a brand-new combat air system with a fighter aircraft at its heart is a highly ambitious project so working with like-minded nations is vital. Building on the technological and industrial strengths of our two countries, we will be exploring a wide-ranging partnership across next-generation combat air technologies.

Ben Wallace – UK Defence Secretary

 

This all came about last summer when UK Defence Secretary Ben Wallace met with Japan’s Defence Minister Nobuo Kishi in Tokyo to discuss the future of air combat systems.

 

As I have seen at first hand our partners in Japan have made enormous progress on technologies that can complement our own advanced skills and could help ensure both our Armed Forces remain at the forefront of military innovation.

We look forward to the continued partnership with a formidable power and close ally.

Jeremey Quinn – UK Defence Procurement Minister

Free Public Transport To Reduce Air Pollution?

A motion has been floated by Belfast’s Green Party Councillors to give all young people free public transport to help reduce the city’s air pollution.

 

A motion has been tabled by Belfast City Council that could see all young people within the city receive free public transport in a move designed to combat rising levels of air pollution in the city.

Current research suggests one in twenty-four deaths in Belfast are linked to air pollution.

 

If successful, the council motion would call on Infrastructure Minister Nichola Mallon and Translink to create a free public transport pilot scheme for all young people in Belfast.

 

The motion has been tabled by Green councillor Brian Smyth and reads…

 

This council supports the promotion and expansion of sustainable transport in Belfast as a critical step to averting climate change, address the significant levels of air pollution, reduce congestion and improve public health. Extending and improving uptake of sustainable transport is key to our city playing its part in averting climate breakdown. In order to create a societal shift in how people access public transport, this council therefore calls upon the Minister for Infrastructure and Translink to introduce a pilot of free public transport for young people in Belfast.

 

Last November Nichola Mallon was quoted as saying she was open to exploring the idea of congestion charges within the city of Belfast to help reduce emissions and move people away from a reliance on cars for transport.

 

I think we are going to have to look at incentivising people out of their cars and I am mindful that, in other places, they are looking at issues like parking charges and congestion charges. That would be an issue that would have to be brought to the Executive, but certainly I am open to considering a range of things. We’re in the middle of a climate crisis and we need to be bold in our thinking.

 

Stormont’s Department of Agriculture, Environment & Rural Affairs has also been consulting on a Clean Air Strategy, with two Climate Change Bills currently making their way through the Assembly.

Heads Up: Your Dynamics NAV 2017 Is Being Removed From Mainstream Support

Hopefully you’ll already be aware but if your organisation is using Dynamics NAV 2017 (what was Navision, now widely replace with Business Central) then it’s about to be removed from mainstream support by Microsoft… on the 11th of January this year.

 

Although Dynamics NAV isn’t available for new customers anymore there are still some older versions out there that Microsoft are withdrawing support for one by one and the 11th Jan, 2022 is when Dynamics NAV 2017 becomes unsupported (for mainstream support).

 

That does raise the question however… what should users be doing with their NAV 2017 systems?

Extended support… for security updates only… will still be available for another five years but it means your NAV 2017 won’t be receiving any more of those awesome and regular feature updates, which, in a constantly changing world, is quickly going to make your systems legacy.

 

There are easy options however and if you’re worried about the path your organisation needs to take now then cloudThing will be happy to discuss how Business Central can step into the gap and help.

 

Not thought about Digital Transformation before? Maybe now’s the time?

Get in touch below to speak with one of our experts to discuss replacing legacy systems that might prove to be an anchor on your progress through 2022.

Rail Commuters Face Long Delays With 25% Staff Shortages… Thanks To COVID

With rising cases of Covid across the UK, passengers are facing more rail disruptions due to high volumes of covid related staff absences.

The Rail Delivery Group said almost one in 10 rail workers were off.

 

There are reduced timetables announced by rail companies ScotRail, CrossCountry and LNER, and passengers have been made aware of potential cancellations.

Hundreds of staff are off work due to covid or having to isolate, according to Alex Hynes of ScotRail when speaking about the company.

He went on to say that, while ScotRail usually operates 2,000 services a day, it would be reducing this by 160 (8%) from Tuesday.

 

Over the last few weeks because of record numbers of Covid cases we have been cancelling too many trains so we have decided to proactively put this revised timetable in to give our customers greater certainty on the service we can offer

We have said we will offer this timetable until the end of the month but of course one of the lessons we have learnt from Covid is that we would be foolish to predict the future,

There will be a few twists and turn in this Covid tale until it’s over.

Alex Hynes, Managing Director of ScotRail

 

The reduced timetable will “prove to be a robust service for customers in the coming weeks”, according to Mr Hynes who said the operator was “pretty confident” in the changes.

Industries where people are unable to work from home are greatly impacted by self-isolation, and the rising cases of Covid have left swathes of the rail work-force self-isolating.

The operator CrossCountry has said more than one in ten staff are absent, and it’s “worsening each day.”

The Rail Delivery Group estimates that more than 6,000 staff, which includes crews and drivers, were currently absent.

The industry body advises anyone travelling to check timetables ahead of time as there would be “some short notice cancellations.”

 

We are working hard to provide the most reliable service possible and so that passengers can travel with confidence when fewer rail staff can work, a number of operators are introducing amended timetables

The Rail Delivery Group

 

Transport for Wales showed a staggering 49 cancellations on Monday, TransPennine Express cancelled 37 and Avanti West Coast cancelled 25.

Even major commuter stations, like London Victoria, have been negatively affected, with Southern stating it will not be putting on any direct services there until next Monday.

Ministers have been tasked with developing “robust contingency plans” in order to cope with the rising cases of worker absences – as Covid case numbers rise, we could see up to a quarter of staff of work.

Keep in mind, the transport sector is not the only sector hit by worker shortages, as the retail and hospitality industries are also affected by surging cases and self-isolation.

The Cabinet Office has asked public sector leaders to prepare for a worst case scenarios of 10%, 20% and 25% absence rates.

Comic Relief Are Making Red Nose Day An Annual Event

Sport Relief is set to become all-year-round campaign after years of being held alternatively with Red Nose Day.

Red Nose Day will now be held every 12 months and the details of how Sport Relief will be operated are to be announced in the following months.

Sport Relief has always alternated with Red Nose Day since it’s debut campaign in 2002, but has always raised less money than Red Nose Day.

And, after a “Difficult Sport Relief 2020”, according to Comic Relief’s latest annual accounts, despite being severely affected by the pandemic it still raised £2.3m more than the previous event in 2018, and so it plans to evolve the once biennial event into all seasons format.

Making Red Nose Day an annual event would create income stability for the charity, and also open the door to new opportunities.

 

For the first time in 20 years we’re changing from alternating Sport Relief and Red Nose Day campaigns to Red Nose Day becoming annual and returning every March, and Sport Relief evolving into a year-round brand from 2022.

Sport Relief is set to partner with major events, sports projects and sports stars, with more details set to be announced in the new year.

This is happening at a time where Comic Relief is focusing on fundraising and using pop culture and sport for social change all year round.

Alex Botha, Chief Operating Officer at Comic Relief

 

The accounts, for the year to the end of March, also show that its annual income was down by almost £4m year on year to £74.1m, partly because its income in 2019/20 was boosted by the one-off Big Night In Campaign held to raise funds during the coronavirus pandemic.

Despite this, the charity recorded a deficit of £12.1m, as it spent £86.2m – down from £105.6m from the previous year.

This was achieved by a “continued acceleration of the allocation of funds raised in prior years to ensure that we have delivered the maximum impact in challenging times”, according to the charity.

To adapt to the growing pressures of the pandemic, Comic Relief reduced staff costs by more than £3m to £10.8m in 2021.

Following a 2013 criticism of the charity, in which it was discovered it held investments in arms, alcohol and tobacco, states all its investments were ethical and it has not invested in fossil fuel extraction companies since 2017.

It also reiterates its commitment to modernise and update appeal films, by producing films from Kenya, India and South Africa using local crews, and local people leading the films in front of the camera.

Scotland & Wales’ Vaccine Roll Out Gets A Boost From Armed Forces

In Scotland and Wales, the vaccine rollout has been accelerated in response to new waves of Omicron infections across the country. The number of personnel from the Armed Forces supporting the effort in Scotland is now 221, and in Wales it’s 98.

The effort has been ongoing since early October 2021, with 121 Armed Forces members leading the charge, however the increased numbers of those on task is due to last until the end of February 2022.

The Armed Forces are on hand to provide more serves alongside the vaccination roll-out, as 114 personnel are driving ambulances to support the Scottish Ambulance service – a supporting service which will be going on until the end of March, however 96 personnel will remain on task to provide ongoing support to the life changing service.

Defence Secretary Ben Wallace said:

Our Armed Forces continue to tirelessly support the Covid-19 vaccination programme in Scotland to give people and communities vital protection against this virus.

This uplift in support will help to get more vaccines into arms faster, working shoulder to shoulder with the dedicated health services.

 

Who are the people 221 supporting the Scottish vaccine programme?

Amongst the Armed Forces, personnel are made up of health care professionals and general duties personnel who will work in support of NHS Scotland and NHS Wales staff and volunteers, and the tasks involve administering vaccines and providing planning expertise. Personnel will deploy to all seven Health Boards in Wales, with two teams assigned to each board, and have been assisting with tasks in Wales since the pandemic began including community testing and PPE delivery.

They come from units across the three services – Royal Navy, British Army and Royal Air Force.

Secretary of State for Scotland Alister Jack said:

Once again our fantastic British Armed Forces are stepping up in times of need to help tackle Covid-19 in Scotland and across the UK and I pay tribute to them.

The deployment of a further 100 personnel will make a significant contribution to getting people vaccinated in Scotland. As the festive season approaches, when we want to spend time with loved ones, it’s more important than ever to be protected. I urge everyone to book their jabs as soon as they are eligible.

The defence’s work to support the UK’s pandemic response has been given the operational name of “Operation Rescript” and has involved making 398 personnel available for tasks in Scotland.

All support that is being provided goes through the Military Aid to the Civil Authorities (MACA) process, and since March 2020, there have been over 430 MACA requests across the UK.

Brigadier Ben Wrench, Commander Joint Military Command Scotland said:

Whether it be responding to the impacts of storms or national health crises, the members of our Armed Forces are always prepared to deploy at short notice to support the nation and our communities.

I commend the dedication of all those serving and supporting this effort, many of whom will find themselves away from their families and loved ones this Christmas and Hogmanay.

Defence Secretary Ben Wallace said:

We are now supporting this national priority Covid-19 vaccination programme in Wales, Scotland and England.

Our Armed Forces are supporting our world class health services to accelerate the vaccine rollout and provide essential protection for people and communities. I urge anyone eligible to take up the offer of a vaccine.

Secretary of State for Wales Simon Hart said:

It is critical that as many people as possible receive the vaccine in our fight against Covid-19 and I’m hugely grateful to the UK’s Armed Forces for supporting this effort in Wales as well as continuing to support the work of the Welsh Ambulance Service.

Since the beginning of the pandemic, the military has stepped up to support health services across Wales with the distribution of PPE, construction of a temporary hospital in Cardiff and assisting community testing in the South Wales valleys, demonstrating the UK Government’s commitment to meet the needs of the whole of the United Kingdom.

The Armed Forces are on hand to support communities, devolved nations and civil authorities as requested, if they meet MACA principles. This includes short notice support to places like Aberdeenshire Council in order to conduct welfare checks on vulnerable people and isolated communities impacted by Storm Arwen.

Leaked Internal Facebook Documents Reveal Their Plan To Ignore EU Privacy Laws

Political News outlet Politico have released leaked exchanges between Facebooks lawyers which they claim show Facebooks intent to ignore judgements by the European Court of Justice that state US privacy laws don’t offer enough protection to allow the free transfer of personal data from the EU to the US.

 

Facebook’s lawyers (or Meta’s lawyers as we should now call them) intend to argue that because Meta uses ‘standard contractual clauses’ (SCC’s) as the legal mechanism of the transfer of personal information and data, then the ECJ’s judgement about US Privacy laws won’t apply to them.

They’re basing this argument on the Schrems II case in which the ECJ ruled the EU-US data sharing agreement Privacy Shield was no longer fit for purpose but that SCC’s were IF additional security (where necessary) was implemented to prevent excessive access to the transferred personal data by the recipient third country.

 

Meta’s lawyers also referred to the fact (in the leaked documents) that the UK was granted data adequacy by the EU last June, this, despite the fact that the ECJ had found mass surveillance activities by the UK Government to be illegal.

In the exchange between Meta’s lawyers the idea was also floated that they could argue that the US Federal Trade Commission was “carrying out its role as a data protection agency with unprecedented force and vigour,” meaning they could make the case the US, in terms of data protection, was not that different from the UK.

 

It is clear that in some important respects, the UK regime, which the Commission has assessed to be adequate under Article 45 GDPR, takes a similar approach to the US in relation to limitations on data protection rights in the context of interception of communications.

Leaked Meta Communications

 

That being said however, the adequacy granted to the UK for data transfer between the UK and EU countries is only limited to four years, with many MEP’s objecting to it at all and negotiations are ongoing even now, meaning any argument based on purely on that could face a lot of pushback.

UK’s National Crime Agency Has Discovered 225m Unexposed Passwords

It’s come to light that the UK’s National Crime Agency (NCA) and National Cyber Crime Unit have recently uncovered a whole host of stolen passwords.

This was after Troy Hunt of ‘Have I Been Pwned (HIBP), fame announced he’d been handed them to add to his service which allows anyone to check if any of their credentials have been exposed.

Apparently, 585,570,857 passwords were shared by the NCA, with over 225,665,425 being passwords that HIBP had never seen before.

That takes the number of credentials that people can now check with HIBP to over 840 million (847,223,402 to be exact).

 

During recent NCA operational activity, the NCCU’s [email protected] team were able to identify a huge amount of potentially compromised credentials (emails and associated passwords) in a compromised cloud storage facility. Through analysis, it became clear that these credentials were an accumulation of breached datasets known and unknown. The fact that they had been placed on a UK business’s cloud storage facility by unknown criminal actors meant the credentials now existed in the public domain and could be accessed by other 3rd parties to commit further fraud or cyber offences.

National Crime Agency statement

 

The NCA haven’t revealed were these passwords came from or how they came to light (outside of their above statement).

 

Before today’s announcement, there were already 613 million passwords in the live Pwned Passwords service… so the NCA’s corpus represents a significant increase in size. Working in collaboration with the NCA, I imported and parsed out the data set against the existing passwords, I found 225,665,425 completely new instances out of a total set of 585,570,857. As such, this whole set (along with other sources I’d been accumulating since November last year) has all been rolled into a final version of the manually released Pwned Passwords data.

Troy Hunt – HIBP Founder

 

HIBP have also confirmed they’ve added a new ingestion pipeline which allows law enforcement agencies around the globe to mass upload compromised passwords, with agencices such as the FBI already availing themselves of the service.

Over 66% Of Welsh Councils Will Be Enhancing Their Telecare Services Soon

A report, commissioned by TEC Cymru and conducted by FarrPoint, has been released with in-depth information as to the telecare landscape of Wales.

Funded by the Welsh Government, TEC Cymru are responsible for supporting the shift to tech led care in Wales, with this report being a ‘state of current play’ for them.

 

A Telecare service is a remote service provided to care for citizens who are less physically mobile, able to provide reassurance, help and guidance via the phone or other assistive technologies like pendants or lifeline alarm units.

 

The research was conducted through a variety of data analysis and direct consultations and estimates there are currently over 77,000 telecare users in Wales, with over 91% of that figure aged over 65 and over 33% aged over 85.

However, the take up pf telecare varies across Wales, being delivered by Wales twenty-two councils, alongside numerous Housing Associations, with only 67% of respondents using 2025’s move from analogue to digital lines as an opportunity for digital transformation.

 

The report did highlight that the main driver for transformation across all Councils was a desire to improve and extend their services to a wider demographic, such as introducing proactive care or a full telehealth service with better interoperability between health and social care bodies.

 

As the switch over to digital approaches though a lot of support and investment will be needed to support telecare services into the new infrastructure.

Currently, only three out of twenty-two Welsh councils are using digital telecare tech and only 19% of a plan on how to manage the transition.

90% of councils have stated they feel unsupported during the transition with over 66% stating they have concerns around the migration.

 

Technology advances mean individuals can be supported in every stage of life. Telecare is an extremely positive way in which we can support the most vulnerable in society, allowing them to continue to live independent lives, often from the comfort of their own home. The digital switchover will be a hurdle for many of the telecare providers across Wales but it is one that they absolutely must overcome. As a result, more residents will be able to benefit from these types of services, providing the support and peace of mind to allow them to live independently in their own homes.

Richard Parkinson – Director, FarrPoint

 

This report provides the first analysis into the current state of the sector in Wales and will be crucial to reshaping services with the citizen in mind. It also highlights the disparity between the access, cost and type of services that citizens have depending upon where they live.  FarrPoint has previously done similar work in Scotland and England, which brought a valuable perspective on the rollout of digital telecare from across the UK. TEC has the potential to help huge numbers of people continue to live independently, and we’re committed to improving services across Wales, helping as many people as possible access the care they need.

Aaron Edwards – TEC Cymru

Life @ cloudThing As An Apprentice Power Platform Consultant

I’d had an eye on cloudThing for a while.

My background is in theatre and that’s where I became friends with cloudThing’s own Mr Performing Arts – Mike Chappell. He’d always said great things about cloudThing, so it was always on the radar.

 

When the pandemic hit, my job was at risk. To make light of a dark situation, I jokingly asked him if there were any jobs for people wanting to go into . Well, semi-jokingly. I just said that “if you have any places feel free to send them my way”.

I hadn’t really expected Mike to come back to me a couple of months later saying they were thinking of getting an apprentice.

 

This whole thing had started off as a joke, but I was committed to the bit now. But it’s fine, I’d done website design in the past, so I was relatively techie but not to this extent. But I’m not one to back down from a bit so I thought I’d give it a go if something came up and then it just did!

So that’s how I found myself doing a Level 4 award in Business Analysis, and I have been brought on as a PowerPlatform Functional Consultant.

 

Why cloudThing, you ask? One of the many praises Mike would sing about it, as he’d worked many previous jobs in similar roles, was that it’s one of the most down-to-earth and chill companies, so much that it felt weird coming to cloudThing and finding the atmosphere so relaxed.

I’m happy to say that it’s held through so far that it’s been social and welcoming, in ways that some companies just wouldn’t trust you to be in that regard, and that’s what makes cloudThing so different.

For example, I was dragged into social events over Teams before I’d even started. I signed the contract and Mike was like, “Great you’re on the team now get in the call we’re playing games.”

I’ve been instructed that I must answer ‘yes’ when asked if I’m enjoying it here.

Kidding!

 

But it is going really well.

I have a fantastic support network around me which is made up of my mentor and my colleagues. I’ve just been added to a BA (Business Analyst) group chat which goes without saying is a font of advice and information I can draw from when needed.

I’m not going to sit here and tell you this is the best company because I don’t need to; they’ve got the awards to prove it!

I’m here on the pulse of a new career and doing an apprenticeship, so that’s the angle I’m looking at here. When this opportunity from cloudThing presented itself to me and I started seriously considering it, I was worried that I wouldn’t have any knowledge of Dynamics or PowerPlatform. The week that mike said, “can you do it” I sat there like “I don’t know IT, I’m so confused I don’t get it…” but within a week I was fine.

 

I guess this is down to the very culture of cloudThing; it meant that just by being here and present within the environment you just pick up so much of this information by osmosis alone. I’ve learned more about the PowerPlatform in the 4 months of being here than I would ever have done otherwise, in terms of being given the space to learn things or asking someone for help and it not being frowned upon. There’s encouragement to learn new things and, as an apprentice, it’s always good to know that the people around you are happy to impart their knowledge. You’re a padawan.

And as a padawan, you can say without judgement “I don’t know this”, and you never get “Why don’t you know this?” In return, it’s always, “that’s fine. Go and learn it.” And I love that – it’s friendly, welcoming, opening – all those lovely buzzwords.

Plus, this encouragement of learning new things has to be embedded into everything we do because the software’s always changing.

 

You’re not locked into anything once you do start either – I could easily Pivot into any other role within the business if I truly  desired. The mobility within the company is attractive for sure. However, from what I’ve picked up, PowerPlatform is the way forward with Dynamics being used throughout.

Everything is moving towards digital, and so there’s no better time to get in at the ground level.

The thing about cloudThing as well is, they’re always looking for the next big thing – Build Future. It’s in the ethos in the company to look at what’s up and coming and to improve on the things that are already there.

 

This last bit should be read in a radio presenter voice; “If you’re going to pick a company for your IT apprenticeship, cloudThing is a spot-on choice.”

Life @ cloudThing As A Dynamics Functional Consultant

Jeewan Bahra thinks the social life of cloudThing makes us stand out from the rest

 

I was told how awesome cloudThing was by a (now) colleague, a sentence most cloudThingers can probably say.

I understood what cloudThing was doing with Dynamics and the fact they were (and are) pushing it to its limits and not using it as a typical CRM system. I wanted a place which wasn’t afraid to push boundaries and that’s what I found in cloudThing.

And the scope of the company was a massive temptation, the career growth here is phenomenal, and the technologies used is varied, there’s a massive range here. And no on-prem servers, I cannot stress this enough, there’s ZERO ON-PREM SERVERS.

 

My day-to-day can vary based on what stage of a project we’re in. Pre-project stuff is not really something I’m involved in unless specifically asked.

In the discovery phase of a project, I’ll work with key stakeholders to identify the business requirements that they want to migrate into dynamics, what their digital transformation plan is. The discovery phase also includes identifying how business requirements can map to Dynamics, which also involves identifying the most suitable technologies to use for the application.

It’s the place where we can see any limitations we might have and look at the existing technology that won’t be getting moved so the task is to find out where the integration points are. An example of this is that one of our current projects has an on-premises database that won’t be moving so we need to find a way to shift it to the cloud.

Next, we agree on scope, namely what can be delivered, whether it’s fixed price or if it’ll be Teams-as-a-Service. Then it’s roadmap composing, agreeing all stories within the scope, confirmation of what technologies we’ll be using, the technological approach and acceptance criteria, Azure DevOps, leading the planning sessions.

 

My favourite bit – SWAGS! Or ‘Scientific Wild Ass Guesses’, which is the effort you think is going to be involved in the delivery of a story. We also make up an Entity Relationship Diagram (ERD) around this time.

That’s when we move on to the really juicy bit of the project – the delivery.

 

I work closely with the UX team to map out user journeys and how everything will be laid out. This includes screen mock-ups and presentations to the client to ensure that they’re happy with it, it meets the acceptance criteria, and of course any marks that they have to hit too.

In project, right now, it’s up to me to order those stories and make sure we don’t encounter any roadblocks.

 

So yeah, what can I say?

It’s awesome here – we use that word a lot, but I think it’s the most accurate. It’s the best organisation I’ve ever worked at because you’re treated like a grown up. Not only that, but you’re respected as an individual and your views count – you could have a heated discussion with the directors, not that it happens, but you could. They know you’ll go the job done and so there’s no micromanagement so you can just crack on with no one breathing down your neck. It’s up to you how you want to work basically. You’re trusted to do your hours and there’s no seat warmers.

 

There’re two facets to what makes this company so uniquely awesome, in my opinion. Industry wise, we’re doing great stuff. We donate so much code to NonProfits and add-ons that are free of charge too it’s ridiculous the amount of goodwill here. When those organisations thrive, it brings a real positive impact to the world and to know cloudThing was a part of that makes you so proud to be here.

 

We don’t just work with NonProfits though, in fact the areas we work in are all so different from each other it’s created a huge variety in the projects you could be working on. For example, we’re working in the Membership sector, and Central Government – I mean, who else would have been able to take a Dynamics CRM system and then apply it to a project with the DVSA that meets all of the policies and guidelines required for a Government portal? I think it just speaks of the innovation and the creativity of the people here.

The other facet is that, on a personal level, this is just a good and healthy environment to be in.

You’re around people who want the best, not just for themselves but for the environment around them. It’s promoted a culture of honesty and openness, where you kinda wanna be better because you’re surrounded by all these knowledgeable people. It’s not so much that you’re competition with each other but rather you take inspiration from these people who are in the top of their field, and you want to keep working towards that for yourself.

I think that’s the main takeaway here, that ‘Why cloudThing?’ Well, it’s the ethos and the ethics. You can go through to anybody and they’re willing to help you; if you’re ever stuck for ideas or whatever it maybe, there’s always at least one person who has been there, done that, and they’ve got the t-shirt to prove it.

As you can imagine, the day to day here is varied in all the best ways. There’s no time to sit and twiddle your thumbs which is good, I like it.

£116m Fund Announced By UK Gov. For Green Tech

The UK Government has just announced a £116m green ‘tech fund’ that will be distributed between direct air capture, greenhouse gas removal, SME tech innovations and business development support services, with the aim to boost job creation whilst also delivering on carbon net zero goals.

 

This £116m government investment will support businesses across the nation to turn their green ideas into reality and to develop ground-breaking projects that save energy, slash utility bills and tackle pollution. British businesses and entrepreneurs are already leading the world with innovative solutions to tackling climate change. This is not only good for the planet but will bring new jobs and investment across the UK.

Greg Hands – Energy and Climate Change Minister

 

The biggest part of the fund – £64m – will be spent on Direct Air Capture and Greenhouse Gas Removal technologies.

It’s hoped this funding will help attract further private investment, with interest in direct air capture and other forms of negative emissions tech growing.

A further £30m will be awarded through the Energy Entrepreneurs fund to 58 SME’s to help deliver better energy efficiency, storage and clean power.

 

The final share of the £116m will see £22.8m go to business development support services for SME’s developing green tech. Technical support will also be offered through the Technical Third Party Support project, which offers expertise in tech coordination, social research, carbon control and storage, generation and distribution to key projects.

 

 

Virtual Events Generated Over £39m In Donations During 2021

Virtual fundraising events continued to grow during 2021 according to an updated report released by mass participation agency Massive, in conjunction with JustGiving.

The report continues on the work done last year, collating information about virtual fundraising during a pandemic. Both reports combined information from JustGiving as well as data from 150 other virtual events that raised over £39m in 202, whilst comparing the amounts to 2020 figures.

 

The report focused on peer-2-peer virtual fundraising activities, including campaigns from Diabetes UK, Breast Cancer Now, Dementia UK and Alzheimer’s Society.

 

  • 54% of the events data was collated for were new for 2021
  • 29% of the events were a repeat from 2020
  • 17% were pivots of existing events that went virtual for 2021 (down from 38% during the pandemic).

 

Overall, the report concludes that virtual events raised more in 2021 than the previous year, with over half of those surveyed raising over £100k and the number of virtual events to raise over £1m doubling YoY.

 

“We’re seeing more people taking part in virtual events but not seeing any significant growth in levels of fundraising, so the growth we’re seeing appears to be driven by volume as opposed to value and we’ve seen a corresponding increase in marketing spend to drive that volume. In 2022 we expect to see continued success for both virtual and physical events. As ever the most successful campaigns will be the ones that adapt their offer to the changing expectations and attitudes of their audience to offer something new, regardless of whether that’s online, in real life or a mix of both.”

John Tasker, partner at Massive

 

The report echoes what we’ve seen on JustGiving this year – virtual events are no longer simply the understudy for physical events and have unique value in attracting new and diverse audiences to good causes. This time next year we expect to see a continued growth in charities blending their events portfolios on JustGiving, with a balance of virtual and physical experiences developed to suit the needs of their supporters.”

Sally Falvey – Head of Retention Marketing, JustGiving

cloudThing’s powerUP For The Membership Sector Coming SOON

Membership organisations that are already using payment schedules and adjustment managers completely in-house by way of their CRM have reduced third-party processing costs, increased data input and processing efficiency and generally just streamlined all their processes.

They’ve achieved this by following a model of reliable and repeatable methods of data classification, that is… automated process and cloud migration have made the data something that can be accessed and analysed by all those who have permissions.

Overall, their people have a reduced manual effort, less room for human error and increased collection of revenue on time, leaving more room for their members to get the most out of the organisation!

As this technology is already being implemented in membership organisations the world over – automated processing is nothing new – it means you’re not experimenting. You’re ensuring business continuity by #buildingFuture and creating a culture of resilience among your organisations.

The good news is that cloudThing is making this technology available for membership organisations to provide easy to implement solutions to all the common problems that the membership sector faces!

What can you do with the powerUps?

Well, basically everything! The automated processing schedule checks billing profiles for payments due, the direct integration with BACS means you can process payment and deliver responses and automatic notification of payments received, and it updates the billing profiles – all which will allow you to maintain that customer relationship.

Data Mill is part of cloudThing’s wider membership powerUp, which has been designed to provide immediate solutions to common problems that the membership sector faces. Sign up here to receive more information about our powerUps and its benefits.

NonProfit Sector Still Needs To Move Past ‘Tokenism’

Fundraisers have heard that much of the NonProfit sector is struggling to move beyond tokenistic gestures when it comes to highlighting the voices of marginalised groups and the communities it works with.

Jaden Osei-Bonsu, programme manager at the leadership development community interest company the Centre for Knowledge Equity, told delegates at the Chartered Institute of Fundraising’s annual convention that the sector needed to shift power to the communities it supported, rather than telling them how to solve their problems.

Speaking at the online convention during an event focusing on how to be an ally to marginalised groups, Osei-Bonsu called for larger charities to think about how they could work in genuine partnership with grassroots organisations which allowed them to lead programmes, rather than simply advising.

 

Historically with the charity sector, fundraising usually puts communities in a position where they are being researched or people are trying to tell them what is going to solve their problems –

Jaden Osei-Bonsu – Programme Manager, Centre for Knowledge Equity

 

Osei-Bonsu adds that the conversation should be about shifting power to communities with direct experience of the issues being addressed, as the majority of the sector is struggling to move past tokenistic gestures.

Drawing on her experience in youthwork, fellow panellist Yolanda Copes-Stepney, founder of Speak & Do, said that when engaging with marginalised communities, organisations needed to make a conscious effort to ask what results the communities wanted to see from the engagement.

She also said that, too often the young people she spoke to believed nothing would come from their involvement and that they would not be listened to.

She also said that organisations and individuals need to remember that allyship and supporting marginalised groups was ‘going to be a constant process of learning’.

It’s about asking lots of questions, and never assuming anything for them.

 

 

Business Architecture Optimisation

Understand the value your organisation can gain through improving your digital Business Architecture with a Microsoft Gold accredited Partner

 

Business Architecture Optimisation Description

Contract out cloudThing’s Business Architecture services.

Leverage our expertise and experience to identify opportunities to improve your digital architecture using best practice methodology following Lean 6-Sigma principles.

We’re a Microsoft Gold accredited Partner with demonstrable experience in the delivery of Digital Transformation programmes for enterprise organisations in multiple business sectors.

Business Architecture Optimisation

 

WHAT’S INCLUDED…

  • Business Architecture service for enterprise digital transformation in all sectors
  • Map your digital landscape, including staff, customer and supplier processes
  • Define your target business architecture, ‘As Is’ -‘To Be’
  • Objectively challenge the status quo, to refine & streamline processes
  • Engage through proven, interactive, collaborative discovery workshops with business stakeholders
  • Delivered using best practice Six Sigma principles & methodology
  • Technology agnostic – any current digital architecture can be assessed
  • Produce recommendations report to share with wider business stakeholders
  • Flexible engagement choices with easy access to additional related services
  • Gain knowledge transfer of SIPOC and Sigma 6 techniques

Application Research & Development

Work with cloudThing’s business and design teams to identify opportunities to in new and existing markets to disrupt and offer a new digital product. Clearly define a problem and solve it through a new digital service, validating the solution with real users before investment.

 

Design Thinking

Start-ups and existing organisations alike are always looking for new revenue streams by offering something unique to users through digital technology.

This is of course, much easier said than done and requires a great deal of work prior to the first line of code even being written to do everything possible to ensure that your investment in a new service will be a success.

cloudThings design approach is to view any digital service as a ‘product’.

While products are often thought of as physical items, even SaaS digital products, every a service should be approached in this manner.

The reason for this, is to approach the ‘product’ as a user, understand who would use this product, what they would expect from it, how they would use it if they had no prior experience of using it and what are they likely pay for to take advantage of it.

 

 

 

This is the first stage of the cloudThing design thinking process, which although not totally linear, revolves around building personas clearly for all users of the product and validating these stories clearly with real users in workshops.

Defining this early helps ensure all parties from management, developers to potential investors all have the same clearly defined objectives for the product build and a shared vision for the reason for the product to exist.

Extensive research into the potential product, the existing landscape and competitive analysis is then undertaken by our design team, through a combination of user interviews, contextual inquiries and wire-framing mean that the design for your product is built with the user at it’s heart and there is no disconnect between the vision for what the product ‘should be’ and what it is.

In parallel to the design process, is the Business Architecture, which is to ensure the long-term success of the investment for the project.

We’re experienced at building products as a platform, which can allow for future integration with third parties, roadmap for future features and defining what the MVP looks like.

We then interpret both the business architecture requirements and wireframe designs into functional User Stories into user personas, scenarios, user journeys and a coherent information architecture that allows us to produce working prototypes we can iterate and refine with you.

The final step is to produce the Hi-Fidelty designs that bring it all to life.

This helps build a business case to investors and ensures the overall solution architecture for the product is tied closely to the initial design research and built for the future of the product.

Business Transformation

True transformation does not happen overnight and neither should it. We are able to cut through the buzzword and show how incremental improvements to processes and programmes will empower your people; putting all the pieces in place, allowing you to fail fast and early but win big, future proofing your transformation project from the beginning!

 

Getting Started

Digital Transformation is a hugely overused term in the tech industry.

It’s been used to describe a huge variety of projects, with 80% of CIOs feeling under pressure to deliver ‘transformation’.

Having a common definition across the organisation is more important than ever.

Transformation is a fundamental change in something.

It is not something done faster or smarter but completely and utterly differently.

For a business, this all comes down to processes. If you wish to use technology, to do what you do today, but faster – then you are not looking to transform and that is absolutely fine.

Paramount to any successful project is clearly defined goals and outcomes, making sure the entire organisation understands the journey, the ambition and what is required to get there, whether it’s transformation or just getting more efficient, spelling this out early is key.

From here, you can work backwards from your desired ‘target state’ to where you are today.

Map out your current processes, stakeholders, suppliers and technology. Only then can you identify what needs to change to reach your desired state.

You can then work incrementally to deliver this change in stages, meaning value is realised earlier and stakeholders see the benefits of the project.

 

Mitigate Risk By Having A Future Proof Strategy

Transformation is achieved through a mindset of Continuous Improvement across the organisation.

Without making changes and daring to innovate, organisations in any sector can get left behind.

Risk will always arise when any changed is proposed and be prevalent within a transformation project, however we have discovered that by taking a modular approach an organisation can actually keep risk to the bare minimum yet maximise reward whilst continuously improving their IT services.

THE MODULAR (LEGO) APPROACH

Business Architecture allows us to take a modular approach to change by improving process and methods before introducing software. This leads to improving and stream lining the way we can engage but by locking the improvement in gets you rewarded early in to the project

Process Automation

Saving time, money and staff resource through well implemented Process Automation

 

Shrink Time, Close Distance

Fundamentally, cloudThing are an organisation who specialise in business and technical consulting to help an organisation shrink time and shorten distances through digital technology.

The technology changes; it could be a bespoke application, it could be by customising something off the shelf or it could be a data science process to surface new features from existing data.

What does not change however is the hard work that goes in prior to the delivery of the technology… A detailed business workshop to map out all processes across all users that make up the organisation, from suppliers, to staff, to customers.

 

 

From this, we can work out the ‘ideal state’ of your organisation, and the ‘true state’ of how things are now – based on what management perceives the process to be, and what is the process in practice.

We can then workshop with users to identify opportunities to reduce manual data entry, erase duplication of processes and spot areas to enhance processes by integrating data insights from other areas of the organisation.

A good way to imagine the change from manual to digital processes is that of the effort required to replicate sheet music, either completely digitally with minimal effort via an automated recording or by enhancing staff with technology to make a live performance more compelling.

With business know-how and expertise we can advise your organisation on which approach makes the most sense and how achievable it is based on your budget.

Often, the technology to deliver unique experiences is already available off the shelf, but just requires a modern, cloud platform to integrate and deliver it.

Selecting The Right Technology

Once we have mapped processes, we can work with you to choose the approach to optimise and automate them.

Through exploratory analysis we will analyse the relevant data to the process and build a hypotheses about how it could be automated.

Our Solution Architects will then suggest the least resource intensive way of achieving this based on cost to integrate or develop the solution, time to train users and the cost associated with maintaining the solution.

We can then test the viability of the technology and the automation hypotheses through observation of users interacting with a prototype or wireframes.

 

 

 

 

The Effect Fintech Is Having On Our Everyday Lives

We may think of Fintech as a new concept but it’s been part of our lives for decades

 

Fintech, a portmanteau of Financial Technology, is a catch all term for any tech software (or hardware) that’s been created with the goal of augmenting streamlining, digitising, disrupting or generally making better any ‘traditional’ financial service.

Think of the level of fintech that layers our lives already, that didn’t exist just a few years ago.

Monzo completely disrupted the banking app market with more and more users making the switch every month, nights out can easily be split via the transferring of money via Venmo, you can pay for groceries in a shop by scanning a phone with Apple Pay and Uber’s can be ordered with pre-stored debit cards paying for you automatically.

 

Fintech is becoming more and more ubiquitous every day, happening so subtly that many don’t stop to consider how wide ranging it’s even become.

However, as with many emerging technologies, due to the breath-taking scope of what fintech can make possible, it makes it an ambiguous subject to discuss in any kind of depth.

Fintech: A Definition

Fintech refers to any software, hardware, algorithm or application for computers or mobile devices. Fintech platforms can empower the most basic of functions, from moving money around accounts, the paying of bills or applying for a loan right through to technically intricate concepts like the blockchains in cryptocurrency or peer-peer investment platforms.

From there, fintech can branch off into several ‘subcategories’ including, but not limited to:

 

  • Wealthtech – Wealthtech refers to the use of innovative tech such as AI or Big Data to empower investment firms.
  • Investtech – Similar to wealthtech, investtech helps users better invest their day-to-day funds and make the most out of funds that are invested
  • Insuretech – is seen as a commitment within the insurance sector to new, better and innovative developments within their product offerings and back-office processes.

When Did Fintech ‘Start’?

Just because something has become a catchy buzzword recently, doesn’t mean it hasn’t also been around for a long time.

Whilst it might seem like a relatively new phrase as the term Fintech was only added to the dictionary in 2018, as a concept it’s actually been around for decades… think of the introduction of cash machines in the 70’s. At the time, being able to withdraw your own money from a ‘hole-in-a-wall, rather than a bank, was seen as bleeding-edge financial technology, no matter how mundane it seems today.

 

Fintech may once have had a bit of a ‘start-up’ reputation.

If you’d heard the word at all ten, fifteen years ago it’s likely you may have thought of ‘Silicon Valley taking on established banking institutions, today however it’s become an integral part of many digital transformation processes.

How Does Fintech Affect Our Everyday Lives?

Without denigrating anyone, large financial institutions don’t exactly have a reputation for agile methods of working or continuous improvement.

Fintech is changing that though as younger, more techy savvy audiences demand more from the financial sector.

Immediacy, mobility, personalisation… these are just a few of the buzzwords that a modern audience expects in their everyday tech.

Fintech is helping to fulfil those goals by empowering technology to be instantaneous… it wasn’t so long ago that you’d have to go into a bank to apply for a loan or speak to someone on the phone, that decision can be reached instantly on an app now… thanks to fintech.

 

One of the main reasons for the growing popularity of fintech is its ability to bypass previously clunky process, and it often does this by eliminating the need for human interaction and the possibility for human error. We’ve already mentioned you can apply for, be considered and approved/declined for a loan and then have the funds issued to your account… all without the need to interact with a fellow human through the use of some clever RPA and AI. But you can also do the same for a mortgage (technically the same thing, albeit with a much bigger commitment).

It doesn’t end there though.

Want to invest in the stock market? Great, well you don’t need a broker anymore… just automate the process by downloading an app to your phone, invest in what you want, when you want, where you want.

In fact, investment platforms are a great example of how far fintech has come. In previous times, even if it were possible to buy or sell stocks directly, you’d still likely have relied on the advice of a qualified broker. Modern investment apps however all come with a ‘robo-advisor’, an AI backed chatbot capable of analysing the market and offering advice as to where your money is best invested.

Some of these things may seem very simple, but there’s a lot of fintech working very hard in the background to make them all so accessible and streamlined.

How Secure Is Fintech?

One of the surprising things about fintech is the level of trust it engenders in end-users. A recent EY report highlighted the fact that 68% of respondents asked showed a willingness to use financial tools developed by a non-traditional (i.e., non-financial) organisations over those that that were developed by more traditional institutions, with 89% of SME’s happy to share theirs and their client’s data with fintech organisations.

What that shows is that financial apps, processes and tech don’t have to carry a hallmark of authority from Wall Street, The Bank of England or any other large, prestigious financial institution… they just have to perform their function well and make life easier for someone.

 

Whilst all that is true, it should also be pointed out that many fintech companies are almost entirely unregulated. A lot of people at the moment are investing in crypto currencies and the block-chain technology that backs them but were anything to go wrong investors would have very little come back.

 

It may be that there is no ‘one’ answer as to how safe fintech is, taking it very much on a case-by-case basis. In certain instances, fintech can immeasurably improve the security of financial transactions, but newer, dare we say it, flashier, advances are still to be proven. That proof may be difficult to get given the immense proliferation of fintech in recent years.

What Does The Future Hold For Fintech?

One thing’s for sure… fintech isn’t going anywhere anytime soon.

Deloitte did note last year (2020) that the effect the pandemic has had on global economies has left industry analysts and insiders unsure about the immediate future of fintech. Many fintech organisations have suffered setbacks, whilst others have thrived and expanded… with demand for mature fintech solutions at an all time high.

That means though that many, if not most organisations will be increasingly counting on fintech to help them navigate digital transformations in the future.

Longer-term it’s likely we’ll see more and more collaboration, consolidation and even acquisitions between legacy financial institutions and younger fintech’s, with end-users continuing to see an increase as to the penetration of fintech into their everyday lives.

Dealing With Ethical Walls In Tech… Ethically

How can technology solve modern ethical dilemmas faced by organisations?

 

cloudThing faces several ethical dilemmas/ethical walls regularly as a software and Power Platform developer, many of which will be familiar to other organisations, both in terms of procurement and in working with competing clients.

 

As cloudThing has grown over the years and acquired more expertise in particular areas of digital transformation, we’ve attracted more and more clients; so once we reached a certain size it was only to be expected that some of those clients might be in direct competition with each other.

It’s also meant we’ve bid and won on a lot of Public Sector procurement processes, however, as many of you are likely aware, public procurement processes come with their own unique ethical wall issues, which also need addressing.

 

What that all means in software development or pretty much any other profession is that an organisation needs to put in place mechanisms to protect the interests of all parties involved, be it the organization itself, it’s members or its clients, whether that’s during a procurement process or after work has begun on a project.

 

As an example of a project that’s already underway, let’s say cloudThing are company A.

They’re doing work for companies B and C.

For cloudThing (company A) the ethical dilemma could be the projects being worked on might have similar goals meaning the questions that will need answering are…

 

  • How can both companies’ data and intellectual property (IP) be protected effectively?
  • How can both companies be reassured to a sufficient level where they’re both prepared to remain clients?
  • Given the above two requirements, how can all ethical issues be resolved neatly?

 

Any organisations most pressing concern when working with a third party will always be the protection of their intellectual property, be it confidential data, internal and unique knowledge or even a code base.

Many of these things will be what gives an organization their competitive edge and if they’re going to be comfortable working with a third party, especially one that also works with a competitor, then they’ll want to feel secure in the knowledge that that IP is protected securely.

 

The approach used by most tech companies (cloudThing included) is a technique called a Chinese Wall (sometimes also referred to as an ethical Wall) and could easily be adapted and adopted by other organisations and sectors to improve their ethical and security-first reputation.

What Is An Ethical Wall?

An Ethical Wall is a concept employed by most tech companies when dealing with clients who do or might compete with each other. That ‘wall’ could also be considered as an ‘IP firewall’, making sure that client data and IP is kept secure whilst held by your organization.

It works by ensuring different departments or teams that interact or work on opposing client projects don’t have access to any sensitive data available to the other team… there’s a metaphorical ‘wall’ between them protecting the different client’s data.

That approach of digitally (and sometimes even physically) walling off teams means an organization can confidently have disparate clients share data with them without any risk of trade secrets or other sensitive data being accidently revealed to a competitor.

In essence there is a complete moratorium on communication between teams relating to anything even tangentially connected to the projects they’re working on to ensure there’s never an inappropriate exchange of sensitive information, IP or even ideas.

Those ethical walls (both metaphorical and where possible, physical) also reduce the risk of any accidental leakage of sensitive data and IP from one client to another.

 

Making things more complicated however is Public Sector procurement as there could well be another form of a Ethical Walls you need to deal with… Procurement ethical walls.

An ethical wall in procurement is a process to ensure absolute parity during the procurement bids, ensuring all organisations bidding do so from an equal position and ensuring no one supplier has an unfair advantage.

They most often occur when one supplier has been working with an organisation for awhile when a new procurement opportunity arises. In those instances an ethical wall needs to be set up between the existing service delivery teams and the people working on the new bid, to ensure the rules of competition and fair trade are maintained.

 

Fortunately, the solution for both is the same..

How To Create An Ethical Wall

When cloudThing set up an ethical wall we typically follow several standard practises:

 

  • Physical separation first: Whilst this may not always be possible in a single office environment, the first step in setting up an ethical wall should always be to create physical space between teams. Different sites are the ideal solution to put a clients mind at ease but if that isn’t feasible then certainly different rooms or, in the ‘new normal’ various teams can work remotely, from home.
  • Dedicated Resources: When working on a high security project or procurement process (one that requires an ethical wall) cloudThing will always deploy additional data-protection governances such as dedicated virtual machines and environments only accessible to those working on the projects. Those privacy measures ensure other teams can’t access data they shouldn’t, either deliberately or accidently.
  • Good Governance: cloudThing have discussed the importance of good governance before, it’s at the heart of everything we do, but it’s also at the heart of how our staff conduct themselves. You can take all the physical and digital steps you want to ensure strong ethical walls but you need to bring the hearts and minds of your staff along at the same time or all of those efforts will be for nothing. They need to know if they get assigned to competing projects then they need to avoid all communication about said project… in that respect good governance actually needs to come first in priority. It covers more than that though. HR staff or resource controllers need to be aware of the importance of ethical walls as well, with appropriate governances in place to ensure staff aren’t pulled between projects, either due to a staff or knowledge shortages.

 

The biggest priority to any ethical wall is maintaining complete openness and transparency with all stakeholders, reassuring them they have nothing to worry about.

Using Tech To Solve Ethical Walls

Fortunately, technology makes the creation, maintenance, and monitoring of ethical walls a lot easier than it used to be.

 

With just a little bit of work simple communication tools like Microsoft Teams can become powerful tools for data protection.

Administrators can set up who can message who, blocking staff on competing teams from communicating, even if they wanted to. Polices can be implemented that limit the types of files being shared or even extended to block screen sharing and video if an organisation felt a full communication blackout was unnecessary.

Policies could also be created within Microsoft Teams so that members of one team couldn’t share files or data relating to certain members with non-team members. You could even go so far as limiting staff to only receiving data, unable to upload sensitive information to OneDrive, SharePoint or Teams.

The list goes on…

 

Life @ cloudThing As A DevSecOps Engineer

Lynne Read-Langan reveals all about the awesomeness of cloudThing

 

Why did I pick cloudThing to work at as a DevSecOps Engineer?

Well, my story is probably a little different to other cloudThingers.

I had an IT career before joining, but for multiple reasons needed a change of career so I had to leave it for a while.

However, like all things we’re passionate about, I wanted to come back, and when I did start looking for places, I wanted to join a company that would allow me to re-skill.

The thing that I was up against was that I had the knowledge, but the skill set was out of date. It just wasn’t happening!

Basically, what I was looking for was a company that would allow me to reskill, but most organisations wouldn’t look at me because my skills were too out of date. There were a lot of doors closed in my face. Like the old tale of Mary and Joseph, there wasn’t any room at the inn for an IT specialist with skills that were ten years out of date.

I’d already been told how great cloudThing was, its reputation is well known. This was further solidified by the fact that after my interview I was given an opportunity to re-skill. It was a win-win; I get to work at an awesome company, I have an up-to-date skillset, and I have peace of mind that I am re-joining the industry I’d been in love with since I was a young girl.

The learning opportunities don’t stop after you join, too. You’re learning something new every day. We work with so many different people from so many different organisations that each day comes with a new set of tasks and obstacles that I’m more than ready to blast through.

Yes, you’re given independence as a worker and there isn’t anyone breathing down your neck to meet your deadlines; we’ve been hired to get the job done and we’re trusted to do that. But that doesn’t mean you’re left outside alone with anything. The culture here is supportive and if you need help, you’ll get it. If there’s something you want to learn more about, there’s someone more than willing to teach you. They love what they do here and so when someone shows an interest in it you see that passion spill out of them.

If there’s something you’ve never done before then cloudThing are prepared to let you experiment and see if you like it. The thing that matters is that culture of learning and trying things out, it’s very much a company built on the ethos of, “if you’re not failing then you’re not innovating enough.”

Working in DevSecOps means I’m not just limited to working with specific customers/projects. I could be helping anybody. There isn’t a normal working day at all. Of course there are specific projects I’m working on with the relevant teams but DevSecOps also provide the support for cloudThing. If someone is having an IT issue and can’t complete their task, then being here and fixing it means that they’ve been able to get on with their day. This morning for example, I’ve just been sorting out some issues for various staff alongside working on some of my projects tasks.

So, you never know what you’re going to get, really. Which I like because it’s a rewarding challenge that when you log off at 5pm you can say, “I’ve achieved something.” You were vital. For me, the rewards of doing this job are phenomenal. You get so much out of it. if I’m doing my job well, I’m contributing to cloudThing to help them do their job well.

But there’s no egos here.

We’re not just bodies who sit in a chair and crank out deadlines; we matter. We’re all important and it doesn’t feel glib when you hear that from management. I think that’s what allows us to work hard and have a giggle at the same time. We take the work seriously but it’s also something we love, so it’s not like you have to separate the two. You get your work done but you have fun doing it, and so there’s always someone you can reach out to and have a conversation with.

I touched on it earlier, but this is always what I’ve wanted to do right from when I was young.

I was encouraged that I could do what I wanted to do. I had a brother who worked in IT and I made him drag me along during summer holidays as a youngster. That drive for technology, it’s always been there.

So, I would love for there to be more female DevSecOps. And at cloudThing, they don’t put you in a box, which I highly value. It’s never a source of contention that I’m a woman, as it would’ve been in less enlightened times.

Working as a woman in tech I can see that attitude is changing. When I first started there were very few women – on my university course I was the only woman. It was heavily male oriented, but it IS changing even though it is still male dominated.

We need to be getting more women interested in STEM and supportive organisations such as cloudThing means it comes from the top. The whole ethos of the organisation is like this, not just pockets of it.

IOT: Dragging The Future Of Healthcare Into Today

IoT has time and time again proven the most efficient solution to specific pain points facing healthcare institutions

 

The last few years have a seen a fundamental shift in how technology is used within the healthcare sector.

Technology makes healthcare better. No one needs convincing of that. It makes it more efficient; it makes it more cost effective, and it empowers better patient outcomes and the IoT (Internet of Things) has played, and is playing, a huge role in that trend.

What Is The Internet of Things?

The Internet of Things, often referred to as IoT, is any network of internet-connected devices that are capable of sending and receiving data to each other.

IoT tech can literally be anything, from hospital wristbands monitoring a patients pulse right through to a doctor’s mobile device receiving the data. IoT tech empowers a much faster, much more accurate and much more efficient data collection in healthcare (and other sectors), collecting and collating live data and allowing healthcare professionals to view it in real time.

Implemented correctly IoT has the potential to quickly elevate healthcare software to the next level.

IoT In Healthcare

Before IoT became available in healthcare, patient interactions with healthcare professionals were limited to face-to-face meetings (with occasional tele or text contact). There was literally no way for a doctor in a hospital to access live data about a patient without physically standing over them.

IoT enabled devices have changed that, revolutionised that in fact.

They make the remote monitoring of patients not only possible, but easy; be it in a hospital setting or in the patients own home. That type of access massively improves patient satisfaction levels, it reduces the length of hospital stays and frees up beds and also helps reduce re-admission numbers.

Deploying & Optimising IoT In Healthcare

IoT opens up a swathe of opportunities for the healthcare sector but… it will produce a lot of data. That data needs collating, storing and analysing in a low cost but efficient manner… which is why cloudThing always recommends a four-step implementation process…

 

  • Deployment of Devices: The first step needs to be the actual deployment of devices that will make up the IoT network… the sensors, actuators, cameras, monitors etc. Anything capable of capturing data and feeding it back to a central source.
  • Converting the Data: Once collected that data will need to be converted into a useable digital format so that healthcare professionals can access it on their own IoT enabled devices but also so that BI software can recognise it to analyse.
  • Storing the Data: Once it’s been collated and digitised that data needs to be stored securely somewhere like the cloud (or to get specific Microsoft’s Dataverse).
  • Analyse the Data: The final step is to use advanced AI, ML and other analytics to better understand the data, draw connections and parallels and use it to improve overall patient care with actionable insights.

Different Uses Of IoT In Healthcare

It’s not just the patient experience that can be massively improved by the adoption of the Internet of Things within healthcare; there literally isn’t an aspect of the sector that can’t be empowered and transformed by it.

IoT For Improved Patient Monitoring: We’ve already discussed this so won’t belabour the point but wearable tech is going to revolutionise the healthcare sector in the coming years in ways we can’t even dream of at the moment. Fitness bands, wireless blood pressure monitors, heart monitoring cuffs, glucometer measuring devices… these are all going to give patients an unparalleled level of personalised care. It’s not a one-way process either. Devices don’t just have to collect and send data, they can also receive instructions and notifications for patients, reminding a diabetic to check insulin levels, sending appointment reminders, notifications of blood pressure variations… the list is almost endless.

 

IoT can Improve Intelligence for Doctors: Once patients start adopting wearable tech as the standard, the level of intelligence as to their welfare available to doctors is limitless.

A patients adherence to a drug regime or care plan can be monitored remotely, with real time notifications being sent to both parties if milestones are missed. IoT allows a healthcare professional  to be a lot more watchful and responsive to a patients needs, allowing them to be proactive rather than reactive.

 

IoT for the Management of Equipment: Putting aside the monitoring of patient’s health, IoT can be deployed a lot more extensively in hospital settings. Any IoT enabled medical device can be tracked in real time, both to hunt down lost equipment and for the completion of time and motion studies to improve the allocation of resources.  The same techniques can be applied to medical staff to analyse and improve their efficiency around the hospital.

Another major use for IoT devices could be to stop the spread of infections within hospital settings. Track & Trace has already been demonstrated as successful in reducing the spread of COVID so it isn’t a huge jump to IoT enabled hygiene monitoring devices or even all devices being monitored by an AI to pinpoint the sources of infectious outbreaks.

Benefits To IoT In Healthcare

A lot of very specific benefits to IoT in healthcare have been mentioned above but the Internet of Things also has a wide-ranging effect across the entire sector, namely:

INCREASED EFFICIENCY

Even the most state of the art hospitals, and the most expert of doctors can be improved through the adoption of IoT.

Long waiting times for appointments, long waiting times in GP surgeries or hospital waiting rooms (increasing the spread of air borne illnesses), inadequate or non-existent data collection (or worse data that is being collected nit being used for improvement)… All of these things can be improved with the Internet of Things.

REDUCTION IN MEDICAL ERRORS

Unfortunately, every year there are thousands of deaths worldwide caused by preventable medical mistakes.

That’s not to insult the hardworking medical professionals working in the health care sector but sometimes human error does occur. Thanks to its real time application, IoT can help to prevent some of those errors or mitigate them when they do occur. Any healthcare worker wearing an IoT connected wristband can immediately identify a patient to prevent mix-ups in drug administration. Allergies can be checked before drugs are administered or food is served. Machines also don’t make errors when recording data so mislabelling of data or incorrect collection becomes a thing of the past.

EFFECTIVE SUPPLY CHAIN MANAGEMENT

A secure supply chain is essential for the vital medicines and equipment hospitals and surgeries use on a daily basis. Counterfeits entering that supply chain can cause major issues down the line to effective patient care and in some instances even risk lives.

IoT sensors can be added to drugs and medical equipment directly at the point of manufacture to provide live feedback on it’s progress to hospitals, surgeries and pharmacies.

Even if counterfeiting isn’t seen as an issue, IoT an still improve supply chain management by allowing for consistent location tracking, allowing for the entire process to be streamlined with careful analysis of the data.

REDUCING OPERATIONAL COSTS

The Healthcare sector is expensive, with a lot of expense that just can’t be avoided. That means Op Directors and others need to be more creative to reduce operational costs without impacting the level of care they can provide. An easy way to achieve that is through an increase in efficiency with IoT.
IoT automation is thousands of times faster when it comes to collating, compiling and disseminating data than a human ever could be. However, less manual work for staff means lower overhead costs.

EFFECTIVE DISEASE MANAGEMENT

Chronic health conditions like heart disease, cancer and arthritis affect millions every year.

However, for those suffering with a chronic condition, medical care isn’t an occasional trip to the doctors but a daily struggle. IoT monitoring devices can keep track of a patient going about their regular routine, allowing for a much more ‘normal’ everyday life. That real time monitoring also allows for immediate action from a doctor should it become necessary.

IMPROVED MEDICAL CARE IN REMOTE LOCATIONS

Many remote locations struggle with access to doctors, nurses, hospitals, pharmacies and other resources that many may take for granted.

Whilst IoT can’t solve all the associated problems related to that, wearable tech and improved communication between doctors and patients can solve a lot of problems where face to face meetings are impractical.

INCREASED ACCURACY IN RESEARCH

No advance can be made in healthcare without accurate research but collecting data from a statistically relevant test pool has always presented with difficulties.

IoT can streamline the collection of data in medical research projects whilst also allowing researchers to focus on more important tasks.

IoT is capable of collecting huge swathes of data in a fraction of a time than was previously possible.

 

IoT can’t solve every problem facing the healthcare sector and there are still some obstacles that need to be overcome before IoT can be adopted by all hospitals (for instance the cost of making all devices IoT enabled).

However, IoT has time and time again proven the most efficient solution to specific pain points facing healthcare institutions and will continue to do so as the tech powering it becomes more mainstream.

Life @ cloudThing As A Functional Consultant

MVP Sheryl Netley discusses the social and business aspects of why she loves working at cloudThing

 

I chose cloudThing because it has a reputation for technical excellence and having come across a few of my new colleagues during community events, I knew they were exactly the sort of people I’d want to spend my days with. For example, I met Mike Chappell, Jason Gardner and Rob Meehan during a number of hackathons, and I’ve known Will Dorrington for a while, having first met him at the D365 Saturday Glasgow almost three years ago.

The personality of cloudThing really drew me in, and now I can’t see me ever being anywhere else.

As soon as I joined, I saw that supporting each other was built into the DNA of the company. We genuinely look after each other and we’ve put things in place to foster that behaviour.

The culture of cloudThing is totally authentic, and at the heart of it all is the principle of ‘doing the right thing’, so I never feel like my values are going to be compromised. It’s my safe space as well as a place of positive social impact.

That ethos has been entrenched in the foundations of the company right from the start, and the founders are fully invested in making sure that we do the right thing at the right time. And for this reason, we have a really solid grounding.

Because of its strong foundation, the company is not afraid of a challenge, and we are actively encouraged to speak up if we feel something isn’t right.

Their sociability and commitment to the wellbeing of their people is not just ticking a box and this is shown in so many ways.  When you join, you’re given a buddy and you’re introduced to teams made up of people who you don’t work with every day, we have a number of different clubs and a chatbot that encourages us to get to know colleagues we’ve not met yet. Our working relationships are nurtured to the point where I really look forward to opening my laptop every morning because I’m genuinely excited to see my colleagues.

But, okay.

You might not be considering cloudThing for the social aspect.

There’s huge emphasis on development and growth steeped into the ethos, too. There are lots of knowledge and learning opportunities which are built into the working week, in a way that I’ve never experienced before.

Friday is set aside as learning day. We get together to go through new technologies and all the cool stuff people have done which is awesome, and it’s also a chance to meet people outside of immediate projects and build stronger relationships with other people in the company.

On Thursdays the chatbot kicks in, and we get given an icebreaker in Teams that pairs you up with someone and gives you prompts to start a conversation. So, despite being remote, you still get a chance to meet lots of new people in the company.

And talking of Teams, there’s always something going on in there, so I always make time to check it, particularly the ‘riddle’ channel which often throws out some strangely philosophical questions.

I can’t keep stressing the social aspect enough but it’s something I highly value, and I enjoy spending my day chatting to people I like and doing stuff I love doing. It’s the dream.

Okay, for REAL I’ll mention what my day to day is like.

We work in AGILE so there’s normally stand ups or workshops thrown into the mix, depending on where in the project cycle we are, a day could also involve documentation, designing solutions, team catch ups, and learning opportunities…

cloudThing are very supportive of my personal goals within the Microsoft Community as well, and I was recently awarded my first Microsoft MVP Award.

If you don’t know, Microsoft MVP stands for “Microsoft Most Valuable Professional” and it’s all about contributing to the Microsoft Community.

I do a lot of things quietly within the community. I’m quite reserved as I like to be the support, I live by the motto “#werisebyliftingothers”, as I love to see people truly come into their own and accomplish all they can.

Saying that, I’m not some silent and shadowy figure lurking around in the background. I’ve spoken at a few conferences and presented and volunteered at a few different events. Scottish Summit is the main one for me and is a first love as far as the community is concerned.

Scottish Summit was the first thing I ever attended, and it had a lot to do with me becoming involved with the community.

I’m speaking at the South Coast Summit next week which is the largest in person Microsoft event in the UK this year. My session is called ‘One Small Step’ and its purpose is to encourage more people to take that first leap of faith and join in!

I’m a mentor for Wentors, an organisation which provides a platform geared towards giving young women in the Technology fields access to ‘wentors’ (female mentors) from all walks of the industry, and I’m involved in a community group called TechStylers which supports women in Tech and am starting up a new UG called 3 Shires 365 to serve rural communities.

On top of all this I have a blog and a website.

So, I’m fairly busy within the community!

I was asked recently how it feels to be a woman working in tech, and my main answer to that would be that it’s a lot easier than it used to be.

I’ve worked in tech for nearly 30 years, and there are more women in tech now than there have ever been, and I think it’s a really exciting time to be entering the STEM world as a woman.

cloudThing are hopeful about closing the disparity and are making a conscious effort to address the gender skew in the industry. In fact, I haven’t really thought it at cloudThing, it’s never been an issue.

If you’re a woman and you’re thinking of going into to tech – cloudThing is a great place to be, regardless of where you are in your career. Actually, another one of my mantras is #NeverTooLate.

The environment here that the founders and the entire team have created is awesome. For me, it’s a welcoming place full of smart, supportive people, doing the right thing. Basically, I like working for cloudThing so much because it’s full of cloudThingers.

Gone are the days when I don’t want to log on in the morning – since coming to cloudThing I look forward to the exciting challenge of each new day. And I also have no problem closing my machine down when that day is done. You feel so satisfied with the value you’ve given, and you can walk away from the machine.

cloudThing keeps the magic alive.

How Will Tech Revolutionise Health Care Over The Next Half Century?

Many of these medical technologies are in their infancy but all hold the power to revolutionise the health sector

 

Well… we say half century but a lot of the health technologies we’ll be discussing today are happening right now.

What may surprise you is that, over the coming years, these technologies are going to become even more advanced but also utterly commonplace, to the point where they’ll be familiar to even the youngest of children.

 

In years past the tech sector may have looked to science fiction shows for inspiration but the revolution that’s occurring in health technology today, right now in fact, is often beyond the dreams of even the most ‘out there’ of sci-fi writers.

Digital healthcare technologies such as AI, VR & AR, 3D printing, robotics, genome sequencing and nanotech are improving the lives of millions around the globe already so it’s important that people in both the health and tech sectors familiarise themselves with the possibilities of the coming years so that they can play an active part in the revolution currently reforging the health sector.

 

There’s no doubt the future of the health sector lies in working ever closer with technology, meaning health workers need to allow themselves to be empowered by the possible to remain relevant. Robots aren’t going to replace doctors; AI won’t dictate patient care; genetic testing isn’t creepy.

All the fake news that floats around about technology in the health sector is just that, either fake, misunderstood or misleading.

 

Technology isn’t scary, at least, it doesn’t have to be.

It’s there to make patients lives easier by supporting health care professionals, giving them better tools to care for people.

Within the healthcare system, digital technologies could help transform unsustainable healthcare models into sustainable ones, provide cheaper, faster and more efficient solutions to problems afflicting millions but… for that to happen, healthcare professionals need to understand the science of what’s possible, what’s coming next and be confident enough in that to feel empowered, rather than threatened by it.

Cloud Computing For HealthCare

Cloud computing through companies like Microsoft is becoming more and more common in healthcare as it facilitates interoperability so well.

It reduces overheads and operational costs, provides demonstrably better services and improves general processes, making them faster and more efficient.

Patient records can be stored safely within the cloud and if using something like the Dataverse across all departments, all healthcare (with suitable access levels set) would be able to access those records as needed. No matter what department (think health and social organisations) or even device, meaning they could access them on the move if needed.

It’s also one of the safest ways to store data, with a tremendous amount of security that automatically backs up data to prevent any losses.

 

Over the next few years, as department after department goes through a digital transformation, it will be common place for all data to be stored in the same format (think Microsoft’s Dataverse) meaning disparate departments and organisations can all easily access it to improve interoperability.

Artificial Intelligence

AI has the potential, if adopted in the right way, to completely transform the healthcare sector.

A well-designed AI algorithm would be capable of data-mining patient records, looking for common trends and past behaviours and then creating a treatment plan that would just need approving by a medical professional, all in a fraction of the time it would usually take.

It goes much further than that though.

Google’s DeepMind created an A.I for breast cancer analysis that outperformed human radiologists on detecting breast cancer by 11.5%.

Just think what might be capable in the years to come!

TeleHealth

The healthcare sector was forced to adopt new remote strategies during the COVID pandemic, technologies that allowed healthcare professionals to provide essential services, patient care and diagnosis’ when not directly present with the patient.

One huge example of this would be Microsoft Teams.

Rather than going back to the way things were however, these remote technologies are becoming the ‘new normal’.

Going forward it will become more and more common place to see Doctors remotely in the comfort of your own home. No more long queues at the hospital, no more picking up colds in a GP’s waiting room. Instead, for most minor symptoms (and some larger ones) you’ll be able to speak to a

healthcare professional through convenient apps, either on a mobile device or your laptop.

Wearable Health Tech

Alongside Telehealth, another big trend that’s likely to be seen coming to the forefront in coming years will be wearable teach capable of monitoring an individual’s health.

Wearable technology for healthcare goes hand in hand with the goal of empowering patients, allowing a person to truly take control of their own health and treatment.

This tech already exists in the form of pacemakers remotely sending back feedback through a Wi-Fi connection but imagine if that process was taken to its logical conclusion and technologies such as Fitbit watches could inform doctors (or more likely an AI) of a patient’s live health, altering both them and if necessary, the emergency services should something go wrong.

StarTrek’s Medical Tricorders

That’s the dream of every medical professional, isn’t it?

An all knowing, all encompassing medical device that can just be waved over a patient, measure their vitals and instantly diagnose what’s wrong with them.

Although we’re not quite there yet, we’re also not as far away as you might think…

Multiple companies are working on handheld devices capable of monitoring ECG, respiratory rate, heart rate oxygen saturation, temperature, blood pressure and more. Many of these devices even come fitted with cameras to enable telemedical usage, allowing patients to use them themselves in their own home and relay the results back to their caregivers.

It won’t be long before we start seeing health professionals out in the field (and patients in their own home) with powerful, microscope like devices attached to smart phones, capable of performing all the above functions whilst also analysing swab samples, skin lesions and even abnormalities in a person’s DNA or anomalies in their antibodies.

With the use of an electronic nose or ultrasonic probe connected to a smart phone, physicians will be able to diagnose patients more accurately than ever or patients will be able to do it themselves with the results automatically being uploaded to their medical files.

Virtual Reality

Virtual Reality (VR) isn’t new, it’s been around since the ‘80’s, almost exclusively for the gaming community. It’s only very recently however that it’s actually started to become any ‘good’.

What does VR have to do with the health sector you ask?

Imagine if surgeons didn’t have to wait for a patient fell ill before getting to practice complex procedures, but instead could learn how to perform them, over and over again in a life like virtual simulation.

That’s happening right now.

A study conducted by Harvard Business Review concluded a VR-trained surgeon would have a 230% increase to their overall performance when compared to a surgeon who had only been ‘classically’ trained. Not only did their performance go up however, but they were also measurably faster and more accurate.

It’s not just doctors benefiting from the adoption of this new wave of VR though. VR headsets given to women in labour whilst showing soothing landscapes have been shown to be effective in pain management. Patients suffering with gastrointestinal, cardiac, neurological pr post-surgical pain have also reported a decline in their pain levels when using VR to distract them from other distracting (painful) stimuli.

Augmented Reality

Hot on the heels of Virtual reality comes Augmented Reality (AR). It differs from VR in a number of small but important ways. Users of AR don’t ‘lose touch’ with reality as they do with VR, instead information, data or graphics are put in front of someone’s eyes as an overlay over the real world.

Whilst AR is still in its infancy with the health sector, its possibilities are enormous.

Much like VR, AR allows medical professionals to better prepare for real life situations. Many medical students now make use of Microsoft’s HoloLens to use the HoloAnatomy app. This allows students to study incredibly detailed depictions of the human body, complete with labels floating mid-air over the relevant parts.

 

As these technologies become more refined, they’ll become much more ubiquitous within the health sector as a whole.

3-D Printing

From the virtual to the real… of everything discussed in this article, 3-D printing is most likely to turn the healthcare sector upside down.

Technicians, rather than waiting for donor parts, can already 3-D print bio tissue, artificial limbs, blood vessels and more… and that list will only get longer as the technology becomes more advanced and mainstream.

Nanotech

From the macro, we move on to the micro. Whilst many of the other technologies mentioned in this article could be considered already well established, the health sector really is in the early infancy of the nanomedicine age.

That being said however, it wont be long before nanodevices, nanoparticles and even nanodoctors or surgeons will be operating on patients.

Operating alongside wearable tech, experimentation is already happening with ‘smart patches’ that use nanotech to continuously monitor wounds, stimulate faster healing and keep health care professionals updated as to the progress.

The health sector has barely scratched the surface of what’s possible with nanotech and the future of this discipline is filled with advances that will advance peoples health immeasurably.

Robotics

Robotics is already an established and fast moving field within healthcare, with surgeons already using robotic tools to help them in the theatre. In fact, it’s one of the fastest growing medical fields at the moment, with advances being made in surgical robots, pharmbotics, hospital disinfectant robots as well as exoskeletons.

Advances in exoskeleton technology are currently helping many with physical impairments live completely normal lives, allowing paralysed people to walk again and speeding up the rehabilitation of stroke victims and those who have suffered spinal injuries. There was even a case from 2019 on a tetraplegic man able to control his exoskeleton directly with his thoughts through various sensors!

AI Being Used To Develop New Drugs

Developing new drugs is currently a long and manual process, requiring testing first on animals, then on humans.

However, the way the pharmaceutical sector is currently heading means AI could completely revolutionise that. Whilst AI is already being used, as the technology grows, it’s use in developing new and more efficient drugs will only increase.

Whilst are current level of technology doesn’t allow for completely simulated clinical trials, that time might not be so far away.

The goal is to be able to test millions of potential drugs and drug combinations on billions of virtual patents in minutes in the very near future, revolutionising the sector with a wave of better, more efficient pharmaceutical options.

Empowering Public Transport With Big Data

Harnessing big data is the key to revolutionise the public transportation sector

 

It should come as no surprise that a technological revolution has been occurring within the transportation sector over the last few years, with the likes of AI and automation leading the charge.

And it’s happening on all levels, from the infrastructure of our highways and service links to the granular data analytics that are capable of monitoring passenger levels right down to individuals favoured routes and stations.

Just around the corner are self-driving cars, hyperloops on trains and drones that will be capable of deliver a package the same day, with zero human involvement.

It’s important here to state that when we say public transportation, we’re including highways as well as trains and buses… anything that’s publicly accessible for, well the public, can be improved with modern automation technology.

 

With the right application of tech, passengers can be empowered to plan their journeys and tailor them to their needs, using apps like Uber, Lyft, Bolt as well e-tickets/e-railcards which launched the transportation sector into a seamless, contactless environment for many passengers to enjoy – whether their reasons for travel are leisure or business.

 

However, there are still challenges to be answered when discussing public transportation, from user experience right through to infrastructure.

Most public transportation organisations are looking at ways they to optimise their operations, cut costs and increase revenues whilst ensuring vital (and non-vital) services aren’t disrupted.

By optimising their operations, and making even small improvements, organisations are able to expand service capabilities by centralising the planning processes required to analyse resource requirements, which in turn ensures the most efficient use of services.

 

That standardisation of the planning process is still open to modification, so users can explore alternative routing options if, for example, there’s traffic congestion or bad weather.

 

So how should a public transportation organisation aim to meet these challenges and ensure its future success whilst maintaining that services are not disrupted, and users are not inconvenienced? Well, with the implementation of new technologies; technologies like Big Data…

Big Data Analytics in Public Transportation

Big Data is at the cutting edge of transportation.

It’s what allows passengers to receive real-time updates on arrivals and departures of trams and trains, the easiest route to travel from point a to point b and beyond, and within the inner cogs of a transportation organisation they’ll be able to be kept up to date with communications between workers, signals, and maintenance works.

It’s all vital to keeping the system chugging along.

Optimising operating procedures and cutting costs:

One of the most important questions to plague the public transportation sector is:

“How many passengers use which routes and when do they use them?”

An intimate knowledge of how routes are being used means transportation organisations can deploy their staff and trains in the most cost-effective way. Big data analytics is used to predict passenger volumes as precisely as possible.

Picture this: certain service halting events such as bad weather, holidays, malfunctions, crashes or leaves on the line can be analysed and processed in real-time.

Even live customer feedback can be fed through and analysed in real-time, as passengers live log if a carriage busy, for example, by self-reporting it on apps like Trainline.

A transportation organisation can take all this knowledge and plan more efficient services in the future, for a lower cost by doing away with short trains during times of peak passenger volume and increasing the frequency of trains in a service-oriented way, for example.

Targeted Increases In Transit Passes

You’ll have needed to be living in a cave in the woods to miss the switch from paper tickets to e-tickets over the last few years. Streamlined that process has been a staple of getting about the place for a while now.

A challenge still facing all public transportation organisations however is the pressure to increase revenues – especially in light of growing cost pressures.

Enter: Big data analytics.

Having an overview of the way passengers use routes plays a huge role in assisting the revenue stream of what goes into operating a public transportation company.

And with that big data, companies can create system-based sales forecasts based on the analysis of customer behaviour, from a top-down view and right up to a granular look.

It empowers public transportation companies to develop more sophisticated sales strategies, leading to optimisation, increased revenue, and improved customer satisfaction.

This puts companies in a position where they can lay out more comprehensive road maps to start campaigns that’ll win back customers at the right point in time, for instance, to increase sales of season tickets.

Big data isn’t strictly speaking just a way for public transportation companies to increase revenue however (although it’s always good to hear that), the data analytics also allow transportation companies to improve timetables, thus acting in a more customer-focused way using season ticket sales figures.

AI-Augmented Mobility

The words data and analytics have been thrown around a lot here.

It’ll be prudent to clarify that the kind of data that is being shared among the public transportation sector is operational data, data that shows what’s happening across a network; where trains are, when they’re going to depart and arrive, security alerts, even how many people are coming and going through the stations.

Queue levels at ticket offices, status of connecting transport services, maintenance and service activity, weather alerts and the affecting impact of too many leaves on the track.

When you’re sat on your 8am train thinking about the meaning of life, the universe, and everything, you just don’t think about the high level of communication being zipped about the stations as you trundle along the track.

And what public transportation organisations could do with all that data is capable of completely innovating the way people get moved about from point a to point b.

 

Public transport, armed with artificial intelligence (AI), will have the power of data analytics and the cloud to reduce travel time, congestion management, ensure compliance and dynamic policymaking, to improve public transport across all levels – from passenger to worker.

People don’t want to have to plan the wait times between their trains. They just want to get home, or to work, or to a wedding or a job interview or to the shops… to have to plan your route and timetable on top of that is just… well it’s just annoying in this day and age really.

Applications have sprung up to take this headache away from urban movement. Transportation planners see the growing need to create a more seamless journey, with minimal stoppages or checkpoints.

This trend has manifested the rise of…

Mobility-as-a-Service (MaaS)

What is Mobility-as-a-Service? Apps like City Planner will show how you can get around seamlessly integrating train and bus into the journey. The trend also includes mobility hubs that enable multimodal transportation, platforms for ticketless travel, and innovations in micromobility and last-mile connections.

Digital Technology and Transportation

Apart from data analysis, how is digital technology improving public transportation?

The available technology increases throughput, improves security, and gives users a better experience.

The digital transformation within the public transport sector will see digital driver’s licenses to enhance security, and a little experimentation with biometrics and facial recognition to improve the efficiency of airports.

It’s important the digital rapture emphasises the customer experience, and what digital transformation can do for them. For starters, putting user’s needs front and centre to make it easier to use digital transportation tools. Simplifying the process of transactions and improving the infrastructure for passengers and pedestrians will also offer more inclusive travel options in urban areas.

AI in Autonomous Vehicles

Okay, just for a bit of fun since there’s been a huge focus on trains and buses; let’s look at how our highways are being revolutionised.

It seems like every big name auto-company out there is vying to be the leader of the mass-produced wave of self-driving vehicles.

It used to be a faraway dream that one day you’d be able to jump in your car, punch in an address, and sit back and let the car get you to point B and onwards without you having to touch the steering wheel or navigate the traffic. However, it’s swiftly becoming a reality.

Google, Uber, Tesla and Ford are all developing machine learning, AI and deep learning platforms that help cars calculate their surroundings in real-time and then calculate accordingly.

The sheer scope of the data the machines are taking in in real-time is incredible. The vehicles take in millions of data points each second through numerous sensors, software and GPS.

The point of these sensors is to constantly monitor the surroundings, looking out for people crossing roads, surrounding vehicles and animals darting out into traffic.  They then make split-second calculations on how to respond safely and efficiently.

GPS monitors also find quicker routes to a destination, accounting for bottle necks, accidents, diversions, etc. The dream: Just start the car and get to where you wanna get.

The thing stifling self-driving cars from being the standard mode of transport right now is the unpredictability of the road and maintaining safety. There’s an infinite amount of scenarios that can appear when on the road – how can a car respond to each one like a rational human? Behind every program is a programmer feeding it enough information to begin teaching itself.

Each automaker and start-up are training the AI inside these cars to drive with the rationale of a human, with safety being the key priority. It’s a burgeoning technology making massive strides within transportation technology, and it’s exciting to see what the impact of it will be.

Putting Patients First Vs. Cost Concerns

Does Putting Patients First have to cost money?

 

‘Putting Patients First’ feels like a statement that should be so self-evident it doesn’t even need stating. Afterall, who else would you put first…?

However, in the real world, we all know that, unfortunately, cost considerations can sometimes get in the way of that lofty goal.

There are many things that can be done though to head those concerns off, mitigate them where they do occur and enshrine the principal of ‘Putting Patients First’ into every aspect of not just a health organisation’s overarching governance but also their normal ‘day to day’ activities.

 

A recent study has shown that more and more people are living longer with, often several, lifelong physical and mental chronic conditions.

In those circumstances it’s vital that a patient has easy access to all their data, made available to them in a method that’s as transparent as possible.

Anyone with a long-term condition, or their carers (carer burn-out is also a factor that always needs to be considered in Putting Patients First), will be looking at an increasing burden as they’re forced to monitor their own symptoms as much as possible at home, updating their health care professionals with that clinical data in an efficient and accurate manner, collecting and delivering samples and adhering to often complex drug regimens, all whilst having to deal with a health and social care system that is, to put it politely, often uncoordinated.

 

Anyone in the above situation or one similar to it is likely going to need care that will cross multiple ‘traditional’ health service borders, including primary and secondary care, health and social care and other ancillary services such as employment, benefits, welfare and housing.

Dealing with all those disparate departments on an ongoing basis, likely in multiple different locations, will be time consuming, frustrating and stressful for a patient, with the additional stress levels likely exacerbating the original illness or disabilities, potentially placing a further burden on the health care industry.

Although the widespread adoption of electronic patient records has somewhat helped this situation in recent years, there are still huge obstacles to overcome in terms of interoperability between different health and social bodies.

 

Given all of that then, what does ‘Putting The Patient First’ even mean these days?

 

The first step in ‘Putting The Patient First’ has to be ensuring continuity of care, ensuring wherever possible that the patient has access to the same healthcare professional, someone who can access all their records and whom the patient knows and trusts.

From there, the process should be as collaborative as possible, with a shared decision-making process in agreeing a care plan. Once the care plan is in place (and even before hand) the patient should always receive easy access to the best care as possible.

 

‘Putting Patients First’ should also involve stepping back on occasion and looking at the wider picture. It’s always worth remembering that a patient won’t just have to live with their illness(s) but also any treatment plan(s) that are agreed… often having as much of an impact, if not more, than the original illness does… especially if additional treatments, travel, consultants and specialists might be required.

‘Putting The Patient First’ means considering those impacts and how they could be mitigated. Can a care plan be adapted so a patient can deal directly with different services? If they’re collecting samples at home, can they upload the results themselves? If not is there a central drop off point they could access or would they prefer to hand in specimens directly to a lab (giving them back a sense of control over their treatment).

Continuing with that theme, when the results are back could they be sent directly back to the patient (and/or their carer) as well as their GP?

That means if needed the patient could take immediate action rather than having to await another appointment. In fact, could the results be automated to, if needed, create an appointment with a specialist without the patient needing to make another trip back to the GP as is often the case with simple referrals.

Once we get into the world of interoperability and automation a lot more becomes possible.

 

However, that’s where the issue of’ Putting The Patient First’ vs cost concerns once again rears its ugly head.

The UK’s NHS and its wider network of GP’s around the country are a globally recognised service, providing free treatment at the point of delivery to any and all, paid for by a centralised tax system.

Given how widely recognised they are however, it can be surprising how little the general public know or understand about their inner workings (bar the odd attention grabbing headline) or the pressure they’re under to provide a world class service for all… for free, day in, day out.

The general public will likely have never heard of the number of guidelines and protocols health practitioners need to adhere to, or the implications they can (and do) have or targets that need to be met such as QOF, let alone how a GP practise is funded or what different treatments can cost.

Some of that information is in the public domain, some of it isn’t. An argument can also be made that whilst patients should be able to easily access some of that information if they chose, other aspects (such as cost) they should never have to worry about or even consider.

 

And that’s an important point.

Quality indicators like QOF have a tendency to focus on medical processes and outcomes, without adequately recognising the concerns that are most pressing to a patient.

Any initiative that ‘Puts Patient First’ should always try to take those concerns into account, folding them into quality measurements where possible.

 

‘Putting Patients First’ can massively be empowered through the application of technologies; technologies that both focus on the patients needs but also act as cost saving measures.

The first step is clearer communication channels between health care practitioners and patients, allowing them to access NHS and Private services with ease and convenience.

Many other sectors have already embraced this ‘on the go’ ethos such as uber or purple bricks, allowing users to access their accounts anywhere through a mobile device and there are lessons there the health sector could learn from. That easy access to their records, combined with increased communication will allow patients to make informed decisions about their own care.

 

Once those issues are resolved the implementation of new disruptive technologies (tech that is either/or ground-breaking or exponential) will start to sweep the health sector, changing patient care in ways that haven’t been seen since the foundation of the NHS.

The kind of tech we’re talking here, both hardware, software and AI, may seem like science-fiction but they’re already here in many sectors already.

They’re technologies (think sensors, scanners and wearable tech) that don’t just make things ‘a little better for a bit’ but that dramatically change the face of an industry for the good of all, improving services and reducing costs all at once.

These technologies will put the patient at the centre of the health sector, able to measure they’re own symptoms from the comfort of their own home and have the readings automatically transmitted to the relevant practitioners.

 

Digital Transformation in the Healthcare sector isn’t about deciding between Putting The Patient First and cost Savings, it’s about empowering the former whilst achieving the latter, all in a single stroke.

Life @ cloudThing As A Power Platform Solution Architect

Meet John Upton, a Power Platform Solution Architect at cloudThing who loves our academic culture

 

If you’re reading this, considering taking the next step in your career maybe, then it may be that cloudThing has been in your periphery for some time; a sort of blinking light, like some mysterious green blip on a submarine radar.

A year and a half ago I had cloudThing on my radar.

It had been blinking at me for a while and this office definitely stood out from other opportunities but was it the right place for me?

It helped when I got to speak to some of the people who had already chose cloudThing, learning they could use their skills for good, anywhere in the country.

So, indulge me while I tell you why I chose cloudThing…

 

In a nutshell, my job is to be ‘loaned’ out to clients, to find out what they need and then support them in streamlining their business.

Sometimes, cloudThing might want to do it one way but if the client is set on doing it another, then it’s my job to offer the best advice (of course) but ultimately, I’ll be advocating for what the client wants. I must make it clear in interviews that my loyalties lie with the client… trust is the real commodity of a Solution Architect.

 

I got the feeling that the cloudThingers (as we call ourselves) I spoke to understood where I was coming from.

I like that we make lots of components and quick-starts open source with a particular emphasis on the NonProfit sector.

Fran (our esteemed Chief Technical Officer) emphasised in my interview that we want to be a partner with a business and not just someone they’ve hired to get the job done. The nerf guns and assortment of stormtroopers chaotically thrown about the place also helped ease my worries!

 

Something I really respect about cloudThing is that cT India is a core part of the company and strategy. It isn’t just another branch or a short-sighted outsourcing project, instead it’s one of our greatest assets, that we continue to invest in.

cloudThing UK and cloudThing India are one fully integrated organisation and they make me look really good to clients.

 

The whole experience of working here has reminded me of being back at university. Everyone’s filled with optimism, keen to learn and improve our processes. To that end, we’re very strict about following Dev-Sec-Ops and agile principles but… anyone can suggest a change or improvement. If it makes things better it doesn’t matter if the suggestion came from an apprentice or a director.

 

As I went through my career at smaller companies, I tended to end up getting involved in every stage and every part of the pipeline. It was great for my personal development and I loved the variety but there weren’t enough hours in the day as projects got close to the final release.

At cloudThing I get to focus on my favourite bits, mentoring, solution design and championing best practice and technologies in the Azure Cloud.

The nice thing about cloudThing is that there are many, many, many people who can do my job a lot better than me, so I’m always surrounded by people who I can learn from. (Editors note: we’re sure this isn’t true… John’s awesome at what he does!)

 

Really, it all comes down to what you’re looking for from a company. What you want to add to it, what you want to take away from it, and the kind of people that are already there making up the core culture.

I can’t speak for every digital transformation company out there, but I can say that the blend of corporate and big kids just having fun is what you get at cloudThing.

 

Interested in working for cloudThing? Drop us a note at [email protected]

What’s The Best CRM For The Membership Sector?

Engage new, existing and ex-members, build relationships, run online CPD’s and manage analytics quickly, efficiently and cheaply

 

There isn’t a membership organisation out there that couldn’t benefit from a better CRM system.

There’s no question that, to manage the lifecycle of a membership effectively, a membership organisation needs a cutting-edge CRM solution.

 

The accelerated adoption of cloud-based technology during the pandemic, with its ever-increasing numbers of apps, features, integrations, and bespoke solutions means there’s almost nothing a modern membership organisation can’t do when backed up by the right CRM.

However… with so much choice out there, it begs the question…  ‘what’s the best CRM system for a membership organisation?’

 

A modern CRM system needs to work hard in the membership sector, leveraging both new and existing relationships all whilst keeping compliant and yet also offering up to date compliance advice.

That’s a lot of different moving pieces, many of which could need a high level of AI and/or robotic process automation to be truly successful… A big ask for any CRM!

What Should A Membership CRM Be Able To Do?

It’s not easy operating in the membership sector these days.

The days of just having a place for sign up and receiving updates on an organisation are long gone as the digital world has grown up around us.

These days, data protection is number one concern and your members will want to ensure any details they give over are, and remain, secure.

But the culture of members has shifted too. They have been in the digital sphere for years now and their expectations for a decent user experience are higher than ever. If it takes too long to get the information they want, or the page they’re looking for, then they’ll simply lose interest and leave.

They expect the same customer experience from your website as they do from industry giants like Netflix or Monzo. etc.

A modern membership organisation will struggle to acquire and retain new members, so it is a priority to place member engagement as number one. A clear idea in mind of your target audience and their story is what will help boost numbers and keep them there. Your members need to see the value in you organisation, and by being a place with a memorably unmemorable user experience, that grants them instant gratification of easily found knowledge and intuitive website building, it really does pay dividends to select the right CRM system for you.

It also needs to, in a single view, liaise with third party partners and persuade media outlets to run (positive) stories about your organisation… all of this in a time of tightening budgets, stifling regulatory oversight, decreased membership levels and a greater need than ever for your membership’s engagement.

That means building lasting, long-term relationships with all of the above individuals, businesses and organisations isn’t just a ‘would-be-nice’ anymore; it’s absolutely essential to achieve your goals.

Whilst the day-to-day business of a membership organisation might differ (gyms, banks, trade associations, professional associations, clubs, etc) all have one thing in common… They’re all dependent on members resubscribing for their operating budgets.

That’s why the core functions of a Membership CRM system must focus on members’ engagement and retention. That doesn’t mean however, that it won’t be called upon to do a lot of other functions.

The good news however is that thanks to data driven technology, layering automation tools over CRM tasks has never been cheaper or easier.

But what exactly should a Membership CRM be able to do?

Key Features Of A Membership CRM System

CRM software has changed a lot recently and the right CRM can give a membership organisation a real competitive edge in a tough market.

If you’re on the lookout for a new CRM, looking to upgrade/replace an existing system or worried about vendor lock-in then below is just a few of the tools you should be on the lookout for…

ROBUST MEMBER PROFILES

Membership, universally, should store necessary information such as contact information, more sophisticated solutions will auto-populate specific information from other integrated features, such as:

  • Event attendance: Your CRM system should have the capability to record the type of ticket bought and any add-ons they opted into.
  • Connections: The CRM system should be able to show the relationships between members, whether they’re colleagues or peers in the same chapter etc. That level of connectivity will allow for reflection on how your members relate to each other.
  • Membership level: If you apply tiers or different membership packages to your organisation then your CRM should apply the appropriate badges, special permissions, or discounts based on their membership level.

AUTOMATIC MEMBER RENEWALS

Member retention is paramount to maintaining the strength of the networks that professional and trade associations create.

The simpler the process is to renew the more likely you are to hold a recurring relationship with that customer. They don’t want it to take any time from their day at all.

To keep this process streamlined you should find a membership software that recognises when a member’s subscription is up for renewal and send them automatic reminders. These reminders should be:

  • Timely: Allow your members plenty of time to renew before their subscription expires.
  • Recurring: The reminders should be like a gentle tap on their shoulder, not a screeching car horn in their face.
  • Personalised: It goes without saying, just putting their name and a direct link to the renewal form with personalise the experience and encourage retention.

 

EVENT MANAGEMENT

You might find the thought of planning real world events digitally slightly incongruous but if 2020 has taught us anything it’s the power of online events!

Either way, online or in real life, good CRM systems should include software management tools that will let your Membership organisation schedule their events, invite and manage attendees, create seating plans, track who’s coming and who needs following up and most importantly allow your team to access this information on the go, at events through mobile devices.

PAYMENT PROCESSING

This incorporates more than just your membership sign-up; online courses and event tickets will also be an important funding source for your association and as such your membership software should be able to provide analytics on where these different streams are coming from.

Which leads seamlessly on to…

ANALYTICS AND REPORTING

There’s no way to know if what you’re doing is working if there’s no way to monitor it. You can’t grow and move forward if you can’t see what did and didn’t work, so a CRM system with the capability of both granular data and overview is vital to your organisation.

ONLINE SELF-SERVICE MEMBER PORTALS

Your members should be empowered to take control of their engagement. They should have full control over their membership tiers and status, easy online event registration and check-in, online communication and networking outlets, and continuing education, certification, and accreditation options.

A smart membership CRM system will create authentic engagement that your members will want to use to interact with your organisation, because they know you will provide them everything, they need to further their careers.

FULL DATABASE CUSTOMISATION

As mentioned in the ‘Analytics and Reporting’ section, you need to use your database to the best of your abilities otherwise you can’t monitor how the changes you’ve made are impacting your organisation.

Customising your data allows you to control the information you want to see, analyse your members at a more focused level with segmentation options and allowing you to conduct targeted communication efforts. Finally, customisable views mean your staffers will always see the most relevant information for their daily tasks, instead of sifting through the mud for what they need.

Basically, a good membership CRM system is a good business intelligence tool.

BUSINESS INTELLIGENCE TOOLS

Your traditional CRM’s have always been great at collecting large amounts of data and business intelligence. What they’ve not always been great at is being able to do anything with that information.

When you’re looking at which CRM suits your Membership organisation best, make sure you settle on one that can segment, analyse and report on information with ease.

It also helps if the CRM allows you to layer AI integrations over those intelligence reports which will allow you to automate reports, plan ahead and garner in-depth business insights and draw connections you might otherwise not have been aware of.

The Top 8 Membership CRMs Available Today

MICROSOFT DYNAMICS 365

We’ll start with Microsoft Dynamics 365 as the majority of the membership sector consider it the best membership CRM available on the market (with good reason).

It’s simple to learn, easy to download, great as an Out-the-Box solution but can also be customised to do almost anything you might require, including integrate with third party applications and services.

It also integrates seamlessly with Microsoft Office and Outlook which the vast majority of both the private and commercial sector operate on. That seamless integration with Office (in this case Excel) also makes Dynamics reporting second to none. Its Business Intelligence tools allow you to drill down and parse data better than any other platform currently on the market, which is easily automatable to create regular reports with charts and graphs that can detect new trends at a glance.

As it’s Microsoft your staff will be inherently familiar with it and with the drive towards citizen developers and low code/no code software your staff will easily be able to create their own applications.

WILD APRICOT

Wild Apricot is a membership management software package that allows membership associations to ‘keep tabs’ on their members.

It sounds meaner than it is; all ‘keeping tabs on’ means billing members for dues and events and engage with them in meaningful ways.

It features the hallmarks of a sophisticated membership CRM software.

JOIN IT SOFTWARE

Join It allows users to build a centralised membership database, search contacts, create web pages and more.

At a quick glance, Join It empowers organisations to create customisable membership plans offering different benefits at different tiers and prices.

SILKSTART

SilkStart is an intuitive system so whether you’re an administrator or a member you will be able to navigate the interface with no problem. The dashboard provides a “health at a glance” type of overview that allows you to observe the general health of your chapter in one easy to view place, and it provides reports on tracking membership churn.

With SIlkStart you can automate your association’s processes, send and schedule emails, set up events and communicate with certain membership levels/groups and clone everything for future use.

GLUE UP

Like most systems, Glue Up is an intuitive system that doesn’t alienate administrators or members alike. There is little upskilling required so your organisation can set up the system then be on their merry way in no time at all. It is an out of the box solution that combines events and membership management.

HIVEBRITE

Hivebrite is an all-in-one community management platform. Empowers organisations of all sizes and sectors to launch, manage and grow fully branded private communities.

Customisable, community engagement focused, up-to-date member data, manage memberships and events, collect payments, centralise communication. Can just turn on and off features as demands require.

CHARGEBEE

Chargebee is a CRM system that captures, retains and maximises revenue opportunities via automated, recurring billing, subscription management and revenue analytics; basically, it’s best for recurring subscriptions and billing.

It integrates seamlessly with your other solutions you may have in place.

MEMBERCLICKS

Memberclicks is, once again, an all-in-one solution to save time and engage with members.

Manage finances, marketing, events, and membership in one place.

It’s user friendly, customisable and with continual product improvement you know you’re always kept up to date.

Memberclicks’ reports can provide real-time information, and offers integration of membership data and website, so going down to the granular level of your data is no trouble.

Bespoke vs Shrink-wrapped CRM systems:

It’s actually a bad idea to go totally bespoke.

You might be thinking that it’s counter-intuitive, and that something ‘out-of-the-box’ will not be able to cover all the processes your organisation requires. However, it is because the application has had to roll out to many organisations before yours that has enabled the solution to encounter it all before.

Not to sound too much like a friendly nurse at the GP office but they’ve seen it all before.

Going totally bespoke means your organisation is entirely dependant on one app that has been built by one person… If that person leaves your whole organisation’s processes are in jeopardy because no one else knows how to operate it!

It begets stagnation then as no one can add features to the app, and you’re in danger of stalling any growth you’ve made.

A straight shrink-wrapped solution isn’t the answer either, what you need is something in the middle. Something that can adapt to your organisation’s needs and something which you can switch on and off features as the gods deem desirable.

A component-led design offers ‘out-of-the-box’ features that can fit together like Lego pieces; the Lego was pre-manufactured but the opportunities for creation are limitless.

How Remote Patient Monitoring Data Can Drive Health Efficiencies

Using data and automation to empower the patient experience

 

It’s not even close to an understatement to say that the COVID pandemic disrupted a whole host of health services… from routine GP and health provider visits, right through to delayed and cancelled surgeries; especially any that had been labelled as ‘non-urgent care’.

Even now, health care professionals are struggling to catch up.

In England alone (discounting Scotland, Wales and Northern Ireland entirely) the number of patients awaiting treatment is the highest since records began and with limited resources it could take overwhelmed health services years to get back on schedule.

On the assumption the health sector isn’t about to see a massive influx of cash then, the question becomes… “how do we create more efficiency with the tools we already have?”

 

The answer is of course technology. Specifically big data and automotive technologies.

Hospital level health technology has been slowly entering patient’s homes for a while now, with that pace only increasing during the pandemic, with an understandable drive to free up as many beds as possible during the crisis.

The tech patients used to only be able to access in a ward or GP’s surgery, has become common place in their homes.

 

Lifting equipment in the case of frequent fallers, blood pressure monitoring kits, ECGs and oxygenation saturation readers, blood glucose readers for diabetics… these can and are, all now prescribed to patients to be taken home or even bought outright should they wish.

As well as reducing bed occupancy however, that’s also had the unexpected side effect of empowering people, putting them in charge of their own health which has led to more reassured, better educated, calmer and ultimately safer and healthier patients.

 

This is still an emerging trend in the health sector so goes by many names… remote patient monitoring (RPM), patient-generated health data (PGHD) or even just the simple, wellness tracking.

 

What that all instantly says to anyone familiar with the technology sector is that there’s an untapped resource here, instantly available for the improvement of health care efficiency.

Data.

The more access we have into patients lives (suitably and thoroughly anonymised of course), the more data we have and as anyone in the tech industry will tell you… the more data you have, the more insights you can draw.

 

What that means is that there’s a current opportunity within the health sector as a whole to use automation and artificial intelligence to collate and analyse that data for new, exciting and cutting-edge opportunities.

Ways that will be infinitely familiar to the tech sector but are still in their infancy in the public health industries.

 

There are hundreds, if not thousands, of ‘low hanging fruit’ opportunities, where something as simple as the routine collection and logging of analytical data from chronic condition management tools through automation can be correlated against other work flows and data sets with some simple AI to create huge increases in the efficiency of patient care.

 

The data from such workflows, collected from home reading machines, could be used to shape, empower and improve the accuracy of diagnostics and the implementation of patient treatment plans.

With the right infrastructure and tools at both ends (in the home a as well as GP surgeries and hospitals) these RPM or PGHD technologies would be capable of generating complex treatment plans based on current or future best practise guidelines.

 

Although the tech sector will be well used to this concern, it’s worth pointing out here that these tools aren’t about replacing anyone. They’re designed to empower both the patient and practitioner, removing repetitive tasks and freeing people up for more important jobs, whilst offering deeper insights all at the same time.

Any generated treatment plan would be seen as a suggestion only, with an expert looking over and approving it… except instead of having to draw it all up themselves it can be created at the touch of a button and then just modified as they see fit based on their years of experience.

That means, faster, more accurate and more personal care… with the health provider needing to spend less time on paperwork and more time on the patient.

 

Health data is a resource that is only going to grow in the coming years but third-party technology providers like cloudThing are happy to demonstrate what’s possible today.

Is A Career In NonProfit Digital Transformation Rewarding?

What’s it really like working on NonProfit digital transformation projects?

 

cloudThing’s founders created the company up to be different. They wanted cloudThing to be both a fun place to work and an ethical one (as well as being awesome at what we do of course).

One of the ways they sought to be different was by helping the NonProfit sector navigate painful Digital Transformations.

cloudThing have loved supporting dozens of NonProfit’s over the years, delivering cutting edge digital transformations to help them better achieve their goals of providing critical services to the vulnerable, whilst also providing a world class service to their donors and volunteers.

 

One of the biggest blockers to digital transformation for NonProfit’s is that they often feel a tremendous burden to invest as much money as possible in achieving their stated goals, rather than on things like organisational improvements.

Working in the tech sector, we’ve seen all too well where that path leads… outdated processes, legacy systems that have been patched, updated, and patched again long after they should have been retired, huge data siloes between departments, volunteers, staff and supporters… all of which cause’s massive slowdowns to efficiency, resulting in much less time and money going towards those who need help in the end anyway.

 

Anyone working towards the digital transformation of a NonProfit needs to be incredibly aware of those unique concerns, ensuring every solution, every step of a solution, takes that into account and can bring immediate ROI to the charity.

The first step towards that is in truly understanding the goals of the organisation, emphasising with their chosen cause so you can help them see the benefits to various technological solutions.

 

It can have its challenges but its an incredibly rewarding field. Unlike other sectors, every investment in a new piece of tech will need to be weighed and justified against spending the money instead on achieving the charities goals instead.

That being said, with the implementation of every successful solution, you’ll know you’re making a real difference to the world.

 

Why do I love working with NonProfit’s? It’s about being an integral element of a partnership that’s committed to making a difference for the benefit of society. Better quality than they could have had, better value than they might have had and more forward thinking than their other options. You can be confident in making a difference. Mike Chappell – Principal Architect, cloudThing

 

Working to digitally transform NonProfit’s, you’ll find both challenges and rewards; fortunately, the rewards (we feel anyway) far outweigh the challenges.

Many of the rewards you might expect… fulfilment on an emotional, personal and spiritual level, working with some incredibly dedicated, passionate and kind people and making real differences to causes.

However, some of the challenges can take people by surprise. The constant justification of spending we’ve mentioned, there’s also an increased amount of bureaucracy and governance when compared to other sectors, especially when it comes to financial matters.

 

We obviously think working with NonProfit’s is worth it… in fact we find it immensely rewarding. But if you’re considering a career in NonProfit digital transformation we’ll try and be fair and cover as many of the pros and cons as possible (but you’ll find far more pros than cons!).

 

Why do I love working with NonProfit’s? Because I Love being able to experience the genuine joy and excitement of volunteers when you are able to save their valuable time through the use of the Power Platform! – Nathan Hawkins – Power Platform Solution Architect, cloudThing

 

Cons Of A Career in NonProfit Digital Transformation

Working to transform a NonProfit with tech will fulfil you and drive you crazy all at the same time so to be fair to the title of this article we’ll start with the cons (as there’s far fewer of them anyway).

NonProfit’s will always have their own way of doing things, with only insider veterans really knowing how everything works. Getting to understand that, navigate it and then transform it can be both rewarding and challenging in equal measure.

After asking around, here are some of the most common cons we’ve heard…

 

Seeing Rewarding Results – Digital Transformation projectstend to focus on internal systems. The work you’re doing will absolutely save the NonProfit time, money and resources but you’ll be working on the ‘outside’ so to speak. The fact that your work will help isn’t in question but unless you’re working directly for a NonProfit then you may never see the ‘end benefit’ of the money saved going on to help that charities chosen cause.

 

It Can Be Challenging – You can skip past this one if you’ve the kind of personality that likes a real challenge. For everyone else, working with NonProfit’s, who for years have spent all available money on their chosen cause, won’t be coming in and just plugging in a shiny out-of-the-box solution. You’ll be dealing with legacy systems you won’t have seen in years that have been patched to within an inch of their life and employees who are always franticly busy with other, more pressing priorities (such as saving the world). All of that makes the pace may be slower than many not used to this sector may be used to.

 

Higher Stakes – Not to lay the pressure on but in most other sectors if you get part of a solution wrong or cause delays, you’ll likely cost an organization a bit of profit. When that happens in the NonProfit sector lives are affected and that can carry a heavy weight for some developers if you’re not prepared for it.

 

Why do I love working with NonProfit’s? Pride. Pride in enabling an organisation to make a difference with technology which makes others’ lives easier. The indirect impact of our work is immense. – Piyush Bhatnagar – Principal Architect, cloudThing

 

Pros Of A Career in NonProfit Digital Transformation

A large portion of your life will be spent at work so feeling fulfilled during that time will always be a huge perk. There are many reasons our developers love working on NonProfit projects, from the goal aligning with their personal ethos right through to that feeling of making the world a slightly better place.

So… here’s the good bit… all the benefits to working towards NonProfit digital transformation.

 

Why do I love working with NonProfit’s? We are making the world a better place by automating and reducing their mundane tasks so that they have much time and energy to make someone else’s lives better and warmerSrinivas Rao – Senior Software Engineer, cloudThing

 

Driving Change – One of the best things about digitally transforming a NonProfit has to be the sense you get of driving real change in the world. The solutions you’re helping to manifest will help empower a NonProfit for years to come in the work they do

 

You’ll Meet Some Fascinating People – NonProfit’s are full of some of the most dedicated and inspirational people you’ll ever meet, all from a varied and diverse background and they’ll all have a story to tell as to why they do what they do. Getting to engage and interact with these people is a huge part of the job (as well as being a huge perk) and is great for both personal and career development. You really will make contacts that will last a lifetime.

 

Job Stability – The world is changing, COVID saw to that, and more and more organisations are realising they need to digitally transform to stay ahead of the curve. What that means for you is there’ll be a constant stream of NonProfit’s needing your skills to automate their processes and design cutting edge solutions to help them achieve their goals… all of which means a high level of job stability.

 

Why do I love working with NonProfit’s? Being a cog in the wheel of an NFP organisation which helps to make a difference to people lives. In other words, positively impacting peoples live indirectly – Marc Rowley – Head of Customer Service, cloudThing

 

No Two Days Will Ever Be The Same – When working on digital transformation it’s quite common to be assigned to just one team or project. However, there can be a lot of cross pollination on NonProfit projects so it’s likely you’ll wear a lot of different hats and collate a lot of different experience working on these. All of which leads to faster career development, more varied job responsibilities and a much more enticing CV (not that you’ll want to leave!).

 

Your World Will Get a Lot Smaller – Woking for big corporate organisations it’s quite likely you’ll just be another cog in the machine… but NonProfit’s don’t operate like that. When creating beautifully crafted solutions it’s more than likely you’ll be pitching your ideas to the top brass… and then working with them on a daily or weekly basis as the project progresses. That closeness is great for building a sense of community and will also make you some great contacts as you progress in your career.

 

Flex Those Creative Muscles – The definition of insanity is doing the same thing over and over and expecting different results. If you want to be better (and the ethos of Continuous Improvement is baked in at cloudThing) then each solution you work on needs to bigger and better than the last. That need for ongoing creativity is even more prevalent on NonProfit projects, where the organisations have small budgets but huge goals… It’s up to you to make the most of your ideas for them.

 

Job Satisfaction – Do we even need to point this one out? When you log off at 5:00 every night you’ll do so in the full knowledge that the world became a slightly better place because of what you did… and you’ll get that feeling every day.

 

Why do I love working with NonProfit’s?  It’s just that feeling of immense happiness knowing that even a small part of our implementation makes such a great difference for an NFP, enabling them to reach more people and serve humanity with a smileManjunath P R – Software Architect, cloudThing

 

Have we caught your imagination? Think you might be interested in a career digitally transforming NonProfit’s and making the world a better place for us all? Reach out to us at [email protected] to learn more.

Six Back-Office Functions NonProfit’s Should Be Using Robotic Process Automation For

RPA is helping the NonProfit sector achieve far more for far less!

 

All the old ways of managing and supporting the back-end processes (and to be fair the front end as well) of a NonProfit organisation are currently going through a huge paradigm shift.

In all honesty they have been for years, but the last year or two has massively accelerated that process and NonProfit’s that are yet to adopt disruptive technologies like robotic process automation and AI are finding themselves falling further and further behind in achieving their goals.

 

We all know the back-office processes of a NonProfit are the backbone to any successful charity, supporting IT, HR, Accounts, Volunteering, Engagement and all the other goals necessary for a successful organisation.

More and more though, the CFO’s, CIO’s and CXO’s (Chief Experience Officers) of these NonProfit’s are realising that to stay relevant in a modern world they need to embrace digital transformation and a huge part of that will be fully integrated back-office processes that fully support all the organisations goals, making them as efficient and cost-effective as possible.

What Is Robot Process Automation?

Those back-end goals are all being met through the adoption of RPA (robotic process automation).

RPA is a type of software that makes it possible for a NonProfit to automate a slew of previously manual and repetitive tasks and workflows, reducing the time spent on them and freeing up staffing resources to focus on more mission critical tasks… doing much more for much less, by automating and replicating the steps a user would normally make when completing a specific task.

Examples of tasks that can be taken over by RPA bots include:

 

  • Vendor management
  • Most low-level finance tasks (certainly anything involving excel)
  • Maintaining accounts payable
  • Updating accounts payable
  • Donation management
  • Gift Aid submissions
  • Financial reporting
  • Volunteer onboarding
  • Volunteer management

 

Many NonProfit’s we speak to are reluctant to spend time and money on RPA (or digital transformation at all), feeling the lion’s share of the money they raise should be spent supporting their chosen goal(s).

However, over time, the implementation of RPA onto back-end processes saves on cost, staffing resources and time. Time in which staff can focus on the more important aspects of their roles, aiding the organisation more directly in fundraising activities.

RPA also helps reduce human error, resulting in more costs and time saved through not having to detect and correct errors.

Blockers To NonProfit’s Adopting RPA

Many of things that people don’t like about NonProfit’s such as lack of engagement, perceptions of wasted money etc. can be fixed in the back-office.

Yes, there’s an initial cost to setting up RPA (that pays for itself quite quickly) but it also frees up staff to deal more directly with donors, engaging with them and highlighting all the good their donations are doing for a chosen cause.

 

Over the last twenty years or so, most NonProfit’s will have dabbled with digital transformation, most likely in the form of a large, costly and unwieldly ERP system that was promised to transform all core operations.

Whilst a good idea, with probably a few scatted positive results, most led to subpar user experiences, multiple systems that couldn’t talk to each other and worst of all, expensive vendor lock-in’s that left the NonProfit trapped in a contract for far too long.

Those legacy systems, siloed data and manual, disconnected (often paper-based processes) all have a huge impact on a NonProfit’s ability to be cost effective and efficient, but without knowing what’s possible the ‘cure’ may often seem too intimidating and far worse than the disease’ itself.

There can also be a reluctance to try anything new after past digital transformation failures, leaving many NonProfit’s stuck with not-fit-for-purpose legacy systems that do nothing in adding to a charity’s fundraising goals.

How RPA Can Benefit The Back-Office Processes Of A NonProfit

RPA offers many benefits to a NonProfit, many of which we’ve already touched upon:

 

  • Reliably consistent results, free of human error
  • Much quicker and reliable delivery timings
  • Staff and volunteers can focus their time on work with much higher ROI’s
  • Workflows are all documented which for many sectors (including NonProfit’s) is vital for auditing and accounting purposes
  • Anomalies, red flags or trends in the data and processes can be identified much quicker
  • Most staff, when asked, will list manual, dull, repetitive tasks as their biggest source of dissatisfaction with their job. Automating those tasks so the same staff can more on more goal orientated projects has a huge lift on staff satisfaction levels, leading to big increases in productivity and retention rates.

 

Back-Office Functions That Should Be Automated

 

Accounts Payable: Most departments handling accounts payable will be woefully understaffed… in fact at many smaller NonProfit’s the task likely falls on just one person, who will often be under a lot of pressure to do more with the limited resource they do have.

It’s estimated that in any NonProfit’s who have already adopted automation (in which at least 70% of invoices are received electronically) a single member of staff could process over 22,500 a year through their workflows. In NonProfit’s yet to adopt automation that figure drops to less than a tenth of that!

Automation in an Accounts Payable department can eliminate all invoice data entry, all manual invoice handling and all the routing normally required in manual or even semi-automated workflows.

Once the basic RPA is in place, AI powered bots can also then be utilised to pull out and validate all a charities invoice data, match those invoices to purchase orders and/or proof-of-delivery receipts and submit invoices straight to the ERP platform.

 

Accounts Receivable: Continuing the financial theme, we get to Accounts receivable. Possibly one of the most vital departments withing any organisation, it’s often also one of the most manual and time-consuming… perfect then for robotic process automation.

Even today, many NonProfit’s will have their staff manually adding data from incoming purchase orders into one system, only to then also have to manually transfer that data to a different system.

It’s a slow process that, because of the number of steps, is prone to a lot of human error.

Every NonProfit organisation will have its own way of handling Accounts receivable, but if we’re discussing broad strokes, the disparate functions can normally be broken down to order processing, order fulfilment, invoicing and cash allocation.

RPA can automate all those processes, de-risking them all through the elimination of human error. Cash flow will also see a massive improvement as incoming orders are automatically captured, downstream documents are produced without human intervention and added straight into whatever ERP solution the NonProfit is using. That level of automation reduces an organisations days-to-pay time which speeds up the overall collection process and increasing cash flow.

 

New Employee Onboarding: A back-office function that often gets overlooked in terms of the amount of time it takes up is new employee onboarding.

The days of a job for life are long gone. Employees today expect a high degree of flexibility, accommodation and openness in any organisation they join… especially if that organisation is a NonProfit.

Whilst all that brings its own challenges to a recruitment process, it also means retention rates are a lot lower than they used to be… meaning onboarding processes are a prime candidate for automation, allowing HR teams or Talent Acquisition executives to focus on more important tasks.

Filling out employment forms, internal HR forms, the sending out of standard notifications, the chasing of signed documents, populating databases, running background checks… these are all relatively simple tasks to automate with a layer of RPA, letting staff once again focus on activities beyond routine paperwork with the added benefit of new employees receiving a seamless, hassle-free onboarding experience.

 

Existing Employee Data Management: Keeping staffing records up to date has to be one of the most manual and labour-intensive tasks faced by any HR team.

Data changes and data requests, especially in a larger organisation, will come flying in from multiple departments and multiple systems, likely with various stakeholders all expecting immediate actions.

Any time an immediate action is required then RPA should be a lead consideration. In terms of HR processes it can help bridge the gap between siloed systems, triggering notifications after certain pre-set actions and creating additional workflows to include all stakeholders as required.

IT Service Desks: The IT service desk(s) at a NonProfit will likely operate very similarly to those at other large organisations.

Service requests, incident management, problem management, change management, asset management and the accurate ticketing and reporting of these incidents… all of these are likely to occur on a daily basis. However, many of these problems will be resolved in a similar fashion and anything that’s repeatable can be automated.

With RPA and a touch of AI, problems like the ones listed above can be tackled proactively, using intelligent automation (the combined us of RPA and AI) to foresee problems and creating triggers that will auto deploy fixes for them as they arise.

That move from a reactive toa proactive model is at the heart of robotic process automation.

RPA can be used to triage incoming requests, routing them to the correct queue or service desk.

Simple issues (like password requests) can trigger automated responses that follow the approved IT workflow. And dare we say it, a triggered response can even be set up to check anyone raising a ticket has tried ‘turning it off and on again’ first.

That hands off approach to triage, dealing with many low-level issues before a human ever sees a ticket, means IT teams can concentrate on issues that are generally business critical, saving the NonProfit both time and money and increasing the employee satisfaction levels of your IT teams.

 

Managing Fundraising Campaigns: Many of the above functions may seem self-evident, but there are many other uses a NonProfit can use RPA bots for, one of those being the set-up and management of a new fundraising campaign.

Before a new fundraising campaign can even begin, someone at the charity will need to pull past donor records, generate new marketing materials, contact both past and new donors, collate expected donor payment info and enter it into the accounting system, update old, possibly out of date financial data and ensure the existing donor database is current.

Most, if not all of those tasks, can be accomplished with the push of a button once a suitable layer of RPA has been applied over existing systems and processes.

How Data Storage Management Will Change In 2022

How will the WFH trend affect data storage management and how will RPA and the cloud benefit that trend?

 

As the world slowly returns to a new normal and more and more workplaces adopt a hybrid working model, we explore how the working world (specifically data storage management) might change as we towards 2022.

During the first lockdown almost all office work became an off-premises operation. Suddenly, it was okay to work in a shirt and pyjama bottoms, with fluffy slippers hiding under the desk.

However, that meant cloud data storage management and full digital transformation became, and continues to be, a rapid priority for organisations who were shocked into the reality of futureproofing themselves. The pandemic unearthed an urgent need for businesses… the need to pivot and turn as external circumstances demand.

Once the dust of getting everyone set up with their work laptops and some hastily bought IKEA furniture had settled, the challenge of keeping communication flowing arose.

Organisations had to adapt from swapping water-cooler chats to a collaborative server which function as both a message board and social forum.
You also just couldn’t leave a stack of paperwork on someone’s desk anymore and you had to physically (or virtually if you prefer) *shudder* message them for a follow up on deadlines and updates.

The pandemic showed that a business needed to be both flexible and lightweight, with no more clunky server rooms, otherwise they ran the risk of left behind by the more agile companies who were able to better and more quickly adapt to this new way of working.

No one knows with any certainty what the future might bring, but you can plan for uncertainties and being able to allow your staff to access their work from anywhere in the country (or world) is a huge step towards that goal.

With that said, let’s explore how cloud data storage will transform the data storage management landscape come 2022.

To kick us off…

What Is Data Storage Management?

There are several processes which go into data storage management, such as volume migration, storage visualisation and process automation, etc.

These are all designed to help store and manage your data, and those tasked with tackling the post-pandemic landscape for data storage management may feel like they have been given the job of taming a wild but prosperous jungle.

Remote Working

This new normal of corporate working is obviously going to have some bugbears come rushing out of the woodwork. Mostly because moving an entire workforce from an office environment to a working from home environment is like lifting up a rock and exposing all the ants living underneath it. Problems just seem to appear everywhere.

 

Working from home requires a lot of extra storage as everything has to be moved to a virtual landscape; meetings, casual conversation, project deadlines and all the social and professional interaction in between is hoisted into the cloud which then needs to find a place to house it.

Whenever an organisation introduces a new way of accessing data it weakens the security structure for a little, until all the nuts and bolts are firmly secured.

When everyone is working remotely it’s just harder to notice when data has been breached because if everyone is in the office then activity can be monitored. But remote working means staff are potentially working with limited, unreliable, and unsecure broadbands with less security in place for information.

However, a move to the cloud was a necessity during the pandemic. Those who had digital transformation low on the list of priorities suddenly found themselves with it thrust to the forefront of their needs.

One of the main reasons for this is that consumers moved even further into reliance on online commerce and workers started working from home as the majority.

Now, it is essential to ensure you don’t get left behind.

What makes an organisation fall behind?

It’s often down to how quickly they can (or can’t) pivot. An already digitally transformed company is able to ride the waves of unprecedented events. It can buy more cloud storage or downsize existing storage to suit demand and its staff can access data securely from anywhere, for example they can communicate via Teams.

When you have an entire workforce passing information, processing data, and collaborating on projects while spread far flung, it won’t make sense in 2022 to have an organisation dependent on on-premises storage.

Cloud Storage-as-a-Service (STaaS)

Gone are the days of having to predict how much data storage you’ll need. 2022 will be the year of STaaS.

STaaS, or ‘Storage-as-a-Service’ is how you avoid locking yourself in to a plan that leaves you either paying for excess storage or scrambling around because you didn’t buy enough.

STaaS means you pay for only the capacity you use.

That’s it; no lock-in, no clunky storage machine lumbering in the back office like an elderly grandfather. You simply scale up or down depending on demand. This storage flexibility is the key to survival.

It works by your organisation ‘renting’ space in a cloud network (pick your favourite) kind of like a subscription. If you no longer need the storage space you simply cancel it, or if you need more you adjust your service package.

Flexi-Working

Talking of flexibility: it’s what 2022 will be all about. Not only have organisations had to scramble to tread water during the pandemic, but the post-pandemic world will require just as much agility and adaptability. What we’ve learned is, no one can predict the future and this uncertainty makes “future-proofing” an abstract concept.

However, with cloud storage it doesn’t have to be.

No one could have predicted a global pandemic to come along and completely up-end the way the office works.

It means you can employ people based on their skill and not necessarily on their commuting distance. Living further out and being able to commute in a couple times a week is a much more alluring feature of office work than the rigid structure of the days of yore.

Automation, Automation, Automation!

2022 will see even more change as to how data is processed and stored. It isn’t just a matter of putting it onto the cloud; it’s automated AI, machine learning and predictive analytics processes to identify and resolve storage issues faster. Taking data storage management off-premises will help future growth without the hardware limitations of the past.

Robotising your data storage management frees up your real workers for more important tasks that drive the real ROI. It’s more like setting up an autonomous robotic vacuum cleaner so it can do the hoovering while you’re out walking the dog/ferret/companion of choice.

Should Everyone Go Full ‘Cloud’?

Not all organisations will choose to go full off-prem. In fact, there are some strategies that exist where a full migration into the cloud would not be viable and you’re better of hybridising the ‘as-is’ with the ‘to-be’.

Companies already in the throes of a digital transformation / cloud migration pre-pandemic will likely continue to do so in a post-pandemic world.

That’s the beauty of cloud data storage management; it’s flexible and allows third party applications.

Virtual Desktops

Desktop as a Service (DaaS) will be more commonplace among the way an organisation functions by 2022. The best thing about a virtual desktop is the collaboration and simplified desktop deployment. Let’s quickly go over the advantages of a virtual desktop if you’re not familiar with them:

  • Security: Security needs to be entrenched in the makeup from the get-go, not as a layer to paint over at the end near deployment, like a filter on Instagram just before sending your selfie off to be judged. Virtual desktops are superior to physical desktop machines because data is stored in the cloud and not on the physical machine. If the end-user’s device is stolen or lost, it doesn’t contain data for the prying eyes to gain access to.
  • Flexibility: There is a clear advantage to a virtual desktop because… well if you have a flexible workforce with a flexible desktop environment then IT administrators can allocate new desktops without thinking of the logistics of the physical hardware. Imagine setting someone up with all those wires and log-in information if they’re only going to be there for a short period of time? Blegh!
  • Cost: Obviously, less hardware = less cost. Organisations may need to factor in licensing but usually, the more you buy the cheaper it is anyway.
  • Easy Management: Think of your IT department’s backs! Being able to manage a number of virtual desktops means software updates can all be done at once without having to go from machine to machine.
  • Computing Power: A ‘thin client’ gets all their computing power from a powerful data centre. Requiring less hardware and thus being a better friend to the environment.

 

So, you can see why many organisations are going to stick to desktop virtualisation after the pandemic. The rub, of course, is that administrators will have to ensure adequate storage resources to meet capacity and performance requirements and that governance adherence is stringently monitored.

The truth is the COVID-19 pandemic has force started a jump to the cloud for many companies who had been adjusting to the change over the last year and a half.

2022 won’t see so much of a shift to cloud data storage management, but it will solidify that the shift is permanent.

Virtual desktops are going to become the new normal (are you sick of that phrase yet?) and from here it is basically going to be like wearing that pair of shoes. You know the ones; they’re perfectly formed to the soles of your feet and never pinch. The pandemic has essentially forced the workforce into a new pair of shoes, and we’ve spent the last year and a half breaking them in.

In some ways you can say the terraform has already occurred and now the workforce must go in to reap the rewards.

cloudThing Named Launch Partner For Microsoft Cloud For NonProfits

cloudThing are honoured to have been named a launch partner for Microsoft cloud For NonProfits.

 

For years, cloudThing have provided bleeding-edge software solutions to NonProfit’s in both the UK and globally, helping them digitally transform whilst donating IP to take them to the ‘next level’.

That’s why we’re so proud to have been named Launch Partner by Microsoft for their Cloud for NonProfit.

 

Microsoft Cloud For NonProfit takes Microsoft’s already awesome technology and aligns it to the NonProfit sector, considering the challenges they face daily and adapting their solutions to answer them.

Microsoft Cloud for NonProfits has been specially designed for fundraisers, Volunteer Managers & volunteer management systems, programme managers and many other unique roles and concerns specific to NonProfits.

 

Microsoft Cloud For NonProfit utilises Dynamics 365, the entire Power Platform, Azure and LinkedIn to help create a whole suite of NonProfit solutions, all underpinned by the Dataverse.

However, Microsoft Cloud for NonProfit’s will also come with advanced training for users as well tech support for the various solutions and software.

 

As part of the announcement, Microsoft’s Tech for Social Impact team has been highlighting where Cloud for NonProfits can have the most benefit:

 

  • Know your donors and supporters
  • Deliver effective programming
  • Accelerate mission outcomes
  • Secure donor and programme participant data

 

cloudThing are honoured to have been named a launch partner for Microsoft cloud For NonProfits.

The NonProfit sector is central to cloudThing’s core identity and empowering NonProfit’s with tech for social impact was one of cloudThing’s founding principles.

cloudThing have worked with dozens of NonProfit’s over the years and can’t wait to introduce them to Microsoft’s Cloud for NonProfits. – Robert Meehan – CMO, cloudThing

 

The NonProfit sector faces many challenges beyond just achieving their stated mission, and often feels a burden of responsibility to invest any money directly into their cause, rather than on organisational improvements.

This can lead to outdated processes and technology creeping into daily use, causing a slowdown in efficiency, as well as large data and skill silos between departments, volunteers, staff and supporters.

cloudThing can assist by firstly, understanding a NonProfit’s ambitions as an organisation, then helping by using technology to address challenges such as supporter engagement, personalisation of services, as well as ensuring business support staff and volunteers are productive and make use of the technology available.

Through a focus on continuous improvement, we can help supercharge your processes, and deliver digital experiences across your NonProfit, even integrating into current systems.

 

 

Chinese Quantum Computer Beats Out Google’s 55-Qubit Sycamore

Google’s Sycamore quantum computer falls short of Chinese research team’s Zuchongzhi 2.1 quantum computer.

 

A team of Chinese researchers have claimed to have created two different types of quantum computers that can perform calculations that would be completely impossible for non-quantum computers, and that can outperform other competitors in terms of speed.

You might be thinking, “what the heck is a quantum computer?”

Fundamentally they work differently to what we’d consider ‘classic’ computers. They use ‘qubits’ (quantum bits) which work by storing a combination of binary digits (bits – ‘0 1’) through superposition.

The team, helmed by Pan Jianwei, a quantum physicist from the University of Science and Technology of China (USTC), state they’ve designed a quantum computer that’s a 66-qubit superconductor, making it 10 million times faster than the world’s fastest digital supercomputer, and a million times more powerful than Google’s 55-qubit Sycamore quantum processor.

It has been named ‘Zuchongzhi 2.1’ after the noted 5th century Chinese mathematician and astronomer.

The 62-qubit superconducting prototype, Zuchongzhi, was unveiled by Chinese researchers in May, making the 2.1 an upgraded version.

Zuchongzhi 2.1 isn’t the only quantum computer floating around, however, in 2019 Google’s Sycamore processor achieved ‘quantum supremacy’ for the first time, which far exceeded the performance of domestic computer systems. Google says that Sycamore performed a specific task in 200 seconds – a task that apparently it would take the world’s best supercomputer nearly 10,000 years to complete.

The Zuchongzhi 2.1 research team also lays claim to having built a novel light-based photonic quantum computer, named  ‘Jiuzhang 2.0’ which is said to perform tasks and calculations up to 100 trillion times faster than the world’s fastest existing supercomputer, and large-scale Gaussian boson sampling (GBS) 1 septillion times faster.

The experiments that the researchers ran Zuchongzhi 2.1 and Jiuzhang 2.0 involved things like calculating the probability that a specific input configuration may lead to a particular output configuration.

These are simply not possible for conventional devices.

Jiuzhang 2.0 can, according to the researchers, sample the output of 1,043 possible outcomes 1,024 times faster than a ‘standard’ supercomputer.

They also noted that a sampling calculation using Zuchongzhi 2.1 is about 1,000 times more difficult to perform on a classical computer.

This indicates that our research has entered its second stage to start realising fault-tolerating quantum computing and near-term applications such as quantum machine learning and quantum chemistry. – Zhu Xiaobo – Study’s Co-Author

 

 

Green Routes & Network Expansions – What Liverpool’s £710M Funding Will Do For Transport Infrastructure

Despite vows of invigoration and interoperability, fears remain that thousands of city residents will be excluded from new transport initiatives.

 

£710m of funding could be the answer to the east of Liverpool’s connectivity, as it has long been cut off from a rail network.

It comes after Liverpool City Region Metro Mayor Steve Rotheram vowed at the Labour party conference in September to create a transport system that was ‘better than they’ve got in London.’

An announcement was made earlier this week that the funding had been secured from the government and it seems the mayor’s vow to invest in new infrastructure for transports were not hollow promises.

The will help launch the ‘transport revolution’ which will include new train stations, green bus routes and improved walking and cycling facilities.

The lofty vision of besting London’s transport system may still be in the infancy stages but the massive cash injection is setting the precedent for achieving the goals, acting as the first track being laid down.

The Merseyrail network will be extended to meet the needs of the previously hard to reach communities, such as Skelmersdale, and it is people of places with limited network services who will have been keeping an eye on the details.

Merseyrail is one of the best performing rail franchises in the country.

Down to the geography of the city, amongst other reasons, it has the luxury of full access to the rail network it operates on which results in fewer delays, since unlike competing franchises it doesn’t need to vie for precious platform space.

However, much of the east of the city is basically untouched – beyond the northern line it skips places like Ormskirk, Southport and Kirkby to as far as Hunts Cross.

The areas of West Derby, Knotty Ash, Croxteth, Norris Green, Tuebrook and Stoneycroft remain without a direct line to the city’s rail network which means a population of up to 70,000 are unable to enjoy interoperability of travel and instead have to rely on car use and bus travel.

The scale of the challenge of creating an infrastructure to rival London’s must be taken seriously into consideration and the focus must be clear, in order to create the level of connectivity throughout the entire city that London enjoys.

West Derby MP Ian Byrne feels as though network for his constituency has devolved.

For me, Steve Rotherham and his team have done a magnificent job securing the funding. It’s a fantastic advancement. I fully support the plans for an integrated London style transport system. But it’s not London yet. London has fantastic connectivity. At the moment we’re far far from that. For me, it’s a case of looking at the gaps. 60 years ago we had far better connectivity. We’ve actually gone back. Hopefully now there is an opportunity with the Metro Mayor to revisit all areas of Liverpool which it desperately needs. West Derby has huge gaps. Train connectivity is something that is unbelievably lacking. It would make a huge difference to the infrastructure in east Liverpool if we had connectivity. – Ian Byrne – West Derby MP

 

City Council cabinet member and Cllr for the ward Harry Doyle, a vocal supporter of improving the city region’s interconnectivity has pointed out that while road connections are good, there has been a lack of consideration in travel time that needs addressing directly and improved upon.

Knotty Ash is slightly further south than West Derby and it faces a whole slew of connection issues.

Firstly, it doesn’t have a train station.

It takes around the same time to get the bus from the Greyhound pub in Knotty Ash into town as it would to get the train from Chester into town. And that’s just not acceptable. For added context, Knotty Ash is six miles from Liverpool city centre. A bus journey could be between 45 minutes and an hour depending on traffic. Formby is 12 miles from Liverpool city centre. A train journey takes 30 minutes. It would take me less than half the time with a car. We have a good bus network across the city, but the travel time is not acceptable. Not when we’re trying to encourage more people to use public transport. – Harry Doyle – City Council Cabinet Member

 

‘Green corridors’ will be a core focus for the funding secured for transport reinvigoration.

It includes zero-emission hydrogen-powered double-decker buses and ‘green bus routes’ which are designed with prioritisations of travel and journey time through a combination of priority lanes, traffic signal upgrades, remodelled junctions and upgraded, accessible passenger facilities.

The most frequented and busiest bus in the region, the 10A, will be the first of the planned green routes to serve the area – it runs from St Helens to Liverpool city centre through areas like Knotty Ash and Stoneycroft.

So who are the residents in danger of being left out?

In the Northeast of the city, the people of Croxteth who are old enough will only remember, before it closed, of West Derby as the area’s nearest working train station.

The centre of Croxteth is 2 miles away from Fazakerly, which makes that area its nearest connection to the Merseyrail line.

Because of this, areas like Croxteth and Norris Green are more reliant than most on the bus services – 80% of journeys in the City Region are in fact taken by bus.

Business Central Vs Sage – Which Does Your Organisation Need?

There’s a lot of overlap between D365 Business Central, a Cloud ERP solution and Sage 50 / Intacct, a cloud accounting solution… so which is the best fit for your organisation

 

In this ‘new-normal’ a modern organisation needs their financial/ERP solution to be capable of being deployed instantly but also be flexible enough to pivot, adapt and grow with shifting market realities.

A platform capable of that though, that’s also innovative, intuitive, adaptive and secure will always be big investment for an organisation in terms of cost, time and resource, so it’s important the decision as to which to invest in isn’t taken lightly.

If you’re an organisation looking to purchase a new (or upgrade from an old) CRM, ERP or accounting solution then it’s likely your research will lead to either Microsoft Dynamics 365 Business Central, Sage 50 or the larger Sage Intacct.

All three are SaaS solutions built in the cloud, offering a range of offerings, some similar, some overlapping and some completely different.

 

Sage and Microsoft Dynamics 365 Business Central (what was formerly Dynamics NAV) are business/ERP solutions with varying degrees of accounting capabilities. Both are the go-to for small to mid to enterprise level organisations (with Sage 50 being the favourite for very small companies) but all offer something different depending on the size, use case and number of end-users.

 

Sage 50 is the go to for small, new business looking for their first CRM or accounting software, Sage Intacct is for more midsize to large organisations requiring a greater emphasis on accountancy software, whilst Business Central is more of a ‘total’ ERP solution, touching on many points in an organisation (including accounts) and being capable of seamlessly integrating with the rest of the Microsoft Stack.

 

Sage offers its users features such as accounting, cash management, purchasing, vendor management, financial consolidation, revenue recognition, subscription billing, contract management, project accounting, inventory management and several financial reporting tools.

D365 BC offers a much wider set of features like financial management, relationship management, supply chain management, project management and HR management, plus, as already mentioned, full integration with other Microsoft applications.

 

How then, to decide which will be more useful for the organisation?

Which Is The Most Flexible Solution?

If COVID has taught organisations anything, it’s that, for any solution that gets adopted, flexibility has to be key.

If the platform you settle on is so inflexible it can’t instantly be adapted to meet changing needs then it’s dead in the water before its even begun.

 

Sage Intacct comes with a lot of changeable options baked right in, such as location, department, vendor, customer, employee, project or product etc. However, you can also add user-defined dimensions if needed.

Sage Marketplace will also make available to you, dozens of other business applications you can install to extend out Sage’s core functionality as and when they’re needed.

It’s level of customisability however often offers a steep learning curve to users not used to it.

 

D365 Business Central will let you do the same thing, except you’ll have the ability to build unlimited dimensions to any financial ledger to help facilitate any advanced accounting needs your organisation might have. It’s also available on mobile, allowing you to access many aspects of the business on the move.

Business Central can also be extended out through the AppSource Marketplace, and, thanks to the huge Microsoft community, there will be an app, many free, for almost any conceivable business situation or outcome that you can envision.

The flipside to that is many organisations will come to D365 BC as part of a digital transformation and, whilst it wont have the steep learning curve often attached to Sage Intacct, can mean the integration process is longer as old systems and data is moved over to Microsoft.

How Scalable Is The Solution?

Business grow and become more successful over time… or at least they should. If you don’t want to be paying for new platforms every time your organisation gets bigger then it’s important any solution you pick is scalable.

 

Sage 50 is what many new businesses will pick for entry level accountancy software and whilst it can handle up to around 50 users it will lack many of the functions both Business Central and Sage Intacct will offer.

At that point, migrating to Intacct is very possible as it’s capable of handling 50 – 1,000 users and comes with a range of additional accountancy and finance features. Migrating to Business Central is also possible but will likely require a third-party transformation partner to help.

 

Microsoft D365 Business Central can easily handle 1 – 200 users and even past that with the right development team, meaning it’s perfect for small, medium and even enterprise level organisations.

Whilst it might lack some of the more in-depth finance features Sage Intacct has, it offers a much wider breadth of functionality across the entire organisation.

That wider range of functions means, no matter which part of the organisation is experiencing a growth phase, BC can grow and scale in conjunction.

Is Accounting Or ERP More Important To Your Organisation?

This question, more than any other, will decide on whether your organisation opts for Sage or Dynamics Business Central.

If the business has complex and varied accounting needs, then it’s likely Sage Intacct will be the choice for you. Both Sage 50 and Sage Intacct specialise in accountancy functions, with Sage Intacct being the much more comprehensive of the two.

Sage Intacct does offer ‘some’ operational functions but they’ll be limited at best.

 

Business Central on the other hand was designed by Microsoft as a complete ERP solution, ideal for small, medium and enterprise level organisations and capable of managing not just accountancy functions but also manufacturing, warehouse, project, supply chains, customer relationships and customer service.

Whilst it can’t offer the same depth of accounting functionality that Sage Intacct does, its much broader capabilities make it the ‘go to’ for many.

How Often Will It Be Updated?

One of the great benefits of SaaS platforms is that they’ll be regularly updated for you (for free) with new features, improved security and bug fixes automatically, with no additional downloads or installation needed.

Microsoft and Sage both offer extensive upgrade cycles, with (bar important security patches) Business Central receiving two big updates a year plus multiple smaller monthly updates and Sage Intacct receiving four slightly smaller, but slightly more regular updates a year.

Which Is Cheaper, Business Central Or Sage?

Of the three choices, Sage 50, Sage Intacct or Business Central, Sage 50, as the entry level option, will be the cheapest, but all you’ll receive in return for that is entry level accounting functionality and a basic CRM.

 

Business Central will be much cheaper than Sage Intacct (much) but the caveat to that is that BC works best in tandem with other Microsoft products (from Office right through to other Dynamics 365 products). It will integrate seamlessly, for instance allowing a user to create and send PO’s directly from their Outlook email, but those additional products can, depending on an organisations need, sometimes drive the price up a lot.

 

Finally, Sage Intacct will be the most expensive of the three, offering a deep (but narrow) focus into accountancy software.

The ability of Sage Intacct to drill down into a business’s core financials is what attracts a lot or people but it will limit the scope of what can be achieved by the solution.

Sage Intacct does have an open API, so it can be integrated with other platforms, systems and apps but this will almost always require a third-party developer, which will again ramp up the cost of the solution.

So Which Is Better? Sage Or Business Central?

As you can probably tell, the answer to that question will be… it depends.

 

Sage Intacct is a powerhouse when it comes to helping accounts teams modernise, automate and improve their processes. However, an accounts team is just one aspect of an organisation.

If you’re looking for a much wider solution with a breadth of functionality to span the entire organisation then Business Central is likely to be the solution you require.

Cancer Diagnosing & Decision Making AI Approved For Use In UK

A massive game-changer for cancer diagnosis and decision making is said to be as accurate lab-testing – with results found in minutes.

 

Diagnosis of routine cancer samples has been sped up due to a ground-breaking AI-based test that predicts the most effective form of treatment from images of routine cancer samples, which also cuts costs and saves time on lab-testing – and it’s now approved for use in the UK and EU.

It’s called the PANProfiler, developed by Cambridge-based company Panakeia, and it works by analysing digital images of routine breast tumour samples that normally require being observed under a microscope by a trained pathologist to judge the best course of action.

After the pathologist has checked the sample, a further sample is sent off to discover the next steps that need to be taken, with the wait time being days or weeks for results, and costing hundreds or even thousands of pounds.

But the PANProfiler Breast Test removes all that wait time and costing. It can scan and analyse the digital image of the sample and predict whether it contains ER or PR receptors, which then categorises patients as needing hormone therapy, or HER2 which is treated by the drug Herceptin.

The test far exceeds existing tests in terms of time and cost efficiency and the accuracy is comparable to lab testing, and it’s able to do all of this in mere minutes just from a digital image. Time, in these instances, is the most precious resource for both patients and doctors. Having the patient journey significantly reduced means the time from diagnosis to treatment is cut tremendously, but also the burden on busy laboratory services is reduced – COVID-19 has undeniably created a backlog on cancer diagnosis during the pandemic and the new AI-based testing method will go a long way to freeing up those services.

As of 13th October, the test now has UKCA and CE approval for clinical use by health services in the UK and EU. It seamlessly integrates into current digital procedures being employed in cancer pathology, and is being trialled in hospitals around the UK, with expansion plans for Europe, North America and Asia.

So how did Panakeia’s innovative technology come to be?

Co-founders Pahini Pandya, a former cancer scientist at the University of Cambridge, and AI researcher Pandu Raharja-Liu found in their research that there were almost imperceptibly small differences in the appearance of cancer cells – so small in fact, that they require a computer to see – and these differences reveal the best treatment options due to the information gleaned from their molecular state.

The Panakeia team is now developing similar tests for other tumour types, in the wake of the PANProfiler Breast test’s release.

The company’s mission is to speed-up the decision-making in cancer diagnosis and treatment, spawned by Pandya’s lived experienced of waiting and waiting for the results of tests for blood cancer – a disease she sadly lost her childhood best friend to – but fortunately the results of Pandya’s blood tests came back negative.

I know first-hand the anxiety of waiting for your test results. Due to the pressure on labs, even in the best healthcare systems, diagnosis and treatment decisions can take weeks – an unacceptable and stressful delay when dealing with a fast-growing cancer. We’re excited to be rolling out PANProfiler to hospitals here in the UK and around the world to speed up access to treatment and help save lives. – Pahini Pandya – Panakeia Co-Founder

 

Raharja-Liu, who has unfortunately lost family members to the disease, adds:

This is a golden opportunity to transform cancer diagnosis. We can now do something that nobody has achieved before – to see more from every tumour sample, gathering rich information about what these cells are like and how best to treat them. – Raharja-Liu – Panakeia Co-Founder

This exciting technology has the potential to save laboratory resources and also to improve turnaround time for biomarker results for patients with invasive breast cancer. – Professor Sarah Pinder – Chair of Breast Pathology at King’s College London & Lead Breast Pathologist at Guy’s & St Thomas’ Hospitals

 

 

How To Debug Something With A Rubber Duck

What does a computer programmer and a rubber duck have in common?

 

First things first, we haven’t lost our minds.

Solving a software debugging problem with a rubber duck really is a technique used by a lot of developers.

It doesn’t have to be a rubber duck though. There’s plenty of other names for this technique, often involving either an inanimate object, a puppy/kitten, small child or at a pinch a fellow coder… but our favourite will always be the humble rubber duck.

What Is Rubber Duck Debugging?

So how do you go about solving complex problems with nothing but your wits and a rubber duck?

 

There’s a big difference in how a human thinks to how a computer thinks.

When a programmer runs into a bit of a problem with their code, it’s more than likely because they’re looking at something differently to how a computer will.

Computers are precise, logical, and, seen from a certain perspective… quite inflexible. The counterpoint to that is the human brain which will be quite forgiving of someone explaining something to them, filling in the blanks where required from their own imagination or experience.

That’s why so many bugs a programmer has to deal with will come down to their instructions (code) not being quite precise enough.

 

But what about the rubber duck we hear you cry?

 

Rubber ducking (as it’s sometimes called – “Have you tried rubber ducking it?”) helps a programmer get round that problem.

The programmer knows the intent of the program, the computer doesn’t. Your rubber duck then, will help bridge that gap.

In essence, if a programmer hits a roadblock, they’ll explain what they’re trying to do, and the code they’re doing it with, line by line, to the rubber duck.

The process of having to slow down and explain what’s happening, line by line, to an object that knows nothing about programming, often helps the coder identify the problem pretty quickly by simultaneously explaining what it’s supposed to do and what it’s actually doing.

It also helps to change a programmers perspective somewhat as what they’re doing is actually ‘teaching’ the rubber duck, forcing them to view their code from a different angle and thereby offering a deeper understanding.

The use of the rubber duck also means this can be achieved without having to disturb anyone else… always a benefit when you’re trying to hunt down an embarrassing mistake or glitch in your code.

How To Rubber Duck Debug

Now you know the theory, applying it to a practical method couldn’t be easier!

 

  • Step One – Beg, borrow, build or steal a rubber duck from somewhere – The default yellow variety is fine but let’s be honest, if you’re going to be talking to a rubber duck, then the more outlandish the better.
  • Step Two – Place said rubber duck somewhere prominent on your desk – Feel free to put it in a glass case you can break in case of emergencies (depending on how often you have to check your code anyway).
  • Step Three – Upon encountering a problem, explain to your duck what it is your trying to achieve then walk it through your code a line at a time – don’t skip any details, rubber ducks love details.
  • Step Four – Enjoy your moment of revelation as you explain to the rubber duck what you’re doing and realise that isn’t what you’re doing after all.
  • Step Five – Quickly explain to all your colleagues you haven’t gone insane.

 

The reason rubber ducking works so well is that by following this process you’ll almost always find your problem, normally a small typo like a misplaced div – The duck always reveals something!

The Psychology Behind Rubber Duck Debugging

The reason rubber ducking works so well is that explaining something to someone else (in this case your faithful rubber duck) causes an actual shift in your thinking process.

The first thing to happen is that you’ll be forced to slow down and pay a lot more attention to what you’ve typed as most of us will think a lot faster than we can type (or code). Having to verbally explain what you’re doing will likely make you a lot more accurate.

 

The second benefit to rubber ducking is that you’re forced to work to your rubber ducks level or knowledge of coding (and rubber ducks make awful coders… trust us).

Your rubber duck won’t know nearly as much about the problem as you do. Simplifying the problem to a level where a rubber duck can understand it will almost always reveal the solution.

Where Can I Get My Own Rubber Duck?

Are you telling us your bathroom doesn’t already have a rubber duck you can ‘borrow’ to talk code with?

Never fear, they’re incredibly easy to buy online… but we still think there should’ve been one in your bathroom already!

Using Design Thinking To Empower Digital Transformations

Using Design Thinking to make sure your next Digital Transformation project is a success

Design Thinking.

It sounds like one of those fads that get spouted around by tech companies every so often but ultimately don’t mean anything… doesn’t it?

Well it might sound like that but Design Thinking actually has some ‘legs’ (so to speak) and has been making a lot of waves in recent years, in industries and sectors completely unrelated to technology (as much as any sector can be divorced from technology these days anyway).

Thousands of organisations, across a wide range of sectors, have already seen the benefits of adopting Design Thinking principles, especially when it comes to complex and nebulous projects such as Digital Transformation.

 

But what is Design Thinking, how can it be successfully implemented into current processes and, most importantly, can it/how will it benefit your organisation?

What Is Design Thinking?

Not sure Design Thinking is right for your organisation? Not convinced you’re ‘techy’ or creative enough for it to apply?

Well worry not as none of those are essential for the implementation of a Design Thinking led culture.

 

At its most simple, Design Thinking is about adopting an end-user approach to all your polices and procedures and then creating tools or solutions that best benefit said end-user.

Design Thinking has to be both a set of policies to enable them and a cultural norm within the organisation to empower it and ensure it delivers results, both practical and creative.

 

Design Thinking is heavily based on its origins within the tech sector, relying on the methodology and processes that are norms within there, but has evolved in recent years to embrace other sectors and other ways of thinking and working, meaning it can now be applied to any problem, in any sector (although for obvious reasons we’ll mostly be focussing on Digital Transformation).

 

Design Thinking is incredibly user-centric, always putting the people at the heart of any solution by seeking to understand what it is they actually need.

A Design Thinking led approach encourages designers (and their wider organisation) to consider what the end solution will be used for and more importantly… how.

Its aim is to understand real world situations as opposed to ideas that might just look good on paper, which in turn leads to a much deeper and successful user experience.

True Digital Transformation Is Tough

According to reports from 2018, over 86% of digital transformations will fail. If those numbers still hold true, then only 14% of digital transformation projects started will achieve their desired goals.

14%!

That’s ridiculously low.

Why is that though… and can Design Thinking help?

 

Common problems cloudThing often see from organisations attempting to digitally transform normally originate from fragmented approaches and a lack of communication between departments within an organisation.

The key to making a success of any digital transformation is bringing along the end-users, hearts and minds.

An example we’re sure many will relate to, there’s no point spending half a million on a new, cutting edge finance suite if Deborah in accounts payable is going to keep using her excel spreadsheet because she doesn’t trust or understand these ‘new fangled ideas’.

 

Design Thinking approaches instead ask the question… ‘who will be using this software and what do they need to better do their jobs?’

Instead of just focussing on the tech (which is still important) more attention is paid to the end-user.

Afterall, the people using the technology in a Digital Transformation are the ones who will ultimately decide its success which actually makes them more important than the tech itself in many ways.

Making Digital Transformation A Success Through Design Thinking

Design Thinking is a way of looking at problems in new ways to come up with new, user-centric solutions. To help facilitate that it’s been broken down and codified into four principles and then a further five phases.

THE FOUR PRINCIPLES OF DESIGN THINKING

  • The Human Rule: No matter the project (or goal) all design is basically social in nature so any solution should always be based on a user-centric point-of-view.
  • The Ambiguity Rule: Ambiguity is one of those things you just can’t avoid. That being said, experimenting at the outer edges of your knowledge and ability will always be crucial to seeing things differently.
  • The Re-design Rule: All design is re-design. The technology or environment may change but at their core, basic human needs will always remain the same. That means any designer, working on any solution, is only redesigning the method to fulfil those need sin a different (and hopefully more efficient) way.
  • The Tangibility Rule: Design ideas should always be made tangible as soon as possible to allow their creators to explain them more effectively.

THE FIVE PHASES OF DESIGN THINKING

Using those four principles, Design Thinking can then be further broken down into fiver discrete phases… Empathise, Define, Ideate, Prototype & Test. Though discrete, these phases aren’t necessarily linear and whilst stages should never be skipped, it’s quite common to skip back and forth and loop back around again as you get closer to the perfect solution.

 

  • Empathise: Empathy is the starting point for all Design Thinking (or should be). It’s the first phase that should be considered in the Design Thinking process and is where the designer(s) should spend time getting to really know and understand the people who’ll be using their solution. Understanding their wants, their needs, their desires. Understanding them on a deep psychological and emotional level as possible. It’s vital during this phase that designers put aside all pre-conceived assumptions and focus on gathering real insights about the end-user.
  • Define: The second phase of a Design Thinking led approach will always focus on defining the problem. All the previous finding from the empathise phase will be gathered and the designers will start to sift through and understand them. Where are the pain points? Are there any identifiable patterns? Can the issues be triaged?
    Come the end of the define phase, there should be a clearly defined problem statement and that everyone can agree on. From here, the key will be to frame that problem in a user-centred way, for instance, rather than statements that start “we need to…” the designers might say, “Accountants at XXX organisation need…”
    Once the problem has been formally scoped out, you can start to work on specific solutions… Which is phase three.

 

Before we move on to phase three though, something cloudThing always recommends to help with the first two phases is the creation of something we call user personas.

A user person is a made up ‘character’ that will represent the end -user of the solution during the design process.

These personas (and it’s rare when there aren’t multiple personas for even a single solution) are created based on the data collected in the empathise and define phases. It’s vital that these personas are as factual as possible, with personalities/job roles/needs based off the data collected, rather than just ‘guessing’.

The difference these personas can make to a digital transformation is truly astonishing.

Once created, any solutions can then be designed to fulfil a much wider (and detailed) set of specific wants and needs whilst considering past user experiences, attitude and bias to create a tailored solution.

 

  • Ideate: Once a thorough and detailed understanding of the users has been reached and the problem sufficiently defined based on user-personas then it’s time to start thinking about solutions. The third phase of Design Thinking, ideate, is the fun bit, where no idea is too outlandish or crazy. In fact, that’s the entire point; this phase has to be completely judgement free. Designers need to be able to conduct ideation sessions where as many new ideas, angles and approaches are thought of as possible.
    As the ideate phase comes to a close it’s important to narrow things down to just a few, solid ideas to move forward with
  • Prototype: The penultimate phase in a Design Thinking approach focuses on experimentation; on turning the wild ideas of the previous phase into a practical reality. To do so, prototypes are used.
    The prototype should be a scaled-down version of the end-solution that can be interacted with by all stakeholders to scout out any potential flaws or deficiencies.
    Depending on how that goes, it’s completely normal within the prototype stage for solutions to be accepted, rejected, improved or redesigned based on stakeholder feedback.
  • Test: The final phase, after prototype, is testing. However, just because it’s the final stage doesn’t mean it’s the end of the Design Thinking process.
    It would be nice if it were but real life doesn’t often work that way and it’s quite common for the results of the testing phase to lead back to one of the previous phases (hopefully not empathise… but it does happen).

The Benefits Of Design Thinking

As we said at the beginning of this article, Design Thinking can have a huge influence on the success of a Digital Transformation project, ensuring that the end solution is both fit for purpose, but also interesting/attractive enough to sway the hearts and minds of its end users.

No Digital Transformation can succeed without that.

Design Thinking comes with a lot of other benefits though, most notably:

 

Reduces The Time To Market: One of the biggest benefits to a Design Thinking led approach is how much more quickly an organisation can arrive at a MVP (minimum viable product).

 

Increased ROI/Reduced Costs: The faster you can a state of MVP, the quicker you can get a solution to market. The quicker you can get a product to market, the lower your costs and higher your ROI will be.

Design Thinking has a proven track record of empowering and accelerating Digital Transformations, to the point where a study by Forresters estimated that it can increase ROI by 85% or more.

 

Design Thinking Isn’t Just For Designers: One of the best things about Design Thinking is that it isn’t just for designers (or even the design industry as a whole).

As a process, Design Thinking encourages inter-departmental collaboration, group thinking and the cross pollination of ideas and processes across an entire organisation.

 

Design Thinking Empowers Innovation: During the empathise stage, cloudThing always seek to understand why a company has a particular process or way of doing things. The most common answer we get to why something is done a certain way is “because we’ve always done it like that”.

Design Thinking challenges pre-held assumptions and established beliefs. It’s about encouraging all the stakeholders in a solution to think outside of the box.

It should go without saying that an organisation that fosters a culture of innovation will always outperform one that doesn’t.

 

Design Thinking Increases Retention: It doesn’t matter if you’re a NonProfit looking to increase donor retention, Membership Organisation looking to increase member retention or you just want to improve customer loyalty; at its heart, Design Thinking is a user-centric approach and putting your users at the heart of everything you do will always increase their loyalty over the long-term

How To Implement Design Thinking, Company Wide

Hopefully by now you’ll have an inkling of the awesome power Design Thinking can have in ensuring a Digital Transformations success but… implementing it company wide can be a little daunting.

Fortunately, there’s some very easy steps that can be taken to foster a culture of collaborative Design Thinking.

 

Invite Everyone: No one likes a long meeting but it’s important within a Design Thinking led approach that everyone gets their say. This may lead to longer meetings in the short term but in the long term will offer a wider perspective and larger idea pool.

There’s no way to tell who’ll have the best idea unless you invite everyone!

 

Accept That Everyone Is Different: Not everyone in the meetings will be as creative as others. Not everyone in the meetings will be as strategic. Or technical. Or confident.

And that’s absolutely fine. In fact, it’s the whole point. Everyone will have a slightly different perspective  and reflecting those different perspectives in the end solution is what Design Thinking is all about.

 

Make It A Judgement Free Zone: Not everyone will be as confident as others in voicing an opinion or putting forwards an idea. To help in overcoming that barrier it’s important to establish a safety net as early as possible so that participants in the process feel confident in highlighting their ideas.

Even the wackiest, off the cuff idea can sometimes be ran with into something awesome.

 

Remove All Barriers To Collaboration: Design Thinking can’t take place without collaboration. Whilst the blockers stopping that will vary from organisation to organisation, removing them to allow the free flow of ideas always needs to be one of the first steps taken.

 

Allow Others To Shine: If you’re running one of the Design Thinking phases it’s important to give others time to share their opinions. It’s too easy to start talking and not leave space for others to add their input. Making room for that in the meeting schedules is vital.

 

It’s OK Not To Be Perfect: During each of the Design Thinking stages, ideas, thoughts and even prototypes will be presented, rejected and re-designed, multiple times.

Those redesigns don’t have to be perfect, just better. Fix the issues raised and then send it back for everyone’s opinions.

How Business Central Can Keep You GDPR Compliant

Learn how to stay compliant with the awesome tools provided by D365 Business Central

 

We recently wrote an article on the importance of classifying your data and the benefit’s it can bring to an organisation.

However, depending on the amount of data you have, classifying it may seem a bit of a daunting task! Fortunately, Microsoft’s Dynamics 365 Business Central is here to help.

Many different territories operate different data standard regulations. One of the best known is the EU’s GDPR or General Data Protection Regulations.

 

GDPR states there are different reasons for holding data and an organisation needs to classify why they hold each piece…

 

  • Consent – Under consent, an organisation can process an individual’s data if that person has consented to it.
  • Contractual Necessity – An individual doesn’t need to consent to their data being processed by an organisation if that data is needed for a contractual necessity. This also applies to the Right To Be Forgotten – Some information may need to be retained if required as part of a pre-existing contract.
  • Compliance With Legal Obligations – As with contractual necessity, it’s entirely inline with GDPR requirements to process an individuals data if the organisation is required to do so to fulfil a separate legal obligation.
  • Vital Interest – This is one of the rarest reasons to process an individual’s data but in life and death scenarios (and life and death doesn’t mean they just have to get your latest sales email) it’s entirely withing the remit of GDPR to do so.
  • Public Interest – Another form of data processing that’s compliant with GDPR but that most organisations won’t see (it’s more common for instance in news outlets for example) is the processing of an individual’s data when acting in the public interest.
  • Legitimate Interest – Legitimate Interest is by far the broadest category of classification for processing data and is defined as if an organisation has a legitimate interest in doing so.

 

As you can see, with so many different ways to classify data, it’s important to have a reliable way to do it… like D365 Business central.

 

1. The first thing you’ll need to do if you’re hoping to classify data in your system for GDPR is to make sure you’re signed in correctly. If you don’t sign is as an Administrator of Users in the User Groups and Permissions role centre, you wont be able to access any of the awesome GDPR tools D365 BC has as standard.

It’s been set up that way as it’s a legal requirement for only authorised users (such as a Data Protection Officer) to access the privacy features within.

 

 

2. After you’ve logged in with the correct profile you’ll find Business Central has added a Data Privacy activity pane that lists all of the handy GDPR features you can use.

3. Clicking on Data Privacy will show you these options…

 

 

4. Data Classification, will, as you’d expect, open up a Data Classification work sheet that will enable you to set the correct level of data sensitivity for all of your tables (both standard and custom).

 

 

5. If you click the Set Up Data Classification button you’ll be presented with a wizard (a Data Classification Assisted Setup… not a graduate of Hogwarts). From here BC will let you import and export data from Excel which will massively help if you need to ever change classifications.

 

 

6. Next you can go back to the Data Subjects Page. You’ll now see all the physical entities with their assigned classification attached. Once that’s done you can create a Data Privacy Utility so that, going forward, you’ll be able to see logs for every Data Privacy Activity.

 

 

7. Clicking on Data Privacy Utility will open up another wizard; this one will let you either export all of the data you hold on an individuals in your systems (incredibly handy for Subject Access Requests) or create a complete data privacy configuration package.

 

 

8. Exporting data for a subject access request will export either all the data you hold or just the data you request based on a sensitivity level.  You’ll be able to preview the export before it generates to make sure it all looks right and then generate an Excel spreadsheet which will be added to your role centres report inbox. If you instead create a data privacy configuration package, a data package for the subject will be created which you can then view and edit.

 

 

 

9. Once you’re done, you’ll be able to see a log in the Data Privacy Activity as this is required by GDPR for all activities related to data manipulation.

 

These features in Dynamics 365 Business Central any organisation should easily be able to handle the vast majority of GDPR issues that come their way.

Government Answers Calls To Teach More About Black History, Cultural Change & Migration

Government plans to widen conversations of diversity with publication of new model history curriculum.

Plans for a model history curriculum have been confirmed by schools minister Robin Walker, with plans to enrich learning about “migration, cultural change and the contributions made by different communities”.

Subject expert Christine Counsell has helped developed the curriculum, after Nick Gibb – Walker’s predecessor – made plans to build a curriculum around diversity.

There have been calls for schools to teach more about black history, and so the government’s plans to take steps to develop such a curriculum have followed.

Walker told a debate organised to mark Black History Month:

We will work with history curriculum experts, historians and school leaders to develop a model history curriculum that will stand as an exemplar of a knowledge-rich, coherent approach to teaching history. – Robin Walker – Schools Minister

This isn’t the first of such curriculums to be published, as earlier this year the government released a similar non-statutory model curriculum for music.

Diversity would be an important aspect of the model history curriculum, as we demonstrate how the content, themes and eras of the national curriculum can be brought to life by teaching them in an interconnected form throughout key stages. A diverse history can be taught because history is diverse. As so many members have said today: black history is British history. The curriculum would equip teachers and leaders to teach migration, cultural change and the contributions made by different communities to science, art, culture and society We will announce further details in due course, but I am pleased to show our commitment to high quality teaching in this debate. – Robin Walker, Schools Minister

 

 

NHS To Prescribe E-Cigarettes In A World First – UK To Be Smoke Free By 2030!

The NHS hope to prescribe E-cigarettes to reduce smoking rates in the UK.

The NHS could soon be prescribing e-cigarettes to help people quit smoking.

The medical regulator is currently working with e-cigarette manufacturers in a move fully supported by the Government with the goal of making the UK smoke-free by 2030.

 

MHRA (the Medicines & Healthcare Regulatory Agency) will be publishing updated guidelines to pave the way to a smoke-free UK by 2030 through medical licensed e-cigarettes to smokers.

Manufacturers of e-cigarettes are now being encouraged to contact the MHRA and submit their products for evaluation and approval through the same regulatory process that other medicines and equipment must go through before being used by the NHS.

If any e-cigarettes pass the process, it will make England the first country in the world to prescribe e-cigarettes to smokers.

 

Once an e-cigarette receives approval from the MHRA, it will be down to individual clinicians and health practitioners as to whether they’re prescribed to a particular patient or not to help them quit smoking.

NHS advice hasn’t changed that non-smokers and children are strongly advised against the use of an e-cigarette as they do still contain nicotine and aren’t risk free… though expert clinicians in both the US and UK have stated they are much safer than smoking is.

 

Smoking remains the leading cause of premature death and, whilst rates are at a record low in the UK, there are still over six million smokers in England alone.

There are also huge variations across the country, with smoking rates over 20% in some areas but below 10% in others.

This country continues to be a global leader on healthcare, whether it’s our COVID-19 vaccine roll-out saving lives or our innovative public health measures reducing people’s risk of serious illness. Opening the door to a licensed e-cigarette prescribed on the NHS has the potential to tackle the stark disparities in smoking rates across the country, helping people stop smoking wherever they live and whatever their background. – Sajid Javid – Health and Social Care Secretary

2019 figures show that almost 64,000 people die in England from smoking or smoking-related issues which is why the OHID (Office for Health Improvement) is also throwing their weight behind this project to support it.

Reducing health disparities across different regions in the UK, including smoking rates, and keeping people generally health has wide ranging effects on the individual, their wider family and society and the economy as a whole.

To help empower goals that support that, the OHID will be working at a national, regional and local level with the NHS, academia, NonProfit’s, scientist and researchers to ensure they receive the help they need.

 

 

What Can Business Central Actually Do?

D365 BC is a single, end-to-end solution to manage all an organisations operational needs

 

What Is Dynamics 365 Business Central?

Microsoft Dynamics 365 Business Central, formerly known as Dynamics NAV, formerly known as Navision (yes really… it’s that old!) is a standalone, end-to-end solution, capable of managing all an organisations business processes in one place as part of Microsoft’s range of awesome D365 business applications.

Its digital transformational power is immeasurable, allowing organisations to streamline and automate a vast array of business functions, processes and roles, meaning valuable staff can get on with much more important (and ROI heavy) duties.

It’s capable of handling an organisations finances, customer engagement, customer service, operations, sales, reporting and much, much (much) more.

It’s also great value for money as it can integrate with the full Microsoft Stack (including Office 365) out-of-the-box, but is flexible enough to be customised for specific sector or industry needs that might be unique to a particular organisation using Power Apps, Power Automate and Power BI for additional low code/no code solutions, massively reducing the cost of ongoing development.

What Makes D365 Business Central Stand Out?

What makes Dynamics 365 Business Central stand out against so many other ERP solutions has to be its flexibility.

We’ve already mentioned how it can be customised and extended with numerous low code/no code solutions through Power Apps and Power automate… what we didn’t point out though was the months of development time that will save without needing years of development experience to even attempt it.

That means each user can be set up with their own homepage/dashboard, featuring different metrics or apps relevant to their roles or themselves.

 

The fact that Microsoft invest in their stack so much also doesn’t hurt.

It means organisations gain access to an entire eco-sphere of amazing products, extensions, plug-ins and bolt-ons, from good ol’ Office 365, right through to the entire range of D365 products and finally to any apps you create yourselves on the Power Platform.

You’ll be able to create Business Central purchase orders directly within Outlook, a Power BI graph could appear right on your BC dashboard or you can manage and automate all of your warehouse’s operations… all at the click of a button!

 

Now one thing we hear sometimes from people researching ERP solutions is that Business Centrals sheer scope can sometimes feel a little overwhelming.

Fortunately, prospective users of BC can treat it a little like a pick & mix (we say a little bit… aside from the fact it’s not edible you can treat it exactly like a pic & mix), just choosing the modules that will be of benefit to the organisation at the time.

It’s then very easy to add all the extras as the scope of the business grows, letting Business Central grow with you.

Enough about what Dynamics 365 Business Central is though… and on to what it can do!

Financial Management With Business Central

At its very heart, D365 Business Central is, and always will be, a financial software management system.

That’s not to say it can’t do other things, because it absolutely will, and will do so awesomely, but we’re starting with Financial Management as that’s what BC does best, is what it’s best known for and is the most commonly used function.

BC will provide the core functionality needed for an organisations basic financial management solution requirements, whilst also being flexible enough to address more complex issues such as multi-currency, multi-site and multi-company issues, all whilst making it child’s play to connect an organisations financial transactions and data across the business, across affiliate companies and even for multi-site or international firms (there’s a common misconception that if you’re an international firm Business Central isn’t suitable and you’ll need to pay for D365 Finance & Operations but that simply isn’t true… as we explain here >>).

In fact, the multi-currency functionality includes automated revaluation and recording of realised and unrealised gains/losses straight out-of-the-box.

 

It also allows you to streamline financial processes whilst getting a real-time view of them in a single dashboard. The sharing of financial data is suddenly a lot more secure when done so through Business Central thanks to Microsoft’s baked in security-by-design principals, meaning budgets and cashflows, if required, can be a lot more collaborative, rather than siloed away.

 

In total, the functions provided by Microsoft Dynamics 365 Business Central are:

 

  • Cash Book
  • Consolidation
  • Credit Control
  • Fixed Assets
  • General Ledger
  • Inter-company Trading
  • Liquidity
  • Multi-Dimensional Analysis
  • Purchase Ledger
  • Sales Ledger

 

Getting more specific, a lot of automation becomes possible with BC. This is just a short list of some of the cool things it can do…

 

  • Incoming payments can be automated to be applied to the related customer, with the invoice being marked as paid at the same time, making the reconciliation of accounts seconds rather than the hours or days it used to be.
  • Data can be connected in to provide recommendations on the best time to pay vendors to take advantage of vendor discounts or avoid late penalties/fees.
  • The sales and purchase ledgers are all fully integrated with stock and order processing features but… If required, both sales and purchase ledgers can operate independently of the stock and order processing functions.
  • Sales invoices (credits) can be raised against the General Ledger to invoice clients for non-stock activities such as commission, expenses or other ad hoc items.
  • Purchase items can also be entered directly for non-stock activities (expenses, utility bills etc). That non-stock process is also capable of being started as a non-stock purchase order, which, as you’ve probably guessed, can then be later matched up to the purchase invoice.
  • It’s possible to spread purchase invoice lines over multiple and varied cost centres that can then be posted to prepayment accounts for later release. Once done, both cost spreading and those future period releases can then be set up as templates to be later replicated at the click of a button.
  • The costing of goods will automatically be worked through to the finance system. That’s achieved through valuing stock purchases at the time of receipt where it’s first accrued as goods received, but not invoiced, before being finalised or invoiced.
  • The costs of goods sold will be automatically released when a sales shipment is posted. There’s also a wide variety of costing methods that can be chosen, from FIFO and LIFO through to standard and average depending on organisational preference.
  • It goes without saying that the general ledger has a general journal feature but… in Business Central, it’s possible to journal between subledgers, for example between sales ledgers, the general ledger, cash book or fixed asset ledgers. Journals can then be saved as templates to allow for repeated use. It’s even possible to set up a recurring journal to take care of tasks such as monthly accruals or prepayment activities (even reconciliations)!
  • Whenever a transaction that could have a financial impact to the organisation, no matter how small, is recorded, the general ledger will be automatically updated (so if stock is bought or sold; adjusted on or off; money gets pain in or out etc). At the same time BC will also post any relevant VAT info into the VAT sub ledger to be used for later reporting.
  • The General Ledger is structured such a way in Business central that nominal accounts can be supported with unlimited dimensions for sub analysis of the business. Whilst, as we’ve said, those dimensions are unlimited and can be customised, some examples of the standard ones out-of-the-box options include cost centre, project, employee, sales channel and product type.
  • Whilst BC is capable of so much more, the financial aspect of its financial management we wanted to highlight was fixed assets. Business Central’s fixed asset module means an organisation can create multiple depreciation books, each, if necessary, with their own depreciation method. Those assets can then be tracked by barcoding products and a handheld device (such as a mobile or tablet) can be used to update their location and condition.

Business Centrals Reporting & Analytics Capabilities

Every organisation that’s ever existed has needed a reliable way to measure, report on and draw insights from their KPI’s.

How mature that process is however, has always varied from company to company. Understanding an organisations data and organising it in a way that will garner useful business intelligence can be hard.

Before, finding the right process or tool for that may have been difficult but fortunately Business Central, combined with Microsoft Powe BI has the answer (according to us anyway).

 

Power BI For Business Central

Power BI is part of Power Platform and is an awesome, out-of-the-box visualisation tool (lots of amazing looking graphs for your data in other words) that was designed to integrate seamlessly with Business Central to create and automate reports and dashboards (visual report displays).

 

Power BI will pull information directly from Business Central (or other sources like Dynamics or even third-party applications like Google Analytics) and display it in a visual format that makes it easier to understand complex data, patterns and trends at a glance. Once created, those Power BI dashboards can then be embedded straight into BC, letting a user see what they need at a glance.

Managing Purchasing Processes With Business Central

Monitoring your sales-orders and purchasing processes couldn’t be easier with Dynamics 365 Business Central as you’re provided, straight out-of-the-box, with a fully integrated order-processing suite that will manage both purchase and sales orders.

Once integrated they’ll work through automated workflows and dynamically updated inventory levels.

Those levels of automation will also help prevent unnecessary (or fraudulent) purchases as with the right levels of approvals mistakes (and fraud) become a lot harder to go unnoticed.

You can also modify invoices that have already been posted in the financial management system and document the correction without any of the normal issues that you’d normally have, create purchase invoices and orders that will record the costs of purchases whilst tracking accounts payable and even make the process of staying on top of expense claims easier through the implementation of advanced workflow and approval structures.

Another nice feature is the ability to automate tasks related to vendors, including the tracking of agreements (discounts, prices, payment terms etc) so nothing is ever missed again when items are ordered.

 

The best thing about the purchase processes withing D365 BC however is that all that lovely functionality is linked right across the system, giving you total visibility, end-to-end, of the entire solution across the lifecycle of an order.

 

For the really nit-picky amongst our readers who are interested in exactly what’s possible, the full list of what Business Central can do is:

SALES ORDER PROCESSING

  • Alert Management
  • Catalogues
  • Credit Control
  • Document Format Management
  • eCommerce Integration
  • Mobile Sales
  • Order Promising
  • Price/Discount Management
  • Quote-to-Invoice Management
  • Retrospective Discount
  • Returns Management

PURCHASING

  • Alert Management
  • Catalogues
  • Consignment Management
  • Price/Discount Management
  • PO Budget Control
  • Requisitions-to-Invoice Management
  • Retrospective Discount
  • Returns Order Management
  • Vendor Management

 

The purchasing functionality baked into BC is also more than capable of giving easy visibility to users of stock levels, sales history and even future forecasts to create recommended purchase lists.

PO’s (purchase orders) are automatically generated and, with a clever bit of AI, all vendor information can be used to provide cost prices and a delivery expectation.

Without any input from a real person (post the initial creation) integrated workflows are capable of generating authorisation requests based on previously created business rules meaning, amongst other things, purchase orders will be able to be sent out automatically via whatever format you’ve designated (post, email etc). Goods can then be received into the stock management system and when a purchase invoice is received, it will automatically be matched with any outstanding receipts.

That process will, again, allow an organisation to automate the approval of invoices, sending them via workflows, and, once they’re posted in the ledger, invoice aging can be made visible to any purchase ledger user who requires it.

 

It doesn’t end there though (gasp).

BC also offers integrated document management features that allow organisations to do really cool stuff (depending on your definition of cool), like barcoding documents.

Why you ask?

A barcoded document can be scanned with a mobile device and automatically linked to the relevant file in Business Central.

When invoices then become due, they’ll automatically be added onto a payment suggestion. Once they’re approved, a BACS file can be produced or a cheque printed automatically. Business Central can even work with SEPA payment mechanisms!

 

Basically, what we’re trying to say is that the Purchase to Pay and Sales to Cash processes in Microsoft’s Dynamics 365 Business Central application can do a LOT of stuff. A lot of really complicated stuff that would previously have been many hours of manual work. Except now, after it’s been configured correctly, it can now all be done at the push of a button (and oft times the button won’t even be needed).

Sales & Marketing Functionality Within D365 Business Central

Although Microsoft offer separate D365 Sales & Marketing solutions, Business Central actually comes with some Sales & Marketing features already built in.

Depending on the size and scope of the organisation, Business Central will integrate and extend seamlessly with these but if your organisation only needs limited functionality in those areas then D365 BC is a great, all in one, solution.

BC’s sales & marketing functionality allows teams to drastically cut the time from quote to sale by linking up an organisation’s sales process to its accounting processes.

Sales & Marketing departments can also work closer together by flagging sales enquiries right from Outlook, with multiple ship-to and bill-to addresses per customer easily automated including direct shipment and invoicing addresses.

 

The full functionality of Business Central’s Sales & Marketing includes:

 

  • Campaign Management
  • Contact Management
  • Document Management and Generation
  • Email Integration
  • Mobile Sales
  • Opportunity Management
  • Sales Order and Process Management
  • Tasks/Interaction Management

Warehouse Management Made Easy With D365 Business Central

Business Central’s Warehouse Management feature is one of its best known, just behind its financial management solution (in fact the two work really well together).

It offers a comprehensive organisational information solution capable of automating a whole host of manual tasks (obviously), whilst connecting processes and workflows across the organisation that enable fact-based inventory management, providing all relevant stakeholders with easy-to-read visibility into business performance, margins, ROI, profitability and potential growth opportunities.

It can be set up with a very simple layer of control, where the stock is stored in a warehouse and BC manages high level tasks like quantity. Purchase Orders get booked in and sales orders get sent to the customer.

 

Now BC is great… awesome, in fact, but there’s obviously a limit to how many manual processes it can automate. Warehouse staff will still be needed to help with the picking of orders.

That being said, BC can still help speed up even those types of manual functions.

 

Business Central can be installed on phones, tablets or any other kind of warehouse specific handheld terminal (HHT’s) to speed up operations.

Put-away instructions can be shown on the device as the worker moves about the warehouse, highlighting via a map specific bin locations and the exact position of stock based on pre-set capacity and pick up zone configurations.

Moving away from handheld devices (although still possible on them of course) pick face replenishment, quality inspection and cycle counting can all be supported/automated through BC.

Again, listing the full capabilities of Warehouse management within Business Central:

 

  • Automated Data-Capture
  • Consignment Management
  • Courier Systems Integration
  • Cycle Counting
  • eCommerce Integration
  • EDI
  • Inventory Optimisation
  • Multiple Locations
  • Serial Number and Batch Tracking
  • Stock-Unit Management

 

And that’s all just straight out of the box, before the clever cloudThing  Business Central team start doing really awesome customisations!

Resource & Project Management Within Business Central

No one enjoys Project Management (project managers accepted).

Keeping track of tasks, statuses, people and goals is by its very definition, laborious.

That’s why Microsoft created the Resource and Project Management solution for Business Central.

With it, a user can complete manual resource and project management tasks easily, whilst still collating all the data needed to help manage budgets and monitor the ongoing progress of tasks.

It’s most commonly used for time and expense billing as it comes with time and expense input, approval and generation of sales invoices to customers but organisations can use it to track both machine and staff project hours using the built-in timesheets and also access real-time data on available resources whilst managing usage levels and profitability levels.

Customer projects can all be created, managed and tracked from within BC, seamlessly integrating with all of BC’s other functionality of required, to provide detailed job costing and reporting capabilities.

 

Dynamics 365 Business Central is a great end-to-end solution for project management on both simple and complex tasks, with the connectivity to BC’s other functions, plus the wider Microsoft stack making it a great choice. The integration with the purchase ledger allows for other costs incurred against a job to be posted and used for subsequent creation of sales in invoices where relevant.  WIP can be calculated and costs or revenues released to the G/L accordingly.

Other features include:

 

  • Resource Management
  • Capacity Management
  • Cost and Price Management
  • Job Lifecycle Management (Opportunity to Completion)
  • Timesheet and Expense Recording
  • Cost and Work Breakdown
  • Budgeting

Business Central’s Service Management Functions

It doesn’t matter what you’re looking for here, contracts, warranties, SLA’s or NDA’s, Business Central can optimise operational efficiency by providing up to date (and constantly updated) efficient contract management whilst enforcing business critical processes.

As well as contract and SLA management, the service management functionality withing D365 Business Central also offers:

 

  • Item and component history
  • Mobile service
  • Work and material planning
  • Scheduling and Dispatch
  • Service Item Tracking
  • Service Order Management
  • Service Price Management

Empowering An Organisations Manufacturing Processes With D365 Business Central

Business Central, when it comes to manufacturing processes, is there to streamline and manage operations, from factories doing just a basic form of assembly right through to giant multi-site MRP.

BC, as with all its other fictions, integrates streamlines and manages every aspect of an organisations manufacturing process, including planning, scheduling, inventory levels, distribution networks and financial management, often using other BC functions to do so.

 

Manufacturing has come under a lot of pressure in the last few years (before we even start on the impact COVID has had). Many have found their market growth slowing, or worse, going into decline or reverse. The emergence of markets like China, with offering low cost offshore solutions has had a huge impact… at a time when customers are demanding lower priced, higher quality products with massively reduced lead times.

That’s where Business Central steps in, lowering operational costs through automation and the flawless integration of processes across the organisation.

The full manufacturing process suite includes:

 

  • Agile Manufacturing
  • Capacity Planning
  • Demand Forecasting
  • Finite Loading
  • Machine Centre Management
  • Production Bill of Materials
  • Production Order Management
  • Production Scheduling
  • Supply Planning
  • Version Management

 

All manufacturing firms have to manage their core sales, with stock management to buy and sell finished items but BC’s manufacturing module provides additional controls when it comes to the consumption of raw materials, managing differing machine or human activities before the outputting of finished items.

BC’s bill of materials and routing mechanisms support an MRP process and production orders (forecasted, planned and firm) are created to consume components.

Back or forward flushing is available as well as the discrete issuing of materials.

Features such as scrap management and sub-contractor manufacture are also integrated parts of the manufacturing module.

Plus, just in case you needed to create finished items from components but find that the manufacturing modules are too complex, Dynamics 365 Business Central provides a neat kitting module which allows for the simple creation of an assembly order, which removes the components from stock and creates finished items in one easy process!

UK Justice System Gets Largest Funding Increase In A Decade

It will be victims, courts and prisons that benefit from the MOJ’s new funding increase.

The MOJ (Ministry of Justice) will be pumping an additional £11.5bn into their budget by the end of the current parliament, a 12% boost year on year to help drive recovery across the UK.

An additional £2.2bn will be spent on the courts, prisons and probation service; £550 million will go to cutting reoffending rates and an additional £185 million will go as a boost to victim support services.

The new spending comes with a commitment to cut crime and restore the publics trust in the UK’s justice system.

 

Another £1bn has been ear marked to boost capacity post COVID and accelerate the recovery after the pandemic. That includes £447 million to deliver a swifter access to justice by improving waiting times and reducing court backlogs.

 

The additional £185 million to be spent on victim support services will fund an additional 1,000 independent sexual and domestic violence advisors by the end of 2023/25 and a 24/7 crisis helpline for victims.

 

18,000 additional prison places will use up a further £3.5bn, the biggest prison-building program for more than a century, with an additional £250m being spent on 2,000 temporary prison places in the meantime.

The pandemic created unprecedented challenges but this settlement is the largest increase in more than a decade for the justice system. That means we can focus on building a better, more efficient, justice system for all. The extra investment will help us protect the public by bringing criminals to justice quicker, reducing stubborn reoffending rates and supporting victims better than ever before. – Dominic Raab – MP, Justice Secretary & Deputy Prime Minister

Finally, £324 million will be spent over a three-year period to increase efficiency and timeliness in civil and family court and the threshold and eligibility of legal aid will be increased in civil cases and family court, benefitting millions annually.

Crackdown On Multination Firms’ Tax Avoidance

World leaders in agreement over global tax reform for tech giants.

A new global tax deal has been agreed by the likes of the UK, US, France, Italy, Austria and Spain, in a decision to move away from national DSTs.

A new DST-credit system will ease the transition between the UK’s current DST and the beginning of the global tax system coming into effect in 2023.

An agreement of a global minimum corporate tax rate of 15 per cent on multinational firms has been met this month by 136 countries.

So who do the rules apply to?

The new global tax deal will apply to global companies with at least a 10 per cent profit margin and will see 25 per cent of any profit above the 10 per cent margin reallocated and then subjected to tax in the countries they operate.

The aim of the changes is to crack down on multinational firms who operate mostly digitally, who circumvent taxes by only paying where they have headquarters and not where they operate. The agreement also seeks to deter tax avoidance by filing profits in low-tax-rate countries like Ireland and Luxembourg.

It was found in June, according to the Fair Tax Foundation, that the biggest US tech firms paid almost $100 billion less in taxes over the past decade than stated in their annual reports.

They also state that the ‘Silicon Six’ (Amazon, Apple, Facebook, Microsoft, Netflix and Google’s owner Alphabet) have a severe discrepancy in their income taxes and their revenue. For example, they paid nearly $219 billion in income taxes from 2011 to 2020: about 3.6 per cent of their more than $6 trillion in combined revenue.

You might be wondering who the biggest tax avoiders are.

According to the report, Amazon and Facebook are the biggest avoiders: the researchers claimed that Amazon paid $5.9 billion in taxes between 2011 and 2020, on reported profit of $60.5 billion and revenues of $1.6 trillion.

 

Since its introduction in April 2020, the DST has netted the Treasury £300 million in the 2020/21 financial year. It charged a 2 per cent on the gross revenues of social media companies, search engines, and online marketplaces.

The revenue that the DST raised will be kept until the new system, ‘Pillar One’, comes into effect. At that point, companies will be allowed to claim back credit against future bills, the difference they paid in tax under DST and what would have been paid under ‘Pillar One’, effective January 2022.

Talks on how the new tax system will be implemented are ongoing in the coming months, with world leaders working out how it’ll be done.

 

Why Data Classification Is Vital To Your Organisation (And How To Easily Implement It)

Data classification is used by organisations to adhere to security, privacy and regulatory requirements when collecting, storing, and processing data

 

No modern organisation can exist without data but… as important as data collection is, being able to effectively classify and then use that data it is just as, if not more, important.

 

Data classification is vital for Business Intelligence, security, and most of all, regulatory compliance.

Whether you store your data on-prem (but why would you?) or in the cloud, understanding and classifying it will provide the bedrock for your data security and make compliance with all applicable regulations manifestly simpler.

However, if you prefer a more tangible ROI, then practical and efficient data classification also adds a deeper and richer level to all business intelligence, allowing for more concise and trustworthy business critical decisions.

What Actually Is Data Classification?

Data classification is the term used when a business, institution or individual organises their data (both structured and unstructured) into discrete categories that show the differences between them in a useful way.

Some of the standard classifications commonly used include:

 

  • Public data
  • Confidential data
  • Sensitive data
  • Personal data

What’s The Point Of Data Classification?

Breaking it down to its simplest definition, effective data classification allows an organisation to understand the types of data they’re collecting, retaining and storing and where in their systems they’re doing so, based on its value and sensitivity.

 

Having modern processes and tools to aid in this allows for:

 

  • More effective prioritisation of security protocols
  • Better risk management through improved regulatory compliance procedures
  • Improved productivity and business critical decision making by having relevant, real-time, accurate data that’s easily discoverable/searchable
  • Huge reductions in the cost to maintain an organisations data through the removal of duplicate or old, no longer used/needed records.

Different Ways To Classify Data

Confusingly, there are many different ways to both categorise and then classify your data, although they all have a similar basis.

The first step is to collate all your data into broad ‘categories such as…

 

  • Content Based – A content-based classification system will look to inspect and then ‘interpret’ your data, looking for issues you highlight such as sensitive information.
  • Context Based – A context-based classification method will look at where the data was originally created, where it’s currently stored, any creator tags that may be affixed to it and numerous other variables that act as indirect indicators as to the nature of the data.
  • User-Based – Finally, a user-based classification methodology will rely on a manual selection by an individual as to what the data is i.e. public, sensitive, restricted etc.

 

From there you can look to further classify it. This will often be sector or use specific.

The simplest method would be a three-level classification of your date, Public, Internal and Restricted.

 

  • Public Data – An organisations public data will be, as it sounds, be freely shareable with the public.
  • Internal Data – Internal data will be data with a low security threshold. It’s likely all staff within an organisation can see this, but it’s still something that might not be appropriate for the public to see.
  • Restricted Data – Finally there’s restricted data. This will be propriety, highly sensitive or both. It’s likely the sharing of this type of data could put an organisation at serious legal or financial risk, so additional steps need to be taken to secure its integrity/security.

 

Once an organisation has mastered a three-level classification system they can then consider taking the next step to a more complicated version, should it be needed.

Many organisations will use a four or even five level classification system with public being the ‘top’ or most open level.

 

  • Public – As already mentioned, this is data that could be shared with anyone
  • Proprietary – Any information specific to an organisation that whilst not public, isn’t sensitive, such as internal processes and the like
  • Private – From here the data starts to need better security for items like individuals’ names or account information etc.
  • Confidential – As it sounds, confidential data is just that; data that through contractual obligations (NDA’s for example) or other processes, can’t be disclosed; such as contract information or employee reviews.
  • Sensitive – Finally we get to sensitive information again; data that could hurt the organisation financially or put it at risk in some other way if it became public such as losing control of its intellectual property.

Benefits Of Classifying Data

As we’ve already mentioned, there are a whole host of reasons to classify data within an organisation, most of them focussing around security, regulatory compliance or improved business intelligence.

 

Data classification will always be the first step to protecting valuable data. If you don’t first classify data that’s sensitive/confidential/proprietary, then it means you need to protect all your data to the same degree… something which will obviously occur additional costs both in time and resource.

It also means there’s no way of knowing who in an organisation should have access to what, which in of itself can raise a lot of security (and regulatory) issues.

 

The other major benefit to data classification is one of regulatory requirements.

Many local and international regulatory requirements require an organisation to protect specific types of data such as personal or sensitive (think GDPR or GDPRUK requirements) in a specific manner.

Classifying data correctly makes the job of determining what data needs what security a lot easier.

How To Set Up Data Classification As A Process

By now we should’ve (hopefully) convinced you that classifying your data is a good idea… but you may now be wondering how to go about it.

Don’t worry, we’ll show you how and it’s actually quite simple.

 

The first thing to do is to actually create a data classification policy for your organisation.

That should include a description of the different types of data you might hold, how they should be classified within a framework, what you hope to achieve from it, who the data ‘owners’ are, who regularly (or ever) handles the data, who is responsible for the data and what regulatory legislation needs to be adhered to in storing and processing it.

The classification of the data should be simple enough to remove all ambiguity as to its appropriate level whilst rich enough to provide context as to why it’s been classified thus.

Once that’s done the data needs to be tagged appropriately, with all sensitive or personnel data an organisation holds being sorted into the right category.

 

Finally, once it’s been established where the data is stored and its level(s) of sensitivity, appropriate security can be implemented that ensures it’s compliant with all relevant regulatory legislation.

After that, it’s just a case of regularly reviewing the data and the processes that control it to unsure it’s still adhering to current best practises and applicable regulatory requirements (as these both have a way of shifting over time).

How To Inspect Items When Using The Execute Pipeline Activity In ADF/Synapse Pipelines

Greg Roberts – cloudThing, Data Scientist

A workaround for ADF/Synapse pipelines when using the Execute Pipeline activity to inspect anything about the run from the calling pipeline

 

A longstanding issue I’ve found with ADF/Synapse pipelines is that if you try to use the Execute Pipeline activity, then it won’t really let you inspect anything about the run from the calling pipeline.

 

This can be generalised to state that within any individual pipeline, you’ll never really be able to understand much about other pipeline runs.

There’s actually a big Azure Feedback post about this, but unfortunately no one seems to have come up with any kind of workaround… So, here’s my workaround…

 

Just call the Synapse/Data Factory API from within the pipeline!

Setup the Rest API as a LinkedService:

 

 

  • Auth type = Managed Identity. It’s worth noting that AAD Resource is not the usual endpoint used for managed identity authentication (see this note)
  • For the base URL, just have workspace_name as a parameter for the linked service. This will be used later and will make this linked service agnostic to the environment you’re in.

 

Then create a dataset from this:

 

 

As you can tell, I’m  being really generic here and just passing the relative url through for now.

Again, it’s worth noting that workspace_name is a parameter here… we only define it in the actual pipeline.

 

Et voila!

Now we have a dataset we can use to get data from the Data Factory API. e.g.:

 

 

As you can now see, we can pass in the workspace name, so this linked service can be moved to new environments with no issue!

 

I actual had to do a double take when I first hit preview data…

 

 

We can now easily get the top level output of a run.

 

“But what about details of the inner activities” you ask…?

Glad you did! Just use this method and hey presto, we now have all the details of our pipeline run…

 

 

 

 

Since this is a REST dataset, we unfortunately can’t use it immediately in a Lookup or Get Metadata activity… but, we can easily persist this to a storage account, then look it up and understand anything about how our pipeline ran, from within a pipeline!

 

This was fun little workaround to create and saved me from a jam.

As I couldn’t find anyone else talking about having done this, I thought I’d leave it here in case you found it useful.

New Cultural Change Tool Unveiled For NHS

The NonProfit organisation ‘Skills for Health’, has unveiled its new Custom Rostering System designed with an NHS workforce in mind

A new eRostering solution provided to the NHS by Skills for Health to aid workforce planning and development processes has been fully designed with the complexity of the NHS in mind; customisable, flexible and integrated.

The Custom Rostering System developed by the not-for-profit organisation Skills for Health, has been designed as a tool for culture change, and will transform planning roles within the NHS from data entry to data intelligence.

The CRS provides a top-down view of the workforce, allowing for seamless management of any staff group and creating more efficient rosters for patient care.

Skills for Health have a long history of working in the health sector supporting the workforce, including providing rostering systems,
This unique expertise has helped them build CRS, designed to support healthcare organisations maintain compliant staff schedules and improve patient care delivery.
The team have truly understood the challenges of the sector, and the complexities of managing shifts across different and complex workforce needs and contracts. Subsequently, CRS will readily create and manage short and long-term staff rosters for any staff group, maintaining compliant staffing levels, and making sure teams are crucially in the right place, at the right time. – Niamh McKenna – Skills for Health Trustee & CIO, NHS Resolution

The CRS will allow integration with multiple management systems, meaning its interoperability will allow more people to make informed decisions and reduces reliance on administrative planning work and locum staff and agencies.

CRS can integrate with numerous other systems, which means it’s a perfect tool to integrate seamlessly into any organisation, supporting digital transformation. I have been struck by the level of care and attention that Skills for Health has put into this development, building a product that exactly meets the specific needs of the health sector, and I am looking forward to seeing it in action in a wide range of settings. – Niamh McKenna – Skills for Health Trustee & CIO, NHS Resolution

Its ‘your system, your way’ ethos means it’s built on two core principles; every staff member deserves a system that works in line with their own individual contract and terms of employment; while every organisation needs to be able to manage staff in line with their unique ways of working, CRS fully customises workforce schedules for the specific needs of each NHS staff role, in any organisation.

Every staff member on the CRS has a contract associated with them, and it is through these contracts that the workforce can then be effectively and safely deployed. With its robust reporting, and by meeting the evolving and ever-changing contractual needs for all staff, proven to be so critical throughout COVID-19, CRS will enable trusts, departments and teams to continue to deliver effective rosters that allow staff to have more time to focus on patient care. – Dr Sara Munro – Skills for Health Trustee & CEO, Leeds & York Partnership NHS Trust

The CRS isn’t there just to make the administrative work easier, though. A consistently managed rota ensures the improvement of the workforce’s work-life balance and improves care for patients in turn.

I’m so excited for this much-awaited launch of CRS, Combining the deep market knowledge and insight Skills for Health brings across the sector with an innovative application of technology, this solution helps organisations and teams of all sizes to manage the most important asset of all, their people.

The focus on security, compliance and customisation for a range of unique settings make this a leading offering for any provider. The fact that it has been built with the NHS workforce in mind means trusts can feel confident when they make an investment in CRS.

Reap the benefits of all that comes from Skills for Health’s deep local knowledge, proven capability and continued engagement, while also supporting their not-for-profit mission, by finding out more about CRS today, and vitally, enable the continued development deserved by your healthcare workforce. – Daniel Langton – Skills for Health Trustee & Strategy Director and Chief of Staff, Microsoft UK.

New Unit Launched As Part of MoD’s Ongoing Data Reform

DASA launches a new unit to discover Emerging Innovations and Rapid Impact technologies within the MoD.

There’s an information reform going on in the UK military, in which it aims to use data and technology to provide an advantage to UK defence forces.

Which is why it’s exciting that the UK Defence and Security Accelerator (DASA) has launched a new unit named Military Systems Information Assurance (MSIA), as part of its new Innovation Focus Area (IFA).

This follows the publication of the government’s integrated review of security, defence, development and foreign policy in March – and will be funded under such. Highlighted within the policy was the need for resilience, and the heightened capability of tackling cyber threats.

The new unit will cover a number of duties under the proposal; from accounting for distinct and hostile demands placed on military information systems, to seeking approaches and technologies that may be an alternative to cryptology, the most common way of ensuring the safety of information.

The different ways this can be assured include novel methods of authentication and the different ways you can store information securely in a cloud environment and at rest.

The unit will also have to seek solutions to ensure that information is not interrupted by wobbly bandwidth and intermittent communications.

Funding for the MSIA will be based on technical readiness level (TRL).

Lower-level proposals will be considered an Emerging Innovation. A proof of concept will be asked to be provided within a six-month contract and can secure funding up to £150,000, on DASA’s advice.

Higher-level TRL proposals will be considered as a Rapid Impact. Within a 12th month contract, a concept demonstrator will be required, and a bid could win up to £350,000 in funding.

Submission for applications will close on the 5th of January 2022.

This announcement by DASA follows the launch of another IFA in August which aimed to reduce the MoD’s systems exposure to cyber-attacks.

The technologies aimed to be developed included a programmed in which the MoD paid bounties to white hat hackers for discovering security bugs in its computer networks, to raise security across its networks and devices.

 

How To Create UI Flow’s In Power Automate

Bring the power of RPA to your desktop

 

Everyone loves a good bit of RPA (Robotic Process Automation), especially all the really cool people here at cloudThing.

That’s because RPA is awesome.

Imagine sitting at your desk, day in, day out, inputting the same data over and over, updating the same spreadsheet, over and over… forever. Now imagine that could be automated by a clever piece of software.

Congratulations, you just imagined Robotic Process Automation!

RPA is an amazing tool for organisations looking to accelerate their digital transformation without having to replace older, legacy systems that might require expensive integrations to work with newer solutions.

 

As you’d expect, Microsoft support RPA through Power Automate, and more specifically, through something called UI Flows, more and more frequently called Desktop Flows.

Whilst that all may sound terribly complicated, it’s actually really easy to set up a UI Flow… as we’ll show you here.

How To Create A UI Flow In Power Automate

THE UI FLOW SETUP STAGE

Creating a UI Flow couldn’t be easier.

Once you’ve decided to create UI Flow you just need to head on over to Power Automate to and record the steps you’d like your flow to execute (it’s important you remember to swap out all your normal values for parameters though).

You can also capture text from windows or web apps and return that data from the flow you’re setting up.

Before starting though, don’t forget that you’ll need to download all the tools required to record and test your UI Flow. Don’t worry however, Power Automate will prompt you to do this if you do.

 

 

As we’ve already mentioned, UI Flows in Power Automate will support both windows and web apps but they will use different frameworks for both.

When discussing creating a new web-based UI Flow, the first step you’ll need to take is to decide which type of application you’ll be automating.

Using a desktop app as an example we’ll automate adding ledger credits into an accounts system.

 

 

 

From here you need to define the input parameters you’ll require before you start to record your steps.

Once you’ve defined your input parameters for your UI Flow you can begin to record your steps with what’s known as the launch recorder.

Select the Launch Recorder option and for our purposes here you’d want to select the Use Input option, select the parameter and just click on the data entry tab you want populated.

 

 

Once you’ve populated the steps you want to capture, you’ll need to define your output. You do this by grabbing some of the data you’ve just filled in and returning it to your Flow.

To accomplish this, just click the Get Output option and highlight the information you want returned by your parameters, giving it a name.

 

 

From here, you can edit and steps you’ve recorded or add manual actions as may be required.

 

 

Whilst this may seem slightly counter intuitive, at this point you need to remember that UI Flows can’t be deployed directly. They need to be called for by using another Flow and using the action ‘Run a UI Flow’.

To do that you’’’ need to create another Flow.

 

 

Although Power Automate UI Flows are triggered in the Cloud, they also need to be able to run locally.

That means you’ll need an on-prem data gateway installed on the machine the Flows are running on.

That will act as a connector between your cloud services and your local machines requirements.

Doing that however is just a simple case of configuring the connection between your UI Flow to the machine the RPA will run on by specifying the gateway, username and password.

 

 

 

 

Hopefully this has given you a taster as to how easy it is to set up a UI Flow using Power Automate but as always if you’ve questions feel free to reach out.

Schools Pushing Back Against Virtual Heads Over Vulnerable Children

Experts warn councils are failing their duty of In Loco Parentis.

Virtual School Heads are failing in their duty as they lack the power to get children in care the education they need as schools have been ‘pushing back’, MP’s have been warned.

Patrick Ward, chair of the National Association of Virtual School Heads, told the education committee last week that local councils were “failing as a corporate parent” because they far too often weren’t using their statutory powers to secure more school places for vulnerable and at-risk children.

What Are Virtual School Heads?

Virtual School Heads (VSH’s) have a Central Government mandated duty to promote and empower the educational achievement of children in care within the Local Authority (LA) they’re employed by.

They’re also responsible for managing their pupil premium funding and for its allocation to schools and alternative provision (AP) settings.

VSH’s also manage the early years pupil premium (EYPP) and allocating it out to early years providers that take care of ‘looked-after-children (children in the care of the LA) who are entitled to free early education.

 

Legally, schools should be prioritising looked-after-children for their admissions with some councils resorting to ‘forcing’ schools to take these admissions. However, whilst a council can force this issue, not all councils have delegated that power to their VSH’s.

 

A survey conducted in March of this year showed that councils had been forced to initiate the process directly, with around 30% being forced to do so once or more very year.

When you try and place a young person in a mainstream school, a vulnerable young person, you get a lot of pushback. Schools essentially do not want to take these young people because they believe there will be a negative impact then on their outcomes. – Patrick Ward – Chair, National Association of Virtual School Heads

Ward also went on to warn the Government that there wasn’t enough data being collected on looked-after-children missing out on education, which fed back into a lack of accountability for both local councils and Virtual School Heads.

No one holds a director of children’s services or a local authority to account, or a virtual school for that matter, over how many of their children are missing in education or in unregulated provision. The department doesn’t know. The stats aren’t held anywhere. – Calvin Kipling, Virtual School Head, Darlington

A Department for Education spokesperson said they were “taking steps to build on the success” of virtual heads and “further raise educational standards” which included a new £3 million pilot for better support in post-16 education.

They added statutory guidance is “clear” on virtual heads duties, with councils “held to account” through Ofsted.

 

However, Paul Whiteman leader of the National Association of Head Teachers pushed back to say:

Sufficient support needs to be available to support each child’s needs. If children are placed in schools that can’t meet their needs effectively and support them fully to be able to learn, then that is not a good place for the child. – Paul Whiteman –NAHT

 

 

D365 BC Vs D365 FO: Let’s Settle This Once And For All!

Which is better? Microsoft Dynamics 365 Business Central or Microsoft Dynamics Finance & Operations?

 

Microsoft Dynamics 365 is awesome.

We know that and you know that, so we won’t belabour the point (too much). It offers ERP, CRM and BI solutions, all in one system, which means you, your staff, your processes, applications and systems can all be unified in one place, offering an unparalleled visibility to, well… everything!

 

The issue we’re tackling in today’s article however is that within D365 there are actually two ERP solutions… D365 Business Central and D365 Finance & Operations.

As a Microsoft Gold Partner, we kind of have to say both are cutting-edge ERP solutions and let’s face it… they are.

The question is though, why does Microsoft have two ERP solutions in the first place and which is better?

Microsoft Dynamics 365 Business Central

You may or may not know but Business Central is a revamped and vastly improved version of an older system called Navision.

Navision, back in the day, did have a web client but only offered limited functionality (it’s come a long way since then of course).

 

It still comes with all of the old Nav. features, but… users can also access their ERP solution either on-prem (boooo) or in the cloud (yaaaay!).

Business Central is a mid-market tier ERP solution that’s typically recommended for small-to-midsize organisations as well as larger organisations that need a solution for just a few regional offices.

It’s there to help organisations manage their finances, whilst also providing a better level of customer support. It can also analyse and increase sales revenue by automating a lot of manual processes that frees up staff time to focus on more important (profitable) issues.

 

The Finance section of Business Central has been designed specifically for SME’s to provide things other solutions don’t, such as better accessibility and clarity of the data being stored. The UI is also incredibly simple to grasp and use, featuring items like:

 

  • General Ledger: Capable of charting all your accounts and presenting information in different formats (such as budgets in different currencies).
  • Financial Transactions: This section of Business Central will offer access to fixed assets, consolidations, bank reconciliations and more.
  • Accounts Payable: Accounts payable will give you access to supplier management, purchase orders management, purchase invoice management and can even automate and run your electronic payments.
  • Accounts Receivable: The accounts receivable section of Business Central covers things like customer management, invoicing, orders and BACS.

 

BC’s Finance functions will also cover things such as documentation, payroll, Business Intelligence and project management for you as well.

Microsoft Dynamics 365 Finance & Operations

Much like Business Central, Microsoft’s Finance & Operations ERP can be hosted either in the cloud or on-prem.

Again, much like Business Central, it can easily be connected to other Microsoft Apps such as CRM, Customer Service, Sales, Marketing or Talent.

As you can see, D365 F&O shares a lot of functionality with Business Central but differs in that, as well as offering some exclusive features, it was specifically designed for much bigger, more international organisations to address the specific needs those organisations might have.

Those needs will often include, but aren’t limited to:

 

  • Inter-company trading between international companies
  • Financial Compliance when country or territory specific localisation is required
  • International Supply Chain functionality such as product or vendor management, quality control, intercompany stock transfer and across borders transport management.

Which Is Better, D365 Business Central or D365 Finance & Operations?

Actually neither wins out on that question as both have been designed with very different organisations and needs in mind.

That being said… there are quite a few misconceptions floating around about the two that are worth addressing.

D365 BUSINESS CENTRAL CAN’T HANDLE MORE THAN 200 USERS

This one is completely false… D365BC is fine with far more than 200 users.

It’s true that D365FO is typically perceived as the ‘right’ or ‘only’ choice for larger, more complex organisations but in making the choice between the two it’s the functionality that’s most important… what it’ll be used for rather than how many people will be using it.

D365 BUSINESS CENTRAL CAN’T HANDLE LARGE AMOUNTS OF DATA

We hear this a lot and it’s technically true in the same way that eventually any system will need a certain level of customisation depending on how much data is fed into it.

D365BC is more than capable of handling a huge amount of data, however, at the point where you’re processing so much transactional data that D365BC starts to struggle, it’s very likely the organisation will need much of the functionality D365 Finance & Operations offers anyway, making it a bit of a moot point.

 

The Licensing for D365 BC is a lot cheaper than it is for D365FO, making Business Central even more attractive to SME’s. It’s also a lot more flexible.

If you’re the kind of company investing in Finance & Operations then the software and all its functionality will very much be ‘dropped’ onto you to work as is. Business Central on the other hand, whilst it can’t handle as much data, is a lot more flexible, with the capability to extend as needed.

Although custom fields can be added to D365FO, you can’t also add any kind of business logic to the field and it can’t support lookup fields.

With D365BC you can add extensions to add new fields and functionality as required.

 

In summary then, it very much depends on an organisation’s requirements as to whether D365 Business Central or Finance & Operations will be the more suitable solution.

The vast majority of the time, D365BC will be most suited to SME, whilst D365F&O will be the solution of choice for larger/international or more complex organisations.

 

However, it’s worth stressing again that the size of an organisation or the number of D365 users isn’t what’s important in making that decision, rather, it will come down to an individual organisation’s specific needs.

Using a real-world example; if you’re an international conglomerate with multiple companies all under one banner (no matter the size) and inter-company trading, stock transfers, consolidations etc will be important then that instantly puts you in D365FO territory.

However, if those companies largely trade as separate entities and never the twain shall meet then it would be better to access requirements on an individual basis, with the likely result being only D365BC being needed with limited licenses.

Grow And Transform Your Business With The New Azure Commerce Experience In CSP

Grow your business with the new commerce experience for Azure in the CSP program

 

#BuildFuture is at the heart of every decision cloudThing makes. We strive to grow our partners digital capabilities and make our solutions Agile enough to adapt to anything the future might bring.

Which is exactly why we’re all so excited to introduce you to the New Azure Modern Experience in CSP.

 

The new Modern Azure commerce experience has been fully designed around the concept of added value.

Microsoft will be encouraging partners like cloudThing to focus on driving customer (your) success via value-added services that generate sustainable profitability… or to put it into cloudThing’s language… create solutions that #BuildFuture.

What Does The New Azure Modern Experience Mean For You?

Your new Azure commerce experience has been created to both streamline and consolidate the way you buy and consume Azure services.

It will massively simplify the purchase experience, as you’ll be able to create as many Azure subscriptions as you need, all under one plan.

To simplify that further, that overarching ‘Azure Plan’ will basically be an ‘empty shell’ that is used to contain all your Azure subscriptions, but it will give you access to a multitude of additional reporting capabilities and dashboards through the Azure portal and ACM (Azure Cost Management).

 

It’s been created as a commercial structure that will house all your Azure ‘pay-as-you-go’ services and resources that were previously only available in other sales motions but are now available in CSP for greater flexibility and security.

You’ll have all the tools you need to, at a glance, be able to plan for, analyse and reduce unnecessary spending, allowing you to effectively maximise your cloud investment.

That’s been achieved by giving end users access to a new set of tools to monitor, allocate and optimise ongoing cloud costs with in-depth analytics.

You’ll also only be invoiced for the consumption of Azure resources that have been specifically mapped to your Azure plan, simplifying the partner purchase experience as a partner (like cloudThing) will be able to provision multiple Azure subscriptions, all under one plan.

What Is Azure Cost Management?

Azure Cost Management (ACM) is a free solution that’s offered to any and all Azure customers and can be accessed via their Azure portal.

It’s there to provide information about overall costs across all an end-users Azure services and marketplace products as well as providing insights, reports, and can even provide data about your organisations use of other cloud providers.

 

Once it’s been enabled, Azure Cost Management will continuously monitor all of an organisation’s resources and provide ongoing reports. It can even be integrated with Azure Advisor to garner cost recommendations that are specifically tailored to your usage.

 

Perhaps the biggest benefit to ACM is its ability to map cloud costs to specific departments or projects.

ACM is capable of classifying your resources into multiple buckets using something Microsoft refer to as cost entities.

Cost entities are departments or projects within a single organisation that pay for Azure services. It can also create a cost model that will structure your Azure resourcing according to ‘tags’ that different teams have applied to their required Azure resources.

 

Once those cost entities and models have been accurately defined (something cloudThing is always happy to help with) the various teams within an organisation can use ACM to both view and investigate unexpected costs associated with their specific budgets. Alerts can also be set to warn or limit overuse for projects, teams or even specific users.

How Do I Qualify For Modern Azure?

To be able to move to Modern Azure, Microsoft have set up a series of simple pre-requisites, making you eligible if:

 

  • An indirect reseller has signed the Microsoft Partner Agreement
  • The end-user has accepted the Microsoft Customer Agreement
  • The subscription is in active status.

However, it’s worth noting that once an end-user is moved to Azure Modern, they won’t be able to move back to Legacy.

How To Drive Donor Engagement With cloudThing’s Free Powerups

Donor Engagement.

It’s such an easy thing to say, isn’t it? “We need to engage with our donors more”.

 

It’s an easy thing to say but for the NonProfit sector it’s increasingly becoming a much harder thing to do.

During lockdown people had less time and/or money to give but even before that, the number of charities operating in the UK alone was/is causing a type of burnout in which people aren’t sure who (if any) to support anymore.

Beyond even that though, the demographic of the ‘typical’ donor is changing, with NonProfits needing and often struggling to adapt to all of these rapid changes.

 

Rage donations are a ‘thing’ now, with many charities seeing surges in donations after a well-publicised event, only to dry up again a few days later, making any kind of long-term donation planning incredibly difficult.

Millennials are getting older, with many reaching a well-paying level in their career or inheriting the wealth of pervious generations, making them an untapped demographic to target.

How though, do NonProfits engage with these new types of donors, converting them into loyal and consistent supporters?

 

A lot of these issues, in cloudThing’s experience, have core, similar causes in which solving one, often helps solve all the others.

 

A modern donor wants a personalised experience.

They want to know that the money or time they donate to your cause will have a direct, measurable and tangible effect instead of paying for a faceless director’s salary.

A modern NonProfit on the other hand needs to save costs, maximise donor revenue and try to avoid vendor lock-in, often having to spend money to save money. Convincing donors of that though isn’t always easy.

 

That’s where a Component Led Design and transformation,  Microsoft’s Power Platform and cloudThing’s powerUps come in.

 

Component Led Design is a process or methodology in which, rather than undertaking a huge, multiyear, multi-goal project, an organisation will scope out small, easily achievable projects that will bring them almost instant ROI but that which, much like Lego blocks, can be built up into said project.

The benefit of this approach, apart from seeing a much quicker return on investment, is that should circumstances change, it’s much easier to pivot direction without the entire project becoming obsolete.

Microsoft’s Power Platform empowers these projects through tools such as Power BI, Power Apps, Power Automate and Power Virtual Agents, allowing citizen developers with little or no coding experience, to build low-code/no-code applications for their organisation in little to no time.

However…

Due to the nature of the Power Platform, organisations can also benefit from pre-built solutions from third party partners. Leading us nicely onto… cloudThings powerUps.

 

cloudThing have created a range of powerUps for the NonProfit sector (all of them free). All your organisation will need is the appropriate Microsoft license (which we’re also happy to help on).

These powerUps offer proven capabilities, fulfilling a range of functions to instantly adding a layer of tech and automation to many NonProfit processes and business outcomes, including in this case… increasing Donor Engagement.

 

Microsoft are also supporting these goals for the NonProfit sector through the Microsoft Tech For Social Impact initiative.

Established back in 2017, it’s aim is to empower NonProfits and humanitarian organisations to advance their mission using the latest, cutting-edge tech. Microsoft’s and cloudThing’s goal is to help digitally transform the NonProfit sector using a sustainable, enterprise driven model, in which profit gets re-invested and new capabilities are nurtured and developed to further help delivery.

 

And that’s what our NonProfit powerUps have been designed to do.

We launched these to the charitable sector a while back (you can watch the launch video here >>) and since then many NonProfits have benefited immensely from their use.

Circling back to the start of this article then, how can these powerUps be used to increase donor engagement in a quick and easy way?

 

The cloudThing powerUp’s focussed on donor engagement have been designed to attract, retain and grow donorship lists through personalised engagement and automation.

The three powerUps most relevant to solving that problem are Donor Influence Network, Donor Engagement Scores and Communication Preferences.

What Does The Donor Influence Network powerUp Do?

The first step in increasing donor engagement is having the correct data to hand. For that you need to understand your donor network intimately and how it’s connected, both to yourself and amongst the donors themselves (particularly useful for discovering new donors!).

The free Donor Influence Network powerUp does all the legwork for you, helping a NonProfit to discover and understand the relationships between their donors, volunteers and the organisation itself.

All you need to do is connect it to your data and it will instantly start providing insights that allow your NonProfit to increase its reach, influence all of your donor’s activity (whether that be through increased donations or activism) and grow it’s network through a wider network of connections your current donors know.

In practical terms, the powerUp will provide you with a visual interface that displays your donor networks in a graphical manner, plotting out circles of influence, highlighting key donors and allowing you to instantly connect with 2nd or 3rd level connections through external sites such as LinkedIn.

What Does The Donor Engagement Score powerUp do?

Once you’ve located existing and potential donors and connected with them, you need to be able to track all their engagements with your organisation so you can see exactly where you’re doing well and where needs improving.

Our free Donor Engagement Score powerUp does exactly that, capturing all of an individual’s engagements with an organisation, tracking all their touchpoints across all channels, from social media, to email to webchat.

Once you have that data you can easily adjust strategies to increase engagement and more importantly, increase donation revenue!

The tool also works in real-time, allowing you instant feedback from new and ongoing campaigns, allowing you to see what changes are and aren’t working instantly.

As well as all that though, as the name implies, the powerUp will assign a ‘score’ to all your donors with KPI’s featuring rules that can be configured to your specific needs, allowing a NonProfit to prioritise where it spends it’s time most effectively.

What Does The Communication Preferences powerUp Do?

The Communications Preference powerUp is a NonProfits best friend for dealing with GDPR compliance when communicating with and driving donor engagement.

It’s been designed to simplify and automate all GDPR, DPA management and internal information polices by capturing all donor communication preferences across multiple channels and campaigns.

It allows all stakeholders to easily view, manage and amend all donor and volunteer preferences which will provide for much better customer facing experiences that can be pieced together with any solution.

 

If you’re wondering what else cloudThing’s (FREE) NonProfit powerUps can offer your organisation, aside from Donor Engagement, then you can see a more extensive list here…

 

  • Flexible Data Entry
  • Rapid Customisation
  • Automated Processing
  • Gift Aid Management
  • Compliance Solutions
  • Cloud Based Processing
  • Data Quality Management
  • Data Import & Export
  • Automated Release Management
  • Personalised Communications
  • Donor Preference Management
  • Network Discovery
  • Partner Integration
  • Donor Management
  • Salutation Management
  • Case Management
  • Income Processing
  • Payment Schedules
  • Gift Batches

 

Interested?

Reach out today to discover how your NonProfit can benefit from these free powerUps with zero risk of vendor lock-in.

9 Awesome Benefits To The Microsoft Dataverse

How can the Microsoft Dataverse benefit your organisation?

 

Data.

It’s such a small word but it’s literally at the heart of everything we do isn’t it?

It doesn’t matter if you’re a Nonprofit, run a Membership Organisation, are a titan of industry or head up a department in the Civil Service… you won’t be able to do what you do without reliable data.

More important than just ‘having’ data though is the ability to use it. To capture and collect it in a timely, secure fashion and to be able to analyse it in real time to inform business critical decision making.

Having data always has to be a means to an end, not the end itself.

 

However…

Building a modern data infrastructure from scratch, that’s capable of performing all the above functions, whilst still being agile enough to adapt to unexpected future developments isn’t easy.

That’s because data in the modern world bombards an organisation from a multitude of different directions, in a plethora of different formats, from so many different sources that it’s not even worth trying to count… and yet, from all that, we still need to derive useful Business Intelligence somehow.

 

Don’t misunderstand us.

It can be done… but it’ll be difficult to source the developers and expensive to complete, as whoever you find will need to have an in-depth understanding deploying, configuring, managing and integrating all those disparate data technologies to make use of the structured and unstructured data you hold.

 

As depressing as that all sounds though, there is an easy solution and, as you’ve already seen the title of this article we’re betting you’ve already guessed it…

The Microsoft Dataverse.

What Is Microsoft Dataverse?

The Dataverse was created by Microsoft to address all the above issues with a simple, end-to-end, compliant, secure, scalable and globally available SaaS data service.

It’s there to empower organisations to work with any and all types of data, no matter what the format or source, to garner real time, and most importantly, useful, business intelligence.

The Microsoft Dataverse is part of the PowerPlatform which means that an organisation needs to write almost no code (in many cases no code at all) to install it begin benefiting from it.

What Does The Microsoft Dataverse Do?

Microsoft have specifically designed the Dataverse to be able to work with any and all types of data, incorporating all the major data technologies (rational, non-rational, file, image, search, data lake etc).

To make using these integrations as easy as possible, Dataverse has a set of inbuilt visual designers that allow a user to create, edit and interact with their data, making it a simple process to scope out and define the tables, relationships, business rules and processes, forms and workflows that all come together to represent your business.

The Dataverse’s integration features have been ‘built-in’ to such an extent that it’s child’s-play to apply them to other cloud-based services such as Azure, Microsoft 365 and D365, including the Power Platform, plus numerus other connectors through Power Automate and Azure Logic Apps.

 

Due to those integrations, a wide-ranging list of enterprise level scenarios become ridiculously simple to achieve, from retrieving sensitive data stored on spreadsheets in an email attachment right through to creating blockchain networks… all with almost no code required (or often no code at all).

Implementing them also becomes a doddle. Projects that might previously have required days or weeks can now be done in hours or even minutes, and often by staff with no previous coding experience.

 

As well as all that though, Microsoft’s Dataverse also supports Virtual Tables.

A virtual table will allow an organisation to map their data in an external source so that it looks like it exists in the Dataverse.

That means, once configured, the Dataverse is capable of executing real-time data operations against an external data source.

Benefits Of The Microsoft Dataverse

So… on to the benefits of the Dataverse we promised you in the title. How can the Dataverse benefit your organisation?

THE MICROSOFT DATAVERSE MAKES EVERYTHING EASY

Anyone that’s ever had to try and create a common data storage solution with data flowing in from a multitude of different systems and applications will know how complicated things can get, extremely quickly!

Getting that solution is vital for the streamlining of development and empowering faster analytics… but it isn’t always fun.

If the data you’re working with isn’t capable of being easily shared then every development project, every improvement to a system, every application you create will need its own custom integrations to be implemented. That eats up a huge amount of time, costs money and will need to be repeated time and time again.

 

Using The Microsoft Dataverse, alongside the Common Data Model, just simplifies all that.

The Dataverse will give anyone using it a shared data store and language that all their business and analytical apps can use.

That sharing means all your data and applications interoperability can sit across multiple channels, service implementations and vendors… making yours (or your poor old developers’ life) infinitely easier.

STRUCTURAL AND SEMANTIC CONSISTENCY OF DATA

 

 

 

With the Dataverse comes the Common Data Model, providing a common language for business tables, covering the full range business processes and systems.

By using the Common Data Model all an organisations data will share a structural and semantic consistency, making it much easier to work with.

THE DISSEMINATION OF DATA

Once your data all shares a common and consistent form, disseminating it out to where it’s needed becomes much easier, allowing for deeper, more useful, and most importantly, real time, Business Intelligence.

 

Simplified integration and disambiguation of data that’s collected from processes, digital interactions, product telemetry, people interactions, and so on.

IT’S EASY TO MANAGE

We’ve already discussed extensively how easy it is to integrate the Dataverse, with little or no coding ability required, but it’s also worth pointing out how easy it is to manage once it’s been set up. Too often an expensive solution is set up by a third party for an organisation, only for them to find out they’re now locked-in to using them to keep it running… that’s not the case with the Dataverse.

THE DATAVERSE MAKES YOUR DATA SECURE

The Dataverse, by default, will secure data securely, only allowing users to see it if they’re granted access.

That role-based security allows admins withing the organisation to set permissions at whatever level they like for individual users at the click of a button.

IT INTEGRATES WITH DYNAMICS 365

Again, as a default, all of an organisations D365 data will get stored in the Microsoft Dataverse which will allow your users to build apps that benefit from your Dynamics data and/or extend them using Power Apps.

IT ALLOWS FOR DEEPER INSIGHTS

Once integrated the Dataverse can pull data from Power BI to create and publish reports that will provide insights and touchpoints that you may not have previously been aware of.

IT’S INFINITELY EXTENSIBLE

The schema behind the Common Data Model can easily be adapted and/or extended so that it can be used alongside the standard schema to fulfil any business function that might be unique to your organisation.

IT EMPOWERS INTEROPERABILITY

It’s a rare app that won’t pull data from multiple sources.

As we’ve already mentioned, a developer can integrate an app’s data per project, customising it every time at the application level but it’s much easier integrating it into a common store with a single set of logic to look after.

The Dataverse does that for you automatically.

You can use the store in Power apps, Power Automate and Power BI along with all of your other D365 data with ease.

Another benefit of that is if you want to use data that’s currently in the wrong format. If you import the data via Power Query it’ll automatically transform it into a useable format for you.

THE DATAVERSE PROVIDES CONSISTENCY

Finally, and perhaps most importantly, the Dataverse will provide consistency to your data, your apps and your business process.

Data within the Dataverse is stored in a set of tables. By defining a set of over two hundred different, standard business tables that over typical business scenarios, the Common Data Model has made it incredibly easy to build apps for any and all business requirements.

By using that model all apps you create or third party apps you sue will be consistent in how they operate, reliable and secure.

Next Gen AI That Can ‘Think Of Itself’ Being Developed By Facebook

Current AI’s are terrible at understanding a first-person point of view… Facebook’s next gen AI will change all of that.

 

Despite popular opinion, AI software has no sense of ‘self’ and is unlikely to do so any time in the near future. However…

Facebook have just announced that they are hoping to build an AI system capable of taking an ‘egocentric’ view of the world.

 

The plan is for future AI programs to be capable of taking a more ‘human’ view of the world by learning from more first-person shot imagery and video that centres the cameras view firmly in, well, the centre.

As simple as that may sound, Facebook hope it will unlock unthought of potential to enhance AR (augmented reality) tools alongside wearable tech… For instance, your glasses could scan a room and find your car keys if you can’t see them.

 

Facebook are working on exactly that in conjunction with thirteen other universities and labs across nine countries on Ego4D, a long-term project they are funding without any outside help. So far Project Ego4D has collated over 2,200 hours of first-person video from over 700 participants just going about their daily lives.

Project Ego4D has been tasked with five developmental benchmarks for developing the next gen of AI:

 

  • Episodic memory: What happened when? (e.g., “Where did I leave my keys?“)
  • Forecasting: What am I likely to do next? (e.g., “Wait, you’ve already got your keys“)
  • Hand and object manipulation: What am I doing? (e.g., “Putting the keys in the car“)
  • Audio-visual diarisation: Who said what when? (e.g., “What was the main topic during class?”)
  • Social interaction: Who is interacting with whom? (e.g., “Help me better hear the person talking to me at this noisy restaurant”)

 

All of the above play right into the rumours that Facebook, owners of Oculus VR, are planning on launching their own branded smart glasses soon.

We expect that to the extent companies use this dataset and benchmark to develop commercial applications, they will develop safeguards for such applications. For example, before AR glasses can enhance someone’s voice, there could be a protocol in place that they follow to ask someone else’s glasses for permission, or they could limit the range of the device so it can only pick up sounds from the people with whom I am already having a conversation or who are in my immediate vicinity. – Facebook Spokesperson

 

Canvas Apps Vs Model-Driven Apps

Helping choose between Canvas and Model Driven apps on the PowerPlatform

 

PowerPlatform… PowerApps… Model-Driven Apps… Canvas Apps. Every industry likes their jargon and Microsoft PowerApp developers are no different..

If you’ve come across one or any of these phrases recently, either through your own research or from hearing them as part of your Digital Transformation with a third-party partner then you may be scratching your head and wondering what they all are…

What Are PowerApps?

Put simply, Microsoft PowerApps were developed to democratise software development, reducing both the difficulty and complexity of launching a new application for use by an organisation.

Using PowerApp’s a citizen developer (someone with little or no coding/design experience) can rapidly create a custom app for use by their organisation, either internally or externally.

PowerApps do this by using a low code/no code approach that lets users create apps using a simple click, drag and drop system.

Low code/No code development has been massively gaining in popularity over the last few years and the COVID pandemic has only seen that interest increase, with many organisations needing to pivot to a new way of working, thus speeding up their digital transformation plans ahead of original schedules.

 

Using PowerApps, a citizen developer can create either a canvas app or a model driven app… which obviously leads on nicely to the next questions, being… what are they, which is best and when should you use them?

What Are Model-Driven Apps?

Again, putting it as simply as we can, Model-Driven Apps aren’t ‘stand-alone’.

They’re based on sets of underlying data, or being more precise, underlying data held in the Microsoft Dataverse (apologies for throwing more jargon at you! The Dataverse, what used to be the Common Data Model, is a cloud-based storage environment that organisations can use to store their business application data safely and securely).

All Model-Driven Apps will be integrated in some way with the Microsoft Dataverse. In fact, most Microsoft Apps are Model Driven Apps themselves as, on some level, they’ll be integrated with the Dataverse (up to and including the entire Dynamics 365 platform).

 

That level of integration with the Dataverse means Model-Driven Apps can be described as ‘data-first’.

They’re far more rigid in their functionality than a Canvas App will be, with the UI (User Interface) components likely being selected from pre-made choices (although some customisation is still possible). That premade element makes Model-Driven Apps incredibly simple and easy to design and build, with absolutely no coding ability needed.

The flip side to all that rigidity however is that a Model-Driven App will always be a lot more sophisticated than a Canvas App, making them much better solutions for anything requiring complex busines logic to function efficiently.

The other plus to those premade User Interfaces is that they’re Responsive-by-Design, so will always look great, no matter what kind of device they’re accessed from.

What Are Canvas-Apps?

A Canvas-App does exactly what it says on the tin… they provide you with a completely blank canvas to create with.

You’re not left completely on your own though, as Canvas Apps are still designed for citizen developers, so there’s still a click, drag and drop format powering the interface.

Once you’ve got a setup you’re happy with though, you can make further adjustments to a Canvas App to change the size, shape and formatting of all the disparate elements.

Then, once you’re happy with that, you can connect it up to either a single or multiple data source(s) using nothing more than simple excel-style formulas.

 

And therein lies the main appeal of a Canvas App over a Model-Driven App; since there’s no need to worry about how your data is structured as there is with a Model-Driven App, you get a much more intuitive design experience.

Sounds complicated?

 

Don’t worry. If you can use PowerPoint and have an entry-level understanding of excel formulas, then you’ll be able to whip up a Canvas App with no problems.

Building a Canvas App is a lot of fun as it really lets you flex your creative side as no two Canvas Apps will ever be exactly the same, plus, as a bit of an added bonus, Canvas Apps are capable of pulling data from over two hundred unique data integration sources as an Out-of-the-Box feature, making them infinitely more flexible than a Model-Driven App could ever hope to be.

 

The downside to that however is that Canvas Apps aren’t responsive by design. The two basic layout functions are portrait and landscape but depending on the amount of customisation that’s happened, some work is often required to make them fit across a range of devices by adjusting the size and positions of the various elements in relation to the screen size.

To do that you’ll need hard-coded values that will have to be repeated for every individual control that’s been created within your app… so whilst a Canvas App is endlessly customisable, making it then responsive can be a long-winded process.

When To Use A Canvas App Vs When To Use A Model-Driven App

In case you hadn’t guessed already, the title of this article is a little misleading as it isn’t really a case of Canvas Apps vs Model-Driven Apps or which is better.

Instead, there’s appropriate use cases for both; complementing rather than competing with each other.

 

A Canvas App is perfect for creating a task/roles-based application such as a ticket system for an IT team. As this task is focussed solely on one issue, it really doesn’t require the full use of the entire Microsoft Suite… that would literally be the definition of over-engineered (or in simpler terms overkill), potentially impacting on the functionality of the app by making it too complicated.

 

Model-Driven Apps then, are much more suited to creating a complete end-to-end solution.

Going back to the IT ticket system use case, A Model-Driven App would be useful after the ticket has been created to route it to the right place with the right location address, allow it to be updated so both sides can track its progress and mark it as complete.

In the above use case, it’s almost certain there’ll be multiple stakeholders wishing to view, update and document the ticket so for a well-rounded app capable of tracking the whole cycle, a Model-Driven app would be best suited.

How To Set Up Field Monitoring In Business Central

Gurdeep Bahra – Business Central Consultant, cloudThing

One of cloudThing’s Buisness Central experts is explaining how to set up field monitoring…

 

Field monitoring can be setup on Business Central (BC) to notify users of changes to specific fields. The functionality adds a layer of security above permissions and sends automatic emails to users when a monitored field’s value is changed.

Please note: automated emailing requires the email feature in BC to be setup.

The easiest way to setup field monitoring in BC is by using the assisted setup guide, use the Tell Me function to search for Assisted Setup and hit Monitor Field Change Setup:

 

 

Microsoft are speeding up the setup process for loads of areas in BC using the assisted setup page, when you click Monitor Field Change Setup you will be presented with a setup wizard, much like those you used to install Microsoft Office with many years ago! Hit Next on this page:

 

 

On the next page we get to decide which fields we’d like to setup field monitoring for:

Here’s a brief explanation on the different options:

  • Sensitive: Information about a person’s racial or ethnic origin, political opinions, religious beliefs, involvement with trade unions, physical or mental health, sexuality, or details about criminal offenses.
  • Personal: Information that can be used to identify a person, either directly or in combination with other data or information.
  • Company Confidential: Business data that we use for accounting or other business purposes, and do not want to expose to other entities. For example, this might include ledger entries or bank account information.

In this example we will go with the company confidential option – hit Next again

The next step is to assign the user who will receive notifications, and the email address that will receive notifications. Click on the ellipsis (or three dots!) and select the appropriate user and email address for this:

 

 

Nearly there now! Click Next on this page and you’ll be presented with the below page:

 

 

 

We just need to click Finish here, notice we have left the “View Monitored Fields” slider set to yes, which means when we click Finish we are presented with this page:

 

 

The fields added to this page are based on the selection we made during the setup wizard (personal, sensitive or company confidential) but we can add more fields to the page by simply adding a new record on this table. For fields we wish to be notified about changes, simply select the Notify tick-box as we have done below for the Vendor Bank Account (Table 288) and Bank Account No (Field 14):

 

 

Note – users currently logged in to BC will need to log out and back in for the field monitoring to begin, but from this point forward any changes to the specified fields will result in the specified user and email address being notified!

Below is a sample email which results from the changing of a vendor bank account number:

 

 

 

 

We found this a very useful out of the box tool which is easy to setup and very effective when ensuring data integrity.

 

 

LinkedIn Finally Says That’s Enough To China

LinkedIn is withdrawing from Chine due to mounting challenges.

LinkedIn, the last of the social media giants still operating in China has finally pulled out, citing ‘significantly more challenging operating environment and greater compliance requirements’.

LinkedIn now joins Twitter, YouTube, Facebook and others in either leaving or being banned from the Chinese market.

 

Microsoft, the owners of LinkedIn, have faced huge criticism over their continued presence in China over the last few years from both campaigners and US politicians over what was seen as their continued ‘appeasement’ of the country, agreeing to censor certain groups and block others… including activists and journalists.

The platform will be replaced by a stripped-down version called InJobs, a job only site with zero social feeds or interactions.

 

Before the closure, Microsoft claimed 54 million Chinese users used their platform but in recent years they’d been ‘walking a tightrope’ to stay compliant with increasingly stringent demands from the Chinese government.

While we’ve found success in helping Chinese members find jobs and economic opportunity, we have not found that same level of success in the more social aspects of sharing and staying informed. We’re also facing a significantly more challenging operating environment and greater compliance requirements in China. – Mohak Shroff – Senior Vice-President, LinkedIn

This exit from the Chinese market has been a long time coming for LinkedIn:

 

  • March ’21 – LinkedIn were forced to restrict new signups to their platform after Chinese authorities insisted they censor sensitive content.
  • September ’21 – LinkedIn had to stop Chinese users of their platform viewing content published by several US journalists, academics and activists who were highly critical of Beijing.

 

Finally, someone at LinkedIn said enough was enough and the platform cut its losses, withdrawing from China completely.

LinkedIn’s replacement, InJobs, is unlikely to be able to compete with more local competition but it’s though Microsoft will want to keep a foothold in China should things change in the future.

 

That just leaves Bing, as the only major foreign-owned search engine currently operating in China… and Bing is only hanging in there by censoring its results, blocking certain searches completely (for example Tiananmen Square).

 

 

Reshuffle Causes Confusion Over Future Of Charities Minister Position

After Baroness Barran was moved to the Department for Education, there’s now an empty spot for the charities minister.

 

Since July 2020, Baroness Barran had been minister for civil society and loneliness, and will now work as minister for the school system, of which Nadhim Zahawi is the new education secretary.

The Department for Digital, Culture, Media and Sport (DCMS) has been asked to confirm whether the department will continue to host the Office for Civil Society and when the minister responsible will be named.

Great honour to be appointed to @educationgovuk ministerial team. Excited to get to work but first….HUGE THANKS to @DCMS Civil Society and Youth team for all your support. Also to all the charities, social enterprises for all the work you do – especially during the past 18 months.

It has been the most difficult time and you have stepped up and delivered for our communities. Supported of course by brilliant volunteers whose generosity has been extraordinary. Thanks too to all the funders and philanthropists who have partnered with us in the past year. – Baroness Barran, Minister for the School System

 

There has seen an outpouring of gratitude and best wishes from senior leaders, as Barran was well respected in the charity sector.

Who is at DCMS?

On Wednesday it was announced that Nadine Dorries would be culture secretary, and only one DCMS minister has kept their job – Nigel Huddleston who is responsible for sport and tourism.

Julia Lopez and Chris Philp have been appointed to the department as minister of state and parliamentary undersecretary respectively. Lopez joins from the Cabinet Office and Philp from the Home Office.

There is not yet a DCMS representative in the House of Lords.

The reason for the confusion for who is charities minister is because, during reshuffles, department heads hand out ministerial portfolios after the prime minister has appointed individuals to departments.

Other Appointments

The voluntary sector has seen a number of key appointments confirmed in the Treasury, Foreign Office and newly rebranded Department for Levelling Up.

Treasury

Helen Whately became the exchequer secretary to the Treasury, which will give her responsibility for charity tax issues.

She has been an MP since 2015, and previously held roles such as deputy chair of the Conservative party, roles within DCMS, and was minister for social care until last week.

On her website it says: “Helen has worked with several charities as a volunteer and adviser and has also been a school governor.”

Department for Levelling Up, Housing and Communities

The Department for Levelling Up, Housing and Communities is the rebrand of up the Ministry of Housing, Communities and Local Government which is now headed by Michael Gove as of last week.

The department also sees the prime minister’s previous levelling up adviser join ranks. Neil O’Brien is the co-founder of the think tank, Onward, which set out a series of reforms for volunteering and Gift Aid last year.

Elsewhere, Andy Haldane has been appointed to head up a levelling up taskforce.

Kemi Badenoch, who had been exchequer secretary at the Treasury, has also joined this department and will continue as minister for equalities.

Foreign Office

The foreign secretary role now includes oversight of international aid programmes and is now held by Liz Truss.

This year has hit home the need for global cooperation to deal with climate change, the rise in extreme poverty, and ongoing humanitarian crises. But this work, undoubtedly, has been made harder by the cuts to UK aid.

It is critical that the new foreign secretary uses the upcoming international development strategy to ensure UK aid remains poverty-focused, and that the whole portfolio of the Foreign, Commonwealth, and Development Office, delivers long-term, sustainable development for the most marginalised communities, whilst protecting human rights and civil society space globally. – Stephanie Draper, CEO of Bond

Climate-Change Deniers To Be Barred From Google Ads

Climate-change deniers will be restricted from monetisation or displaying ads, according to Google’s new policy.

The new policy prohibits the advertisement and monetisation of content that argues against the scientific consensus to do with climate-change’s existence.

The prohibition will cover all their platforms, most notably YouTube.

The policy announcement comes in a support document where it is also stated that the content or videos that appear to promote false or inaccurate information about climate-change has sparked concern among Google advertising partners.

Advertisers simply don’t want their ads to appear next to this content. That’s why today, we’re announcing a new monetisation policy for Google advertisers, publishers and YouTube creators that will prohibit ads for, and monetisation of, content that contradicts well-established scientific consensus around the existence and causes of climate change. – Google’s Ads Team

It goes the other way too, with content creators refusing to have ads promoting unscientific claims appear next to their videos or webpages.

The restrictions cover any references to climate change being a hoax or scam; denial of long-term observations of climate-change; and any attempts to refute that greenhouse gas emissions from fossil fuel consumption and other Earth-damaging human activities have and continue to contribute to climate-change.

The review process to enforce the new policy will include both automated processes and human processes.

Context will play an important part in the review process to discern if the misinformation is being presented as fact, or if it is simply being discussed or disputed, and Google will ensure that the automated tools and human reviews will look closely at what is being stated within the content.

Ads and monetisation won’t be restricted on the various other climate-related topics, as the aim isn’t to stifle discussion. Debates on climate policy, impacts of climate-change, and new or burgeoning research, etc, will still be allowed to have ads or be monetised.

The guideline of the policy includes consultation from experts who contributed to the United Nations Intergovernmental Panel on Climate Change (IPCC) assessment reports.

It also follows as Google’s second big misinformation policy change in less than a month, making up a wider crackdown of ‘fake news’ and the misinformation that is easily spread on the internet.

Last month, content surrounding anti-vaccine misinformation was blocked by Youtube. Misinformation such as claims that flu shots cause infertility or persistent claims that vaccines cause autism are such examples of the content that is blocked on Google’s platforms. The updated policy will now cover any misinformation surrounding the substances that make up a vaccine.

There is insurmountable pressure on tech giants like Google to address the spread of misinformation on their platforms, even Facebook has invested a $1 million grant into fact-checking false climate claims.

There have also been a number of climate-awareness products launched by Google to help against climate-change, such as a Google Maps setting which finds the most eco-friendly route for users.

Currently, the Biden administration is attempting to pass the Build Back Better Act, which includes a $3.5 trillion spending package, with the intention of tackling climate change. The legislation includes a tax on methane gas, expanding tax credits for renewables and electric vehicles, and pushing utilities to use more clean energy.

These measures could reduce the US’ greenhouse gas emissions up to 936 million tonnes by 2030, according to research firm Rhodium Group.

NC3RS Seeks To End Animal Testing With £2.7M Prize Fund

NC3Rs announces challenge which aims to end the bioscience sector’s dependence on using animals for testing.

The National Centre for the Replacement, Refinement and Reduction of Animals in Research (NC3Rs) has announced a ‘Virtual Second Species’ challenge with a £2.7m prize for anyone who can devise a solution to the use of animals, particularly drug testing dogs, in R&D.

This competition is part of the CRACK IT Challenges competition to discover a solution that will replace the need animal testing within the science community.

Participants in the NC3Rs are encouraged to create virtual models of dogs based off of existing dog study data from previous experiments, with the aim of improving animal welfare, increasing efficiency, and of course lowering costs.

Several years ago, the National Centre for Universities and Business (NCUB) created a platform called konfer, in which the NC3Rs will use to reach over 150,000 academics from universities around the world.

The challenge set out by NC3Rs is for tech innovators to create digital dogs to predict the negative effects of drugs before they go to human trials.

The collaborative effort to revise the need for animal testing in the bioscience sector forms a part of NC3Rs wider ‘CRACK IT Challenge’ with the laudable goal of eliminating animal testing completely.

This competition has been put together by NC3Rs, in collaboration with eTransafe and Simomics, with sponsorship from Bayer AG, Eli Lilly and Company, Genentech Inc., Gilead Sciences Inc., GSK, Merck Healthcare KGaA and Roche.

It’s hoped anyone entering the competition will develop models using advanced mathematical modelling and machine learning combined with years of previous testing data to help predict unexpected side effects of drugs without the need to test on animals.

As it stands, two species are required to test for negative reactions: a rodent and a non-rodent. Last year, 2,082 experiments used dogs, and NC3Rs hopes to reduce that figure.

Even worse, the current method of using two animals doesn’t always even reveal adverse effect that could affect a human, leading to an unnecessary waste of animal life.

It’s hoped the competition will reduce the cost of drug testing and increase the speed of which new drugs can reach the market safely, and of course save thousands of animals’ lives.

We are excited to be using konfer to connect with the UK’s leading innovators and academics. As we seek to shift the paradigm of the use of dogs in drug testing, we’re keen to make use of the large amounts of dog study data within pharmaceutical companies to create a virtual dog to determine drug toxicities. Konfer’s smart-matching technology offers a speedy, cost-effective route to collaboration, and we look forward to working alongside the UK’s brightest minds to tackle an acute issue in drug development. – Dr Anthony Holmes, Director of Science and Technology at the NC3Rs

It is hoped that by making these solution opportunities more accessible, then the UK will become a global hub for innovation by 2035:

As the government pursue their vision of making the UK a global hub for innovation by 2035, it is important that we simplify collaboration between universities and businesses. The NC3Rs ‘Virtual Second Species’ Challenge is a prime example of meaningful and game-changing innovation, addressing the major matter of animal testing that continues to persist even today.

It is our hope that by making it easier for organisations such as the NC3Rs and academics to find each other, we can facilitate the creation of productive, innovative and of course, meaningful coalitions that will improve lives. – Dr Joe Marshall, Chief Executive Officer at NCUB

 

 

Intel Says No Due To Brexit

Intel CEO, Pat Gelsinger, says the new European chip factory will only be considered for EU member states.

He goes on to say that before Brexit, the UK would have been considered but due to the decision to leave, it is not a part of Intel’s £70 billion expansion plans.

There are currently 70 proposals across 10 different countries, and an agreement will hopefully be made by the new year on which of the EU member states wins the proposal.

Intel plans to boost its export from the US and has a 10-year plan of investing £70 billion into opening and upgrading semiconductor plants over Europe.

Intel is considering multiple factors for the proposals. The European site must be able to support up to eight fabs on 1,000 acres of land, and a decent talent pool must be accessible.

Belgium, France, Germany and the Netherlands are in the running, among others, and it’ll be announced by the new year which sites will be host to the new factories. There are plans for at least one manufacturing and one advanced packaging factory in Europe.

This follows other direct consequences of the Brexit vote, such as mass shortages of healthcare staff, care workers, and HGV drivers, and now a high-skill manufacturing investment being taken out of the running completely.

The pre-Brexit Britain was a prime location for US and Asian firms to use as an entry into the rest of Europe, and the access to international shipping ports like Tilbury, Felixstowe and Liverpool, as well as the highly technical and skilled talent pool for firms like Intel to take advantage of means the UK would have been a smart choice for a new silicone wafer plant to serve the rest of Europe.

However, supply chains need to be reliable. The Brexit vote means the UK can no longer guarantee cheap, easy and robust cross-border trade, which has led to the nation being passed over for EU member states like Ireland and Germany.

Intel’s expansion in the EU comes as a solution to the global semiconductor shortage affecting supply chains of numerous goods, from cars to computers.

September 2021 saw Toyota slash global production targets by 40%, with other manufacturers like General Motors, Nissan, Ford, Honda and Jaguar Land Rover also being forced to slow or stop production at various plants in 2021.

The chip shortage has resulted in the cost of any goods where microchips are a vital part of it, which could last until Christmas and is unlikely to stabilise until 2023.

There is some possibility that there may be a few IOUs under the Christmas trees around the world this year.

Just everything is short right now. And even as I and my peers in the industry are working like crazy to catch up, it’s going to be a while.Pat Gelsinger, Intel CEO

As it stands currently, Taiwan and South Korea produce around 70% of the world’s supply of chips, so Intel is hoping for government subsidies in the US and Europe to address the global reliance on Asia for the supply of chips.

UK Has One Of The Highest Rates Of Infection Of Covid In Europe

As winter begins to encroach on our heating bills, how worried should we be about Covid?

You may not have noticed, but the UK has one of the highest rates of infection of Covid in Europe.

Compared with the big nations in Western Europe, the UK unfortunately holds the mantle with the highest numbers of infections.

But what are the contributing factors? England was the first European country to unlock, with all (bar a couple) social distancing and Covid measures being lifted on July 19th, 2021. The next country after that was Denmark in late August.

This step has only been taken in recent weeks by nations like Norway. Many other nations have kept most of their measures in place, for example, Italy and German still have restrictions on large gatherings.

So, with England having led the charge by weeks on the lifting of social distancing, it comes as no surprise that a virus that is passed via close proximity of humans has taken off ahead of the rest of Europe.

Vaccine uptake has also slowed down which has allowed for nations such as Spain, Portugal and France to take the throne of administering more doses of the vaccine than there is population of citizens.

This is in part due to the UK only just starting their vaccination of under-16s, a little behind schedule to a lot of other countries.

But has the link between catching Covid and becoming seriously ill from it been broken?

The vaccine uptake between those with serious illnesses and older people in the UK is similar to the rest of Europe, which grants a higher level of protection to the more vulnerable.

Simply put, the gap in numbers dying is similar.

Current trend shows just over 100 Covid related deaths a day in the UK, which is similar to what happens in a blad flu season for months on end.

However, death is not the only measure. The effects of ‘long Covid’ are still just as present among those who have not been seriously affect by Covid. While these effects are still being learned about it is argued that spread should be better contained.

Some experts, such as Prof Mike Tildesley, an expert in infectious disease modelling at the University of Warwick, are now questioning if there is an “acceptable” level of Covid, otherwise we will become reliant on extra measures long-term.

Covid is here to stay – we need to discuss what we are willing to live with. – Professor Mike Tildesley

Another thing to consider is that there has been a variety of approaches throughout Europe so there isn’t really a control to look at, so where we’re heading is just as important a measure as where we’ve been.

For example, in early Spring the UK had one of the lowest rates in Europe because we’d already had our Alpha wave, whereas Europe’s was in full swing.

Covid is one of those situations that can change at an unprecedented rate, whether positive or negative. The UK death rate is falling, even in a society with little to no social distancing happening and mask-wearing is not mandatory. It suggests the virus has been brought under some control, in the sense that those rapid surges of the early days should be behind us as the wider population has immunity.

As it stands, the high rates are apparent among teenagers – particularly those under 16 who haven’t had a chance over the summer to get their vaccination unlike those in the same age group in other parts of Europe.

The concern is, and always has been, that the younger population could spread the infection into the older populations, as children are the lowest at risk of becoming seriously ill off of the virus.

But there are initial indications that suggest this isn’t happening, furthermore, the rise in children may have already peaked. It shows we may be able to maintain a level of cautious optimism.

So, as winter approaches, we may actually see a continued fall in infections once the wave in teenagers comes to pass.

And this was the argument provided by the UK government and its senior scientists – Prof Chris Whitty and Sir Patrick Valance – when the decision to reopen was being floated, that we needed an ‘exit wave’ before the throes of winter fully arrived.

The problem, really, is that the NHS doesn’t have much room for even a tiny surge.

This winter, it will not just be a surge in Covid that strains the NHS.

With all the lockdowns and social distancing, the regular, common colds and flus that we encounter in daily life, especially wintertime, were largely absent, so there is less immunity among society.

For example, the beginnings of an outbreak of RSV – a virus which can cause up to 30,000 under-fives to be admitted to hospital every winter, which is six times what that age group has seen of Covid, can already be tracked and it is circulating at very high levels.

On top of that, flu season is about to begin.

How much room does the NHS have?

 

 

NonProfit Sector Still Needs To Move Past Tokenism

Delegates at the Chartered Institute of Fundraising’s annual convention told that power needs to be shifted to the communities.

Fundraisers have heard that much of the NonProfit sector is struggling to move beyond tokenistic gestures when it comes to highlighting the voices of marginalised groups and the communities it works with.

Jaden Osei-Bonsu, programme manager at the leadership development community interest company the Centre for Knowledge Equity, told delegates at the Chartered Institute of Fundraising’s annual convention that the sector needed to shift power to the communities it supported, rather than telling them how to solve their problems.

Speaking at the online convention during an event focusing on how to be an ally to marginalised groups, Osei-Bonsu called for larger charities to think about how they could work in genuine partnership with grassroots organisations which allowed them to lead programmes, rather than simply advising.

Historically with the charity sector, fundraising usually puts communities in a position where they are being researched or people are trying to tell them what is going to solve their problems.Jaden Osei-Bonsu – Programme Manager, Centre for Knowledge Equity

Osei-Bonsu adds that the conversation should be about shifting power to communities with direct experience of the issues being addressed, as the majority of the sector is struggling to move past tokenistic gestures.

Drawing on her experience in youthwork, fellow panellist Yolanda Copes-Stepney, founder of Speak & Do, said that when engaging with marginalised communities, organisations needed to make a conscious effort to ask what results the communities wanted to see from the engagement.

She also said that, too often the young people she spoke to believed nothing would come from their involvement and that they would not be listened to.

She also said that organisations and individuals need to remember that allyship and supporting marginalised groups was ‘going to be a constant process of learning’.

It’s about asking lots of questions, and never assuming anything for them.

 

 

Facebook Back Tracks On Instagram Plans For Kids

Facebook halts its plans for an ‘Instagram for Kids’ – aimed at ages 10 to 12 – after being accused of ignoring its own research into the harm to children’s wellbeing caused by Instagram.

 

An article reported by the Wall Street Journal accuses Facebook of ignoring and covering up evidence of the harm caused to teenagers, particularly girls, by Instagram. Further on in the article, it reports that an internal Facebook presentation noted that among teenage social media users who reported suicidal thoughts, 13% of British users and 6% of American users traced the issue back to Instagram.

Another presentation from 2019 said, “We make body image issues worse for one in three teen girls,” while a later slide deck added: “Thirty-two per cent of teen girls said that when they felt bad about their bodies, Instagram made them feel worse”.

Facebook has spoken out against the allegations, stating the article deliberately mischaracterises them, and “conferred egregiously false motives to Facebook’s leadership and employees.”

The article increased the scrutiny on the dark side of social media at a time when Facebook is facing criticism from many angles.

It even caught the attention of US politicians who outright called for the company to abandon Instagram for Kids.

Bowing to pressure, on Monday Facebook said in a statement that it would ‘re-evaluate’ the project:

While we believe building ‘Instagram Kids’ is the right thing to do, Instagram, and its parent company Facebook, will re-evaluate the project at a later date. In the interim Instagram will continue to focus on teen safety and expanding parental supervision features for teens. – Statement Issued by Facebook

 

Children under the age of 13 are not supposed to use Instagram, although this is easily circumventing by many by lying about their age. The planned new product, known informally as ‘Instagram for Kids’, would be aimed at the ages 10 to 12 demographic with parents having some control over their usage.

The problem with this is that a similar Facebook product called Messenger Kids was found to be open to abuse by strangers who were able to enter chatrooms.

The plans have been steadily pulled back as more and more advocacy groups, parents, and lawmakers have lined up to take aim at the proposed product plan.

In April, advocacy group Campaign for a Commercial Free Childhood wrote to CEO Mark Zuckerberg, saying Instagram for Kids would create “challenges to adolescents’ privacy and wellbeing”.

Us Representative Lori Trahan and Senator Richard Blumenthal welcomed the announcement that Facebook would be delaying Instagram for Kids but said in a statement it should go further and ditch the project as a whole.

“We are pleased that Facebook has heeded our calls to stop ploughing ahead with its plans to launch a version of Instagram for children. A ‘pause’ is insufficient, however.

Facebook has completely forfeited the benefit of the doubt when it comes to protecting young people online and it must completely abandon this project.”

 

 

 

Check Out The Benefits Of The Microsoft Catalyst IDEA Framework…

One of the most common complaints we hear from organisations is that they know they need to digitally transform their organisation and become more resilient to change but simply have no idea where to start, what ‘good’ change should look like, how much it’s likely to cost, how long it might take and perhaps worst of all… how they should (or can) measure success at the end of the project.

 

cloudThing have always (and will always) be happy to help you answer those questions but recently Microsoft have released something called the Microsoft Catalyst that seeks to codify the answering of those questions by taking an organisation on an ‘engagement journey’ through a series of phases curated by a third-party (cough, like cloudThing, cough) which will run the full gamut of Inspire, Design, Empower and Achieve (I.D.E.A) or, in more simple terms… the IDEA of a solution framework.

The idea (pun intended) behind it is to apply a structured approach so that, larger organisations especially, can garner real ROI from change instead of making the same old mistakes with slightly newer technology.

 

Microsoft’s Catalyst can best be described by breaking it down into two sections, the what and how, which is what we’ll be showcasing for you today, along with all the awesome benefits it can bring to your organisation.

The ‘What’ Of Microsoft Catalyst

At its core, Microsoft Catalyst seeks to garner a deep understanding of the problems an organisation might be having by leveraging the IDEA framework (Inspire. Design. Empower. Achieve) to create successful solutions for a digital transformation.

Microsoft do this through a series of conversations we hold with an organisation looking to digitally transform, that between Microsoft, the client and ourselves, enable us to create a vision of how better business outcomes can be achieved through a set of components that create a business application solution.

The ‘How’ Of Microsoft Catalyst

Now on to the good stuff… how Microsoft Catalyst can actually work for your organisation.

The framework underpinning the Catalyst is, as we’ve already said, called IDEA, which stands for Inspire, Design, Empower & Achieve and is designed to take you through a solution envisioning and planning process that utilises everything Dynamics 365, the Microsoft Power Platform and Azure has to offer.

 

  • Inspire: The Inspire phase comes first and focuses solely on envisioning the solution(s) that are right for your organisations needs. cloudThing (or someone else… but why would you want someone else?) will work closely with all stakeholders in the project, alongside the members of your wider organisation, to really imagine and define the future state everyone wants for the organisation through a series of ‘design thinking workshop engagements’. If that stage is followed correctly, what you’ll end up with is an Envisioning Workshop Output Template which will be used in the second stage…
  • Design: During the Design phase cloudThing will again work closely with you to quantify the results of the design thinking workshops into a next steps transformational approach with the technology that will be needed for the solution. All of that is achieved through a series of business value and technical discussions that look to discover what’s possible by aligning technology investments with the organisations stated goals.
  • Empower: The Empower phase is the fun bit. This is the first chance everyone across your organisation will have to see your solutions really brought to life through a series of visual assets, solution demonstrations, prototypes and immersive experiences that highlight the transformations vision and value to promote buy-in across all teams.
  • Achieve: and then finally, we get to the Achieve stage. This is where a real-world blueprint is created for how to deploy the solution and the process of digitally transforming your organisation is started, finished and assessed.

Benefits Of The Microsoft Catalyst IDEA Framework

We can quantify the benefit(s) of the Microsoft Catalyst IDEA Framework in one word… Transparency.

The IDEA Framework gives all stakeholders a complete overview of everything the organisation will need to succeed in their digital transformation, from the viability of solutions, technical necessities right through to staff’s actual desire for change.

 

Whilst it may seem quite rigid, the other huge benefit to the Catalyst IDEA Framework is it’s cohesiveness. At every step of I.D.E.A collaboration is encouraged (even required) from all stakeholders, making the end solution something feels a part of and can be proud of. It also heads off issues that may be missed (how many CEO’s for instance really know the day to day workings of their payroll department?)

Is The IDEA Framework Right For My Digital Transformation Project?

So… by now, if you’ve read this far, you should be utterly fascinated and ready to digitally transform your organisation using the IDEA Framework.

If you are (or even if you’re not sure), feel free to get in touch.

cloudThing will be happy to go through the process step by step and give you an honest answer as to whether we think we’ll be able to help (we’re pretty sure we will be able to though).

Component Led Development, Or… How To Make Your Organisation Instantly More Resilient

Because having to do things more than once is so old skool

 

Efficiency has always been the key to any organization’s success… but that goal has never seemed more urgent than it’s been over the last couple of years, with everyone seeking to pivot, adapt and change the way they do business at the same time, whilst also finding new ways of working to future proof themselves by becoming more resilient to further change in their respective markets.

 

That’s a big ask for anyone and unfortunately, as we all know, a lot of companies have struggled.

cloudThing wish there were a golden bullet we could release out into the wild that would make a business instantly profitable but there’s no such thing we’re afraid.

What we can do though, as our mantra states, is #BuildFuture and show you a new way of working going forward that will bring immediate results (rather than some nebulous, pie in the sky, six-year, nineteen step digital transformation plan).

And that’s where Component Led Design comes in.

What Is Component Led Design?

Component led design is one of those things sounds exactly like what they are…

Rather than creating a huge digital project with an end date months or years in the future, that’s only released all at once at the end, component led design seeks to break that project down into smaller, much more manageable solutions processes, that can be developed and completed quickly and, here’s the trick, bring instant ROI to an organization, whilst simultaneously making entire projects faster, better and more efficient.

These individual components, once built, can then be reused multiple times in various other parts of the project (or indeed other projects and solutions) with zero development work needed, with the end goal of achieving a Minimum Viable Product quicker every time, that can then be improved on as it matures.

Component Led Design With Microsoft’s PowerPlatform

Whilst any developer can ‘do’ component led design for you (be it an internal or external resource), by reusing the components they build on other projects, it’s actually much easier to reuse components others have already created or build interconnecting components based on proven enterprise design patterns.

That’s where Microsoft steps in with the PowerPlatform.

The ‘Power Platform’ as it’s collectively known, is made up of three of Microsoft products: Power BI, PowerApps and Power Automate (what they used to call Flow).

Together, the power platform helps both developers and citizen developers (people with little or even no coding experience) easily create, automate, analyse and improve their own apps which can be used in wider digital transformation projects.

Using PowerApps, almost anyone can, using a low-code, no-code approach, develop a custom app using an easy to grasp ‘point and click approach that will be mobile friendly and won’t require the help of under pressure IT or development teams.

In fact, we say develop, but for a whole range of business processes and solutions, there are already existing components available from both Microsoft and third-party developers that can be assembled as and how your organisation require them, the IP of which, for many if not most, is free.

This means either your third-party developer or your internal IT Team can take these disparate components and configure them into your organisations ideal solution and, by defining the organisations most urgent needs through a series of use cases, Component Led Design is capable of driving instant value.

Benefits Of Component Led Design Approach To Development

As already mentioned, by focusing on ‘smaller, bite-size, development issues (components) you can bring instant ROI to an organisation (as opposed to waiting till the end of a long development cycle before you see any return on your investment).

As promised in the title, this approach also makes your organisation a lot more resilient to external change.

Getting into the habit of creating (or using) components to incrementally change your organisation will make you much more flexible should you need to change anything due to external (or internal) factors.

The main benefit of Component Led Design though, and the point we really wanted to get across, is that it shifts the focus away from development and on to outcomes.

Component Led Design In The Discovery Phase

As an example, someone may have said at your organisation “right… we need a new CRM/Payroll system/volunteer management system/(any other large digital development)”.

That’s great, but using a Component Led Design approach in the discovery phase you’ll ask the question… why?

 

Why does the organisation need that new CRM/payroll system etc.

 

Is the CRM completely out of date and really does need replacing, or is that that a couple of new features have been requested by different departments?

If the latter then there’s really no reason a Component Led Design  approach can’t offer real and tangible benefits within days rather than months (or years in the worst case scenarios).

Configuring individual components is just cheaper.

Cheaper, faster and… it’s much less of a risk to your organisation.

 

How familiar does this sound? You buy an expensive new piece of software, investing time and money into it only to discover at the end of the project it still doesn’t do everything you needed initially.

Component Led Design is much less risk because you’re either developing a much smaller solution or re-configuring a solution that’s already been created.

This gives you much more time to run an effective discovery process, workshopping exactly what the organisation needs and what the solution will need to do to get you there.

That will let you put a prototype together much faster meaning, when your solution goes into development, you already know it will work as you’ve already seen it working.

Being More Efficient With Robotic Process Automation…

Given the recent drive for efficiency and resilience it hasn’t surprised anyone in the tech world that RPA (or Robotic Process Automation) has seen a huge surge in demand during the COVID pandemic, freeing up staff’s time to could focus on much higher value work. And RPA and component led design go hand in hand…

RPA, at its most basic, is a type of automation for all of your organizations business processes that’s designed to handle a huge number of repetitive tasks that would normally require a person to complete manually; often utilising the likes of Artificial Intelligence and Machine Learning to get better and more efficient at those stacks.

But…

This article isn’t about RPA. RPA in this instance is one of the goals, not the tool to get you there.

How To Achieve A Single Customer View In 5 Easy Steps

Why is gaining a single, unified view of your customers so important?

 

Modern analytic tools can provide awesome insights into your customers behaviours, motivations and buying habits but for those tools to be efficient they require something known as… a single customer view.

What Is A Single Customer View?

A single customer view (SCV), sometimes also known as a unified customer view (UCV), is the ability to hold and access all the data your organisation might have on a particular individual in a single, easy to use, easy to understand, CRM system, with the data all normally viewable on a single page (although a single page might not always be the goal depending on how much data you hold on said person).

Microsoft have taken that concept even further by storing that data within their Dataverse (previously known as the Common Data Service) and retrieving it with various Dynamics apps depending on your needs at the time.

The beauty of that of approach of course is that it’s also hooked into Microsoft 365 which means you could build a PowerApp ‘over it’ or even set things up so it pulls data directly from excel etc.

Basically, anywhere that’s most convenient for you to retrieve and view your data (security concerns permitting of course).

 

Holding a consolidated view of a customer has become ever more challenging for organisations in recent years, as we’ve all started to hold more and more records of and around a person, over the multiple touch points they might interact with said organisation, in various databases (creating not just a Business Intelligence nightmare but also a UK-GDPR/GDPR one!)

Online forms, webchats, face to face meetings, email chains, social media posts, call centre recordings… all that data has to get stored somewhere, but often that somewhere isn’t in the same place, resulting in duplicate or worse, fragmented views of a customer.

Why Is A Single Customer View Important?

As time goes on the amount of data collected on your customers will only grow, as will their touch points with your organisation (think social media for instance… where do their tweets about you get stored? Are you storing them at all?).

As the data multiples exponentially though, various stakeholders throughout the organisation will likely lose visibility of large swathes of it as it gets siloed away in separate databases and systems, causing major pain points for an organisation. Circling back to Twitter as an example. Your Customer Service team might document all complaints a client makes to the organisation, but what if they complain on Twitter… does your social media executive log those in the same place?

Having a unified or single view of a customer then, offers several immediate as well as long term benefits to ongoing business processes and strategies.

 

  • More Focussed Targeting: A single view of a customer will offer you much greater insights as to their behaviours, interests and interactions with yourself; meaning your marketing teams can tailor their messages on a much more personal and focused level.
  • Better Decision Making: By doing away with duplicate records, consolidating records and making their access to authorised personal easier, the single customer view (SCV) will allow your leadership teams to make decisions backed by real (and more importantly) accurate data, all accessible at the click of a (single) button.
  • Increased Customer Loyalty: It isn’t all about how you’ll benefit, however. The customer, receiving a much more personalised experienced, will naturally feel more favourable and loyal towards your organisation and brand (which of course will have a feedback effect of benefiting you in the long term).
  • Increased Inter-Departmental Communication: Your disparate departments can’t work together and help each other if they’re not even aware of the various dialogues between customers, strategic partners and the organisation itself. Consolidating that all down into one SCV to allow for better, more ‘joined-up’ communication, massively boosts efficiency across an organisation.
  • Increased Data Quality: The other issue in siloed data is the quality of the data may vary across systems, making certain aspects of it harder to rely on. Introducing a single customer view alongside something like Microsoft’s Common Data Model makes for much more trustworthy and consistent Business Intelligence.

A Single Customer View: What To Look Out For…

When building an SCV, it’s important that either yourself or your chosen partner (cough, cloudThing, cough) strike the right balance between customer insights, privacy and data security.

A good step to take in helping with this is to create various levels of access from administrator to editor right down to read only.

That will ensure the right stakeholders have the right levels of access to the various types of data available to your organisation on its partners and customers, all inline with any possible GDPR or data protection legislation that might come into play.

Whilst a single customer view is the ultimate goal, it’s worth noting that previously, certain sectors or organisations (think the NHS or Police Force) might have deliberately siloed away elements of an individual’s sensitive data for compliance can so any solution needs to consider that and ensure only authorised parties can access certain types/levels of data (again – Microsoft’s Dataverse can automate this for you depending on an individual’s level of access).

What Does A Good Single Customer View Solution Look Like?

Before you even start to scope out the solution, let alone start to build it, it’s important that you define what success will look like.

Why does your organisation need a single customer view, what will it be used for and how will success be defined?

 

In cloudThing’s eyes a successful SCV (or any solution really) should:

 

  • Deliver immediate value to your organisation.
  • Be reusable and replicable.
  • Be scalable.
  • More specifically it should provide an efficient data capture system (preferably compatible with the Common Data Model or the Dataverse) that’s capable of collecting all customer communications and data across all of their touch points with the organisation that is both secure and compliant with all relevant legislation.
  • All existing data should be capable of being migrated into the new system securely, efficiently and effectively.
  • All stakeholders should be trained in the use of the new system before it goes live to ensure there’s no interruption to standard business processes.
  • It should be secure, with no one being able to get to things they shouldn’t.
  • It should be resilient… no one want’s an all singing, all dancing solution that’s going down all the time!
  • It needs to be observable, with someone capable of seeing, and being in charge of, the data that goes in and out of it.
  • And finally, any solution needs to avoid vendor lock-in as much as much as possible (another reason cloudThing is such a fan of Microsoft’s Dataverse… data from there can literally be exported in just five clicks).

5 Easy Steps To Building A Unified Or Single Customer View

Ok.

So, we now know what a single customer view is, why it’s important, the benefits it can bring to an organisation, what you should look out for when building one and what success looks like, but…

 

How do you actually go about implementing it?

 

The first phase to go through is Consulting.

You need to engage will all the stakeholders within the project to understand how data is currently stored, how it should be stored and what it will be used for going forwards (irrespective of the single customer view or not) and, as we’ve already said, define what success will look like so everyone knows what the end goal is.

 

The next stage is Assessment.

The assessment stage, as it sounds, comes in several steps, most of which can be carried out concurrently by different teams.

 

  • You need to investigate all the data assets your organisation has and decide which will be pulled into the single customer view solution, whilst removing duplicate records and cleaning others.
  • All existing data needs to be assessed for accuracy and relevance, with any possible inaccuracies being documented along with any future impact they might have on Business Intelligence (so this can be taken into account after the solution is finished).
  • Although cloudThing always advocate for Microsoft’s Common Data Model, your organisation needs to decide what technology it will use to clean, transform and unify all your disparate data sources into the new solution.
  • You’ll obviously, in conjunction with all key stakeholders, need to decide on the platform your solution will be built on (cough, PowerPlatform’s Dataverse… with it’s built in CDM, cough!)
  • You’ll also need to research and confirm any licensing issues the new platform and its users might require.

 

After the assessment stage you’ll need to work on your Platform Integration:

 

  • Assign workflow patterns and responsibilities for who needs to deliver what.
  • Create any integrations that will be necessary to connect your current systems to the new SCV.
  • Enable all the key resilience features the solution will need such as auditing, alerting, caching and security etc.

 

Once all of that is accomplished you can enter the Build/Migration Phase:

 

  • Migrate all of the data across through the integrations you’ve previously built by focussing the various business capabilities sequentially prioritised by business needs and the technology available.

 

Following on from the build (the jobs never done!) you’ll enter the BAU or Business as Usual phase.

This is where you’ll monitor and refine your SCV through a process of continuous improvement (an ethos that’s always been close to cloudThing’s heart) to allow it to get better with time/adapt to new or changing needs.

Best Practice For Creating Cloud Flows With Microsoft Power Automate

Written by – Mike Chappell – cloudThing, Principal Solutions Architect – With feedback from Sushil Kudav, Benedikt Bergmann, Éric Sauvé and Matt Collins-Jones

cloudThing’s Principal Solutions Architect lays out the best practices for creating cloud flows when using Power Automate from the Microsoft Stack

 

Power Automate from the Microsoft Stack is a fantastic tool whether you want your automations to be triggered automatically, instantly or via a schedule.

Types Of Cloud Flow

  • Automated flows – An automated cloud flow is triggered by events such as an incoming email from a previously specified individual or perhaps a mention of tour organisation on social media.
  • Instant Flows – Instant cloud flows are there for repetitive tasks from either desktop or mobile devices. As an example, they’re capable of sending reminders to your different teams with the push of a single button on your phone.
  • Scheduled Flows – Scheduled Cloud Flows are for reoccurring automations such as a daily data upload to SharePoint, Dataverse… or in my case filling my timesheets!

A Step By Step Guide To Creating Cloud Flows With Power Automate

  • Connection References – Still in Preview at time of writing and the ALM will surely smooth out soon but meanwhile you still need to make sure you are documenting exactly what they are and how they are connecting (via the user, via a service principal etc) and re-use.  You might be limited to the scale of reuse during Preview but having 3 different connection references to the same Dataverse table in a single flow? You’re creating work for yourself!
  • Secured Inputs & Outputs – It’s absolutely fine to leave this step till after the debug stage but you definitely need to put some thought into what data is visible to people allowed to view your Flow’s run history. Hit the ellipsis in the top right of your action, select Settings and toggle on Secure Inputs and Secure Outputs. Your organisation likely has its own rules on what is considered sensitive but if you’re connecting to Key Vault or working with HR data, I’d be flicking those toggles!
  • Actions Should Have Meaningful Names – When you’re deep in the loops it’s far too easy to lose track of context! Assigning memorable and meaningful names will pay dividends later down the line. You may already have naming policy elsewhere where you work so agree a convention between you and stick to it to help each other out. I’m a fan of descriptive Pascal Case myself.
  • Consider Concurrency In Loops – For an immediate performance boost, head back to Settings and enable concurrency on your loops. Remember it does not tend to play well if you are using variables in there but I find it’s sometime beneficial to engineer away from a variable if it means I can knock 80% off the run time!
  • Don’t Nest Too Deep – I know how hard it is to avoid sometimes but try to avoid nesting too deep. PA itself will impose a limit (8 at time of writing) but your bleeding eyes 10 weeks later when you’re debugging probably have a lower threshold!
  • Error Handling – Do you know what will happen if a step fails? Have you captured said failure? Use Configure Run After to have different follow-on actions depending on the success or failure of key Actions. Stephen Siciliano wrote a great blog all about this.
  • Optimise & Look For Efficiencies – It probably goes without saying but you should always aim for the absolute minimum amount of calls needed to do the job, especially where you are calling potentially slow or expensive APIs. A good example of this is querying Environment Variables, should that be six calls for the six variables or could you make one call with a well thought out Dataverse filter and an array?
  • Variables Should Have Meaningful Names – I’ve already mentioned that your steps should have meaningful names but the same also goes for your variables. A quick tip here, if you are using PowerApps as your trigger, put “ask in PowerApps” dynamics expressions into an Initialise Variable action and give both the action and variable a good name to reap a much more meaningful prompt to your users over in their Canvas Apps.
  • Cloud Flows Should Be In Solutions – It’s getting better all the time but it can still be confusing to try and find specific flows or work out who can do what with them. Not only do Solutions help you move your creations around but they also help you organise your work into logical features or story collections.
  • Always Think Scale – Sure your cloud flow works when there are only ten records in your table but what happens when you’re dealing with 10 million? This is where Filters and Tops on Lists, timeouts (and concurrency) have to come into play.
  • Keep It Simple Stupid – It’s an old adage but a good one. If you’re seven nests in then maybe it’s time to start thinking about breaking your cloud flow out into smaller, more discreet functions.
  • Comments – Don’t forget to comment your actions. Like many useful features Add a Comment it underneath that ellipsis in the top right.
  • Consider Performance / Readability – It’s a fine line and adding 10 strings together in a loop might be easy to understand but that might be taking 6 seconds whereas an expression utilising join is done in milliseconds. If you have a using waiting for the outcome of this Flow, go for speed and comment and document well.

Legacy Estate Reduction… Or When To Get Rid Of Old Tech

Is your department suffering under the yoke of outdated systems & technologies?

 

What Is Legacy Technology?

The negative effects legacy technology has on an organisation is felt in all sectors but is a particularly acute problem for those with high levels of bureaucracy like the Public Sector or Central Government.

The technology aspect of the phrase refers to an organisation or departments IT infrastructure, systems, hardware, software and all their interrelated business processes and it becomes ‘legacy’ when it:

 

  • Reaches end-of-life
  • Is no longer supported by the original supplier/creator
  • Is no longer possible to update
  • Just isn’t cost effective anymore
  • Is too risky to use, either due to new cyber threats or internal policy changes that the tech can’t keep up with.

 

Why Is Legacy Estate Technology Such An Issue?

That’s a fair question. After all, if legacy tech is such a problem then, what’s stopping organisations or departments from just upgrading it, getting rid of it or just buying better systems to replace it?

Most organisations will (or should) have a policy regarding what to do about legacy technology (although many, many more don’t, choosing to bury their heads in the sand, hoping nothing goes wrong on their watch). If a policy does exist, then within that it should define what’s to be done about legacy technology, such as:

 

  • Retain (or in other words, don’t do anything about it)
  • Retire (or just get rid of it)
  • Re-Host (lift and shift it)
  • Repurchase (buy a better version)
  • Re-Platform (lift and reshape it)

 

The path that’s eventually decided on will likely depend on several factors, most likely based on the size of the organisation, the size of the problem and its complexity (alongside budgetary constraints of course).

That’s what should happen when an attempt is made to reduce legacy estate technology but it doesn’t answer the question as to why so few attempts are made in the first place…

 

Older technology – If you’ve been using the same tech for the last twenty plus years (believe us it happens) then, as it nears its end-of-life you’ll likely find it growing more and more temperamental, unstable and in need of repair. The problem with that however is that due to its age the parts or expertise to repair it might not be easily available, making the repairs cost prohibitive.

 

Service Interruption – No piece of tech or system works independent of everything else. This becomes even more true if something has been embedded in your processes for a long time; in fact, you might not even have a clear idea anymore of what exactly it does interact with or what other systems are reliant on it.

That means taking everything off-line to either upgrade or replace it could have multiple unforeseen circumstances to both your staff and users… which makes many people freeze with indecision and try to maintain the ‘status quo’ for as long as they can instead; the net result being that the situation actually worsens over time.

 

Data Formats – Different systems will store and retain data differently, in different schemas or formats.

That being the case, a lot of organisations feel ‘stuck’ with their existing system.

They may want to move/upgrade but can’t as all their data is in the wrong format and can’t be migrated to the new system.

Microsoft are trying to do something about this using the Common Data Model but ultimately if you’ve a lot of data on a legacy system you may well need a 3rd party Transformation Partner to help you migrate it all.

 

Technical Debt – The older a system, the more technical debt it will have accrued over its life cycle. As with our first point, the older a system, or the more ‘legacy’ it is, the more expensive and time consuming it will be to pay this debt back, especially if there’s a lack of the right tools or skill sets in your IT teams.

 

Security Concerns – Migrating all your data and systems to a new environment, if you’re not completely sure what you’re doing, can leave you open to unique migration security concerns. The extra security precautions needed often puts many off even attempting it in the first place.

 

Accurate Documentation – Here’s the thing with legacy technology. It’s legacy.

That means the person who originally installed it has likely moved on or even retired. Those who were first trained on it and know exactly how it works may have moved on or retired.

The same goes for all the upgrades and improvements it’s had over the years but… If you don’t know exactly what it does and how it runs before starting a migration then things can go very wrong, very quickly.

The added expense of documenting a legacy system with all it’s functions and where it intersects with other processes within an organisation is often the biggest factor in why people attempt to keep it rather than improving it.

 

It doesn’t just end there though. Those might be all the technical reasons why organisations may avoid reducing their legacy technology but there’s also, unfortunately, a whole host of organisational (non-technical) reasons as well.

 

Organisational Culture – “But we’ve always done it this way”. We’re sure you’ve heard this before, especially in organisations with a low turnover of staff who’ve been using the same systems the entire time, but that kind of intransigent attitude can prove a real blocker to true digital transformation.

Getting rid of legacy technology isn’t just about getting rid of/upgrading the tech.

You also need to bring the users along on the journey or ultimately, you’ll fail even with the introduction of the most bleeding edge technology available.

 

Commercial Pushback – Just because you know upgrading a legacy system is the right thing to do doesn’t mean the organisation can afford it.

As much as we’re proponents of digital transformation, that transformation needs to work for an organisation, at a pace it’s comfortable with and can afford and that fits in with its current priorities.

 

Outdated Policies & Procedures – Legacy tech isn’t the only thing that needs to be transformed, legacy procedures do as well.

Failure to address the second issue in fact will often prove a blocker to the first.

Outdated policies, for example, requiring the IT team to keep a hard copy of all files, can really inhibit the types of systems they can look at.

That’s why, when starting a digital transformation project, it’s important to look at both.

 

Lack of Resource – A lot of organisations attempt digital transformation projects in-house (and there’s nothing wrong with this… although a 2018 report from Global Management Consultancy McKinsey said 86% of Digital Transformation projects ultimately fail).

However, it may be that your in-house IT team just doesn’t have the resource, isn’t big enough or have the skill sets needed to migrate some legacy technology (as previously mentioned, dependant on its age, the people who originally installed it have likely moved on).

 

Multiple Suppliers – Without the help of an experienced Solution Architect and Project Manager to coordinate multiple suppliers and 3rd party partners, there can be delays in accessing the data, systems, polices or hardware needed to migrate your legacy tech.

That’s why it’s so important someone is designated the role of coordinator from the start, to oversee all the changes.

 

Contractual Agreements – Technology gets out of date or to the point where it isn’t fit for purpose really quickly these days, especially in very volatile or dynamic sectors that require a much higher degree of flexibility, but… Vendor Lock-in can be a real problem when it comes to trying to do something about legacy technology.

 

Risk Aversion – Some organisations are just a lot more averse to risk (or what they may wrongly see as a risk) than others.

It could be they still think the cloud is ‘too new/risky’ and decide to stay on-prem or it could be that all the above-mentioned factors mean they decide to ‘put up with’ the faults with their current legacy system rather than making the attempt to do anything about it, instead hoping the problem eventually goes away on it’s own (it won’t!).

How Do You Know When It’s Time To Migrate Your Legacy Tech?

As we’ve already mentioned, there’s so many different factors involved with migrating legacy technology that it can be almost impossible to know exactly when to push the button on a digital transformation.

There are however some very good indicators of when it’s time…

 

  • Maintaining your current, legacy systems becomes more expensive than replacing them outright
  • Reduced efficiency is having too big an impact on the organisation or department
  • The technology is so old it’s no longer supported, either by the original creator or by 3rd parties
  • The contract or lease you use the technology under is about to expire
  • Maintaining and patching the existing technology becomes too much of a security risk to the organisation.

Getting Your Legacy Systems Ready For Migration

Continuous Improvement – If you’re stuck with a legacy system that you can’t replace wholesale or are getting ready to do so them employing a principal of Continuous Improvement to the process will pay off dividends in the long run (and in the short term to be honest).

Not only will this help with any future migrations, but it’ll help prevent the accumulation of legacy technology going forward.

You’ll find he main benefits of continuous improvement to your organisation will be:

 

  • The gradual (and easily managed) retirement of technology. Taking this approach is obviously much cheaper than having to do everything all at once, at the last minute, in a rush.
  • It reduces the security risks to both your systems and your organisations infrastructure.
  • Reduces the risk of all future migrations by having all tech and OS’s on their most up to date versions.

 

However, just as there are many benefits to a continuous improvement approach for legacy estate reduction, there are things you’d need to look out for too, such as:

 

Extra time, budget and resources may need to be allocated if you’re adding new functionality to existing systems to allow for things that need updating/replacing regularly.

  • All changes must be documented thoroughly so that going forward what has and hasn’t been updated is understood by both current employees and by anyone who may be involved with the organisation in the future.
  • Reducing legacy estate technology isn’t just about moving forward though. If you’re using a process of continual improvement to migrate only particular aspects of the tech at a time then it’s vital all new tech you install is also backwards compatible with the legacy technology that’s left, or else everything will come to a crashing halt very quickly.
  • The name of the game with a true digital transformation should be in replacing/building your new infrastructure almost on a Lego model, with components that can easily be swapped out or upgraded in the future.

 

Document All Your Data Assets – Both Legacy & New – Having a complete (and more importantly accurate) record of all your organisations data assets is vital as the real value to your organisation isn’t the system itself, but the data stored within it.

As already mentioned, understanding that data and the format(s) it’s in is a big deciding factor in any successful legacy migration.

Your data asset register should include:

 

·       The type(s) of data you hold

·       Where it’s stored

·       How it’s been secured

·       How it should be handled

 

It should be someone within the organisation’s responsibility (preferably someone

senior) to check that asset register regularly and keep it up to date with any ongoing changes.

The nice thing about these checks, with continuous improvement in mind, is that they’re great in highlighting when technology is running the risk of becoming legacy and needs updating/replacing.

 

Understand Your Infrastructure – Documenting all your data assets and processes though, is only half the battle unfortunately.

It’s also really important someone at your organisation is intimately familiar with the practicalities of all your systems, processes, policies, infrastructure and what they’re all capable of.

If this step is skipped, then documenting your data assets may fail as there’s no back up if something’s missed. It’s important someone within your organisation (preferably multiple people) can do a manual sense check.

A System’s Architect is perfect for this role and will be more than used to deciphering the capabilities of older systems that have evolved over time.

Regular IT infrastructure reviews will also help here and in fact, if you’ve instituted a program of Continuous Improvement, then these reviews can form part of your ongoing review cycles.

When conducting these reviews, you may find it helpful to consider…

 

  • Your organisations infrastructure – Does it still meet yours and your user’s needs?
  • Performance management
  • Service availability
  • Security management
  • Capacity management
  • Your administration and business processes

 

Above All, Be Flexible – As contrite as it may sound, having a responsive service model, capable of reacting instantly to changing situations in the market, as well as being able to adapt to new technologies, is vital.

As part of a legacy estate reduction initiative, it’s likely a lot, if not all your systems will be moved to the cloud, with various, different suppliers facilitating that.

As that transformation is happening is the perfect time to become more flexible as an organisation by considering and implementing:

 

  • Is the current iteration of your service model meeting user’s needs?
  • Your IT Team’s governance structures and processes; is their day to day configured around flexibility?
  • Do those governance structures align with the organisation’s goals?

 

And the hardest question to answer around that flexible model is if a point comes when your systems become legacy again is that you can answer whether the technology needs changing or your businesses process do.

What Is Fintech?

The key to understanding fintech is realising it’s not about crypto-currencies or blockchain security but in improving the User Experience for both the organisation and client using it

 

Fintech, or financial technology, is an increasingly common phrase being used to describe any new or innovative ‘tech’ that automates, or somehow otherwise makes more efficient, the execution of financial services.

Broken down to its most simple elements, fintech is designed to help organisations (and their clients) make their financial operations more efficient and secure through the use of specialised software’s, AI, machine learning or other advanced tech.

 

When the phrase fintech was first coined, it tended to only be applied to ‘financial technology’ involved in the back-end systems of financial institutions but it’s scope has widened in recent years to also include more client-based solutions (be it customers of a bank or the member accountants of a membership organisation).

Nowadays fintech spans multiple different sectors, industries and occupations, from central Governments or retail banking right through to Nonprofit organisations or the education sector.

It’s also come to represent more advanced financial technologies such as crypto currencies.

Making Sense Of Fintech

Any new technology when it’s first introduced will seem daunting, especially if proponents of fintech start waxing lyrically about bitcoin and block chain security but ultimately it’s no more complicated than improving the user experience of both an organisation and their clients.

 

Since the advent of the internet, financial services have been getting more complicated. First it was online payments and now it’s currencies not backed by any nation state (Bitcoin). Fintech covers all of that as well as the ‘simpler’ services such as automation software, hyper-automation, CRM systems, money transfers and stock trading technologies.

Many of those are needed or are available in other sectors as well but when falling under the umbrella of ‘fintech’ the financial aspect of the tech, and all the complexities and security concerns that go with that, are kept firmly to the forefront.

 

Fintech has also folded into itself a lot of new technologies that may be found elsewhere. Things like artificial intelligence, machine learning, predictive science etc. Anything in fact that can take the guesswork out of making a financial decision.

How much better to rely on an advanced AI to decide if a new small business loan should be offered rather than the decision being left to an agent sat across the desk from a nervous prospect explaining why their business will be the next big thing.

You’ll also find organisations that are early adopters of fintech using a lot of advanced customer service technology like chatbots with AI customer interfaces to help out clients with repetitive tasks whilst keeping their staffing costs down.

 

Money transfers, smartphone transactions, Apple Pay, crypto currencies, online banking, online credit applications, automated credit approval/rejections and remote management of investments are all types of fintech available in the modern world and if that list makes you want to scream ‘no more’ then fear not, there’s plenty of Transformation Partners out there who have specialised in fintech and will be happy to help (cough, cloudThing, cough).

Real World Applications Of Fintech

So as we’ve already mentioned, fintech has become rather ubiquitous in the modern world.

It’s there as both a threat and an added level of security to established organisations; the security of keeping them relevant in a constantly shifting market and the threat of shaking up entrenched or obsolete financial processes that need replacing with more dynamic and efficient functions.

 

The uses to which fintech could be put are only limited by yours and your transformation partners imagination but some examples might include:

 

  • A retail store offering short term loans with immediate credit approval for large items, bypassing the credit card companies
  • Automated mortgage application and approvals using AI, cutting out the need for manual approval by staff
  • The IMF even recently suggested the possibility of offering loans to individuals in the developing world not based on their credit history, but based on a deep analysis of the shopping data stored on their smart phones

 

Basically, any aspect of the finance sector… or that even comes close to touching on ‘something’ to do with finance, is a good candidate for a fintech transformation, whether it be something that’s always driven clients insane that could be improved upon or a back-end system that’s still done manually because ‘that’s how we’ve always done it’.

Who’s Currently Using Fintech?

There are currently four (very) broad categories of people that have been early adopters of fintech.

 

  • Financial instructions for their backend processes
  • Financial institutions for their client engagements
  • Small businesses and start ups for their backend processes
  • Millennials

 

Fintech has made possible mobile banking, better financial decision making, better business intelligence analytics and the decentralisation of the entire financial sector, all whilst still in it’s infancy, easily explaining why so many organisations have embraced it. Then, on the other end of the scale is it’s widespread use by the general public.

It’s an oft overused cliché but the younger someone is, the more likely they are to be comfortable using fintech.

Almost all consumer targeted fintech has been developed for millennials for a reason and as that demographic gets older (and starts to earn more) the investment in fintech will only increase.

However, it’s too easy to say older generations aren’t interested in fintech because they can’t or won’t understand it. It could just be fintech hasn’t been deployed yet to solve the types of problems older generations might be dealing with.

Before fintech a small business owner might have had to (and possibly still does) go to their bank manager, cap in hand, for an injection of capital. If they wanted to offer credit to a supplier, they’d likely prefer a personal relationship with them.

Whilst fintech hasn’t completely replaced those interactions, as it becomes more widespread, more credence will be given to AI driven business intelligence and less on ‘gut feelings’ about someone.

An unexpected but welcome benefit of that being the elimination of gender/ethnic/religious bias in financial decision making processes.

The Future Of Fintech

The status quo financial institutions have enjoyed up until now has always involved a high degree of centralisation, each organisation offering a wide range of financial services under the aegis of one brand, from mortgage lending, right through to financial trading on Wall Street.

The future of fintech will completely disrupt that business model, allowing organisations the flexibility of unpacking previous offerings and parsing them out individually themselves, effectively cutting out the middleman.

As previously mentioned, if a retail store can offer a microloan on the spot, independent of a bank or a credit card company, it makes the entire transaction more streamlined and efficient, saving time for the customer (with an instant decision in store) and avoiding financial fees the store may have previously paid to the bank.

In the coming years (and to a certain extent it’s already happening) fintech will appropriate more and more of the traditional financial control away from bank tellers, bank managers, brokers, traders and salesmen by automating their decision making whilst enabling said processes on mobile devices using hyperautomaion and AI tools for instant accessibility.

 

What’s interesting about the future of fintech however is who will be in control of it.

 

Previously, with the advent of new technologies, young start ups (think Uber or Airbnb), have completely outclassed entrenched competitors as they’ve had too much organisational debt to effectively compete.

This time around though the banking institutions have paid attention and have been investing heavily in fintech themselves… Apple Pay or AI automated decision making on loans being two perfect examples, both now available on an individual’s mobile device.

 

Competing with fintech-inspired start-ups offering a new and truly innovative experience though will likely require a lot more than a simple cash injection.

Making the most out of the possibilities created will require imagination, meaningful changes in thinking and the ability to completely pivot an organisations direction in line with sector trends.

How To Get Better At: Online Continuous Personal Development (CPD)

CPD doesn’t have to involve expensive locations, trainers and time away from the office anymore

 

Online CPD (Continual Personal Development) is becoming one of the (and in fact during 2020 was probably ‘the’) most popular method for tutors.

Certainly it’s become the go to standard for continuous personal development training, both remote and not.

Online learning isn’t a new concept by any means but over the years the technology behind eLearning has continued to improve to the point where its critics have mostly fallen silent and it’s become the de facto ‘go to’ for most membership organisations running CPD courses.

Busy professionals with limited time available, many of whom are mandated to carry out CPD courses throughout their career are looking for the flexibility that online CPD offers whilst organisations running the courses want to benefit from the efficiencies and cost savings that it makes possible.

What is Continuous Professional Development?

For anyone reading this not sure what CPD stands for, it’s Continued Personal Development, but much more importantly… what is it?

CPD is more than a set of initials or a clever anacronym. It’s a principle or a commitment either by yourself or an employer to your ongoing career and future professional development.

 

Historically CPD training would take the form of seminars, lectures, classes, talks or self-study. These days however it’s much more likely to be a webinar or online training course.

Either way, CPD is there to train, update and expand on either a key or generic set of skills a professional needs for their day to day role.

It could be practical skills or a more theoretical refresh of existing skills and may or may not be mandated as part of their career path.

It’s how individuals in ever advancing fields (like medicine perhaps) can stay relevant; keeping up to date with the latest advances but it’s also a great way for others to progress in their careers, learning new skills as they work.

Why is Online CPD Needed?

Long before the world’s current woes with COVID-19, online CPD training courses had become the preferred method for providers and professionals alike.

For providers there are massive cost benefits involved whilst the professionals taking the CPD benefit from the convenience of being able to take the course or exam wherever they like.

 

If an organisation requires their staff to carry out CPD then realistically they have two choices. They can either have it carried out in-house or contract it out to external agencies.

In-house means they’ll need permanent members of staff on the payroll that can conduct this training, either as their main function or as an extra to their day-to-day duties.

Having full time members of staff to conduct this training means a big increase in staffing costs whilst assigning it to someone else, likely in your HR department if you don’t have a full time trainer, means pulling people from their regular tasks.

Outsourcing it however can be even more expensive, with agency fees, staff time lost from travelling to the courses, up front exam costs etc.

On the other side of the aisle, the providers that conduct CPD training have staffing costs for tutors to pay themselves, the hiring out of venues, travel expenses and dozens of other ancillary costs that quickly all mount up.

There’s pros and cons to all these situations but the point is, having the ability to do many of these things virtually or remotely will mitigate a lot, if not all of them.

Types of CPD

When discussing CPD training, either online or in-person, it’s worth noting that there’s two very distinct types… Mandated CPD and what we’ll refer to as ‘General Improvement’.

MANDATED CONTINUOUS PROFESSIONAL DEVELOPMENT

As already mentioned, many professions require ongoing mandated training to make sure individuals are kept up to date with the latest advances in their chosen field. It’s expected of almost all professionals considered to be working in a ‘professional sector’.

The CPD (and resulting certification) is usually governed by a sector specific regulatory body who’ll define the scope and requirements of the CPD (as opposed to an employer).

These bodies could represent sectors as diverse as healthcare, the law or accountancy right through to architecture or journalism.

Anyone in a regulated sector will most likely need to track and progress their own CPD requirements to maintain either/or their license to practice or their professional qualifications. An individual’s employer may help with these, supporting them with paid time off to complete the training (or even paying for the CPD themselves) but this isn’t always the case; within these sectors, the onus is very much on the individual to source and maintain their own CPD requirements.

The regulatory bodies in charge of these qualifications will be responsible for maintaining their industry’s reputation with the public so will take their duties very seriously, making sure their members are fully up to date and taking measures to ensure fairness in the process such as paying for exam proctoring to prevent any hint of foul play.

GENERAL IMPROVEMENT CONTINUOUS PROFESSIONAL DEVELOPMENT

The other type of CPD is perhaps a lot more common but often goes by a variety of alternate names. This is the kind that tends to be run by employers rather than regulatory bodies and is designed to help improve their staffs knowledge/training.

It’s not a ‘mandated’ part of their career path or required by law, instead the employer does it to keep their staff informed around issues or events that may affect the sector, the company as a whole or their job role specifically.

This type of training could be anything from a refresher course in a warehouse on how to correctly lift to how to prevent phishing scams in an office setting.

What Does Good Online CPD Look Like?

So the world’s heading towards online CPD and we’re all agreed that’s a good thing, but what does/will ‘good’ online CPD look like though?

It’s a big question, with no ‘right’ answer… except cloudThing are going to answer it anyway (of course).

 

The first and most important thing is to ensure that any CPD training that’s conducted, whether it be mandated or just general improvement, be part of a structured, long term plan that improves the individual’s knowledge and empowers them to excel in their role rather than a tick box exercise.

Research conducted on the effectiveness of CPD training consistently shows that it’s most effective when it’s sustained, consistent and everyone involved understands its benefits.

CPD is only really at its most effective when it’s fully relevant to the individual.

If you’re planning on running CPD sessions for large groups, then it’s worth considering breaking them up into smaller units with tailored content that best addresses their interests and/or skill levels.

 

The first step is in considering how you’ll conduct the CPD online.

Virtual lessons are perhaps the quickest route, putting several students on a Microsoft Teams call in front of a teacher. This approach is basically a replication of the classroom environment but with the added benefit of remote access.

This makes everything a lot more convenient for everyone involved but adds little to the experience (with many arguing it may even detract from it) although it does mean that most course materials such as textbooks or handouts can be offered to the trainee free online, which is a huge benefit to individuals on a budget or to organisations paying for their staff.

So how can technology be used to make online CPD’s ‘better’ or more efficient rather than just empowering remote access?

 

The name of the game here is automation!

 

Many of the processes around CPD can be automated with many new improvements being added too.

 

Record Your Sessions: Probably the quickest ‘win’ to save time/resources/costs is to record your training sessions. If it’s a subject that gets repeated a lot, rather than conducting a new session every time you can record it once and let people watch it as and when needed.

This type of action is most useful for low level training for subjects like ‘how to correctly lift’ etc that won’t require a lot of follow up questions and can be shown as and when required.

For more in-depth subjects these videos could be used as an additional follow up resource after a ‘live’ session.

 

CPD Reminders: Whether a CPD course is mandated or just part of an organisations regular training schedule, the ability to track who’s done what, what’s outstanding and what scores were achieved with automated reminders being sent out is an invaluable time saver.

 

Competency Gap Analysis: Moving past the more obvious, a great way of using tech to improve the online CPD experience is through competency gap analyses.

Right off the bat you can have people complete an in-depth self-analysis questionnaire to find out which areas of training they might benefit the most from. The self-analysis can be made as contextual as you like, assigning ‘scores’ and setting goals for each individual which will then generate automated learning paths for them with links to suggested training materials such as blogs, YouTube videos, per reviewed platforms, previously recorded sessions etc. With access to the learner and trainers calendar it can even schedule in automatic 1-2-1 sessions to review what progress has/hasn’t been made.

 

Members Requests: An additional benefit to the above is that it’s very easy to run off a report as to which areas individuals struggle most with so additional content can be created to fill that gap. The self-analysis reports can also be monitored to see what areas of training people most request, which can then be folded into an organisations ongoing CPD strategy.

  • It is provided by people with the necessary experience, expertise and skills. These providers may sometimes be colleagues and peers. At other times they may be specialists from inside or outside the school.
  • It is based on the best available evidence about teaching and learning. The evidence needs to include current research and inspection evidence. Research shows that pupils learn best when staff are motivated, developed and updated.

Benefits To Online CPD Training

As we’ve already touched upon, the advantages tech can bring to CPD training (not the least by offering it online) are numerous.

Online CPD means the both the trainer and trainee can see the results of tests in real time and respond to them accordingly with follow up actions, grades and certificates can be accessed instantly, the course can be optimised much more efficiently with real time feedback backed up by powerful analytics to create feedback loops in the training to both improve the course and empower the individuals taking it.

PROS OF ONLINE CPD

  • It’s cost effective
  • It’s flexible
  • There’s less pressure on the individuals, who are able to go at their own pace
  • It’s more accessible for everyone, from shift workers unable to attend in person sessions through to the hearing impaired or those with mobility issues.
  • It allows membership organisations to target individuals outside of their normal catchment area… perhaps even expanding their audience to global scale.

CONS OF ONLINE CPD

  • As already mentioned, if pre-recorded material is being used, it can take time to get answers to specific questions/problems
  • Practical training such as CPR is much harder/impossible to deliver remotely
  • All participants, both the trainers and trainees need a comfortable working knowledge of computers and a strong internet connection.

Finishing up, Online CPD is here to stay and in the coming years those that improve it with AI and machine learning are going to become their industry leaders.

It’s benefits, to employers, employees and membership organisations should be clear.

 

For membership organisations and employers CPD helps them share knowledge by keeping their members or staff informed with sector updates whilst ensuring the highest level of professional standards are maintained.

For the individuals taking the CPD’s the benefit is to their careers or future aspirations.

Anyone looking to advance within their sector is hugely benefited by showing a commitment to improving their knowledge and skill sets or, put another way… CPD training is a great way to achieve a higher salary!

Virtual Exam Proctoring (Or How To Stop People Googling The Answers At Home)

Virtual Exam Proctoring isn’t just a side effect of 2020.

As Membership Organisations, professional bodies and universities attract more diverse, widespread membership candidates, virtual exam proctoring becomes essential both for professional qualifications, continual professional development and staying relevant in the marketplace.

What Is Virtual Exam Proctoring?

An exam that is proctored virtually is one that’s being supervised/monitored by an approved third party (the proctor) who’s role is to confirm the identity of the individual(s) taking the test(s) and ensure the integrity of the test environment.

Virtual proctoring should allow for the same, or better, level of supervision that would occur if the test were being taken in person, just conducted through online monitoring software, allowing the student/candidate to complete the test in a location remote from the examining body and the proctor.

It’s standard practice in a virtually proctored exam for the student to confirm their identity through a webcam, whilst sharing their screen(s) with a proctor throughout the test to help identify any suspicious activity.

Concerns that have been raised historically around virtual exam proctoring that modern solutions are now answering include a student’s privacy being violated and the potential for some candidates to circumvent the software being used to monitor them in order for them to cheat.

Why is Virtual Exam Proctoring Needed?

Virtual exam proctoring, in the current climate, has unfortunately become a necessity.

Exams had to either go online or be postponed until the world comes out of it’s various, COVID-19 related local lockdowns. Once an exam goes virtual though, there’s a necessity upon the exam organiser to ensure that the right person is taking the test and that they don’t cheat; either by taking extra time or by googling the answers/using resources they shouldn’t have access to.

 

There’s also a tremendous cost benefit to membership organisations and exam boards that can solve the problem of virtual exam proctoring, saving them the expense of various exam centres in proximity to their candidates as well as the cost of on-location proctors.

Professional exam proctors aren’t cheap to hire/employ and, when conducted in person, it’s almost impossible to measure their effectiveness with any kind of measurable KPI.

Types Of Virtual Exam Proctoring

There are three main types of virtual exam proctoring…

‘LIVE’ VIRTUAL PROCTORING

Live virtual proctoring has been around for years now and is probably the most common type currently in use but it’s also the least cost efficient.

In live proctored exams, the proctor will monitor the student’s video and audio whilst also screen sharing with them in real time. This ensures they take the exam themselves, don’t have anyone in the room helping them and can’t use other screens/browsers to look up the answers.

This enables the students to take the exam from remote locations with a proctor monitoring anywhere between 16 to 32 students at once.

Although this form of virtual exam proctoring does allow for remote exams to take place, it still requires an exam ‘schedule’ with just as much human involvement as though the exam where taking place non-virtually.

RECORDED EXAM PROCTORING

One method that’s becoming increasingly common is to record a student’s audio, video and screen as they take the test in real time. This means the exam can be taken at any time, subject to the convenience of the student.

A trained proctor will then watch the video/audio/screen back (typically at three to twenty times normal speed) looking for any suspicious behaviour or signs of cheating.
The fact exams can be taken at any time is a huge bonus to this method but it’s still reliant on being watched back by a real proctor, keeping the costs similar to the membership organisation and the fact that the audio/video feeds are watched back at speed can allow a level of human error to creep in.

AUTOMATED OR AI PROCTORING

Using AI to create a virtual exam proctor is the most recent and advanced form of proctoring.

The audio/visual/screen(s) pf the students are once again recorded but… instead of being watched live of a speeded-up recording after the event as in methods one and two, the AI integrated with the system will monito for any suspicious activity using advanced audio/visual/screen analytics and flag the recording for manual review by a trained proctor.

The software can do many thing… ensuring the students attention is on the test and not on an off camera mobile device to look up answers, it can search for suspicious noises/shapes in the background such as a person reading out the answers and it can even use facial recognition at the start of the test to authenticate the identity of the student.

Other security configurations that can be added to the software include limiting logins to specific IP addresses, the blocking of copy & paste functionality and/or freezing the screen of the student taking the test to stop them accessing other applications or browsers during the test.
This method allows membership organisations to do away with exam schedules and location constraints whilst also cutting down on the costs of professional proctors allowing the solution to be scaled effectively.

What’s Driving The Adoption Of Virtual Exam Proctoring?

As already mentioned, virtual exam proctoring has been around for years now as the technology involved in methods one or two is relatively low tech and easy to adopt as long as both parties (student and proctor) have suitable wi-fi speeds/connection but due to the costs involved they’ve never exactly been scalable.

There’s little point swapping a physical solution for a virtual one if the benefits are limited/non-existent.

Automated or AI proctoring however represents a real stride forward in the tech at a time when the demand for e-learning is experiencing huge growth… and not just because of the coronavirus!

The modern student wants to be able to learn wherever they are, according to their schedule… and that attitude extends to their exams as well.

That shift in behaviour comes at a time when cost efficiency has never been more important so the ability to save costs on physical locations as well as physically (remotely or not) proctored exams is a massive enticement for invigilation boards.

What Should You Look Out For When Choosing A Virtual Exam Proctoring Tool?

The biggest thing to look out for when looking at virtual exam proctoring tools is ease of integration.

They’re not easy tools to build from scratch so most likely you’ll end up with an OotB (Out-of-the-Box) solution. The issue there however is that they’ll want you to use all their systems/assessment engines instead of building on your existing infrastructure and adapting it into the final solution to save you costs.

The ideal package is a layer of AI Exam Proctoring of your current systems that can built on continuously, allowing you to implement it in the shortest amount of time.

The next point to look out for is robustness. With the need for video streaming, recording and screen sharing you can’t have the platform glitching, well… ever! Not only will this adversely affect your candidates, but the reputational harm to your organisation will also be incalculable.

Being able to plug into something like Power BI from the Microsoft Stack will also give you cutting edge analytics and reporting capabilities, allowing you to learn, adapt and advance with every exam taken by staying on top of the candidate (and proctoring) experience.

Who Is/Can Use Virtual Exam Proctoring?

There’s a lot of people out in the market already making use of virtual exam proctoring including professional membership organisations, assessment providers, online education providers (or as of 2020 we can just say education providers) and certifying agencies.

How Easy Is It To Cheat During A Virtual Exam?

There are sources on the web that give students advice on how to cheat or trick a Virtually Proctored Exam but most of the ‘tips’ these sites give are ludicrously convoluted, requiring a lot of preparation to even stand a chance of working.

As time passes the AI behind virtual proctoring gets smarter and smarter, making even these techniques difficult to get away with and, even if a student does manage it in the short time… you’ll still have the physical recording of their audio/video/screen for a thorough review.

The vast majority of people will just fall in line and take the test honestly.

Candidate Privacy During Virtual Exam Proctoring

As we mentioned at the start of this article, one of the big historic concerns with virtual exam proctoring was that for a candidate’s privacy.

Due to the nature of virtual exam proctoring, there will always be privacy concerns, as candidate’s will most likely be completing the exams in the privacy of their own homes. The key issue to understand and implement as part of the solution is mitigating these concerns as much as possible.

The first and easiest solution is to implement a principle of minimum access.

We assume all your proctors (virtual or not) will go through rigorous vetting processes anyway but after that the system can be set up in such a way as that only those who have been vetted can ever access the data.

There’s likely regulations that will need to be confirmed to around the length of time recordings can be (or have to be) held for but again, the system can be created in such a way as that only proctors or otherwise authorised staff can see the exam footage and that after the required/legal time limits have elapsed the data is automatically deleted.

What’s Next For Virtual Exam Proctoring?

As cliché as the term ‘the new normal’ has become, when it comes to virtual exam proctoring it really does apply.

As more and more professional organisations, universities or certification bodies adopt this tech, more and more students will come to expect it as the norm and anyone not offering it will be left by the wayside.

Software that’s currently being worked on include more accurate ways to authenticate students ID through the use of biometrics through hardware like smartwatches or fitness monitors.

In fact, those same pieces of hardware could be used during the exam itself to measure changes in pulse or heartrates as red flags for cheating.

The other side to that is the ruling out of false positives.

Exams have always made people nervous but the problem with a nervous person is that they can look guilty… which is not what you want with a virtual proctor monitoring you!

This kind of technology can be used to overcome these issues. By monitoring things like pulse or heart rates an AI could be easily taught to recognise the difference in someone cheating vs someone who was just nervous.

 

Once virtual exam proctoring becomes the new normal (and it will) and the technology has bedded in then several other possibilities present themselves as natural evolutions of the tech. One of these could well be Adaptive Examinations.

Just think about it.

You could well be one of the first organisations to offer candidates an adaptive exam experience. Instead of offering them a fixed set of questions the exam itself could change depending on the candidate using an adaptive algorithm, offering different approaches and questions based on their previous answers or even how they answered a question.

The questions could become much more subjective, making it that much harder for a candidate to google the answers (cheat in other words) with the algorithm software reviewing the answers for sentimental analysis.

How Hyperautomation’s Benefiting PAO’s (Professional Accountancy Organisations)

The adoption of hyperautomation by the wider accountancy sector has, in the last few years, resulted in a faster, more secure, more reliable and more efficient service, both in-house and for their clients.

It’s had to though, as in order to remain competitive in a saturated market, the globes top PAO’s have needed to provide the very best experiences to their customers, whilst also maximising efficiency and reducing costs wherever possible within their own firms at the same time as maintaining or increasing their cybersecurity levels.

 

Automation uses technology to automate tasks that once required humans. Hyperautomation deals with the application of advanced technologies, including artificial intelligence (AI) and machine learning (ML), to increasingly automate processes and augment humans. Hyperautomation extends across a range of tools that can be automated, but also refers to the sophistication of the automation (i.e., discover, analyse, design, automate, measure, monitor, reassess.)Gartner – Top 10 Strategic Technology Trends, 2021

 

Hyperautomation has played a huge role in this digital transformation process, speeding up complex accounting processes, reducing time consuming manual tasks and just in general making workflows more organised and, if it doesn’t sound too obvious, automated.

Hyperautomation is particularly good at doing away with manual and labour-intensive back office processes that keep staff from doing other, more important tasks. In freeing up this time from human to machine, a PAO can either reduce staffing costs (if they have a need to) or their teams can instead concentrate on projects that bring real ROI to the organisation.

As no single app or tool can replace a real, thinking human, hyperautomation involves combining robotic process automation (RPA), intelligent business management software (iBPMS) with AI and machine learning (ML) with an end goal of making certain descisions increasingly AI led.

 

PAO’s have to deal with a ridiculous amount of data, much of it personal, sensitive or oft times, both, often using very manual processes. The problem with manual, repetitive processes however is that they’re very prone to human error and as any account reading this will tell you, one small error in the data up the line can lead to huge mistakes further down… even if everything else is correct.

Hyperautomation does away with all of that, minimising manual processes to avoid human error and speeding tasks up to a tremendous degree.

An RPA bot can reduce processing costs in a PAO from anywhere between 30% to 70%.

Where Can Hyperautomation Be Implemented?

PAO’s can make use of hyperautomation in so many different areas that they’re really only limited by their imagination, or the imagination of the Digital Transformation partner they’ve chosen to work with.

Some of the most common area’s hyperautomation is applied to accounting processes though are…

 

  • Customer Service – Now we’re not saying customerscan be annoying to deal with sometimes. Having to answer the same questions over and over, time and time again. We’re not saying that. What we are saying is that a bit of hyperautomation, combined with some clever AI could easily answer a large chunk of the questions you get asked repeatedly through a chatbot directing clients to a relevant page on your website.
  • Compliance – Compliance for PAO’s doesn’t just mean GDPR as it does for so many other sectors. No matter what geographic location you operate from, there’ll be a plethora of regulations to adhere to from both the central Government of the country you’re based in and of anywhere you do business. Hyperautomation reducing manual processes (and human errors) can massively help with this.
  • Accounts Payable – Accounts payable is probablyone of, if not the, most repetitive task any finance team will have to do. It doesn’t need any out of the box thinking which makes it the perfect task to be handed over to hyperautomation. It can read all the relevant information needed from scanned in documents using an optical character recognition tool in seconds, validate the information against existing databases and then process the payments all with little to no human interaction.
  • Fraud detection – We often hear that keeping everything manual reduces the threat of fraud as cyber criminals can’t ‘hack’ manual processes. Hyperautomation can help there though too by tracking all incoming and outgoing transactions and flagging any suspicious patterns in real time.
  • Report automation – Combing a bit of hyperautomation withMicrosoft Power BImeans you can have daily, weekly or monthly reports generated and sent out to all stakeholders with no manual processes involved past the initial setup.

 

Hyperautomation is irreversible and inevitable. Everything that can and should be automated will be automated. – Brian Burke, Research Vice President, Gartner

 

Benefits Of Hyperautomation

  • Cost Savings – As already mentioned, hyperautomation increases efficiency, reduces errors and allows an organisation to either downscale staffing costs for manual, laborious tasks or move those staff on to tasks that will generate more ROI in the long run.
  • Increased Operational Efficiency – Yes, we’ve already mentioned this… but the point is worth belabouring. As a PAO the work you do can have direct and measurable effects on a country’s economy. That means the more efficient and profitable you are, the more so said economy may become through the ripple effect.
  • Make Your PAO More Agile – Accountancy reaches out to every sector of every country on the globe meaning that whilst most firms are robust, the industry can find itself affected by both local and global trends. That means agility and the ability to pivot at a moment’s notice is vital for a modern PAO.
    Hyperautomation will allow a PAO to do that quickly, for any situation that may occur in a variety of different ways, none the least of which is having the staff to do so, not tied up with mundane, repetitive tasks.
  • Little Infrastructure Investment – Due to the nature of hyperautomation you don’t need to spend a fortune changing your current infrastructure. Instead, you can implement it as a ‘layer’ over your current systems, integrating it seamlessly with your offices processes.
  • Low Code/No Code – Most hyperautomation’s come with easy to understand ‘drag and drop’ tech so that even someone with only a passing familiarity with the technology can create automation workflows with little or no coding needed. This makes maintain it after the initial setup incredibly easy.

 

Hyperautomation lets Professional Accountancy Organisations do so much more for less, reducing HR costs and human errors all whilst increasing efficiency and the all-important ‘bottom line’.

It can provide an ‘edge’ over competitors and the efficiency savings provide growth within your organisation.

Cyber Security For Remote Working… How Everyone Can (And Has To) Pitch In

Cyber Security isn’t, and shouldn’t, be the sole responsibility of your IT team

 

2020 has seen an unprecedented rise in the rate of remote working and whilst only time will tell if this becomes the ‘new normal’, organisations have already had to take drastic steps to lock down security over these new extended remote networks.

Or they should have at least…

 

Many organisations are still failing in their cyber security and time and time again cloudThing has seen their main vulnerabilities being their remote workers and the access they need to centralised systems.

One of the key points we stress when it comes to cyber security and remote working is staff awareness… to make your organisation an unattractive target to cyber actors, everyone needs to do their part.

Everyone Needs To Understand Their Part

It’s not just the IT department that needs to worry about cyber security.

In an age of remote working cyber security needs to be built right into your organisations guiding principles, with every member of staff aware of their responsibilities. They need to know understand and be able to react to the dangers that working from home can bring to a company, from opening a strange email right through to logging on to a public wi-fi.

Implement Personal Device Policies

In the scramble to enable working from home at the start of 2020 an unprecedented number of organisations allowed staff to use their personal devices to connect to centralised work systems.

In theory there’s nothing wrong with that, but, if the correct guidance isn’t followed, then it can become a cyber criminals dream.

Your leadership team needs to work closely with IT, HR, and the wider organisation to institute secure policies for when staff are remote working that everyone understand, and, more critically, follows.

Staff need to know what it means when they use a personal device, what the consequences could be and what steps your IT team will take to secure company data on it. They might need to download specific security protocols or allow IT remote access to wipe the device if it’s lost.

Make Sure There’s Companywide Updates On The Latest CyberThreats

Cybercriminals are constantly improving their skills, finding new ways to attack your organisation so if Frank from accounts is remote working these days a half hour talk on cyber security when he started two years ago just isn’t going to cut it anymore.

Your IT team will be (or should be) well aware of what’s going on in the wider worlds and they need to work with other departments to communicate that knowledge.

For the foreseeable future the two biggest threats to remote working will be phishing scams and network penetrations.

Phishing scams are nothing new, but your staff should still be kept up to date with the latest techniques cyber scammers are employing as they get ever more sophisticated.

The bigger concern is a disperse ‘home network’ as opposed to the traditionally centralised ‘corporate network’. That kind of setup gives hackers infinitely more entrance points to your vulnerable systems, massively increasing the risk of advanced persistent threat attacks, risking all your data.

Home routers, open management interfaces and ridiculously easy to crack passwords are just some of the issues your IT team need to work hard on to educate your staff, enshrining it in your governances.

Utilise HR & Marketing

As already mentioned, cybersecurity works best when all your staff are onboard. How best though, to accomplish that?

Fortunately you’ll already have two departments within your organisation who’s job it is to communicate… HR and Marketing.

HR will already be set up to communicate effectively with staff, and PR is marketing’s core job… selling others on ideas. Have both departments work closely with IT on how best to communicate the concept of cybersecurity to your staff.

Don’t be afraid to pull your experts from their normal roles to help out in communicating new and vital messages to company prosperity.

Have HR Mitigate The Psychological Impacts Of Remote Working

All these new security principles we’ve been discussing have to be implemented but… HR need to work with IT as it happens so as not to overburden your staff with unfamiliar concepts.

As important as these are, they need to go hand in hand with a positive staff experience.

Surprisingly, one of the biggest cybersecurity risks in this age of remote working is employee satisfaction.

Staff are worried about furlough, redundancy, job security and a whole host of other issues… all whilst coping with the mental stress of a new working environment… putting them in charge of their own cyber security with a whole host of new governances to learn could well be the straw that breaks the camels back.

Insider threats are a very real thing and won’t necessarily take the form of a staff member stealing your data. It could just be with everything else and a disgruntled/don’t care attitude they ignore new security measures from home, putting the entire organisation at risk.

 

That’s why IT need to work HR are about the rollout of cybersecurity controls, to make sure they’re implemented in a way that empowers rather than forces staff to adopt them.

Have a third-party test security measures before making them compulsory… just because they make sense to someone from IT doesn’t mean they’ll be easy to work with to a member of staff with a non-technical background.

Make your staff aware of why you’re doing these things. Rather than just arbitrarily ordering a new way of working on them, explain what the measures are for… you’ll be amazed how much more accepting they become.

Appeal To Their Self-interest…

If you want people to follow your new cyber security governances, then you need to appeal to their own self-interest.

Work related cyber security concerns can feel distant to employees stuck at home so you need to work extra hard to bring these very real concerns to life for them.

That’s where all the above steps come into play, with good governance, training, crafting off the message from marketing and before and after care from HR.

What Is An Advanced Persistent Threat (APT’S) Attack?

APT attacks are when a cyber actor manages to access your networks for long periods of time…

 

While some cyber actors operate on a ‘smash ‘n grab’ principal, getting in, grabbing/damaging what they can from your systems or network, then getting out before they’re caught.

There’s another type of cyber-attack though that goes completely against that principal though… The Advanced Persistent Threat.

What Is An Advanced Persistent Threat?

An Advanced Persistent Threat (APT) is when a cyber actor has gained prolonged access to your network or systems without you realising.

An APT attack is never done to just cause damage; its main aim will be to sit there as long as possible, collecting as much data as possible before being discovered.

APT attacks tend to be targeted at larger organisations in the defence, manufacturing or finance sectors or even nation states, simply because these are the organisations/entities with high-value information worth stealing… Personal information for thousands, possibly millions of people, high-end IP, financial details, military software… the list goes on with anything a cyber actor could sell for a profit.

An APT attacker will target a network with the aim of achieving persistent (hence the name) ongoing access.

Recent examples of APTs in the news (and the fines that resulted from them) are those that impacted Marriott Hotels and British Airways.

This type of attack can be quite resource heavy for the hacker though, which is why they tend to target high-value targets; smaller organisations just aren’t worth their time.

Whilst that might seem a positive for many, it does mean that, as your organisation grows larger, you’ll need to beef up your cyber-security to deal with this specific type of attack. And also be aware that APTs have targeted foundational security technologies that many types of organisation may rely on: the RSA APT perhaps being the most notorious example, due to the cryptographic keys securing every RSA SecurID token being put at risk of compromise, resulting in their replacement.

How Do Advanced Persistent Threat’s Operate?

As already mentioned, APT attacks can be quite resource heavy for the hacker. They’ll typically use advanced methods to gain entry to your networks or systems such as exploiting zero-day vulnerabilities or even highly targeted phishing campaigns against your staff.

Once in though, they then need to stay in for as long as possible.

To accomplish this, they’ll have to continuously rewrite malicious code that they’ve placed within your network to avoid detection. In fact, some APT’s are so complex that they require teams of full-time administrators and hackers to maintain their access. And it’s common for APT teams to fix any technical vulnerabilities they used to gain access, as they don’t like to share their spoils!

Identifying An APT Attack

APT attacks won’t be easy to catch but… you do have to catch them as the cost in not doing so can be incalculable for an organisation, both in financial and reputational terms.

Most cybersecurity experts agree that the best way to detect when an organisation is under attack isn’t by identifying the malicious code in your systems but by monitoring your outbound data for anomalies or discrepancies that could give away the presence of a cyber actor. Data loss prevention technologies can play a role here.

How Do Advanced Persistent Threat Attacks Work?

Anyone looking to breach your networks or systems with an APT would likely have to follow a process something like this (so any steps you can take to disrupt them along the way will massively bolster your cybersecurity) …

 

To start with they’ll actually need to gain access.

Many, if not all of the steps involved with this can be dealt with under your standard cybersecurity defences. The difference with APT’s isn’t how they get access… it’s how long they’re able to maintain access undetected.

 

Once they’re in, they need to stay there for as long as possible.

After breaching your systems/network the first thing a cyber actor will do is take a look around. They’ll introduce malware into your code that will give them continued access without being detected and then attempt to dig themselves deeper.

At this stage it’s worth noting that an APT attacker may well attempt to create multiple points of compromise within your systems. That enables them to maintain access if you think you’ve ‘discovered’ them… meaning you need to be constantly on the lookout.

They’ll attempt to access ‘deeper’ systems by breaking/changing passwords and give themselves administration rights over the entire network. If things get to that stage, they’ll then be able to move around your organisation (digitally) at will. From here they’ll start to centralise as much of your data as they can, encrypting and compressing it so they can exfiltrate it as soon as they deem it secure to do so undetected.

This will be repeated for as long as they’re able to stay unnoticed…. Stealing new and more valuable data over and over.

 

One of the hardest things to defend against when it comes to APT’s is that the hackers likely won’t be using standard, OotB (Out-of-the-Box) hacking tools. The APT will have been tailored uniquely to your organisation, making a cyber defence strategy much harder to formulate.

Spotting An APT Attack

Despite being hard to attack, APT’s do come with some key warning signs that you should always be on the lookout for.

Unusual activity on your staff’s accounts is always a good sign… especially if large amounts of data are being sent or it’s being sent from an account that wouldn’t previously have sent data. A lack of multi-factor authentication on 3rd party supplier and staff accounts may have contributed to both the Marriott and BA APT attacks.

Unusual activity on your database is another sign to look out for… sudden increases in database operations involving large amounts of permissions, new users being added, or permissions being changed… these are all signs you’ve been the victim of an APT attack – a particular database query led to an alert and the discovery of the Marriott APT, some 4 years after the initial infection.

 

The last warning signal to look out for is unusual data files where you might not expect them as this might be a sign that data is being collated ready for exfiltration but as already mentioned, detecting anomalies in outgoing data will always be the most reliable way to spot an advanced persistent threat.

Visualising Your Data Differently With Power BI

Analytics teams using Microsoft Power BI can choose a variety of data visualisation techniques… Are some better than others though?

 

Microsoft Power BI offers users a wide variety of different data visualisation options to help look for meaning in their data.

Given the large amounts of data Power BI often deals with however, accurately, and more importantly usefully, portraying it has been a large driver for Microsoft over the last few years.

If a user can’t tell at a glance what their data’s saying then are they, in fact, any better off than looking at it in its raw form?

 

Accurate and efficient data visualisations have become core to modern analytic and business intelligence processes, with many organisations experimenting with the swathe of new options that an improved CRM data structure and Power BI have made possible.

Of course, the guiding principles behind data visualisation haven’t really changed in decades, but as technology has offered more and more visualisation options to developers and analysts the tendency has been to try and cram in as much information as possible, with the end result that data graphics have become more and more complex and inscrutable over time.

 

Microsoft Power BI cuts through that, allowing data analysts to create their own, custom, data visualisations to help display their data in reports that are a lot more intuitive to grasp.

The problem is that with the range of options available, your data can sometimes get displayed in a way that’s either not the best format or worse, actually misleading to someone reading it.

That’s why we’re discussing some of the best techniques to display different data sets, alongside their associated pros and cons as we know a lot of people really struggle with selecting the right data visualisation technique for the right task.

KPI Visualisations

 

 

Most people using Power BI for reporting won’t be looking for a ‘deep dive’ into their data, but rather a high-level understanding of a key metric in as succinct way as possible.

That’s what the KPI visualisation is there for.

It will let you highlight a single key datapoint and show it rising and falling over time.

Line Charts

 

 

Starting with the easiest then, line charts are one of the most instantly recognisable ways to display data, allowing users to identify trends with just a glance.

They can display data measured over two axes, with different categories of data being displayed by differently coloured lines.

Line charts are most effective when used to display data over time, for example, rising and falling profits, plotted monthly over the course of a year. In fact, Microsoft Power BI will actually allow you to create time series charts from your data, allowing you to drill down by flipping one axis between yearly, monthly, weekly or even daily instances.

They simplify data into an easily understood graph that can be understood at a glance… however that simplification can sometimes be their weakness as complicated, underlying trends can sometimes get missed.

Pie Charts

 

 

Need to show something as a percentage breakdown? The humble pie chart is your friend!

Pie charts look great and are helpful in displaying ‘top line’ information but… many data visualisation experts agree that people can struggle to take away detailed information from them, often struggling to process close differences in the different sizes of the ‘slices’.

For high level information at a glance, they’re excellent but if you need more detail, particularly with large data sets, you might want to consider a different data visualisation technique.

Bar Charts

 

Perhaps even more so than line charts, bar charts are one of the easiest ways to display simple data sets. Sans all the different colours, curves, gradients, angles or shapes of other display methods, a simple bar chart is unparalleled for showing off relative sizes of categorical data at a glance (e.g. sales by district).

The bonus to a bar chart is that they can be understood by virtually anyone without any specialist understanding of your data or the need for a key at the side of the graph explaining what everything means. On top of that, Power BI allows you to implement variations on the bar chart for stacked and grouped options, allowing you to visualise the makeup of subcategories whilst still maintain the detail of the overall categories.

Scatter Plots

 

 

Scatter plots, also known as scatterplots, scatter graphs or scattergrams, are used to display the general density of data in a two-dimensional format or the relationship between two values (i.e., outdoor temperatures and ice cream sales).

That means they’re great for displaying relative densities of data as well as overall trends (that you might not have been aware of before) and outliers in an easy-to-read manner.

For instance, along one axis you could plot a sample groups IQ, along the other you could measure the time taken to solve a complex problem. The general density of ‘scatter plots’ would then let you infer trends based on that information.

Bubble Charts

 

Bubble charts are used when someone needs to accurately display three different dimensions of data by plotting them along an X/Y axis with varying sizes of bubbles, for instance, using the example above, outdoor temperature vs ice cream sales vs ice cream truck size.

A bubble chart can, with a minimal of explaining, provide quite complex information in a condensed and visual manner.

Some ‘advanced tips’ for bubble charts include labelling the separate bubbles for clarity and adjusting the layout/sizes so bubbles aren’t overlapping each other.

Network Diagrams

 

 

Although not native to Power BI, network diagrams are often used in conjunction with other data visualisation techniques to show how the different data sets being represented elsewhere are connected to each other in reality.

They’re particularly good at letting users grasp the inter-connectivity of data that might have been difficult to imagine without visualisation.

Most data visualisation experts agree however that if not put together carefully, most network diagrams quickly become indecipherable. The trick is to show a high level of detail and then, if needed, let the user drill down on successive network diagrams for greater, expanded information.

Sankey Diagram

 

 

Sankey diagrams can be used to plot out process flows via lines and arrows that can vary in width to show relative sizes of the data in individual flows.

Depending on the amount of data present, some Sankey diagrams can, initially, be quite overwhelming at a casual glance but there really aren’t any better data visualisation techniques for the representation of process flows, however, Power BI offers some extra functionality with Sankeys that make them a lot more interactive than normal, allowing users to explore truly complex process flows.

When using one it’s often worth including an annotation to explain to anyone viewing it how the diagram is set up and what it’s demonstrating.

Treemaps

 

 

Treemaps are used to display data in a hierarchical fashion in which nested blocks are used to represent different data sets.

Those nested blocks are particularly good at identifying trends but if they’re not sixed correctly or ordered in a non-intuitive way then they can be difficult to read.

Circle Packing Charts

 

 

Finally, the circle packing chart is a variation on the treemap, using, as the name suggests, circles rather than blocks to represent the relationships in your data.

By plotting different sized circles and placing them within other circles, you can easily display data ‘three levels deep’.

t-SNE

 

 

Getting a little (a lot) more technical now, t-SNE or, to give it it’s full name, t-Distributed Neighbour Embedding, actually uses machine learning to model dimensional data sets as two or three dimensional data points for display in a scatter plot, using colours, shapes or other visual elements to represent the third dimension.

Data visualisation analysts developed it to combat some limitations of the more traditional scatter plot.

 

Stacked Area Chart

 

 

Much like a line chart, stacked area charts show the breakdown of a trend into subcategories.

A line chart tells you if something’s going up or down but an area chart can tell you what’s responsible for that trend (is it one district having a great month, or are all districts growing equally?).

Maps & Filled Maps

 

 

If you’ve geographic data, Power BI has a range of ways to plot this out.

Without any technical set up you can throw some geographical data at Power BI and it’ll recognise the columns corresponding to areas (could be latitude/longitude, could be postcodes, could be zip codes… not to worry, Power BI will work it out) and plot it on a great looking, interactive map, which also handles grouping things up automatically.

Custom Visualisations

Finally, there’s a vast library of pre-built custom visualisations available online, covering a myriad of uses but if you’re still not satisfied Power BI will let you can build your own, allowing you to display your own data in any way you find easiest!

 

As we finish up it’s also worth mentioning the interactivity available in Microsoft Power BI.

All Power BI visualisations allow you to drill down into your data at the touch of a button.

For instance, say your dashboard has a map of sales by county, and a line chart of total sales. You might notice on the map that Yorkshire is way below the rest, so you click through, and hey presto, your line chart will automatically show the sales in just that area.

Database Marketing – What Is It & How Can You Benefit From It?

It’s not the size of your database… but what you do with it

 

Effective use of a CRM in marketing campaigns has come to be known as Database Marketing.

Database marketing though, isn’t just the ‘use’ of a database in your marketing efforts… it can go so much deeper than that.

What Is Database Marketing

Database Marketing is a much more analytical and systematic approach to marketing, reliant on the gathering, collating and processing of an individual’s data through the use of a cutting-edge customer relationship management (CRM) system like Microsoft Dynamics 365.

Of course, we’d be remiss if we didn’t point out here that if you’re going to use your CRM for database marketing in any kind of meaningful way then you’ll need strong foundations built on Privacy-by-Design.

Database marketing really comes into play for large organisations or those that hold a lot of data on their potential clients.

The more data you have, the more effective your results will be.

 

Whilst most traditional marketing methods will rely on a database of potential and current clients, we can differentiate out database marketing by the type and amount of data that will be used.

The information you hold will be a lot more in-depth and contain all the possible touchpoints a someone may have with your organisation. Given all that however, the data will obviously need to be processed differently to traditional methods.

Anyone using database marketing will use the data they already hold to acquire more, learning more about their current and potential clients as they go by using strategies such as customer segmentation and sector targeting, comparing which types of clients offer the highest ROI, and formulating strategies to engage and encourage greater spend amongst all segmentations.

The real key to database marketing is the collection of large amounts of data and an in-depth analysis of that data to provide a highly personalised strategy for each client/potential client.

How Does Database Marketing Work?

As already mentioned, to begin database marketing campaign, your organisation will need to start collecting large swathes of data from a variety of different sources.

Names, geo locations, email(s), phone numbers, order history, website/app touchpoints, engagement levels, browsing data… all of this and more can and should be tracked.

There’s dozens of different ways this data can all be collated, so we wont list them all here but some might come from the data mining of your current CRM, cookie tracking, past subscriptions, past campaign engagement… pretty much any touch point they may have had with your organisation.

Once all that data has been gathered it needs to be stored in a database, preferably using the Common Data Model for ease of use, (note… if you’re a larger organisation with enormous amounts of data, you may want to look into a data warehouse). Data warehouses are capable of receiving and storing data from a variety of different sources (for example, different departments within an organisation) which then needs to be processed for actionable intelligence using software tools Microsoft Dynamics 365 Marketing or Microsoft Power BI.

For it to be useful that data needs to be kept as fresh and as up to date as possible with the base assumption that over time, if it’s not updated, that it will become outdated.

Benefits Of Database Marketing

Hopefully the benefits of database marketing should be easy to identify to most marketers, advertisers or other stakeholders but for the sake of clarity some of them might include:

 

  • Identifying the most efficient way to engage with potential clients
  • Identifying potentially new customer segmentations
  • Identifying the most profitable way to increase ROI of existing clients
  • Organising prospect data via previously untargeted demographics on a much more personal level
  • Prioritisation of your most valuable clients
  • Personalised and individual marketing messages
  • Increased client retention
  • Campaign resource efficiency savings as you won’t be sending collateral to unengaged/irrelevant prospects

Database Marketing: Things To Look Out For

Database marketing, especially when backed up by strong AI software, is a strong marketing tool… but that’s not to say there’s things you shouldn’t look out for:

 

  • Outdated Data: Any data you collect will become out of date fast. You can attempt to head this off by only collecting data that stays relevant (names, email addresses etc) but this severely limits the insights the data can give you. You need to factor the half-life of your data into your analyses and try to keep it as relevant as possible.
  • Incorrect Data: Don’t forget, if you’re relying on the potential client to fill out the data you might end up with a lot of Mr/Mr’s Smith’s on your system, telephone number 01234 567 891, email address [email protected] Where possible, you need to limit that with pre-filled drop-down menus or check boxes but the potential for data to be incorrect needs to be acknowledged in your strategy.
  • Database Costs: Investing in an expensive CRM and database warehouse is likely going to be frowned upon by your accounts team if all you’re storing in it is names and email addresses. To truly maximise the ROI from database marketing you need to collect as many client/prospect touch points as possible.
  • Reduced Engagement: The whole idea behind database marketing is to run personalised campaigns, driving up engagement and ROI. The danger you run however is mislabelling data or grouping contacts together in the wrong subsets so you end up advertising lawnmowers to a group of prospects who have told you they live in a box of flats. For database marketing to work you need stringent protocols in place to maintain the accuracy of your data or you run the risk of driving clients and prospects away.

Types Of Database Marketing

There are two different types of database marketing and, depending on your organisation, you might need to look at either/or. The thing that differentiates between them is the target audience…

 

B2C or Consumer Database Marketing: B2C is when an organisation sells directly to a customer (Business 2 Customer) and the data collected will contain a high degree of personal information so GDPR and Privacy first principles are vital for staying compliant. When collecting this data you should try to be as specific as possible, collecting all the different touchpoints an individual may engage with you on.

B2B or Business Database Marketing: B2B is when an organisation sells directly to other organisations (Business 2 Business) and the data collected, whilst no less specific will be of a very different nature.

 

  • Business name
  • Contacts within the business
  • Company revenue
  • Contact job titles
  • Purchase history
  • Contact LinkedIn profiles

 

Whilst a B2B database may contain more generic, esoteric data, you still need to be careful when documenting individual contacts details as if any of the information is personally identifiable then GDP will kick back in.

 

There are many organisations out there today offering a high degree of personalisation through database marketing.

Netflix, Amazon, Facebook and Google just to name a few of the biggest players.

No matter your size though, there’s some basic points to remember when getting your database marketing campaigns up and running.

 

  • Engage with your front-line staff on the information most worth collecting. They’ll likely know your clients best and what information is worth gathering as a starting point.
  • Put Privacy First. Whilst all this personal data might be easy to obtain not everyone will be happy you’re holding it. Make sure you’ve conducted your DPIA’s and are clear under what basis you’ll be processing this data with clear information held in your policies should anyone wish to read them.
  • And most importantly, invest in the right software… This isn’t something that can be done with an excel spreadsheet!

Microsoft Dynamics Cloud Licensing Options – What’s Available?

Keeping up with Microsoft’s cloud licensing agreements isn’t easy… but that’s why cloudThing are here

 

Let’s face it, Microsoft’s cloud-based licencing agreements are complicated… it’s why we’re here to do it for you.

Lucky for you we’re a Microsoft Gold Partner and are happy to share all the behind the scenes know how you could possibly ever need to decide what kind of licencing plan bests suits your organisation.

 

Microsoft’s subscription-based cloud licensing agreements are great as they all come with frequent updates (both functional and security related), they require little or no maintenance, they’re easily scalable, flexible and best of all… can be fitted to any budget.

The trick is choosing the right plan for your organisations current and future needs.

Microsoft also offer something called an EA, or Enterprise Agreement, for organisations that might need 500+ licenses without the hassle of having to manage them all individually.

What Are The Different Types Of Microsoft Cloud-Based Licensing?

Unlike most other vendors, Microsoft won’t just assign you a licence based on what you need, they’ll also do so based on what you do and the sector you’re operating in. Whilst that’s great, and often gives lots of extra benefits, it can add an extra layer of complexity if you’re not sure what’s right for you (again, that’s when it’s best to speak to an expert, ahem us, ahem!).

 

  • Commercial – Commercial cloud licenses are the de-facto standard and most users will fall under these.
  • The Education Sector – You’ll need to be able to prove it, but if your organisation is affiliated with the education sector you might qualify for Microsoft’s extended/discounted education licenses.
  • The NonProfit Sector – To apply for a NonProfit License your organisation must be a registered NonProfit, but again… Once this is confirmed Microsoft (and cloudThing) will be happy to donate a lot of free resources to your chosen cause.
  • Government – Government bodies can all utilise a variety of speciality Dynamics licences. Central Government, Local Government and even the NHS come under this heading.

Microsoft Dynamics 365 Business Central Licensing

Without a shadow of a doubt, Microsoft’s Business Central is the best accounting and business software out there today for SME’s.

But guess what… how you go about licensing D365 Business Central is again, rather confusing.

There’s basically three different types available, D365 Business Central Essentials, D365 Business Central Premium and Team Members

Which you opt for would very much depend on your organisations needs but in essence…

MICROSOFT DYNAMICS 365 BUSINESS CENTRAL ESSENTIALS LICENCE

  • Finance management
  • CRM
  • Project Management
  • Supply Chain Management
  • HR Management
  • Warehouse Management

MICROSOFT DYNAMICS 365 BUSINESS CENTRAL PREMIUM LICENCE

  • All of the above, plus…
  • Service management
  • Manufacturing

Depending on your organisations needs you’d have to opt for one or the other.

Need finance management, project management, distribution, and warehousing? Essentials will most likely be fine for you.

If you need all the above plus the ability to automate and manage production, manufacturing, and service then Premium might be the better choice.

 

The Team Members subscription is a named user subscription designed for users who are not tied to a particular function but who require basic Business Central functionality.

This license includes read access as well as some write access for select light tasks across Business Central functionality for a given tenant.

The Team Members SL grants a user full read access to Essentials and Premium for a given tenant. In addition, the Team Members SL includes some limited use write access to Essentials and Premium.

Business Central Team Members also includes the Power Apps/Power Automate Use Rights with Dynamics 365 license.

Team Members users can use Power Apps to access Business Central within the bounds of their Team Members license.

 

Team Members requires that at least one other user be licensed with Essentials or Premium.

TEAM MEMBERS USE RIGHTS:

  • Read anything within Business Central
  • Update existing data and entries in Business Central – existing data are records like customer, vendor or item records which are already created. Entries means entries on which it is specifically allowed from an accounting perspective to update specific information. (e.g. due date on customer ledger entries)
  • Approve or reject tasks in all workflows assigned to a user
  • Create, edit, delete a quote
  • (Create, edit, delete personal information
  • Enter a time sheet for Jobs
  • Use Power Apps/Power Automate Use Rights with Dynamics 365 license
  • Team Members application module1 may be customized with maximum 15 additional entities (custom entities or standard Common Data Service entities) available to the Team Members license2 per pre-approved application scenarios

One important thing to note with Business Central licenses though is that you can’t mix and match… you have to choose one or the other and also nominate how many users/additional users will be using the service.

Microsoft, within the scope of Business Central, define a user as someone who can access the entire suite of tools whilst an additional user will have a ‘read-only’ access.

Manager Customer Engagement With D365 Sales, D365 Customer Service and D365 Field Service

Apologies for the long title but Customer Engagement licenses for Dynamics Sales, Dynamics Customer Service and Dynamics Field Service all work on a similar basis so we thought we’d cover them as one.

A while back Microsoft moved away from selling license ‘bundles’ and started offering organisations a lot more choice. They’d found that 80% of their Dynamics 365 Customer Engagement users (Sales/Customer Service/Field Service) were only using one of the applications and just 17% using two so they started to offer them on a much more individual basis.

In real terms that means if you only need D365 Customer Engagement for your sales or customer services team then you’ll only need to license D365 Sales or Customer Service, or, if you need both, you’ll just need a license for D365 for your Sales team and a Customer Service license for customer services, mixing and matching as required.

If a user just needs the one license, then they’ll just need to purchase that. Should they need more they can build on them like Lego bricks. Their first app license would be their ‘base’ license, additional licenses after that can be added on for discounted rates.

Microsoft Dynamics Finance & Project Operation Licenses

As with D365 Sales/Customer Service/Field Service, Microsoft Dynamics Finance and Project Operation licenses are now much more modular than they used to be, with users only needing to pay for apps they’ll actually use.

Again, users just need to purchase a base license and then add on any attach (secondary licenses) that they need.

Your first (or base license) will need to be the most expensive license, purchased at full cost, but attached (or extra licenses) can then be built on top of your base license for a discounted amount.

Those discounted attach licenses can only be purchased with a base license though and, whilst you’ll only ever need one base license, you can have as many attach licenses as you want.

Privacy-By-Design & The Children’s Code

The ICO have just released a data protection Children’s Code (or Age Appropriate Design Code in full) that outlines an organisation’s responsibility for its online practices with regard to children.

The code will cover apps, online games, platforms, websites, social media sites or anything else likely to be accessed by a child.

It came into force on the 02nd Sep 2021 and organisations now have twelve months to make sure they’re fully compliant with the new code.

 

Here’s what you need to know and what you’ll need to do to make sure your organisation is ‘up to code’…

What Is The Children’s Code?

DEFINITION OF THE THE AGE APPROPRIATE DESIGN CODE

The Age Appropriate Design Code, otherwise known as the children’s code, is a new statutory code of practice that falls under the General Data Protection Regulations.

It’s there to recognise the fact that children should be given extra consideration around their personal data, whilst helping organisations understand what is and will be expected of them.

The ICO has ‘translated’ what the law says into fifteen standards that any and all organisations providing online services should follow to remain compliant within the law.

Following the principal of Privacy First, the code is there to ensure children have a baseline of protection automatically by default and design to ensure they’re protected within the digital world, not from it.

Any organisation that isn’t fully compliant within the code by September ’21 could face penalties from the ICO such as compulsory audits, orders to stop the processing of personal information and fines of up to 4% of their global turnover!

Why Is A Children’s Code Needed?

EXPLAINING WHY THE AGE APPROPRIATE DESIGN CODE BECAME NECESSARY

Most, if not all modern apps, games and websites will start collecting data on their users the moment someone opens/visits them.

That data can then be used to tailor what advertisements a child might see, shape how they’re encouraged to engage with the app/site or even in how they’re ‘persuaded’ to spend more time using an organisations services.

Whilst the digital world can offer truly awesome experiences for younger users to learn and enjoy themselves, it was felt that not enough was being done  to create a space within the digital world for children to explore and grow safely.

What Do Organisations Need To Change For The Children’s Code?

HOW TO PREPARE FOR THE AGE APPROPRIATE DESIGN CODE

Service and platform providers will need to acknowledge within their GDPR compliance that children must be treated differently to their adult users.

In the UK, children make up 20% of internet users… even though it was never designed for them or with their needs in mind.

Take the ‘real world’ for instance.

There’s plenty of laws protecting children… car seats, film and game ratings, drinking and smoking age restrictions… The Age Appropriate Design Code just follows that thought process through to it’s logical conclusion by adding those same protections to the digital world.

 

In real terms that means organisations will need to make it clear when a child’s personal data is being used to drive the content they’re seeing/experiencing, whilst recognising and protecting a child’s right to privacy.

The law will compel organisations to:

 

  • Provide privacy settings set to their highest… by default
  • Switch off any geo-location services that could reveal a child’s location to anyone else
  • Cease the use of all ‘nudge’ techniques and notifications to encourage minors to give up additional private data/personal information.

 

Children, or their parents/adult supervisors can, of course, change these settings but they need to be there by default, as set out in the Children’s Code.

Organisations, to remain GDPR and Children’s Code compliant, will be expected to:

 

  • Create an open, transparent, and safe place for children whilst they’re online
  • Comply to a set of standards when designing, developing, or providing online services that are likely to be accessed by children
  • Always consider the best interests of the child when processing their personal data
  • Implement the highest of privacy settings by default whilst using language that is clear and easy for children of different ages and development stages to understand

 

For organisations with high Data Protection standards forming the root of their processing, the Children’s Code should not cause any major problems.  As with all Data Protection matters, however, organisations must ultimately be able to demonstrate their accountability i.e. that the risks have been considered, steps have been taken, and that the steps taken are justifiable according to the risks as assessed by that organisation.Jane Rudge – cloudThing, Chief Commercial and Compliance Officer

 

When Does The Children’s Code Take Effect?

THE AGE APPROPRIATE DESIGN CODE ENFORCEMENT DATE

It already is!

The Age Appropriate Design Code/Children’s Code come into force on the 02nd September 2020, however the ICO have allowed for a twelve-month grace period for organisations to become compliant… by the 02nd September 2021.

What Happens To Organisations Not Compliant With The Children’s Code By 02nd September 2021?

CONSEQUENCES OF NOT COMPLYING WITH THE AGE APPROPRIATE DESIGN CODE…

Remember getting compliant for GDPR back in May of ’18?

The Children’s Code is rooted within GDPR and DPA legislation that the ICO is already enforcing.

Any organisation asked to demonstrate their compliance to GDPR or PECR (Privacy and Electronics Communications Regulations) that operate services accessed by children will struggle to show compliance if the Children’s Code hasn’t also been considered within their data protection policies.

As a worst-case scenario, should the ICO get involved your organisation could be looking at audits, assessments, stop processing orders and fines of up to 4% of your global turnover…. The ICO is taking this seriously and so should you!

How Old Is The Definition Of A Child In the Children’s Code?

THE AGE APPROPRIATE DESIGN CODE DEFINITION OF A CHILD…

The Children’s Code will define anyone under the age of 18 as a child for the purposes of compliance.

Many websites in the UK have, up until now ‘thought’ of children as those being under the age of 13, often citing Article 8 of GDPR.

However, this misconception has always been such… a misconception, and is now clarified in the Children’s Code.

Article 8 of GDPR sets out when a child becomes old enough to provide consent to the processing of their own data, but it’s never set the age of a child as 13.

Does The Children’s Code Change GDPR?

THE AGE APPROPRIATE DESIGN CODE WITHIN GDPR AND DPA

It doesn’t.

Data Protection regulations haven’t changed, the Children’s Code will just refocus specific attention on those under 18 years of age.

Everything within the new code links back to existing provisions within GDPR, it just also adds another level of complexity on what the ICO Commissioner will expect of organisations when dealing with children, in order to remain GDPR compliant.

Which Organisations Will Be Affected By The Children’s Code?

SERVICES & WEBSITES THE AGE APPROPRIATE DESIGN CODE WILL REGULATE…

The Children’s Code will apply to any and all organisations offering ‘Information Society Services’ that are likely to be accessed by children within the UK.

Basically, it won’t matter if your app, game, device, search engine, social platform or website is specifically targeted at children or not. If there’s a possibility a child could use it then the Children’s Code will kick in.

The ICO have also confirmed their default position will be to expect most online services to fall under the Children’s Code.

Will The Children’s Code Only Affect UK Companies?

GEO-LOCATIONS APPLICABLE TO THE AGE APPROPRIATE DESIGN CODE

The Children’s Code will apply to all UK organisations and companies.

It will also apply to any Non-UK organisations with offices, branches or establishments in the UK that process children’s personal data in the context of the activities of that office.

It will also affect any organisations based outside of the EEA, even those without offices in the UK, if they offer services to UK end users (or monitor UK users/collect data on UK users) and are likely to be accessed by children.

How Will The Children’s Code Be Enforced?

HOW THE AGE APPROPRIATE DESIGN CODE WILL BE APPLIED BY THE ICO

We’re currently in the twelve-month grace period to prepare for the new Children’s Code.

After that expires (02nd Sep 2021) the ICO will investigate anywhere they’ve concerns for the digital welfare of children, starting with areas with the highest risk of harm.

They’ll also actively be investigating complaints made by parents, teachers, carers, or other adults that have identified possible breaches.

As with their GDPR investigations, the response they take is designed to be proportionate and risk-based but, should they find an organisation showing a blatant disregard for children’s privacy, as already mentioned, fines of up to 4% of global turnover could be applied.

Is There A Specific Legal Definition For ‘The Best Interest Of The Child’?

A wide one!

The United Nations Convention on the Rights of the Child states: The UNCRC incorporates provisions aimed at supporting the child’s needs for safety, health, wellbeing, family relationships, physical, psychological and emotional development, identity, freedom of expression, privacy and agency to form their own views and have them heard. Put simply, the best interests of the child are whatever is best for that individual child.

 

This will be used in conjunction with: Children merit specific protection with regard to their personal data, as they may be less aware of the risks, consequences and safeguards concerned and their rights in relation to the processing… Recital 38 To GDPR

 

How Should Organisations Apply The Standard ‘The Best Interest Of The Child’?

To make sure your organisation is able to effectively apply the standard ‘the best interest of the child’ the ICO suggests you consider the specific needs of your child users that to your platform or service and how you can best support those in your design and implementation processes.

 

  • Keep children safe from exploitation risks, including the risks of commercial or sexual exploitation and sexual abuse
  • Protect and support children’s health and wellbeing
  • Protect and support children’s physical, psychological and emotional development
  • Protect and support children’s need to develop their own views and identity
  • Protect and support children’s right to freedom of association and play
  • Support the needs of children with disabilities in line with the obligations under the relevant equality legislation for England, Scotland, Wales, and Northern Ireland
  • Recognise the role of parents in protecting and promoting the best interests of the child and support them in this task
  • Recognise the evolving capacity of the child to form their own view, and give due weight to that view

Does The Children’s Code Mean I need To ‘Age-Gate’ My App/Website/Platform?

Fortunately… no.

The ICO have confirmed they’re not interested in seeing an age-gated internet.

What they want instead is a fundamental shift of how organisations approach the collection and processing of children’s private information, in which the processing of data from apps, websites and platforms takes a child-centric approach, building in relevant privacy protection from the beginning, rather than trying to add it on as an afterthought.

How Can An Organisation Know How Old Their Users Are?

In short… as long as your privacy standards are set high enough as per the Children’s Code, you shouldn’t need to know the age of your users.

If, however, you decide not to go down that route you will need to establish age.

The ICO have set out several appropriate ways for organisations to do this within the Children’s Code:

 

  • Self-declaration
  • Artificial intelligence
  • Third-party age verification services
  • Account holder confirmation
  • Technical measures
  • Hard identifiers

How Will The Children’s Code Work In Relation To Data Minimisation?

There’s nothing set out in the Children’s Code that prevents data minimisation.

Data minimisation isn’t there to stop organisations collecting personal data. If you need to ask the age of a user to verify if they’re a child or not then this is wholly compliant with data minimisation which states you should only collect data you actually need for a specific purpose.

What Happens If Children Lie About Their Age?

The ICO is well aware that no age assurance technique is 100% infallible, so don’t worry too much on this point.

If a complaint were made or your organisation came to the attention of the ICO through some other means then they’d look at whether the age assurance measures your organisation had put in place were stringent enough given the risk of children lying.

In layman’s terms… Has your organisation done enough to try and verify the age of its users and ensure that the personal data of children will be processed in accordance with the Children’s Code?

How To Connect To A Named Sandbox Enviroment

Darren Austin – cloudThing, Lead Dynamics Business Central Developer

You’ll often find you need to connect visual studio code to each of the sandbox environments you’re working with

 

As a developer working across multiple projects you’ll find that you need to connect visual studio code to each of the sandbox environments you’re working with.

The standard launch .json file won’t work when you initially start your project as its generic – expecting you to only connect to one project and the account you’re connecting with is not a partner account – it’s a user account.

Connecting to the sandbox from Visual Studio Code is key for your application as you’ll need to download symbols – this gives you the symbols from the base Microsoft apps but also you can extend the app .json file to include any custom apps you need in your project.

To connect to a named sandbox instance, I have found adding the following lines in to your launch json file:

  1. Server – this is the http url of your server – Not 100% needed in this process but I like to add it to see the full detail of what I am connecting too.
  2. Tenant – this specifies the tenant name of the server, in my example it is the guid you would find in your full business central url.
  3. SandboxName – This is the name of the sandbox it should publish too, this is key if you have multiple sandboxes in your tenant.

 

 

As an end result your file should look something like this, having added these lines and saving the file.

You should be able to download symbols and publish your applications to your sandbox environment.

If you have multiple sandbox environments in your tenant, then you should only have to change the last property – SandboxName – to the name of your sandbox.

Download symbols and then just publish

What Are The Different Types Of Cloud Licensing Agreements?

Are you considering an ELA, a BYOL or PAYG cloud licensing model? Do you know the difference, and which will work out better for your organisation?

 

If you’re reading this, then you’ve most likely clicked a link online somewhere to learn more about cloud-based licensing agreements.

Cloud migrations are probably one of the most interesting business/technology topics around but, unfortunately, you can’t have a cloud migration without cloud licenses and cloud licensing agreements, shall we say… are less interesting.

 

There’s a host of fascinating cloud-based subjects we could discuss, machine learning, AI, CRM management, volunteer management systems, low code/no code apps… but none of them can happen without first having the right cloud licenses in place.

That means, if you’re an SME or enterprise level organisation, looking at a cloud migration, licensing is probably one of the first things you’ll start to research (which is most likely how you ended up here!)

The problem with discussing cloud-based licensing agreements though is that for every sentence we write, we almost always end up needing to elaborate on about seven different terms within it.

It sometimes feels like the cloud vendors (AWS or Microsoft for instance) try to keep things as confusing as possible.

So, rather than  dive straight into a fact based, 1,000 page strong document, with all the fine print laid bare, we thought we’d take things back to basics and discuss the different types of Cloud Licensing Agreements available on the market today… and if you’re still hungry for more you can always give us a call for a free consolation on the best route forwards for your organisation.

Bring Your Own License – The BYOL Model

DEFINITION OF A BRING YOUR OWN LICENSE AGREEMENT

A Bring Your Own License is a licensing model that will allow organisations to use all of their licenses flexibly, so in the cloud or on-premise. It’s about making the most out of your existing investments into on-prem licensing whilst still getting to benefit from awesome cloud-based services.

WHAT DO YOU GET WITH A BYOL AGREEMENT?

  • Flexibility – Due to their flexibility, organisations can easily switch between services with a BYOL agreement. All without needing to worry about managing multiple licenses across various platforms or services.  
  • Lower Upfront Costs – The ability to both share and migrate licenses when you need to upscale your software needs is a huge benefit to saving costs. A BYOL model massively mitigates that by avoiding the issue of paying for multiple, concurrent licenses across different services and platforms.
  • Freedom of Use – BYOL gives an organisation a lot more flexibility over how they use a service, perhaps only needing specific parts rather than the whole.
  • Better Reporting – With everything being managed in the cloud, and copies of licenses no longer being needed, tracking your validity and usage becomes a lot easier! As an example, imagine we’re migrating you to Microsoft Azure. Microsoft have enabled their users the facility to bring over their current Windows Server Licenses and SQL server into any cloud-servers being managed in Azure.

 

See, the main issue with a lot of traditional software licenses is that they can be rather restrictive in what you can and can’t do. They’re often tied to very specific servers or types of servers, which means you can’t re-use it after a cloud migration without violating a host of licensing t&c’s.

The BYOL model gets you around that by allocating your resources on a much more flexible basis.

WHEN TO CONSIDER A BRING YOUR OWN LICENSE MODEL

BYOL agreements lower the cost and minimise your risks when making a migration from on-prem to cloud by utilising any existing licenses you already hold… so are a good choice for anyone going through their first cloud migration.

Pay-As-You-Go Licensing Agreements

Exactly as it sounds, a Pay-as-you-Go cloud Licensing agreement is a payment method that only charges you based on your organisation’s usage.

The biggest benefit to pay-as-you-go licensing agreements is that there won’t be any wasted resource.

You’re literally only paying for what you use, as opposed to other models where you’re paying for resources that you might/might not ever need.

Most pay-as-you-go vendors provide their services by allowing their end users to scope out and architect the virtual machines they need, with the users selecting the CPU, memory, storage, network capacity and security level they need to run their environments.

Breaking down pay-as-you-go cloud licensing agreements further, the three main uses it’ll be put to are…

 

  • IaaS – Infrastructure-as-a-Service, when under a Pay-as-you-Go agreement, is on a per use basis, normally by the hour, week or month although some vendors also charge based on the amount of VM (virtual machine) space used.
  • PaaS – Platform-as-a-Service, under a PAYG model is typically priced per app, user or gigabyte of memory consumed per hour. Microsoft is one of the only vendors that provide a per-minute pricing structure for its PaaS services, which, depending on the use it’s put to, can equate to large savings as well as maintain the state and configuration of your VM’s… you only pay when they’re ‘switched on’.
  • SaaS – Software-as-a-Service, under a PAYG model can be priced on a wide variety of features such as storage capacity, with many vendors offering this as a ‘go to’ choice.

Cloud Subscription Licencing Agreements

Perhaps one of the simplest models to wrap your head around, there are never the less hundreds of different types of subscriptions available so choosing the one best suited to you can be tricky.

In essence it’s an agreement to use the vendors cloud platform and or some of their services/apps, which will then accrue a set fee, usually on a per user basis.

There are often trial subscriptions available, but these will expire after a set amount of time where they’ll then normally revert to a paid subscription.

It’s possible for an organisation to have multiple different subscriptions for various different vendors and or services so costs can quickly mount if your licensing agreement isn’t optimised correctly.

Enterprise Licensing Agreements

Also known as ELA’s or EA’s,  enterprise licensing agreements can offer huge discounts for large organisations that can afford to pay for a set number of licenses up front.

ELA’s can be great value for large organisations (typically those that will need over 500 or more users) with a mixture of cloud space and services agreed at the start of the contract. With that many users, rather than having to mess around with hundreds of different PAYG or BYOL agreements, an ELA streamlines the whole process organisation wide, with easy to understand (and predictable) payment plans to keep the accounts team happy.

How To Export To Text Files From Microsoft’s Business Central SAAS

Darren Austin – cloudThing, Lead Dynamics Business Central Developer

Before going down the development route, it’s always good to see what standard has to offer

 

Often during the lifecycle of a project there comes a time where you’ll need to export data from Microsoft Dynamics 365 Business Central in a particular format. Before going down the longer route of development though, it’s always a good idea to see what standard has to offer…

 

In the below example, there are “Open” and “Edit” in excel options on most pages which may offer the solution for you.

The reason I’ve chosen to share this is because I come across often the need to export payment journal lines. This will allow the customer to upload payment lines into their bank account to pay their vendors.

The below solution will cut out the process of them having to enter everything line by line into their banking software.

 

As you’d expect, the file the bank is expecting is quite strict in format.

Please note, with the introduction of data definitions this could also eliminate the need of any development, but I’ve seen many which are not possible using this method and the only option we have is to develop the solution.

 

Life has changed between NAV >  BC and CAL code to AL code. The methods / code we used in the past may not work going forward for D365 Business Central SaaS.

Below is a method we can use to extract data from NAV to a text file, this method isn’t limited to just using text files.

Development:

Firstly, lets create a codeunit and within a function to handle our code.

 

 

To begin this process we need to add some variables to support our export

  1. One InStream
  2. One OutStream
  3. One as a reference to the standard “Temp Blob” codeunit.
  4. One to handle the file name but is not 100%, you can do this is many ways.

 

To enable us to write text to this we will need to create an OutStream using the Temp Blog variable.

Once created use the WriteText function within the OutStream variable.

This is a basic function below but could easily be extended to use two char variables to enable line feed / carriage return to add more lines.

Lastly to get the file to the user we would use the temp blob codeunit to create an InStream and DownloadFromStream to retrieve the file

To test this is working, I create a simple page extension to the company information page

 

 

 

The file is downloaded to the user:

 

Technical Debt – The What, Why, When & How Do I Get Rid Of

Defining technical debt, how to reduce it and how to get rid of it

 

What Is Technical Debt?

Technical debt, sometimes called design or code debt, is a concept in software development that talks about the extra risks you accumulate when using code that’s easy to implement in the short term but you know isn’t the best solution for the long term.

The ‘debt’ is the additional time you’ll have to spend fixing these issues down the line, costing staffs time and rescources.

Just as with fiscal debt, Technical debt also earns interest the longer it goes ‘unpaid’ (fixed), making it that much harder to affect true Digital Transformations.

 

Consider a software developer, working hard on a part of the code base they’re designing. Any change they make in the code will likely have far reaching affects on other parts of the codebase or documentation. Changes that are necessary but not corrected can be considered ‘technical debt’, and, until the developer (or some other unlucky individual) pay it back, will continue to cause far reaching problems.

Whilst it’s mostly a coding term, it shouldn’t be hard to see how this concept can also be applied to other job roles or processes within an organisation.

The Origins Of Technical Debt

This next paragraph won’t help you prevent or get rid of Technical debt but it’s quite interesting, so we’ve included it for you anyway… feel free to skip down to the next paragraph though.

 

The term ‘technical debt’ was first used by a man named Ward Cunningham (who actually developed the first ever Wiki).

When describing the concept for the first time back in ’92 he said:

 

Shipping first-time code is like going into debt. A little debt speeds development so long as it is paid back promptly with a rewrite. Objects make the cost of this transaction tolerable. The danger occurs when the debt is not repaid. Every minute spent on not-quite-right code counts as interest on that debt. Entire engineering organisations can be brought to a stand-still under the debt load of an unconsolidated implementation, object-oriented or otherwise. –Ward Cunningham

 

The Three Types Of Technical Debt

Breaking down technical debt further, there are, broadly speaking, three different categories it can be classified into; naïve, unavoidable and strategic:

 

  • Naïve – Technical debt caused by naivety typically occurs when protocols and governances aren’t followed or correctly applied. Hence, it could also be called negligent technical debt. The reasons this might occur are many and varied, unfamiliarity with protocols, rushed worked, odd naming conventions, etc, etc. This kind of debt (whilst not looking to disparage junior developers) does occur more at the junior software developer level… but even the most experienced of devs make mistakes things sometimes!
  • Unavoidable – Unavoidable technical debt is caused when tools or processes get upgraded, resulting in more efficient ways of doing things. Whilst great, it does mean older processes may need upgrading to the new, achievable standards. Changes to the scope of a project without adjusting deadlines can also have the same result.
  • Strategic – Strategic technical debt, as it sounds, is when a conscious decision is made to take on technical debt, perhaps to meet a deadline or perhaps because the financial cost of the debt is outweighed by a speedier finish to a project.

What Are Some Of The Common Causes Of Technical Debt?

There are probably thousands of things which can cause an organisation to start accruing technical debt, but some of the most common are:

 

  • Insufficient briefs at the start of a project often lead to technical debt, with unnecessary or incorrect solutions needing to be re-done. This can be exacerbated further with development starting before a brief is even fully scoped out, in a misguided attempt to save time.
  • In ongoing developments or projects that are continuously improved, older solutions sometimes become sub-optimal, as better, more efficient, solutions are found. The technical debt occurs when you need to go back to fix these older elements.
  • Pressure from upper management to get a project completed quickly often leads to a build-up of technical debt, with rushed, inferior, or unchecked code often making it into the final solution.
  • If your organisation lacks strong governances or is simply unaware of the concept of technical debt, it becomes very easy to accrue it without ever realising.
  • Having a sandbox to test solutions in before pushing them live is vital in preventing technical debt. Whilst a solution might look great in theory, it’s only when it’s tested in situ that all flaws are really revealed. Writing code ‘live’ without thorough testing is a great way to collect technical debt.
  • If a solution is arrived at but not documented correctly, then the work and time required to retroactively document it can also be considered as technical debt.
  • Technical debt is very common in organisations that suffer from a lack of collaboration, where data is siloed, and junior developers aren’t mentored correctly, instead just left to ‘figure things out’. Siloed data also means parallel development often occurs on different parts of a solution, with the duplicated work being the technical debt.

How To Prevent Technical Debt

Technical debt isn’t always a bad thing if it lets you get to market faster than a competitor, but it does eventually need paying back, and that process can be painful. Avoiding as much technical debt in the first place is a lot easier in the long run…

 

  • Technical Debt Isn’t A Dirty Word – (Or two words). If your staff are aware of technical debt, are encouraged to talk about it, and know what they can do can do to prevent it, then they’ll be that much more likely to find solutions to heading it off as they go.
  • Good Governance – cloudThing are huge advocates for good governance. By taking the time to address possible outcomes before they occur, you’re heading off technical debt before it becomes an issue. Make time within your project for technical debt, schedule planning for it into your processes and make limiting Technical Debt a KPI that your developers can work towards.
  • Don’t Run Before You Can Walk – Rushing, or leaping in, is the easiest and quickest way of accruing technical debt. Make sure projects are fully scoped out, plan ahead and above all involve your developers in every step of the process. Let them set realistic deadlines with you to avoid corners being cut to meet them. Resulting in technical debt.

How To Reduce Existing Technical Debt

Reducing existing technical debt is often a long, laborious process over going back over existing work and fixing issues.

It is something cloudThing have specialised in for a long time though and would be happy to advise you on.

You first need to understand what and where your technical debt is, normally achieved through collaborative discovery workshops with your developers.

Once you’ve identified as much of it as possible you need to develop a strategy to reduce the technical debt with incremental changes and you won’t be able to do everything all in one go. When defining that strategy, it’s important you set clear delivery goals with associated metrics and KPI reporting so that everyone has a clear picture of what they’re working towards.

Then… it’s a simple case of going after it!

Ensuring Business Continuity With The Microsoft Stack

Business Continuity isn’t a term to describe dealing with a short-term disaster anymore, but something that needs to be baked into long-term business plans.

 

*Transcript from cloudThings Membership Sector Digital Conference

 

Thankfully, Business Continuity is a pretty theoretical subject for most, even if it’s something that technically falls under your remit within the organisation. Even those working in the IT security sector would probably say it’s rare your Business Continuity plan is ever needed. Before 2020 that was…

We currently have over two billion people globally, either living in or slowly emerging from, lockdown.

Business Continuity has never been more important.

That’s why I thought I’d start off by discussing how cloudThing made a seamless transition to a remote working model during the coronavirus crisis whilst still delivering to its customers.

Hopefully, you’ll get some insights that will be applicable to your organisations own Business Continuity plan, in particular how you can continue to offer support to both your staff, your members, your volunteers or your donors with a staff that’s all working whilst geographically scattered.

 

Business Continuity isn’t a term to describe dealing with a short-term disaster anymore, but something that needs to be baked into long-term business plansTony Leary – cloudThing ​Principal Architect

 

The Challenges Of Remote Working

So… the challenges of remote working…

 

A lot of (if not most) of the challenges associated with remote working will come down to one thing… Your organisations architecture.

If you’ve a fairly traditional/high security architecture in place it’ll likely be very centralised with all the traffic from your remote workers flowing back through a VPN.

That may be purely for security reasons, but it may also be because the lines of business apps your users need are only available within that environment.  Centralising things that way though creates bottlenecks which can then causes issues with your bandwidth. It could be you’re sharing bandwidth with both members and staff, with the same internet service being used for lots of different things.

If you find yourself in a situation where you need to rapidly increase your bandwidth though you may find it rather difficult.

It tends to be something you can either upgrade really quickly because the capacity is already there in your connection (taking maybe a day or two) or it could take months because you might need a completely new set up.

Even if increasing your bandwidth isn’t a problem it may be that the systems you use have some inherent limits.

Some of those limits can be dealt with through additional licenses but most Firewalls or VPN gateways like that tend to have an inbuilt limit that can’t be overcome without it being completely replaced.

 

Another challenge to successful Remote Working is security.

Security is obviously a vital part of any organisation, but many had to take a more flexible approach during lockdown. If you go down the of route making compromises from a security perspective however it needs to be done knowingly, in the right way and for the right reasons.

One example is that of staff using their own devices to work… how did your architecture support/facilitate this?

 

Another issue, and one that often gets taken for granted is communication.

However, a sudden switch to Remote Working can really throw up issues here. Many modern phone systems have increasingly become remotely enabled, but that’s often not the case in an organisation yet to start or not very far down, their Digital Transformation journey.

Even if the foundation of the phone service is remotely enabled there tends to be functions lay on top of that such as a switchboard or contact centres.

So, for a successful switch to Remote Working all these things need to be empowered to work in remote locations.

 

The final problem to be considered for successful Remote Working is personal… the human aspect of it.

Or in other words, your organisations culture.

Culture First

When I pulled this all together, I’d only been with cloudThing for seven or so weeks and started during the middle of lockdown, but I have to say that that, for me at least, that really highlighted the importance of an organisations culture.

Culture isn’t just something you can ‘do’ once and move on. It needs to be nurtured and sustained and that has to start with the induction process.

As I say, I’m still pretty new, so feel uniquely qualified to speak about this, and I’m happy to say that I received a lot more than the half days’ worth of Health, Safety and security training you might normally expect when starting somewhere new.

In fact, seven weeks in and I was still only in the middle of my induction process as every single person is inducted into every single part of the business.

If you work in sales, you’ll get the software engineering induction and vice versa. I think what’s really great about that approach is that it stops those data/knowledge/culture silos from ever starting in the first place.

Everyone understands what other parts of the business is doing (and more importantly why they’re doing it) and I think that’s invaluable for an organisations ongoing culture.

Video is something I think is important to an organisation that’s either chosen to or been forced into a Remote Working model. Being able to see and interact with people on a face to face basis, I feel, is vital for a sense of ‘team’.

I’ve worked in a variety of places where video capability was there but never used, culturally it just wasn’t the norm, whereas at cloudThing it is, which has been really useful to getting to know and connecting with people on a personal level… much more so than an email!

 

Something that’s particularly useful for ‘bedding’ in a culture is writing it down… cloudThing have done that with our Principals.

We’re obviously a software engineering company so a lot of our Principals will speak to that but not all.

I’m a security guy at heart so no.3, ‘Governance Is Good’, is a personal favourite of mine.

What these Principals really do is codify cloudThings culture whilst educating and empowering people about how the company should operate and what’s expected of them… but also what those people can expect from their colleagues.

For example, we as a company are ISO9001 and ISO27000 qualified. They’re quality standards we certify to but they’re not just logos that are there for a tender or to show on the website; they’re fully integrated into cloudThings day to day operations and in my view that’s great from a risk management perspective.

Having those governances locked in as Principles makes for a mature risk awareness, meaning an organisation knows its own risk appetite as well its obligations to its customers.

That feeds back into cloudThings twenty-nine principles, which I see almost as ‘mini mottos’ for every employee to work to and I’d encourage any organisation to document their own if they’re looking to grow a particular culture; something that’s especially helpful if your employees are working remotely from around the country, or even around the globe.

The most important thing about these principals though is that they’re not set in stone.

The very last one, ‘There’s No Perfect Rules’, again empowers staff to question these principles…

Are they right?

Do we need more?

Do we need less?

Why Cloud?

Obviously it’s not all about culture though. We still need tools to help us deliver really cool solutions to our clients.

So a key takeaway I’d like to get across here is that even if the tools we use aren’t Cloud-Native they’re always Cloud-Capable as a minimum.

And that’s the key to cloudThings approach to Business Continuity (the clue was in the name of this article).

Having everything in the Cloud means that we can keep working wherever we are or whatever may happen as there’s no geographical reliance, meaning our customers can keep coming to us and accessing all the Cloud Based solutions we’ve built for them no matter what happens in the real world.

 

 

Some of you have probably seen an example of the above graphic before, usually called the Shared Responsibility Model.

I show it here, in reference to our topic of Business Continuity, as it’s really useful for highlighting where the demarcation is with Cloud Platforms.

I’m aware some IT managers shy away from the cloud as they may feel they’re giving away too much power or responsibility but in reality, what a digital transformation represents is moving from an on-premises environment. It means you as a customer don’t have to manage everything to the nth degree anymore.

Towards the right you get Software- as-a-Service (SaaS), which is a stack that’s managed by the Cloud Service Provider.

However, rather than being worried about that, it should be seen as a freedom for an organisation as companies like Microsoft spends enormous amount of money on running these services issue free.

They’ve got probably more certifications than most organisations could ever hope to achieve for these platforms, and so from my perspective, as a security person, I see massive security benefits, as well as operationally, having far less for an organisation to really worry about, meaning you can focus on things that drive actual ROI.

And I guess just carrying on that that security theme, looking through my security lens, this is a bar chart that looks at how platforms or operating systems are attacked and the things that attackers do when/if they do gain access.

 

 

You can clearly see the CSP’s and Office 365 have a very low attack surface and that really does speak to how well these platforms are controlled.

 

I guess an important point to note here is that if you decided to use Microsoft Azure, which has a very low number on the above graph, if you picked up Windows and decided to install it inside Azure you’d actually end up stacking those things on top of each other, so you’re not gaining necessarily from doing that, you’re probably going to bump up your finger to three hundred, and that’s one of the reasons in cloud thing why we prefer to stick to Platform-as-a-Service, which is that sort of low plateau of cloud providers, as well as SaaS products.

Just because the attack surface and therefore the management required is so much more reduced. I’m looking at one of these just in details, so this is for 365 and the link links on the previous slide to might.

 

 

Continuing with Cyber Criminals and attackers, what they’ll really be doing is trying to move through these columns.

So on the left is their initial access and on the right is their final objective.

The key takeaway in this diagram though is on the left, the way they get in.

And really, it’s all around accounts and actually the word here is identity. For anyone in the cloud that’s the battleground for security.

Circling back to that shared responsibility model I mentioned, this is a good example of where cloud providers will actually step in, over those demarcations if they see a spike in usage as CSP’s are really good at detecting that type of abnormal behaviour.

Multi-Factor Authentication & Cloud Security

As identity is so central to cloud-based security the most important factor becomes controlling that access. Which means multi factor authentication.

Something you know, something you have and something you are.

 

  • Something You Know – Passwords is the traditional step taken here
  • Something You Have – Increasingly the standard is becoming our mobile phones
  • Something You Are – Generally this is biometrics, like your face, a scan of your eye, a fingerprint etc.

 

 

More recently another security measure has been added to the three traditional factors and now we have Where You Are as well.

Where Are You could be an office network, or it could be the GPS location taken from a phone. It works using a concept of Geofencing, so access can/will be denied to a device if it’s detected somewhere you don’t want/expect it to be.

And then finally, the last of the new security measures is Something You Do. This measure uses Artificial Intelligence to learn your behaviours, the times you normally log in and out and then make a decision as to whether there’s something abnormal going on and if entry to the system or device should be denied.

Cloud Identity In Practise

This next graphic is from Microsoft and brings all these things together.

 

 

They combine all five concepts or signals which, as a company cloudThing can build polices around.

For example, take the user & location, device, and application signals. Using tools Microsoft have provided, we can construct security policies around them.

We can say where users should be logging in from… here is fine, but here is not for instance or which devices are ok to use. So, it might be set that you might only be able to access certain devices using corporate equipment, but not their own equipment.

Then in the bottom right is the ‘secret source’ that you get solely from Microsoft, constantly assessing risk in real-time.

What they do is analyse signals from across their entire infrastructure and customer base, (that’s currently about 8.2 trillion a day).

Those signals allow them to calculate very accurate probabilities of what’s going on. This puts Microsoft in an excellent position to instantly detect new and emerging attacks or threats

 

And just a final slide on Identity within Security to tie all these concepts together.

I don’t think anyone’s  going to be surprised to see where password is (apart from the fact I maybe should’ve put it lower down and possibly far further left).

Here at cloudThing though, we like to operate in that top left quadrant in terms using passwords with multi-authentication factors.  We also use Windows Hello to log in to our work devices which is a biometric login and I currently couldn’t tell you what my password is because I never use it!

I don’t have to… The system knows who I am.

Password-less authentication on the right is truly just going without passwords and not using them at all and that I think that should be a future objective for all organisations.

 

So back tools…

As a Microsoft Gold Partner no one should be surprised to know that we use the full range of the Microsoft Suite.

Microsoft 365 (formerly Office 365) which is Outlook, SharePoint, the usual suspects for Office apps, but also Microsoft Teams.

Now that I’m going to talk a bit more about because for cloudThing, Teams is a really foundational platform for our culture.

 

 

Now you may have seen products like Skype in the past or other video conferencing software but Teams is quite a bit different from all of them.

It brings together things like chat and calling, which you might have had from Skype for business, but it layers it all with the concept of ‘teams’, which you can see on this column of Microsoft Teams on the left…

 

 

These can be based on your corporate environment or whatever different departments but then within them you can create different channels and we use this as part business tool part intranet and part social as well. We really do put everything into Teams and one of its key strengths is that around files. But, what teams is doing is it’s providing a really great abstraction to SharePoint. I’ve used SharePoint for years but making it useful and pretty well can take a lot of development effort. Microsoft Teams, as a default, really gives a structure to SharePoint and importantly, as this is all a Microsoft product and I guess importantly the security model we’ve been discussing is baked in already. All of the different teams that I can see in the various channels, all the content… that’s all bound by Active Directory and the policies that you’d create anyway for other parts of your business. My single identity that I use for Outlook etc still applies here too.

And somewhat interestingly, I found this research from Microsoft back in 2017.

 

 

You can see email and face to face interactions drop off from the Boomers on the left to the Gen Z’s on the right. In terms of preference, the younger you are, the less you want to send emails and well, and apparently the less you want talk to people face to face as well. And then on the right (not surprisingly) there’s a reverse of that trend, which is towards more modern communications with online video, chat etc.

The takeaway here is about preference.

It may be that older people prefer to do face to face and prefer email but if an organisation has a tool in place to facilitate both then it will get used.

The danger these graphs show is that if you’re in an environment that has a fragmented approach to communication, whether it’s email or chat or social media etc. then you’re going to be creating knowledge new silos.

You might have silo’s anyway, but based on your structural organisation but you might also be creating others based on demographics without ever realising it.

In part that’s really what Microsoft Teams is trying to solve.

 

Microsoft Teams diagram is about commonality across the generations. The unofficial tagline here might almost be ‘you can please most of the people most of the time’ although don’t ever expect Microsoft to use that).

The benefit of Teams is that you’re centralising a lot of this type of collaboration and communication that might previously have been spread around lots of disparate apps.

The really great thing about Teams though is that rather than it just being this static app that you use, it’s extensible. At the moment I think there’s over 450 different apps you can integrate into Teams you can also develop your own, which is something cloudThing has done for many of our clients.

 

 

For example, if you wanted to use something like Webex, you could.

So, I’ve talked about the culture, I’ve talked about the technology we have and the key technology to both is Teams.

It’s that combination of having our data in SharePoint alongside the chat, the voice and the video, how easy collaboration is and how it really does support the transparency that we want across the organisation… Something I feel really benefits us as a company and certainly could benefit your organisation too.

The Cloud will certainly make remote working easier but makes it easier for your customers members to reach you as well.

What’s The Best CRM For The NonProfit Sector?

Engage new, existing and old donors, build relationships, process donations and manage analytics quickly, efficiently and cheaply

 

It’s no a secret there isn’t a NonProfit out there that can effectively promote their chosen cause without a decent CRM system to manage their donor pipeline and fundraising efforts.

The adoption of cloud-based technology around the globe, with its ever increasing numbers of apps, features, integrations, and bespoke solutions means there’s almost nothing a modern NonProfit can’t do when backed up by a modern CRM.

But that’s the nub of the question isn’t it… what’s the right CRM for your NonProfit?

A CRM needs to work hard for a charity, leveraging new and existing relationships, amplifying fundraising efforts, keeping everything compliant… all whilst reducing donor churn.

With so many different moving pieces, many of which will need a high level of AI and/or robotic process automation to be truly successful… That’s a big ask for any CRM!

What Should A NonProfit CRM Be Able To Do?

It’s not easy operating in the NonProfit sector these days.

The days of being able to just round up some volunteers and station them outside a supermarket with a collection bucket are long gone.

A modern NonProfit needs to be able to, in a single view, be able to consider its fundraising channels (that hasn’t changed at least), organise, manage and engage with it’s volunteers to keep them motivated, interact with potential, current and ex donors, liaise with third party partners and persuade media outlets to run (positive) stories about your organisation and cause… all of this in a time of tightening budgets, stifling regulatory oversight, decreased donation levels and a greater need than ever for your NonProfits support.

That means building lasting, long-term relationships with all of the above individuals, businesses and organisations isn’t just a ‘would-be-nice’ anymore; it’s absolutely essential to achieve your goals.

 

Whilst the day to day business of a charity might differ (community outreach or support, advocacy, research, activism, etc) all have one thing in common… They’re all dependant on charitable contributions for their operating budgets.

That’s why the core functions of a NonProfit CRM system must focus on donor management. That doesn’t mean however that it won’t be called upon to also perform a lot of other functions.

The good news however is that thanks to data driven technology, layering automation tools over CRM tasks has never been cheaper or easier.

But what exactly should a NonProfit CRM be able to do?

Key Features Of A NonProfit CRM System

CRM software has changed a lot recently and the right CRM can give a NonProfit a real competitive edge in a tough market.

If you’re on the look out for a new CRM, looking to upgrade/replace an existing system or worried about vendor lock-in then below is just a few of the tools you should be on the lookout for…

REDUCING DONOR CHURN

Retaining existing donors by minimising donor churn is often an uphill battle for the NonProfit sector.

Donors need to feel appreciated; they need to know their contribution is making a difference and they need to be told so frequently.

That means a key feature of your CRM needs to be to keep them updated as to what their donation is doing for the cause, what’s happening withing your organisation and what context it’s happening in in the wider world.

Any NonProfit CRM worth it’s salt then must include pledge marketing, multi-channel communications, marketing automation and social media integration.

RELATIONSHIP MANAGEMENT

Perhaps the most important aspect of any CRM system, this one is especially important for NonProfits.

Any system you finally settle on should offer you a high degree of personalisation and segmentation to keep your donors engaged (and donating) with the ability to combine that with a high degree of automation (so that engagement can happen automatically when certain actions or events are triggered rather than your staff having to do it all manually for thousands of donors).

MARKETING CAMPAIGNS

We wont spend time explaining why marketing campaigns are important in promoting your organisation but we will point out that any CRM you settle on should have a built in Marketing Campaign functions with multiple, pre-built templates to save your staff hours of time designing everything from scratch whilst simultaneously automating different campaigns to different segments of your database.

DONATION FINANCE

Hopefully your organisation receives a lot of donations which also enables them to claim thousands of pounds in Gift Aid. Those two things though can be very labour intensive.

A good CRM then should have multiple, flexible donation methods to make it as easy as possible for your donors to support you, with integrated credit/debit card processing capabilities.

It should also be able to automate the processing of Gift Aid to cut down on errors on your submissions.

Depending on the amount of donations you take, integrated accounting and budgeting features might also prove useful to you.

EVENT MANAGEMENT

You might find the thought of planning real world events digitally slightly incongruous but if 2020 has taught us anything it’s the power of online events!

Either way, online or in real life, good CRM systems should include software management tools that will let your NonProfit schedule their events, invite and manage attendees, create seating plans, track who’s coming and who needs following up and most importantly allow your team to access this information on the go, at events through mobile devices.

BUSINESS INTELLIGENCE TOOLS

Your traditional CRM’s have always been great at collecting large amounts of data and business intelligence. What they’ve not always been great at is being able to do anything with that information.

When you’re looking at which CRM suits your NonProfit best, make sure you settle on one that can segment, analyse and report on information with ease.

It also helps if the CRM allows you to layer AI integrations over those intelligence reports which will allow you to automate reports, plan ahead and garner in-depth business insights and draw connections you might otherwise not have been aware of.

The Top 8 NonProfit CRM’s Available Today

MICROSOFT DYNAMICS 365 FOR NONPROFITS

We’ll start with Microsoft Dynamics 365 as the majority of the NonProfit sector consider it the best charity CRM available on the market (with good reason).

It’s simple to learn, easy to download, great as an Out-the-Box solution but can also be customised to do almost anything you might require, including integrate with third party applications and services.

It also integrates seamlessly with Microsoft Office and Outlook which the vast majority of both the private and commercial sector operate on. That seamless integration with Office (in this case Excel) also makes Dynamics reporting second to none. Its Business Intelligence tools allow you to drill down and parse data better than any other platform currently on the market, which is easily automatable to create regular reports with charts and graphs that can detect new trends at a glance.

As it’s Microsoft your staff will be inherently familiar with it and with the drive towards citizen developers and low code/no code software your staff will easily be able to create their own applications.

SALESFORCE FOR NONPROFITS

Salesforce for NonProfits’ is their dedicated charity CRM offering.

Traditionally, Salesforce is considered one of Microsoft Dynamics main competitors for enterprise level organisations but for the NonProfit sector a pared down version is available for free for up to ten users. This makes it a good choice for smaller NonProfits who won’t need a large amount of bespoke customisation (as this can then start pushing costs back up).

It’s a cloud-based solution that focuses on inter-office collaboration, social media integrations, and is also available on mobile devices.

ZOHO FOR NONPROFITS

The Zoho CRM is an efficient and economical CRM choice for smaller NonProfits, with free licenses for up to three users and discounts available for charitable organisations after that.

It’s a straight forward CRM with an intuitive UI that works well on most devices. One of the nice things about Zoho though is the number of Out-the-Box reports available that can be set up and used almost instantly.

THANKQ FOR NONPROFITS

thankQ is a dedicated NonProfit CRM that understands the sector well, specialising in faith-based, environmental, international relief and social service charities.

It’s a good relationship building platform with tools designed especially for the NonProfit sector with a modular design that lets you purchase only those aspects of the platform you feel your charity would use.

HUBSPOT FOR NONPROFITS

Hubspot’s main product is their marketing automation tool and their NonProfit CRM has clearly been created as a gateway to that.

That being said, their CRM offering is free, feature rich and easy to plug in and start using as an out-of-the-box solution so is a great starting point for an organisation not sure where to start.

VTIGER FOR NONPROFITS

Vtiger is a great CRM platform for any NonProfit on a budget as all its basic functions come free.

It’s a great entry level CRM system that lets an organisation get up and running quickly but really comes into its own for any NonProfit that needs to manage inventory.

Vtiger offers excellent inventory management functionality, easily allowing you to manage stock and customers’ orders if your charity sells goods to support or promote itself.

DONORPRO FOR NONPROFITS

DonorPro is another great, cost-effective solution for the smaller NonProfit that comes with some useful features such as donor management, event planning, volunteer management, integrated marketing, analytics and inventory management as standard.

One of DonorPro’s nicer touches (and more popular functions) is its ability to support fundraising events such as hosting online auctions.

CARE FOR NONPROFITS

CARE, much like Microsoft Dynamics, is a good choice of CRM for larger NonProfits as it was specifically designed for enterprise level organisations.

It hosts plenty of streamlining capabilities and time saving functions for your staff with a high level of advanced tech for NonProfits looking for ‘more’.

The trade off for that functionality though is it’s an inhouse system. You’ll have to work directly with CARE rather than a third party for any bespoke developments you might wish to create, meaning if you’re looking for something bespoke you won’t be able to shop around for developers.

BITRIX24 FOR NONPROFITS

Bitrix24 is an incredibly flexible CRM system for NonProfits. It’s another of the free options (for up to 12 users), making it ideal for NonProfits with smaller numbers of staff.

It comes with a built-in email marketing system, sales automation, invoicing, sales reporting, and donor management tools that are all very easy to use.

The only downside (if you can consider it such) is that if you’re looking for a bespoke solution many third party developers might not be as familiar working with it as they might be with something like Dynamics 365 or Hubspot, making it harder to find a reliable partner to work with.

Rage Donations – How To Engage Past The ‘Now’

NonProfit’s creating a ‘story’ has always been an effective fundraising tool; generating an emotional response in potential donors… But can rage donations be leveraged into an overarching strategy?

 

Rage Donations… “What the hek are they?” you’re probably wondering.

You’d be forgiven for have never hearing of rage donors or rage donations before (or even thinking we’d just made the term up) but they’re actually not a new concept to the NonProfit sector.

They been around under one name or another, since at least WW1 (maybe even before); it’s just that, with the rise of Slacktivism, they’ve become much, much, more common and influential.

Also known as episodic donors, they can form a vital part of a NonProfits donor stream, but if not managed correctly it can be very hard to factor the possibility of rage donations into any kind of meaningful forecast or donation strategy.

What Is A Rage Donation?

A rage donor is an individual who donates to a NonProfit’s cause in response to a strong emotional stimuli, normally triggered by a natural disaster, viral social campaign or trending headline etc.

It’s a knee jerk response to donate to a cause caused by a sense of frustration at being powerless to do anything else to help.

 

We all know the Not-for-Profit sector has been struggling during the coronavirus pandemic, with donations massively down, but we also know viral causes tend to elicit a lot more one-time donations than they ever have before.

So is it feasible to run a long time strategy based on that fact?

The Pro’s & Con’s Of Rage Donation

The main pro to rage donations is the vast amounts of money they can bring into a charity over a very short amount of time (just think of the Ice Bucket Challenge or 2004’s tsunami appeal).

The con is that once that immediate emotional response has gone, or the disaster/headline/viral event has passed, your rage donors tend to dissipate without becoming long term allies of your cause.

The NonProfit sector also doesn’t have a lot (if any) of influence as to when these spikes in donations may appear, as they’re very much driven by outside trends.

This makes long term forecasting and budgeting next to impossible.

The History of Rage Donating

As we’ve already touched upon, rage donating isn’t a new phenomenon.

As far back as WW1 The US Government was cashing in on the emotional state of the public by selling Liberty Bonds to fund their war efforts. During that same period the Red Cross raised over $114m to help their troops abroad.

 

Emotional responses (and an increase in charitable giving) then, aren’t anything new.

What is though is the scale and speed that these events can appear and just as quickly disappear.

Reliable TV news coverage helped (just think how big Band Aid was in the 80’s or Comic Relief or Red Nose Day) but the advent of the internet sped this trend up even more.

What really made rage donations a force to be reckoned with though was social media.

Social Media has allowed NonProfits to massively amplify and accelerate the spread of their message with an immediacy that just didn’t exist before.

Within hours of the tragic George Floyd incident for instance, dozens of Just Giving pages had cropped up in support of the BLM movement… In fact, Just Giving themselves are an excellent example of the evolving nature of how the public donates.

How To Capitalise On Rage Donations

Whilst counting on rage donations should never form part of a long-term strategy due to their unpredictable occurrence, a modern NonProfit can’t afford to ignore them either.

And the key to capitalising on rage donations is immediacy.

The most donations will be given in the first few hours or days after an incident occurs (whatever form that may take).

That most likely means your organisation won’t have time to sit down, consider and reject strategies, formulate plans, and put together complex marketing schedules.

Instead, should something occur around your chosen cause, you need to be able to, almost with the click of a button, throw up a donation page online that offers a seamless user experience and can be shared quickly and easily across social media that fully GDPR compliant and probably automates Gift Aid donations as well.

Microsoft Dynamics is particularly suited to this because as many of the steps you’ll need (if not most) can be automated ahead of time.

Depending on the actions of your rage donors you can orchestrate their journeys down different funnels depending on how you wish to segment them based on past and predicted behaviours.

Automated, personalised communications such as emails, social media posts, SMS or web pages can all be set up ready to send at the click of a button as the situation that caused the rage donation continues to unfold.

Dynamics 365 Marketing’s pre-built campaign assets means this can all be spun out quickly and efficiently, with follow up communications being automated to build a more meaningful connection with your organisation, to turn your rage donors into regular supporters.

Converting Rage Donors Into Long-Term Donors

The goal should always be to keep as many of these new ‘actigivists’ interested in your cause for as long as possible.

You’ll need to work harder than ever to engage with them after the immediate cause of their donation, steer them in the right direction and above all else show them the impact their continued donations will have, as once the immediacy of the event has passed, the emotions that caused them to donate in the first place will likely have faded away.

 

Whilst a rage donation will typically come from a negative place it’s important that your follow on messages take a more positive tone, focussing on what is/can be done with their continued donations and what the immediate effect they’ve had is.

Your staff should also be trained on your organisations key messages should something occur and be comfortable taking calls or responding to social media posts on them as the first time your organisation even hears about an event may be when someone tweets or calls your for your opinion.

 

Whilst trying to cater to a rage donation audience on your website will always be a losing battle (there simply won’t be time), documenting all your site content is a good idea so that relevant pages can quickly be updated with new, more relevant information, FAQ can be added/changed and donation CTA’s can be put in place as soon as possible.

 

Whilst the term we’ve been using is Rage Donor, donations aren’t they only way people can help your cause.

The key here is the emotional response an individual has, combined with that sense of frustration at not being able to do something.

A rage donation occurs to alleviate that sense of frustration, but you shouldn’t shy away from educating people on how else they might be able to help, whether that’s through volunteering, gift donations or possibly lobbying on your organisations behalf.

People will be actively looking to alleviate that frustration or guilt, your job should be to show them how to accomplish that in a way that best benefits your chosen cause.

 

After a big event occurs there’s always be a huge up-spike in online traffic looking for information on this. Whilst many of these may choose to rage donate many might not…yet.

Email sign up forms and pop-ups may not be the most popular marketer’s tool, but they still exist because they work. You should be ready to capture as many emails as possible to guide down your donation funnel, to a point where they become regular donors/supporters.

 

Search engines have become more and more intelligent in recent years but they’re not infallible. In the aftermath of an event people might be finding your site but not information relevant to their immediate concern.

If you want to capitalise on rage donations whilst capturing as many emails as possible update your homepage as soon as you can with a banner or link to specific pages that let people sign up, learn more or donate.

 

The immediacy of a rage donation can’t be stressed enough.

 

If the goal is to garner as many new followers and supporters as possible then this is the perfect time to trial new tools like Snapchat, Instagram Reels or Facebook Live.

These have been designed to reach out to a new generation of audiences in a much more informal and immediate way and whilst they may not fit into your normal strategy, an extraordinary event is the perfect time to trial them as your audience will be much more forgiving of possible mistakes ‘in the moment’.

It also has the added potential of involving your organisation within the digital conversation, going viral along with the event itself.

 

Once the immediacy of the event has passed it’s important you reach out to your new audience and lead them through your standard donation funnel (more on Donor Churn here) but remember… the key to keeping an actigivist will be in replicating that initial emotional response, showing them that their continued donation can indeed have an effect and ‘alleviate’ the frustration and rage they felt at not being able to help.

 

Dealing with rage donors will, by its very nature, be a stressful time for your staff. This kind of strategy only becomes possible in times of high emotion when something has gone drastically wrong.

Whilst immediacy and a clear strategy to follow up is key, it’s also important to reach out to your staff, volunteers and existing supporters to make sure they’re ok as well as they’ll be the ones coordinating the day to day response of your organisation and cultivating rage donors into, hopefully, much longer relationships with your charity.

What Is It & How To Avoid: Vendor Lock-In

Vendor lock-in can be a huge worry for anyone thinking about Cloud Migration. Knowledge (and the right partner) is key to mitigating that risk

 

Whilst the entire world seems to be going through a Cloud-Migration at the moment, all looking for the awesome benefits over their competitors (such as increased agility, flexibility and huge cost savings) that it brings, many still shy away from it for a variety of reasons.

One of the biggest is something called Vendor Lock-In.

After all, having the bedrock of your entire IT infrastructure in the hands of a third-party would give anyone pause, right?

What Is Vendor Lock-In?

The classic definition of vendor lock-in is an organisation that finds itself in a situation where the cost of switching to a different vendor is so high that they’re basically ‘stuck’ with their original vendor permanently, normally occurring due to over-reliance on said, single vendor.

It’s still technically possible to leave (assuming no contracts with fixed terms are in place) but it just isn’t economically viable to do so.

 

Small organisations (for example a small charity or membership organisation) are especially vulnerable to this kind of situation, as, at the start of a cloud-migration journey, they might opt for an ‘out of the box, one size fit’s all’ solution, only to find that, as they grow, it’s no longer fit for purpose but still far too expensive to move away from.

 

So, is vendor lock-in just another cost of doing business? Impossible to avoid if an organisation wants to migrate to the cloud?

If that was the case this would be a much shorter article!

 

For anyone doing their due diligence correctly, there are a number of steps that can be taken ahead of a cloud-migration that will let your organisation retain the flexibility it might need to adapt to changes in their sector as they grow.

Why Is Vendor Lock-In Such A Big Deal?

Before we start though, it’s probably worth just going into a little more detail on why vendor lock-in can be so bad.

Why that is will depend very much on who you speak to within your organisation.

Your finance team will give you a completely different answer than your IT team will (though both will be in agreement it’s bad).

It could be the loss of control of the organisations key systems or data.

There could be concerns over cloud-based security or server uptime.

Finance might be worried on the reliance to a single vendor for so many business-critical systems.

The board might have concerns around what happens when the organisation grows… will the current solution be suitable, can it grow as you do and if it can’t, can you get out of the agreement?

What happens if your cloud-provider goes out of business? Can you just pick up and move then? How will you be affected?

 

There’s a lot of issues there so let’s try and unpack some of the more commonly asked about risks in a bit more detail…

VENDOR LOCK-IN & DATA SECURITY

Take it from us… shifting data from one Cloud Service Provider (CSP) to another is not easy! Trying to do so always leaves you with a pile of complicated questions…

 

  • Whose responsibility is it to extract all the data from its current locations? The existing CSP or the new one? This can be made even more difficult if there’s any bad blood between the existing CSP and yourself as they won’t exactly be bending over backwards to be helpful.
  • What format is the data even going to be in? This can be particularly problematic if, for instance, you’ve made the choice to shift from AWS to Azure as quite significant changes to the data might need to be made to make it compatible.
  • Can the data be transferred without any down time to the organisation’s day to day activities?
  • How long, and more importantly, how much will all this cost?

 

There have been moves in recent years to standardise data interchanges… this was the driving force behind Microsoft’s Common Data Model for instance, but no two organisations are ever alike so it’s never a perfect fit.

APP TRANSFER WITH VENDOR LOCK-IN

If your organisation has Apps or Bespoke Software at play, all leveraging your CSP’s proprietary tech, then reconfiguring all of that can be a scary proposition.

Let’s say you’re running Microsoft’s Power BI on Azure for example, but have also layered it with Microsoft’s Machine Learning, datalakes analytics and Robotic Process Automations (RPA’s).

Where do you even start in getting all that to run efficiently on a different CSP?

THE ‘PERSONAL ELEMENT’ TO VENDOR LOCK-IN

No one Cloud Service Provider does something the same as the next. Those differences in service can be subtle (but very present) or they can vary wildly.

If your organisation has been with a CSP for a long period of time it’s likely they’ve picked up a lot of background knowledge as to how that CSP’s systems run and can be configured.

Moving CSP’s therefore will mean your IT Team’s need to quickly consume a whole new subset of operating data, other departments will need to learn about new User Interfaces; in short, if you’re not careful… it can be chaos!

7 Tips For Preventing Vendor Lock-In

Now we’ve a better understanding of what vendor lock-in is and why it can be so damaging to an organisation with growth aspirations, let’s talk about some active steps that can be taken to stave off the risks associated with it.

DUE DILIGENCE

Migrating to the Cloud is absolutely the right thing to do. Before you do it however it’s vital you make sure both yourself and your IT teams do their due diligence to mitigate as many vendor lock-in risks as possible.

Ask yourself… why are you moving to the cloud now, what do you need a CSP to achieve right now and, perhaps most importantly, what might you need it to do in twelve months or five years’ time?

Can your chosen CSP service all those requirements?

If it can only match some of those needs you’ll have to prioritise what’s important, compare it to other CSP offerings and decide what’s best, whilst perhaps taking some of the other steps below to leave you in the strongest possible position.

 

When it comes time to signing the contracts, make sure you’ve put in place something called Target State Architecture. Agree in advance what the ideal solution looks like and make sure everything after is incrementally building towards that.

If you’re strict on governance and what’s in scope and keep all your vendor(s)/developers/engineers singing from the same hymn, you shouldn’t go too far wrong even if priorities change over time.

Make sure everyone knows what the big picture is and what they’re working towards. Making everyone accountable to the foundations of your Digital Transformation means they have to integrate, have to be open, and anything bespoke is built to ‘play nice’.

SEE IF YOU CAN SIGN A PRE-NUP

No, we’re not joking (well not really).

A cloud migration can be a little like a marriage. Fingers crossed you’ll both be happy together and live to a ripe old age but… should the worst happen it makes sense to have taken steps to protect yourself.

When looking at the contract make sure you’ve studied the exit costs and that these are factored into any ongoing transformational strategy as a potential cost to growth.

It’s also important when entering into an agreement with a CSP that you go over their exit polices with a fine-tooth comb.

Doing that will be less about the technical aspects and more about your/their legal/financial obligations. Putting data into the cloud is normally quite an easy ‘lift n shift’ process but taking it back out can be a very different matter.

There may be hidden fees for migrating data away from them, it may be in different format that’s unusable elsewhere or there may be IP that can no longer be used.

Understanding all of that will be vital to any successful exit strategy/cloud migration.

DESIGN YOUR APPS, PROCESSES & GOVERNANCE WITH AN EXIT IN MIND

When you (or a third-party software partner) are designing your organisations apps and processes bear in mind the possibility you might need to leave your CSP one day by creating them as platform agnostic as possible.

Make sure you’ve great documentation describing all your tech, with staff you’ve specifically up skilled in its use. Then, if you and your provider do ever part ways you won’t have to waste time and resource figuring out what it’s meant to do or reverse engineering it.

With that in mind, it’s a good idea to choose a CSP on an Open Platform so you can see the code behind the service, with API’s that can communicate with it.

WATCH HOW YOUR DATA GETS FORMATTED

When migrating, it’s always an organisations data that causes the biggest issues.

Different CSP’s all use different data formats and models, none of which are very compatible with each other.

As we’ve already mentioned, great strides have been made in recent years into standardising data, but the guidelines and protocols unfortunately aren’t always followed or adhered to.

Therefore, if you want to maximise the portability of your data you need to make sure you avoid any proprietary formatting with your CSP.

Data lock-in is the hardest part of vendor lock-in to overcome so taking steps to mitigate these risks at contract stage pays dividends later on.

JUST USE MULTIPLE CLOUD SERVICE PROVIDERS

No one ever said you were only allowed to use one.

More and more organisations are moving to a multi-cloud solution, leverage different CSP’s offerings and tools for different functions within their organisation.

Having a layer of AI from one CSP, your data storage from a second and your processing power from a third leaves you much less vulnerable to vendor lock-in whilst allowing for a cherry-picking of tools that gives you a ‘best of all worlds’ solution.

There is a downside though (isn’t there always?).

Multiple Cloud Service Providers means there’s a much greater strain on your Development Team making sure they can all talk to each other, if you’re not careful your security might not be as strong and of course there may be additional licensing costs.

TREAT YOUR CLOUD-MIGRATION LIKE LEGO

It’s ok we’ve not gone crazy, nor are we suggesting you try and build a castle or spaceship with your CSP (although…)

What we’re talking about here is something called component-led development.

Build things in isolation, that fit together after… like Lego.

That way, if you decide to stop or change, you’re not stuck without the missing final piece of the puzzle. You’ll also get the great benefit of being able to reuse components for other features.

MAKE SURE YOUR INTELLECTUAL PROPERTY IS REALLY YOURS

When we talk about IP in terms of Cloud Service Providers, we’re talking about the unique dev work you’ve done to make everything compatible with your organisational needs.

One of the risks with cloud migration you need to head of early is discovering that they own critical parts of your system meaning you either need to stay put, negotiate to continue using that element or re-build it all from the ground up.

You can contract your way out of this, coming to an agreement whereby you own all your own custom development, but you need to come to that agreement at contract stage.

 

Finishing up, getting locked into a single vendor can be a painful and expensive experience for an organisation.

We absolute don’t want to scare you away from migrating to the cloud, we believe it’s 100% the right thing to do but before you do it it’s important you do your due diligence and make sure the move is both right for you now and will be in the future.

That’s why having an independent third-party partner to guide you through the process can be so important.

 

  • Make sure it’s right for you now and in the future (Build Future)
  • Make sure you have an exit strategy just in case
  • Avoid proprietary formatting
  • Consider a multi-cloud function
  • And above all work with the right partner.

 

Finally, don’t try and reinvent the wheel.

If there’s already a solution out there, don’t rebuild it if you can avoid it, integrate it into your new platform.

Follow those steps and you’ll be well on your way to your own Digital Transformation!

The Bad Guys Don’t Care You’re The Good Guys

How to improve your NonProfits Cyber-Security – Quickly, easily… and at low cost

 

You may be wondering why the NonProfit sector needs its own guide around Cyber Security. Afterall… shouldn’t Cyber Security be the same across all organisations?

Whilst this is broadly true, Not-for-Profit organisations have unique concerns around cyber-security that can leave them particularly vulnerable to Cyber Actors and so these concerns deserve to be addressed separately.

 

As a sector, NonProfits hold a tremendous amount of data on people of a personal, commercial, and financial nature, as well as having access to large funds of money (donations) that many cyber actors are incredibly interested in.

Now you may feel your NonProfit is both perfectly aware of these cyber threats and secure against the risks posed by cybercriminals (and it may well be you are) but the National Cyber Security Centre (NCSC) has, on several occasions, publicly stated that many charities, especially smaller ones, don’t realise how tempting a target they make to cyber scammers.

One of the problems facing the sector is that no-one’s quite sure of the scale of the problem.

Whilst some cyber crimes do get reported by NonProfits, many don’t for fear of the reputational damage it will cause amongst their donors and volunteers.

 

NonProfits have a duty to spend as much as they can on their chosen cause and malicious cyber activity can really impact their ability to do so, whether through Denial of Service attacks (DoS) or through more direct methods such as the theft of funds or even indirectly through damaging the reputation of the sector as a whole. After all, who’s going to be happy donating to a charity if they thought their money would just end up in the pocket of a cybercriminal?

Who’s Targeting The NonProfit Sector?

As we’ve already said, charities hold both a lot of disposable funds and personal information on their donors and volunteers. Coupled with that they’re also vulnerable to other forms of attack (more on those in a minute) that could hurt their reputations with potential donors.

Now obviously the type of information about donors (or the amount of money in accounts) that’s held will vary widely from charity to charity, depending on their size, cause, structure or stated goals but all will still be vulnerable to attacks such as viruses, phishing emails, ransomware attacks, identity theft and Denial of Service.

 

cloudThing recently wrote an article on the different types of Cyber Criminals at work today (you can read it here>>) but the types of cyber-actors targeting charities might vary slightly.

Those targeting the charitable sector could very well be advanced ‘cybergangs’ but unfortunately they could also be small time individuals, operating from anywhere on the globe (making them much harder to track down after the fact).

This is why prevention has to be key.

 

The technical skills needed to commit a cyber-offence aren’t anywhere near what they used to be, with multiple tools available to make the job even easier, all available through criminal forums on the Dark Web.

These forums even offer out their services and tools under something known as Crimeware-As-A-Service with specific advice on how to target NonProfits.

Whoever attacks you though will have one thing in common with all other cybercriminals… they’ll be motivated by financial gain.

How they get that though will vary, from the outright theft of funds held by charities right through the gamut of online criminal activity to fraud, bribery and data theft.

This means, in todays day and age, the charity sector needs to be prepared for both organised gangs and individuals sat at home, in their bedrooms, possibly a continent or two away!

Types Of Cyber Attacks NonProfits Are Vulnerable To

Whilst there’s a whole host of attacks that a NonProfit may be subjected to, the ones they’re especially vulnerable to tend to be…

 

  • Ransomware & Extortion – Charities, by their very nature, have to be open and forward facing. Their entire model is predicated on people being able to contact them to either volunteer or make donations. This, however, can leave them open to extortion or Ransomware attacks.
    These malware attacks normally rely on a technique called social engineering to succeed. They’ll attempt to deceive end users into clicking on malware-infested links in phishing emails or by visiting compromised websites.
    In recent years a lot of charities have been targeted directly with these attacks in an attempt to not only steal or deny access to data but to delete or change it for nefarious purposes (something they ICO take very seriously). Cybercriminals attackers may will steal this data and threaten to release it unless a payment is made (or another demand is met). Any NonProfit involved in the protection of vulnerable individuals or holding sensitive medical data in particular need to look out for these kinds of scams.
  • ·Phishing Attacks – Another common attack (that we’ve spoken of in much more depth elsewhere) aims to trick employees or volunteers that have access to a Not-For-Profits donations and funds into transferring them to a Cyber Actors account, normally by spoofing an email from the CEO or other high level staff member requesting it.
    There are multiple variations of this fraud but always remember, it only takes a second to head to someone’s office or pick up the phone to verify if an email is real.
    Criminals may succeed in prompting fund transfers using purely social engineering, but more developed campaigns combine the fraud with the deployment of malware to capture information that can be used to generate even greater returns. Charities operate on a culture of trust and openness, and whilst we wouldn’t want to see that change, it does leave them especially vulnerable to this type of attack.
  • Fake Websites & Apps – A particularly insidious form of attack doesn’t target the charity at all, but rather their supporters.
    Tricking donors into giving money via a fake website, app or email campaign is becoming increasingly common and does untold harm to the reputation of the charity… often through no fault of their own.

How To Prevent Cyber Attacks On A NonProfit Organisation

CULTURE & TECH

Charities, much like other organisations, have a duty of care to safeguard their data and good security is a massive part of that.

That really falls under two categories though… Tech & Culture.

Surveys by the National Cyber Security Centre (NCSC) repeatedly show that NonProfits, as a sector, have a “broad lack of specialist staff with technical skills to cover cyber security, a low awareness of government support available and a low level of digital skills”.

 

Addressing that issue on a technological level is important as Cyber Actors will target organisations they deem as ‘weak’.

Something as similar as an up to date firewall can shift their attention away from you to a different target but… ultimately, all the security precautions in the world will be for naught if you don’t bring your staff and volunteers along on the journey (culture).

The best firewall on the planet won’t help you if Jeff from accounts keeps clicking on links in emails from the Sultan of Zimbabwe who needs his help transferring funds out of the country.

We ‘re obviously not being 100% serious there but you take our point.

Cyber Criminals are becoming increasingly sophisticated and your staff need to be aware of how they might be targeted so they can be on the look out for it.

 

There’s a huge gap in understanding the scale and scope of cyberthreats between different organisations in the NonProfit sector and that gap needs to close if trust in the sector as a whole is to continue.

Donors are unlikely to continue to support their chosen cause financially they begin to fear their donation may go astray.

Although it may seem like an uphill struggle, investment in Cyber-Security doesn’t have to be a huge investment in either money or time but in the end in the end, whatever resources are applied to the problem will always prove cheaper than repairing the damage after a successful cyber-attack.

Build The Future Of The NonProfit Sector… Today

Cloud-native NonProfit’s are leading the way in increased donor revenue, financial stability, improved business agility, enhanced volunteer experiences & donor retention

 

How do you future-proof something when the future has never been more uncertain?

 

Our political landscape changes daily, globalisation has increased competition to agonising levels, no one is quite sure what the economic picture will look like post-COVID and the potential of what technology can achieve continues to accelerate at an unfathomable rate.

Then, if we do manage to identify, create and deploy a strategy, the chances are it’ll already be outdated to some degree before it’s even been embedded!

 

It’s easy to understand why so many NonProfits prefer a wait-and-see approach: let others make mistakes first and learn from them while focusing on simpler issues to improve the bottom line. Or braver organisations might embark on a grand transformation, maybe even borrowing to invest in the future – a long-running programme full of big ideas, but of little value until the project is actually delivered to users in two years’ time.

 

Both approaches are full of peril but where there’s a risk, there’s also opportunity.

 

What if you could create a situation to minimise the risk of change – to always expect it, maybe even embrace it?

Could it be possible to be ready, willing and able to effectively capitalise on opportunities the moment they arrive?

Assuming it was, what would this fairy-tale approach entail?

 

Perhaps the secret to success here isn’t an all-singing, all-dancing technological solution, but something we already know we all should be doing more of: investing in our cultures.

 

Identify a clear vision, of course, but don’t wait for the perfect version of this to begin and absolutely expect it to evolve. – Fran Thomas – cloudThing, Chief Technical Officer

 

We’ll need some technology too of course, but culture comes first.

cloudThing have been involved in countless technology, transformation and product development programmes across a wide range of sectors, including blue light, government, education, financial and membership. We, of course, consider the majority of them to have been successful, but the degree to which they were varied, and sometimes greatly.

 

Assuming all other things are equal, the overriding factor that predicted this success was the willingness for the organisation to change, and whether it was set up to do this efficiently, safely and often.

 

The current recommended formula at cloudThing (we’re all agile, right?) is to create an environment that’s built to expect continuing, day-by-day and piece-by-piece transformation.

Identify a clear vision, of course, but don’t wait for the perfect version of this to begin and absolutely expect it to evolve.

What’s more important is establishing forums from a cross-section of your business and empowering them to make real decisions. Give them an agile framework rooted in research before action, require measurable results, and make it safe for everyone to fail occasionally – because if we’re going to innovate, we need to be free to experiment.

By doing this, we have a chance at scaling continuous improvement.

We’ll foster deeper ownership of decision-making. And with the people around us having a greater stake in where we’re going, we’ll discover yet more opportunity to improve.

This should also free up more time for business leaders to spend working on, as opposed to in, our businesses.

 

If we expect this agility from our people, we also need to expect it from our technology. It also needs to be ready for change, for change to be of low impact on the business and for the platforms we choose to be delivering their own innovation.

These key factors are some of the core reasons why cloud has been such a success and perhaps why we are seeing more of our customers choose the Microsoft Cloud.

 

We’re on a journey too and today this also means embedding the expectation and foundations to support constant change in everything we do. This is especially true for Microsoft Dynamics 365 and the underlying Power Platform, which is currently enjoying year-on-year growth of more than 40%.

 

If you’re already using or thinking about adopting Office 365 these are logical extensions to rapidly introduce common business applications, an RPA-cable automation platform and a low-code/no-code development environment.

These technologies are all part of a fast-growing pattern that lets power users safely self-direct their own technology needs – clearly of benefit to innovation-driven organisations.

 

It’s simple in theory, but such a shift is demanding on many levels. If your organisation doesn’t specialise in delivering transformation programmes and isn’t exposed regularly to successful approaches and technologies, how can you design a way of working that capitalises on everything that’s come before?

 

You certainly can’t buy a solution off the shelf because there’s no pre-built package that can deliver your business. After all, you exist because you’re unique in the market.

The answer is, of course, to bring in a partner.

 

But today more than ever, make sure this is the right partner for your organisation. Demand that they understand where you want to be culturally.

Demand that they upskill your staff so you can start to deliver change as a business-as-usual activity.

Demand that they are socially and environmentally responsible, because now more than ever we all need to be.

 

And considering all businesses are digital businesses, demand that they specialise in the technologies that will be used to deliver the transformation.

 

When your change partner is also the right digital partner, real impact can be delivered extremely rapidly. – Fran Thomas – cloudThing, Chief Technical Officer

 

The right partner will want to work in this way because they know it’s the most effective recipe for success.

The outsourcing of transformation programmes is a fast-fading idea. Co-source, or creating a virtual team of your own people combined with specialists from your partner, is rapidly becoming the future.

When we manage to join these things together with flexibility of people and technology, and focus on capability as much as delivery, we call this approach “build future”.

cloudThing coined this ideology early in our journey to direct us to focus on delivering quickly and solving our customers’ tangible problems today, but doing so in support of a wider, transformative vision.

 

We’re on a journey too and today this also means embedding the expectation and foundations to support constant change in everything we do.

Isn’t that perhaps the most prepared we can be for the future?

 

After all, if your business isn’t continually improving, it’s already legacy…

Data Protection & Artificial Intelligence: Best Practice

Resolving the conflicts between the masses of data AI requires and an individual’s right to privacy

 

The unlimited potential Artificial Intelligence can bring to an organisation is far too broad a subject to discuss here (although we have discussed it at length elsewhere) but unfortunately, with the benefits of AI also come the pitfalls…

 

The main pitfall we’ll be focussing on here is that of data protection, and all its associated concerns, such as privacy and data security.

 

Under GDPR (or whatever equivalent legislation applies within your territory), using technology for the processing and handling of personal data through complex computer systems (and often rather opaque algorithms) is something you’ll need to consider very, very carefully as an organisation.

 

Let’s make it clear. We’re not trying to put you off Artificial Intelligence, merely stating that it’s use requires due consideration.

 

Its potential, when applied to a business’ processes through the right partner, are nearly limitless.

But there are things you need to consider (and again… the right partner should be able to help you with that).

 

The following should give you a brief oversight on how to you mitigate any data protection risks arising from the implementation of an AI project within your organisation without scaring you so much you lose sight of the benefits that such a project can deliver.

If at the end however, you still have questions, feel free to get in touch to discuss your needs further…

Taking A Risk Based Approach To Data Protection & Artificial Intelligence

First things first… This is going to be a lot less complicated than you thought it might be.

 

When you’re assessing the impact Data Protection may have on your Artificial Intelligence project (no matter how complex) it’s worth remembering the questions that will need answering will be the exact same questions that needed answering for all your other projects.

 

  • What data will be used and if there is personal data being processed do I really need it?
  • Is the data being processed under one of the lawful bases for processing?
  • Have I adequately informed my end-users of how their data is being used?
  • Is the data being processed securely?

 

The starting point for this is completing a DPIA and deciding (and documenting) what data is relevant and needed for the project. This will be key alongside the steps you’ve taken to secure said data.

 

If no personal data is being captured, everything becomes much simpler.

 

Don’t forget personal data is any information relating to an identifiable natural person who can be identified, directly or indirectly, by any of the information being processed.  If you absolutely must use personal data, then ensure adequate controls are used to restrict access, keep it safely encrypted and pseudo-anonymise where possible.

 

As with all projects, the key to getting this right will be through a principle of Privacy By Design.

If you make it your goal to mitigate privacy risks as part of the initial project design, rather than as a rushed (and potentially bodged) bolt on at the end of the project, it’s likely you’ll be successful in coming up with a valid and compliant Data Protection solution.

 

No matter what the project, good Data Protection governances have always been dependent on specific factors such as…

The types of data being captured, what the data will be used for, how the data will be used, where the data will be used, if there are any special categories of data etc.

Whilst AI technologies do make this trickier and are likely by design to include automated decision making (AI by its very nature requires as much unfettered data as possible, whilst Privacy By Design focusses on data minimisation) the important thing is to be able to demonstrate the steps you have taken to mitigate as many risk factors as possible.

 

It’s important that this task isn’t just delegated to your Data Scientists or Developers though.

AI developers may have a tendency to prioritise data collection and a wider view than that will be needed.

These steps can’t be a tick box exercise either; you should never underestimate the amount of time, expertise or resources your AI governance and risk management efforts deserve if you want to be compliant.

Data Protection & Artificial Intelligence: How To Set Your Risk Appetite

A risk based approach to data protection and AI means you must consider (and document) how you comply with your obligations under the law by taking specific measures that are appropriate to your organisation and showing you’ve balanced the risks to an individual rights and freedoms vs your legitimate business interests.

 

Setting your risk appetite should be intentional and expected to form part of an AI strategy document. This will also affect the possible range of algorithms that can be chosen from to use in your solution.

 

The AI strategy document should scope out frameworks for applying AI within your organisation over a horizon of 3-5 years and it should assess the risks that will be posed to said individual’s rights and freedoms.

 

When it comes to AI technology, the various risks posed by your project and how data is to be processed will mean you need to take a balanced approach between those competing interests to ensure you remain compliant.

 

But…

That doesn’t mean you need to assume a zero-risk stance.

A zero-risk stance in of itself would be immensely impractical (the law even recognises this).

What it’s about is assessing your own use of AI and doing your utmost, at an organisational level, to mitigate the Data Protection risks.

Consider the following:

 

  • Have you thoroughly (and accurately) assessed the risks to an individual’s rights and privacy that may come about due to your AI activities?
  • Have you determined how all these risks will be mitigated?
  • How will you collate, store and use this data?
  • What volume and sensitivity of data are you collecting?
  • What’s the final outcome you’re attempting to achieve by collating and processing this data?
  • Have you clearly documented these risk assessments?

 

Whilst this process may feel long winded and a ‘nuisance’, doing it correctly will give you a much better picture of your organisations risk proposition and exposure and how adequate your governance’s are in balancing out the various conflicts. It will also help you to justify your actions if you are challenged in the future.

Identifying The Controller Of Your AI Technology

It’s not uncommon to have several different organisations involved in the planning, development and then deployment of Artificial Intelligence technology.

 

Whilst GDPR legislation does recognise that not all the parties involved in the processing of this data will have the same degree of control over the data being processed, it’s still incredibly important to identify who’s the controller, who’s a joint controller and who’s just processing the data… and then document these facts.

How To Make AI Systems Conform To The Data Minimisation Principle

GDPR’s data minimisation principle states you should be storing and processing the minimum amount of personal data you can to achieve your businesses goals.

 

1. Personal data shall be adequate, relevant and limited to what is necessary in relation to the purposes for which they are processed (data minimisation) – GDPR 5(1)(c)

 

As we’ve already mentioned though, AI technology requires pretty much the exact opposite… so how do reconcile these two seemingly diametrically opposed stands?

Whilst it may sound like a huge obstacle to overcome, a closer look at the legislation shows the way forward.

 

It clearly states that the data used needs to be limited to only what is necessary to complete your stated goals. Whilst AI certainly pushes that limit, the two are both still possible to conform to.

How you go about determining what is ‘adequate, relevant and limited’ is therefore going to be specific to your circumstances and should be captured within you DPIA.

 

Once you understand what data is being captured, the purpose it will be used for, and what measures are needed to manage the security risks when processing that data, when it is time to implement AI, Microsoft Azure can help.

 

We end by simply listing a few products from Microsoft within Azure, Windows or Office 365 that illustrate all the options that can be layered up to to create defense-in-depth for your data:

 

  • Azure Information Protection
  • SQL Server Transparent Data Encryption (TDE)
  • Dynamic Data Masking
  • Always Encrypted
  • Data Classification
  • Azure Advanced Threat Protection
  • PIMs
  • Group Policy
  • Conditional Access Policies
  • MFA through Azure AD
  • ·Azure Recovery Services Vault
  • Intune
  • Azure Application Gateway
  • ·Azure API Gateway
  • Azure Firewall
  • Azure Sentinel

Shadow IT – 9 Things To Look Out For & 1 Unexpected Benefit

Shadow IT will stop your Digital Transformation in its tracks… and you may never even realise it!

 

Anyone who’s worked in I.T long enough (or worked with IT teams long enough) will no doubt remember needing permission to do anything on your computer that even slightly deviated from the norm. This could be seen as either a good or bad thing depending on whether you worked in IT or with IT (we’re definitely not taking sides!).

It was impossible to introduce any kind of hardware or software into the business without signed and sealed approval, possibly in triplicate.

A large part of this was because end-users had neither the knowledge or inclination to run their own technology (beyond trying to re-install minesweeper of course).

But for better or worse, those days have long since passed…

What Is Shadow IT?

Shadow IT can go by many names.

Embedded IT, Fake IT, Stealth IT, Rogue IT, Feral IT (this might be our favourite!) or, if you work for an agency, Client IT.

No matter what it’s called though, it always means the same thing. Shadow IT is created when departments other than the IT Department create workarounds outside of the Central IT Teams control (or knowledge) to get around perceived flaws in the system.

These workarounds could be software or hardware based or, increasingly, Cloud based.

What Causes Shadow IT?

The main reason Shadow IT has exploded in recent years is because end-users have become completely comfortable with the downloading and installation of Cloud based apps and services in their private life, so it’s only natural this spills over into the working world where they seek to make their day to day role easier/more efficient.

How Common Is Shadow IT?

How do you answer that?

By its very nature Shadow IT is almost impossible to measure, or even realise it’s there in the first place. Many departments may even take active steps to hide their activities from their own IT Teams in the mistaken belief what they’re doing is for the best.

Without a through review of your Business Architecture and processes you may never catch it all.

Even if an organisation does know… it’s not like they’ll be rushing to publicise those numbers.

However, a report from Gartner back in 2015 did show that 35% of enterprise IT expenditures for most organisations will be managed outside the central IT department’s budget.

That was back in 2015 though… undoubtedly the situation is much worse now.

Examples of Shadow IT

The list of what does and doesn’t count as Shadow IT is almost endless, but some common and easiest to spot examples might include…

 

  • USB Flash Drives for the transporting of data
  • Instant Messaging services for communication between staff (FB Messenger, WhatsApp Desktop etc)
  • Private email accounts sending confidential work emails
  • Google Docs
  • WeTransfer

What Are The Implications Of Shadow IT?

Depending on who you ask, Shadow IT can be either a good or bad thing.

For most organisations Shadow IT will result in a heavily fragmented system and app infrastructure, making any kind of meaningful Business Intelligence hard to come by as well as possibly sacrificing security but… can, in some circumstances, make an organisation more agile for the overcoming of business obstacles.

Problems Caused By Shadow IT

Shadow IT can cause so many issues within an organisation that it would be impossible to list them all here but some of the main offenders include:

 

  • Security Issues – This should be an obvious one. If your staff is downloading who knows what onto your servers then the chances of Cyber Actors targeting your business with malware, ransomware or other viruses goes through the roof. It also allows untested devices and applications straight into your corporate infrastructure without any guidance, control or testing from the IT Team.  One of the main points for SaaS is that the vendor will control the upgrade and release process. This means a SaaS customer always has the latest, most secure version of software. But upgrades can still have unintended consequences. Managing the updates and testing a new release is hard enough, without a layer of Shadow IT getting in the way constantly.
  • Wasted Time – There’s a lot of hidden costs Shadow IT can cause an organisation. First there’s the time taken by non-IT staff downloading software; time taken by IT teams to fix any problems caused; time taken when someone leaves, is ill or goes on holiday and someone else needs to figure out what systems they were using because nothing is centrally documented; the list goes on and on and on!
  • Lack of Business Intelligence/Internal Logic – If different departments are collating, storing and reporting on data using a variety of different programs or calculations then over time small errors will creep in as data is copied, modified, misunderstood reported on and acted on. Gaining any kind or real-time view of an organisation is impossible when Shadow IT is too prevalent.
  • ·Wasted Investments – If senior management or the IT Department are unaware of Shadow IT (or it’s extent) then it’s possible large scale investments can be made, often in Data Warehousing or Business Intelligence Tools, that, through no fault of their own, then never get used as staff are already using their own work arounds.
  • Business Inefficiencies – Shadow IT is almost always a blocker to any meaningful innovation as it prevents the uptake of more efficient business processes. How can you streamline your business processes if you don’t even realise what’s going on in the first place?
  • Data Loss – If Data is being stored outside of the approved methods then chances are it’s also not being backed up correctly, making it far less secure and… should the worst occur, likely unrecoverable by your IT teams.
  • Compliance Issues – Depending on the type of data being stored this could open your organisation up to all sorts of compliance issues such as GDPR breaches. It could also leave you in breaking all sorts of security and software legislation without ever realising it.
  • Slows Digital Transformation – Shadow IT can drastically slow (or even stop) Digital Transformation projects. Assuming all your organisations software/hardware has been deployed to fulfil a business need; your IT Team will have backups of it if required, protocols for updating it and procedures for replacing it. If they’ve no idea where anything is or how it’s being used though Digital Transformation stalls and will be massively prone to errors
  • Creates Animosity – Shadow IT invariably leads to chaos which almost always leads to a them and us attitude between your IT and non-IT staff. It can also lead to attitudes and motivations that really aren’t in the best interest of your organisation such as data hoarding, skill siloing, self-promotion and favour trading.

How To Prevent Shadow IT

Whilst you may be hoping for a tech solution here, by far the easiest way to stop the use of Shadow IT is to remove the root cause of it, otherwise you’re just treating the symptoms.

Regular reviews from heads of departments, leadership teams and the IT Department are vital in this process.

 

  • Do the departments have the software/hardware they need?
  • Does it all perform as it should (or as the department need it to)?
  • What problems are the various departments currently struggling under?
  • Does everyone that needs to have a clear picture of this?
  • Are there any upcoming challenges that might provoke a new requirement?

 

Most importantly though, you can’t shy away from the pain points.

If you’re finding lots of examples of Shadow IT then ask the end-users why they chose that route rather than contacting your IT Team.

Are IT not approachable?

Is there too much red tape in the procurement/request process?

Benefits of Shadow IT

So now we’ve terrified you about Shadow IT we promised you an unexpected benefit, didn’t we?

The main benefit you can derive from Shadow IT is how much more reactive and agile it can make your staff.

Individual departments can control their own IT resources rather than relying on a central team.

They can also be an important source of innovation.

The person quietly working away in accounts who finds a new macro or designs an app that will automate and streamline part of his day to day? That’s Shadow IT.

But roll it out company wide and suddenly they’re an efficiency hero!

The trick is finding the right balance. Your central IT department need to be able to recognise the signs of Shadow IT whilst still controlling the technical environment, guiding the business with enterprise-class Business Intelligence tools.

 

Whilst it may be tempting to try and put a blanket ban on all downloads or unapproved apps without prior IT approval, we live in an age in which anything an end-user might need can be accessed through their personal mobile anyway.

All you’ll really be achieving is stifling your staff’s agility.

It’s a competitive world out there and departments… and ultimately the overall business… needs to be agile to survive.

Shadow IT systems can be a source of innovation. They can also be a huge blocker to Digital Transformation.

The real trick is, as with most things, tackle the issue head on at its source and remove the need for your staff to try and circumvent your IT in the first place.

This can best be achieve with a modern digital platform centralising application management, data storage, security management and other related technical assets but which also empowers staff with the right tools to extend the platform so they can develop solutions freely within their department, in the open (not in the shadows) and with the all the security concerns involved with solution development and data exchanges taken care of for them.

For example, imagine your marketing department being able to create a little app which connects to the central CRM, fetches a desired list of contacts and enables them to send them automatically to another department every Friday in preparation for a campaign.

Your marketing team can crack-on happily in the open and your IT team wont be worried about security issues – in fact they’ll probably be happier to have less work to do!

That’s where the Power Platform and Power Apps change truly digitally transform your organisation…..

Organisational Debt & Why It Makes Digital Transformation Hard

You won’t find the debt that’s slowing down your organisation on a balance sheet in the accounts department

 

We’re sure you’re all familiar with the concept of debt (unfortunately). You borrow money and until you pay it back (normally with interest) you’re in debt.

Simple enough, right?

It’s a concept that can be applied equally to individuals, small businesses, enterprise level organisations… and even countries.

In recent times however the AGILE Digital Transformation community has started to coin a variation on that term; something called… Organisational Debt.

What Is Organisational Debt?

Organisational debt can best be defined as a problem that’s created as an organisation and its business processes grow.

It’s all the compromises (big and small) that are made on a day to day basis to get to a minimum viable product (or company in the case of start-up’s).

 

You see, 99% of organisations don’t start ‘big’.

Instead they grow, adapt, pivot and change over time, according to their business needs.

This naturally leads to their business processes, job roles, structures, governances, systems, polices, CRM’s and any other day to day ‘norms’ growing ever more convoluted and bespoke as they do. However, systems, CRM’s and other Business Intelligence tools can be a large investment for a company to replace every year or two so it’s quite common for these to remain static over time, or at best be patched, patched and patched again as needed.

As you can imagine this can leave a business with bloated processes and outdated software that they’ve either outgrown or are just no longer fit for purpose.

And the larger they grow, the harder everything will be to change or divert down a different path.

This could be due to some kind of vendor lock-in but more normally it’s because everyone’s so close to the issue that no one can clearly see it what the problem actually is (or even that there is one).

A good barometer of if your organisation has collected to much Organisational Debt is how often you here the phrase “but we’ve always done it like this”.

Organisational Debt is further exacerbated if your organisation has grown through the process of acquisitions or mergers, where there’ll now be two, three or more sets of bloated processes and systems that somehow need to be made to ‘talk’ to each other before they’ll work as a whole.

The whole thing is then made even more complicated by people leaving a department with no one else left who knows where/what anything is, why it was set up like that, how it works or what to do if something goes wrong.

Is any of this sounding slightly familiar to you?

 

That’s Organisational Debt in all its horrifying complexity.

The more you have, the harder and slower any Digital Transformation effort you try to make will be.

Organisational Debt: Things To Look Out For:

We already mentioned the dreaded phrase “but we’ve always done it like this” but there’s plenty of other signs to look out for when it comes to Organisational Debt…

 

  • If you find yourself waiting forever for changes or improvements that seem obvious to everyone then speak up… If something is good for an organisation, then it shouldn’t take months for it to be implemented.
  • New processes are put in place, not to drive the business forward in the right direction but because the alternative was too difficult at the time.
  • Different departments within your organisation (or even different people withing the same department) have their own tools and methodologies for measuring and reporting on the exact same thing.
    If that’s the case how much longer does it take to get a bird’s eye view of the business – and you’ve no hope of ever viewing it in real time.
    Ensuring all of your departments work towards The Common Data Model makes Digital Transformation and the Robotic Process Automation of repetitive tasks that much easier.
  • Corners are sometimes cut or steps skipped to meet a pressing deadline.
    We all live in the real world and know that happens sometimes but the problem occurs if that becomes habit and those cut corners become standard practice. You’re then accruing a lot of Organisational Debt without ever realising it.
  • When a problem occurs, the root cause isn’t addressed. Rather new software is purchased or new processes are put in place that only address the symptom.

 

You should always be working toward the standardisation of processes. That way you can achieve a predictable result every single time a process is run. If the processes aren’t standardised, then the outputs won’t be eitherMike Eckersley – Business Architect, cloudThing

 

How To Reduce Organisational Debt

Reducing Organisational debt isn’t easy, but it can be done, either by yourself or with the right partner (hint, hint).

The first step is to realise that you won’t be able to do it all at once. Any organisational transformation will need to be accomplished in incremental steps, addressing Organisational Debt as you go.

It can’t be done just by one person though, or even with a small team.

Organisational Debt is pervasive and if you don’t lay the right foundations will just start to creep back. As such it needs to be tackled as part of a cultural transformation within your organisation.

Without that cultural shift it’s just too easy to look at the problem and think… “Nope! That’s too big and too hard, I’m just going to leave it as it is!”

 

If you do what you’ve always done, then you’ll always get what you always got! If you aren’t improving, then you are going backwards… because all your competitors will be improving. – Mike Eckersley – Business Architect, cloudThing

 

Whilst Organisational Debt will be far too big and hard to tackle with just one sweeping policy change or a quick email fired out by the CEO, it can be fixed.

You can streamline your organisation to compete with new start-ups in your sector if you follow the right steps…

Take It A Step At A Time – The first thing to realise, as we’ve already mentioned, is that you’re not going to do this all in one go.

Recognise you’re going to need to take small, incremental steps.

Start by breaking down and listing all your business processes. Everything every department does and then ask the important question… why?

Why do they do that? Is there a better way it can be done? Is there a more efficient method? Is anyone else already doing it? Is there anything that can be automated? (The answer to that last question will be a resounding yes and something cloudThing will always be happy to help with.

We’re also more than happy to help with process review and improvement, or Business Architecture as a Service, which is on our product list.

Very often if you try to do this internally, you’ll find you’re blinded by the fact you haven’t seen the advantages of best practices elsewhere or being too close to the coalface or the internal politics of the organisation.

Having someone external who can robustly but constructively challenge, can pay dividends.

Launch a Reward Scheme – It may be that you have some kind of leadership team within your organisation with responsibility for the governances that help shape and run your organisation.

If that’s the case think about instituting a reward scheme for anyone who can identify either processes or governances that are slowing down or stopping them from doing their day to day job or anyone who can suggest ways for streamlining your organisation.

The leadership team can then explore these suggestions and implement (and reward) any efficiency drives that are identified.

 

We often find that the process operators already have a good idea what’s wrong – or at least where the problem is located – which is why our workshops involve gathering as many of the process operators together in one room. It’s also the case that the documented Process Map is what Management ‘think’  is being done – whereas, when you speak to operators, you find a very different picture – this is why we always like to spend time documenting “As-Is” processes before moving on to “To-Be” processes – Mike Eckersley – Business Architect, cloudThing

 

Democratise Your Organisation – Often, the different departments within your organisation will have some kind of mechanism for the monitoring of their own governances.

When individuals are empowered to recommend changes to not just their own role and the rules that affect them, but those of their department and even the organisation as a whole, then the entire organisation will benefit.

No one knows their day to day better than the people doing it. Giving them the chance to suggest changes to streamline their role and make their department more efficient is good for everyone.

 

It’s also important to set time aside to review processes, rather than just relying on ad hoc suggestions. Suggestions can also cause more problems than solutions if they are not considered and rolled out carefully. Process measure of the current process should be in place with performance data collected, so that it can be determined quickly and easily whether any change is actually an improvement or not.Mike Eckersley – Business Architect, cloudThing

 

Not Everything Needs To Be Documented To The Nth Degree! – Putting firm governance in place is vital for the reduction of Organisational Debt… but that doesn’t mean every little thing needs a policy attached to it… in fact that’s the exact opposite of what you’re trying to achieve.

The key here comes back to the cultural shift within your organisation. You can get away with a lot less governance and policies if you trust the experts you’ve hired to do a job to… just do their job.

Technology To Help Reduce Organisational Debt

As already mentioned, reducing Organisational Debt is as much about addressing your organisations culture as anything else but there is a lot of tech out there to help as well.

A lot of the tech, like Microsoft’s Common Data Model, Artificial Intelligence, or the Microsoft Power Platform with Robotic Process Automation may seem daunting but with the right partner can launch your organisation and its Digital Transformation to new heights.

Implementing these technologies though, is not a silver bullet. The homework needs to be done on your processes first so everyone has a clear picture of what’s good and bad in your organisation, otherwise you’ll just have the same poor ways of working, but in a shiny new software solution!

 

On the plus-side, these technologies can achieve huge and provable savings for relatively small outlay, if you set about it the right way.

Discussing All Things RPA… Robotic Process Automation

Let your staff focus on higher value work by automating time consuming manual processes with RPA (Robotic Process Automation)

 

Before we dive right into all things Robotic Process Automation (RPA for short) we should probably come up with a definition of what Robotic Process Automation is.

What is Robotic Process Automation (RPA)?

Robotic Process Automation is a type of Business Process Automation technology that can handle huge numbers of repetitive tasks that would normally require a person to complete manually.

 

RPA projects often include

  • the use Artificial Intelligence (AI) and Machine Learning (ML) software to help automate decisions in those processes
  • Business Intelligence (BI) tools to track those

 

 

The software utilises ‘bots’ to mimic the roles real people would normally fulfil.

RPA software can be used to log into apps, add information to databases, calculate quite complex tasks and even log back out when they’re done.

These ‘bots’ are typically divided up into three distinct types, all performing different types of functions…

 

  • Probots – These bots are designed to complete simple, repeatable tasks such as inputting data.
  • Knowbots – These bots can take commands from a user to scour the internet (or other large databases) for information and then return with it.
  • Chatbots – The final type of ‘bot’ and perhaps the most well-known, is of course chatbots (sometimes called virtual agents). These bots can tend to exist on social media messenger apps or a companies live chat and can respond to a customers questions in real time… often without the customer even realising they’re not talking to a human.

 

RPA projects are also now sometimes known as ‘Hyperautomation’ projects.

 

In times past, a developer looking to automate an organisations business processes might have created a list of specific things to automate and then link those things to the organisations back-office systems through an API.

Robotic Process Automation differs from this though in that the RPA system will actually watch an end-user at work in the apps GUI (Graphical User Interface) and then automate these tasks through the GUI.

This means that tasks that might not be linked to an API can also be automated.

In terms of implementing it, it’s worth noting that RPA software won’t form part of an organisation’s business/IT infrastructure. Instead, it overlays that structure, meaning Robotic Process Automation can be implemented quickly and efficiently without having to re-architect whole infrastructure systems every time a new task needs automating.

Benefits Of Robotic Process Automation

Probably the biggest game changer with RPA is in its ability to adapt and learn to changing situations.

Once your RPA software has been taught to capture data and correctly interpret the intent behind capturing it, it becomes capable of triggering different responses and outcomes, initiating new actions on its own and communicating with all your business systems autonomously.

Other benefits an organisation can enjoy when they make Robotic Process Automation part of their Digital Transformation journey might include:

 

  • It can save your organisation money through efficiencies
  • Better (and much more efficient) customer service
  • Ensuring all business processes and data capture consistently comply with all relevant regulations and legislation (such as GDPR) by eliminating human error
  • Allowing repetitive tasks to be completed at a much faster rate
  • RPA is great at cutting costs associated with repetitive tasks
  • All of the above mean your employees can be much more productive, getting on with valuable tasks with more associated ROI.

What Can Robotic Process Automation Software Do?

The uses to which Robotic Process Automation are as varied as the organisations that use it and are only really limited by the developer’s imagination. Some of the more common uses RPA tends to be put to however include:

 

  • Customer Service: As already mentionedRPA processes can improve an organisations Customer Service offering in a variety of ways. Chatbots as a first point of customer contact are a good example, answering simple and oft repeated questions. Other repetitive tasks could include uploading and storing of scanned documents whilst verifying the details within using text recognition, possibly evening for allowing for automatic approval or denial of certain requests. Think about the last time you applied for anything online… was it a person or a robot you were dealing with? Were you even aware?
  • Finance: Financial Service organisations (or any accountancy department of a business) can use Robotic Process Automation for a variety of functions, including general accounting processes, operational accounting processes, reporting, automatic budgeting and budget adjustments as well as more technical services such as foreign exchange payments, account approvals (or denials) or even automating claim processes.
  • HR: RPA can fulfil a lot of functions in a HR department such as, on and offboarding processes, keeping employee information up to date or time sheet completion and submission processes.
  • Membership Organisations: RPA is great at streamlining the processing of new Membership Applications as well as automating approval processes if such is necessary.
  • The NonProfit Sector: RPA can be used for the streamlining of application for grants or handling volunteer expenses approval and payments.

Things To Look Out For When Purchasing Robotic Process Automation Solutions

If you’re looking into Robotic Process Automation for your organisation there’s a good chance it’s an enterprise level organisation. That being the case there’s several things you should look out for:

 

  • Scalability: If you’re looking into RPA Solutions don’t select one for a particular problem. What you want is an RPA solution that can cover and scale with your entire organisation, turning its hand to whatever task you put it to.
  • Speed: We’ve already discussed how much quicker RPA processes are when compared to a real person but the same should also be said for setting them up. Once an RPA as been overlaid over your business processes it should only be the work of a few hours to create, test and put the bots to work at a new task.
  • Reliability: If a Robotic Process Automation solution has been created correctly at your organisation you should have hundreds, maybe thousands of bots performing millions of tasks daily. With that much going on it’s important the built in monitoring and analytics of your RPA is up to par.

Robotic Process Automation’s Impact On Employment & Job Roles

It’s too easy to get caught up in the lie that Robotic Process Automation will be stealing jobs from hard working people, in some sort of Asimovian, post-future nightmare where robots have replaced humans so instead we’re going to ask you to step back a second and think about what technological advances are for…

Start with a person in a cave, shaping an axe head out of flint to make it easier to chop wood and go on from there.

Pretty much every technological advance ever made as been done with the express purpose of making someone’s life easier, of removing some kind of manual drudgery in a more efficient way.

RPA is no different.

Many people’s initial reaction will be that RPA technology is there to replace their staff, to take their jobs and to save companies money on their staffing expenditure (and depending on the kind of CEO you are that may be seen as a good or bad thing).

Either way it’s just not true though.

Consider your sales force… front line staff taking clients details, inputting them correctly into a CRM system, making sure the data capture is done in a compliant way and then selling your companies product. Now consider the first half of that sentence is already done for them so that all your sales staff have to do is sell… it’s a novel idea isn’t it!

RPA doesn’t replace your staff, it frees them up to do the tasks with a lot more value to your organisation.

 

In a report by the Harvard Business Review, released on RPA technologies, most Operations Groups adopting RPA have promised their employees that automation wouldn’t result in redundancies; instead it would free up their time to do more interesting work.

 

Introducing Robotic Process Automation into an organisation can be scary which is why it always needs to be done with a culture shift, ensuring everyone is brought along for the journey.

Good RPA technologies will be available to all members of staff, easily able to configure and deploy a bot to remove some of the repetitive tasks from their daily workload.

To successfully implement RPA technologies all your staff need to be onboard.

How To Get Started With Robotic Process Automation

Many of the things done by RPA software are very simple, but they can vastly improve the efficiency of organisations if used around process bottlenecks.

 

 

A typical RPA project will involve some Process Analysis to fully understand your business processes, data analysis to review process metrics and find those bottlenecks.  From there a business case is built out for the work.

Then these processes would be delivered in a specific RPA platform such as MS Power Automate that’s open-ended and allows your processes to be built in it… Or if you’re still not sure give us a call!

Creating A Low Code App Using PowerApps & The Power Platform

Written By: Mike Chappell – Dynamics Solution Architect, Matt Hollingworth – Product Manager & Les Greenhalgh – Dynamics Solution Architect

Demonstrating the building of a low-code/no-code Microsoft PowerApp with text recognition, using Azure Cognitive Services

 

The cloudThing team recently attended an all-day Microsoft Hackathon to demonstrate how easy it is to use the Microsoft Stack, Power Apps and the Power Platform to create low code/no code apps.

The various teams involved weren’t given a brief till first thing in the morning (so there was absolutely no prep time) with just four hours to design and create a solution.

For those of you who have never attended a Hackathon hosted by Microsoft before, these briefs typically take the form of a business scenario or problem to solve… You then go away and as a team come back with your finished solution, built using the Microsoft Stack, and present it to the other 70/80 attendees, all who have done the same thing, and then (the fun bit) have your solution judged.

 

The brief on the day was to create an app that could be useful during a COVID-19 world.

Our intrepid cloudThing team decided to build an app that would help getting people their cars and their rubbish whilst in lockdown into and out of recycling centres.

They chose this as the scenario had actually happened to one of the team recently who’d (believe it or not) had to use a re-purposed theatre booking system to book a time slot at his local tip, then, on arrival, the staff were waiting with big clipboards and reams of paper/license plate numbers to check people in and out.

Not an ideal situation we’re sure you’d agree!

 

In traditional cloudThing fashion they rolled up their sleeves and decided to fix that problem… in under four hours!

The Hackathon was on!

 

Their chosen solution was a website (Power Apps Portal) within the Microsoft Power App Platform that an end user could access to book a time slot at a recycling centre.

Their data would then be stored in the Common Data Service (more on that in a minute) but the main thrust of the solution on the day was automatic text recognition for the staff checking in cars.

The app would allow staff at the recycling centre to take a photo of a car coming in and automatically check if that person had a booking and if so, what time they were booked in for.

 

The first thing they did was to head to make.powerapps.com where they created a demo environment.

The great thing with PowerApps is that you can just go on to the Microsoft Dynamics trial section and say… “I’d like to play around with Dynamics and the Power Platform please” and it’ll create you a temporary area for you to play in that’s free for the first thirty days.

 

The first step was to head to entities and, as you can see below, select bookings.

 

 

From there, the team guessed a few details that would be helpful to know about someone visiting a recycling centre (or tip if you prefer).

 

 

At this point (just in case any existing or potential clients are reading this) we should point out that cloudThing aren’t in the habit of guessing anything and if this was a real project we’d have conducted a lot of research into your organisation, with our Business Architects and Solution Architects building out both staff and customer personas for how the app would be used… but the team only had four hours so we’ll give them a bit of a break!

Our Power App Pro builders. As they’d been trained to do, first defined the problem (managing queues in and out of recycling centres), before building user personas and designing a solution with a clearly defined outcome of what the app should achieve as an end result.

 

Once the fields they wanted to capture had been structured (and adding new fields was literally just a case of clicking the add button, typing the field name and pressing save) they just need to head to the Apps tab (shown below) and start building.

 

 

As mentioned, all the data was to be held in the Common Data Service so once they’d done that it would automatically start to build a default app with loads of options for customisation.

 

 

Once you click connect (bottom right in the image above) PowerApps will head off to read the entity and create a mini app that should allow you to view records within that entity, edit them and add new ones.

Back in the bad ol’ days of yore (and depending on the sector you’re in) this could have taken a developer anywhere from a day or so to six months but with the Microsoft’s PowerApps Studio and a bit of cloudThing know-how you can see the team accomplished the same result in minutes.

Customising and then integrating the app into specific business processes  is a more complicated matter and will likely require a Power Apps Pro builders expertise but as you can see, the basics aren’t insurmountable for anyone with access to the Microsoft Power Platform.

 

 

From here you can see Power Apps has gone away and created three screens.

A browse screen in the middle with a list of all the bookings…

 

 

A detail screen with an expanded view displaying more information about each booking

 

 

And an edit screen that would allow you to amend any booking details you might wish too.

 

At this point not one line of code has been written for this app.

It took about four/five minutes from having no app to having a perfectly functioning data capturing app and that’s the power of the Power Platform and Power Apps itself!

 

So far, the above is the basic framework you’d need to get an app up and running. We’re now going to demonstrate how easy it is to expand it out.

 

The first screen we’ll add is a bit of a blank screen for when an end user first opens the app.

To do that’s it’s as simple as clicking new screen (you’ll note we’ve still not had to write any code)

 

 

And choosing the kind of screen we want

 

 

We take this blank screen and move it to the very ‘top’ of the application

 

 

As we’ll want this to be the very first thing users see when they open it up on their phones.

From here we can add image both images, texts and functioning buttons.

 

 

 

(Still no real coding!)

These buttons will give the staff at the recycle centre the option to view bookings or book people in.

Now, making these buttons ‘do something’ is actually incredibly easy.

All you need to do is select the button you want to work on and head over to advanced settings.

 

 

If you choose the onselect box, which means when the user ‘does’ something with the button (i.e. presses it) you can describe what you want to happen.

In this case we want them to navigate somewhere (one of the many pre-built options within the Power Apps toolkit).

 

 

Power Apps will then very helpfully list all the screens you’ve previously built for you to select which one you wish the button to navigate to (if you’re feeling fancy you can even add in a pre-built transition fade).

Right… now let’s get on to the fun bits…

 

 

Rather than manually typing in the number plate you can head to media and select the camera option.

 

 

On a mobile this automatically defaults to your device’s camera, and it’ll do the same thing if you’re on a tablet or laptop.

Once that image is taken by the app it can be saved in a database or sent off to another API or service. Power Apps had always been able to do that though.

What’s really new and cool is this bit here called AI Builder…

 

 

Power Apps being Power Apps there’s then lot’s you can do within this section with the analytics like:

 

  • Business card reader
  • Form Processor
  • Object detector
  • Text recogniser

 

 

As we all know there’s loads of things within Microsoft Azure around Cognitive Services that will try and learn or recognise things for you and many of those have started to be added into the Power Apps framework.

The one we used for this app was text recogniser…

 

 

This camera option, like the last, will automatically default to your devices standard camera or you can just choose to upload an image.

 

 

Whether you take the image with your phone or upload it directly it will be saved to Azure in the background and you can see from the blue lines, any text that is detected will be automatically read… in this case the registration, the make and GB (as it’s a GB license plate).

The next step is being able to ‘do’ something with information.

 

 

From here we select the text recogniser option on the left and in the onchange box say put whatever the user has selected into a variable.

Variables don’t have to be created, we’re very much still in the realm on low code/no code here, all you need do is type set, then variable and number plate and Power Apps will do the rest for you.

 

 

 

As you can see, the end user still has the ability to override the license plate if it’s read wrong due to it being dirty or the photo being blurry etc.

 

 

As the App works in real time, once the registration is recognised, either through the text recogniser or from being manually submitted by the end user, it will search the database for the booking with that registration and bring it up.

And there you have it.

A perfectly functioning app in just under an hour.

Obviously there’s a lot more that could be done in terms of styling and depending on what you’d like the app to do there’s a whole host of really cool functions you can throw in but hopefully this has shown you just how easy Microsoft’s Power Apps Platform is to use if you’re a citizen developer looking for low code/no code solutions.

Now’s The Time To Get Excited About Cognitive Search

Cognitive search employs artificial intelligence (AI) to extract relevant information quickly and efficiently from disparate data sets.

 

Before we explain why you should be so excited about Cognitive Search, we should probably take a minute to explain what it is.

What Is Cognitive Search?

Cognitive Search (what was Azure Search) is the next generation of search, utilising cutting edge artificial intelligence (AI) to both exponentially improve users search queries and aid in the extraction of relevant information from disparate data sets.

It’s [Cognitive Search] search capabilities go far beyond what a traditional search engine is capable of though, by bringing together various data sources whilst also providing automatic tagging and personalisation of information for your marketing team to use, vastly improving how an organisation can discover, collate access their data.

How Is Cognitive Search Different From Previous Search Iterations?

At it’s core, Cognitive Search is based on the same Lucene engine as many other platforms, but Microsoft has added some serious enhancements on top to help you build intelligence into search apps quickly. The default query engine provides many advantages over plain Lucene queries, with built in NLP based scoring across text fields.

 

The real power comes in the form of ‘Cognitive Skills’ which allow you to build pipelines for cracking and extracting structured data from your search records before they are indexed. There are some configurable skills which allow you to plug directly into Azure Cognitive Services for common tasks around image analysis and text recognition, but if you want to go further, you can include any kind of web service in the pipeline. This is a powerful option, as it enables you to run custom Machine Learning workflows over your search data without having to worry about writing lots of custom data pipelines.

What Are The Benefits Of Cognitive Search?

Whilst there are many benefits to Cognitive Search, the main one in our opinion will be the knowledge discovery you can leverage out of your data.

It both improves the relevance of information extracted from data sets whilst also improving the performance of query responses. You can also enable the Knowledge Store, which will store a view of all the metadata extracted in your pipelines, so you can gain additional insight into the data you’ve indexed.

 

Cognitive search makes it really simple to write engaging, semantically rich search applications with minimal effort. I see it as a set of modular tools which can be combined with other ML services to allow users to query data in ways which were previously very complex to engineer – Greg Roberts – Data Scientist, cloudThing

 

Why Is Cognitive Search Important?

Due to the constant advancements in Machine learning and Artificial Intelligence technologies, systems using either a keyword-based search or a traditional enterprise search can no longer keep up with the amount of data that organisations hold and process.

In fact they’ve actually started to hinder organisations that need to quickly process large amounts of data by constantly returning too many results for search queries, returning results that are often irrelevant, incomplete or too vague, thus wasting employee time as they then have to manually sift through the returned results for what they were actually looking for.

What’s the point of making a search if you then have to search through the search results?

 

That’s where Cognitive Search steps in.

 

With Cognitive Search technology, the AI behind it is able to delve deeper and understand the users intent, pull advanced meaning from content and learn from past searches to consistently provide concise and relevant search results.

Other benefits to Cognitive Search are:

 

  • Increased Productivity: Cognitive Search enables a single search functionality which means users no longer need to switch between apps whilst looking for results.
    This save time not having to re-enter log-in credentials, saves time switching between apps and saves time by receiving better search results the first time round.
  • Employee Satisfaction: If your employees are spending their entire day searching through your databases, then sifting manually through returned search results for what they actually needed they’re going to be feeling pretty frustrated.
    Doing away with that will raise employee satisfaction levels, make them more productive as a result and as a bonus, increase your staff retention rates.
  • Business Operation Costs Are Lower: Not to belabour an obvious point but increasing productivity will decrease your organisations operational costs as time and resources are saved whilst gathering data.

How Does Cognitive Search Work?

The great thing about Cognitive Search is that it can be built on top of existing enterprise search infrastructures and not having to re-build your entire IT architecture is always a nice bonus!

 

Cognitive Searches’ AI tech will just layer on top of your current search functionality, allowing your staff to find relevant information across all of your organisations data.

Where it really comes in to its own though is it’s use of NLP (Natural Language Processing) to analyse and decipher your organisations unstructured data, heterogenous document data, and rich media like video or images to return meaningful search results.

It’s Machine Learning then learns from your employee’s frequent searches, returning even more relevant results as it becomes more embedded in your business processes.

 

Depending on the size of your organisation or the amount of data you hold, Cognitive Search is either a ‘would be nice’ luxury or a ‘must have’ necessity.

 

If you’re bringing in hundreds of new customers daily it won’t be long before your CRM becomes bloated and deriving actionable business intelligence from all that data will be nigh on impossible without the help of Cognitive Search, especially as your parse your data up into smaller and smaller cohorts or segments.

The same can be said for organisations that attract relatively fewer clients, but those clients hold a lot of data… the health or legal sectors for instance.

If You Don’t Have An Automated Deployment Process… You’re Already Obsolete

Speed, quality, trust, and adaptability is how a modern release process should be described

If you’ve ever been involved with the deployment of software, whether as a Software Designer, Project Manager or as part of the Quality Assurance team then you’ll know it includes a myriad of different processes and complexities, all of which need to come together at just the perfect moment for a truly successful release.

That could include a release of a software application, building cloud infrastructure from code or even the continuous deployment of both application and infrastructure code into the cloud.

 

However, being able to stay on top of all that and still push releases manually just isn’t viable in today’s world of fast paced Digital Transformations; where if a solution takes too long to complete, it’ll be obsolete before it’s ever released.

That’s why an Automated Deployment process long ago became the standard for software organisations like cloudThing.

It streamlines the entire release process and makes a deployment measurable, predictable, repeatable and problem free.

 

The beauty of an automated deployment is that you can code out the process once, make sure everything is exactly how you envisioned it and then embed said process into your organisations governance so that it doesn’t matter if someone’s been there ten years or ten days, your processes for a release are both easy to understand and run with.

 

Automated deployments tend to go hand in hand with companies that have adopted a DevOps approach to working.

Devops is a great way to ensure your software developers are working hand in hand with your IT teams and an automated deployment pipeline is the springboard into Devops.

What Is An Automated Deployment?

An Automated Deployment cycle then, is what makes a culture of DevOps and Continuous Improvement possible.

It creates a route to the shortest possible release cycle.

 

It tends to help if you think of it in terms of a row of dominoes….

Not everything can be automated of course, but if you imagine the software developers setting up a line of dominoes (a release of software) then an automated deployment is the moment someone knocks that first domino over.

 

Solutions these days, especially when using Microsoft Dynamics, can be very complicated, with each individual change affecting everything else in a myriad of (sometimes unexpected) ways.

Automating that process means everything is quicker and more reliable, meaning you get to continually improve your systems rather than waiting and doing everything manually, which can often take months.

Barriers To An Automated Deployment

So if automated deployments are so great, why isn’t everyone using them?

It’s a good question.

A lot of organisations are put off the idea of an automated deployment process by the worry that it’ll cost them too- much money.

Money to set it up, money to configure it just-so and then more money to maintain it.

The short-term problems never seem to outweigh the long-term benefits.

Well…cloudThing only have one thing to say to that…

 

Focusing purely on the short term is no way to future proof your organisation, something we like to call Build Future, but… if you still need help convincing someone of the benefits to an Automated Deployment Process, then read on…

5 Benefits To An Automated Deployment Process

  • Automated Deployments Cut Out Human Error – Manual deployments require human intervention and unfortunately, no matter how great your team… to err is human (at least according to Alexander Pope).
    Vital steps can be missed, quality assurance can be skipped or out of date versions of the software can be uploaded.
    The benefits of an Automated Deployment are that they don’t deviate from stated governance.
    Once it’s configured, it’s configured.
    If the deployment works the first time, it’ll still be working the 1,000th time.
  • Anyone Can Do It – The knowledge of how an Automated Deployment works isn’t siloed away in a senior developer’s brain, it’s programmed right into the system. That means it doesn’t have to be the responsibility of just one person, falling to pieces if they’re off ill, on holiday or have left.
    With an Automated Deployment system all you need do is hit the big red button that says ‘Deploy’.
  • Developing Software Can Mean Developing Software Again – Creating, testing and then deploying a manual release can be a painful, time consuming, often thankless process.
    It’s also a job that often gets given to the original software developers whose time is far better spent on other tasks like developing really cool solutions for your organisation.
  • Automated Deployments Are Agile – Not only are automated Deployments repeatable, they’re also infinitely configurable.
    That means, that whilst the underlying release process stays the same, the intended environment or machine it’s released too can be changed as needed.
    This is ideal for releasing to test environments in-house or multiple clients (if you’re an agency) without having to overhaul the entire system every time.
  • Automated Deployments Are Fast – Automated Deployments empower Continuous Improvement.
    A single deployment, once configured, has little or no overhead.
    A release process with no overhead can be repeated much more frequently leading to a continuous stream of awesome upgrades and improvements being released onto your system.

 

Remember those overheads you were worrying about?

If you’re considering an Automated Deployment process, then it means one of two things.

You either have in house software developers or you’re looking for a partner to help you.

 

If you have in-house developers, then the only real overhead is in their time to initially configure the automated deployment and that time will be paid back in dividends once it’s all set up.

And if you’re looking for a partner with an Automated Release process well… feel free to get in touch…

Tips & Tricks To Creating Successful Volunteer Management Systems

A volunteer management process needs to save time & money whilst empowering the volunteers

 

Most NonProfit organisations couldn’t exist without volunteers. Even if they can their fundraising (or other vital activities) would be severely curtailed.

And yet, many charities still have only a basic (or even non-existent) understanding of how their volunteer management processes work.

That flaw have been particularly highlighted in a post COVID-19 world, with an increase in volunteerism combined with a need to effectively manage workers remotely.

 

An effective volunteer management system is vital then, in managing your volunteers.

If set up correctly a volunteer management system can increase volunteer engagement, reduce volunteer churn and most importantly, free up volunteer administrators from the hum-drum, day to activities of managing volunteers and let them get on with the work that can really make a difference to your chosen cause.

 

Where to start though?

The secret to any successful volunteer management system is, before you begin, to set out very clear and achievable goals.

This will help the entire process keep on track with everyone involved knowing where you’re heading and why.

Those goals could include:

 

  • Increasing volunteer acquisition rates
  • Reducing volunteer churn
  • Improving volunteer engagement
  • Improving communication channels between your organisation and its volunteers
  • Attracting new donors

 

The best route for both increasing volunteer acquisition rates and decreasing volunteer churn is to ensure an engaged volunteer base.

Volunteers who actively engage with your organisation will obviously be less likely to churn but also be much more likely to become outspoken advocates for both your organisation and cause, helping with volunteer acquisition and most importantly, the long-term goals of your charity.

The flip side to that is one of disengaged volunteers, who can cost both time and money in terms of managing them, training them and then trying to retain them.

 

The best way to achieve an engaged volunteer base, on any kind of large scale, is to have a modern, automated engagement portal for your volunteers to engage with you on and for you to be able to  communicate with them on a day to day basis.

It should go without saying that this kind of portal is also particularly effective at communicating with your volunteers remotely, especially if they happen to be scattered geographically.

 

An online volunteer management app/system is great for driving engagement as it’s a cheap and easy way of communicating with your volunteers en masse.

It’s also a great place to drive engagement:

 

  • Your organisation could offer incentives/rewards through the portal for those that have donated their time
  • The benefits of instantaneous communication should never be overlooked for encouraging an engaged volunteer base. If someone’s prepared to donate their time (and likely money) to your cause, then nothing will leave them feeling less appreciated than not hearing from you from one month to the next.
  • Another benefit to an online portal for managing your volunteers is that you can also enable communication amongst themselves. Nothing builds a sense of community and camaraderie like the ability to make friendships with like minded people.
  • An online portal will also allow for your members to create and customise a profile. That profile could include a list of skills that may be useful to your Non-Profit.
    Running an event in a foreign country? At the click of a button you can search your volunteer database for anyone with the relevant language skills.
    Constructing a building?
    With the click of a button you can search the profiles of your volunteer’s profiles for anyone proficient in carpentry.
  • Nothing is more frustrating than wanting to help a charity but not know where to start. An efficient volunteer management process will make for a pain free, streamlined onboarding process of new volunteers, cutting down on the overheads of having to do it manually.
  • Automating your volunteer management systems means you can also integrate them with your Not-for-Profits CRM system which will allow for in depth and actionable insights on both your volunteers and donors.

 

We mentioned at the start of this article reducing administration costs.

Not-for-Profit organisations have a duty to spend as much of the money they’ve raised on their chosen cause as they can. To a lot of people that can mean avoiding any large outlays of capital.

However, a fully automated volunteer management system can massively reduce manual overheads.

Think for a minute about how many workhours go into the administration of new volunteers, completing paperwork, adding them to the CRM, onboarding them.

Now think how much additional work your staff could get done if that entire process became automated…

 

Creating and utilising an effective volunteer management system is an investment a modern Non-for-Profit organisation can’t do without, especially in a post COVID-19 world.

People today demand instantaneous communication and the ability to make a difference at the touch of a button… your volunteers will be no different.

 

Get the system right and you’ll improve relations with your volunteers, potentially encourage more donors (as most volunteers tend to be both), increase retention rates, get actionable intelligence and still reduce overall costs.

What Is Solution Architecture?

Solution Architecture is vital in minimising the number of steps in a project that need to be changed, fixed or created.

 

The phrase Solution Architecture can mean many different things to many different people, with the entire matter being further complicated by Solution Architects themselves defining the role differently depending on the project they’re working on at the time or the company/sector they work within.

 

You see, there is no, one ‘official’ industry definition of what Solution Architecture is… but when has that ever stopped us?

So today we’re going to take a crack at defining what Solution Architecture is anyway…

What Is A Solution Architect?

At cloudThing we define Solution Architecture as scoping out and documenting the ‘architecture’ of a system, delivered in such a way as to provide a solution to a specific problem (or set of problems).

Said solution can encompass the entire system or just its specific parts.

Our Solution Architects use a set of guiding principles that shape our solution design and the decision-making processes in our Digital Transformation programmes, informing our Enterprise Architecture, development policies and governance as a means to help realise common programme benefits.

We try to keep our architecture principles as high-level definitions of fundamental values.

Those principles then enable us to clearly articulate our high-level approach and efficiently deliver high-quality solutions for internal product development and customer projects alike.

The principles also direct some of our key organisational strategies, whilst being simple, consistent, flexible, enduring and useful.

The idea being they should never hinder progress but instead, support it by directing decision making and resources.

We also review these solution principles annually to ensure they remain relevant with current best-practices and our customers strategic goals.

What Does A Solution Architect Do?

To put that in a way that makes sense; cloudThing’s Solution Architects will be involved with your project from the initial brief right through to the final releases and sign off of the project; considering along the way multiple viewpoints including business, information and technical.

They’ll work closely with all the key stakeholders involved in the project to scope out the brief and then translate that into a detailed solution vision.

 

That means they’ll be involved in the solution ideation, the solution design and the solution implementation.

 

  • Solution Ideation: Our Solution Architects will look beyond the initial brief to put the project into a wider business context for the end solution and then define the vison and requirements needed for that solution to become a reality.
  • Solution Design: During the solution design process the Solution Architect will formulate and elaborate on potential options which may or may not include initial prototype developments. They’ll then select the optimal solution for success and finalise a roadmap for the chosen solution.
  • Solution Implementation: During the implementation process the Solution Architect will maintain communication with all the project stakeholders and guide the implementation team to ensure the original vision is maintained.

Why Do I Need A Solution Architect?

A new project your organisation wants to implement or a problem it may be having wont ever sit in isolation.

Understanding your business architecture and putting the project into a wider context is what cloudThings Solution Architects do.

Using these processes they gain a deeper understanding of how your business works, understanding the why’s and wherefores of what you need and then use that understanding to make sure all the necessary processes and actions happen to make sure the desired end goal is reached.

All project or product design and delivery activity benefit significantly from collaboratively creating and agreeing to operate using a set of shared foundational beliefs… cloudThing just happen to have codified and named there’s.

 

Solution Architecture is vital in minimising the number of steps in a project that need to be changed, fixed or created.

Ultimately this deeper understanding of your business results in less work during the design and implementation of the project allowing our clients to make efficiency savings from the offset.

If Nothing’s Going Wrong, You’re Not Innovating Enough

Digital Transformation has existed as a mainstream term in the IT sector for a long time and even before that, the phrase used was ‘channel-shift’.
Mike Eckersley – Business Architect, cloudThing

 

Digital Transformation is a term for a project or strategy taken on by an organisation to change the way either it provides services to customers or how the staff interact into a new, more efficient medium using technology.

That’s taken many iterations over the years including moving paperwork to CRM systems, remote working, encouraging customers to use web chat instead of phone calls and to many more to count.

 

I worked in the Housing sector for over twelve years and have seen IT make a big difference to the quality of service delivered to tenants through numerous new approaches.

However, this hasn’t always been the outcome with every project. I saw first-hand, projects that have gone awry despite best intentions.

 

Sometimes a project just doesn’t land well with users. It’s either too complicated, too steep a learning curve or such a big change that it becomes a white elephant and users have to end up working around the solution using ‘Shadow IT’ or simply going back to what they always did before.

In a modern world, where people use technology in new ways all the time in their personal lives, this simply shouldn’t be happening with enterprise-level investments in transformational technology.

 

In my opinion, most Digital Transformations come down to mindset, attitude and approach. Mike Eckersley – Business Architect, cloudThing

 

By labelling strategies and projects with ‘transformation’ or ‘channel-shift’ it’s too easy to view them as an end-goal or destination that can be reached liked any other milestone in a project.

Back when I was in the Housing sector we had to pick and choose where we spent budgets and we simply didn’t have the resources to go after ‘transformation’  as an end goal in of itself… and nor should we have.

 

Transformation implies a total re-imagining of an organisation, from the top to the bottom, with new ways of working throughout… an approach that risks ‘throwing the baby out with the bath water’.

We of course wanted to continue to support our tenants in well maintained, safe properties and ensure organisation were paid on time, through universal credit or otherwise.

 

Whilst what we delivered may look totally different in ten years’ time, it’s through Continuous Improvement and incremental changes that transformation must be led… not pursuing it through one project alone.

 

Since moving from the Housing Sector to software development and Microsoft Dynamics 365 Gold partner cloudThing, I’ve seen projects delivered through truly agile methodologies.

I believe this is an approach still not mastered by most and whilst individual projects may be ‘labelled’ as agile, the true benefits of the approach come from an entire organisation adopting agile to achieve quick wins and incremental steps towards an overall transformation.

 

I’ve been lucky enough to be involved with customers across multiple sectors in recent years and I’ve seen first-hand how a different approach to transformation and IT projects in general can make a dramatic difference to success.

Our work with the Institute of Chartered Accounts in England of Wales, as well as the south African Institute of Charted Accountants are big transformational deals, lasting multiple years but have been split up into disparate project phases, each with measurable success factors.

This granular approach to the big picture as well as the small means that every feature of each solution must be delivering today, whilst also building for tomorrow.

This means that if priorities or resources change, we can shift what will be delivered when, whilst still moving towards the organisation’s overall goals.

 

Treating Digital Transformation as a project managed with a ‘waterfall’ approach is simply never going to achieve the required result.

Waterfall relies on agreed and locked-down success factors from top to bottom and working back from there, which can work with software development but not when applied to a business strategy involving technologies which can change in as little as twelve months.

Proclaiming that to transform an organisation must have 90% of customers interacting with themselves through digital methods makes sense as an overall goal but digital can take so many forms it’s in the steps to achieving it that we arrive at genuine business value.

 

With the advent of chat bots, voice assistants, A.I and more, technology has never been more accessible to everybody.

Siloed projects that deliver these expensive new technologies or services which don’t integrate with existing infrastructures often lead to simpler, more cost-effective solutions being missed all together.

Through modular, incremental improvements, creative SME’s can really make a difference.

Instead of committing budget to a huge transformational project; starting small and building on each success will ensure user buy-in and a transition to digital that is manageable and built on processes that are familiar to staff involved.

 

What I’ve seen over the years is that the success of any project is dependent on an ability to quickly prove its business value to the wider organisation.

By taking the approach of building incrementally on existing infrastructure through various pieces of software and integration means that the sector can shift away from reliance on large, inflexible outsourcing agreements and software to instead deliver fixed-cost, manageable investments with clear goals and minimal risks on the journey to transformation.

 

By lowering the risk to innovation through smaller ongoing investments both SME’s and enterprise level organisations are free to test what works and what doesn’t quickly, without the worry that jobs are on the line if something fails.

Through cloud technology, we’re free to spin up and test solutions which were in the realms of science fiction only a decade ago.

These projects need no added hardware investment, only some integration work to build into existing systems, meaning there’s no added screens for users, no skill gaps for supporting the new hardware… just room to test and see what makes an impact to improving services.

 

This, for me, is the best approach to deal with IT projects going wrong.

We must accept that when trying new ‘things’ sometimes they’ll fail.

Lowering the risk, pushing forward and searching for the projects that truly work is how we’ll best transform over time.

Working with innovative SME’s and their creativity whilst combining it with familiar, off-the-shelf solutions whilst not being afraid to make mistakes along the way is the real route to Digital Transformation.

The Five Types Of Cyber Criminals

What are the different types of Cyber Actors and how can you protect against them?

 

Although the term Cyber Criminal gets thrown around a lot most people don’t realise that a Cyber Criminal is only one heading in a much larger category of individuals and groups known as Cyber Actors.

What’s A Cyber Actor?

Cyber Actors can be both individuals or part of a much larger group; normally characterised by the desire to damage a person or organisations computers, devices, systems or networks (in short, not very nice people)!

The broader term of Cyber Actor can be used to stand for them all or often gets broken down into its five, distinct groups; with the different categories being defined by their motivations rather than tactics or tools they use.

Fortunately, that means strong Cyber Security protocols usually work the same against them all.

What Are Cyber Criminals?

The first, and most and common term used is Cyber Criminal.

No doubt you’ll have come across this before, may even have fallen victim to one of their scams in fact.

They tend to be motivated by profit and greed and all Cyber Security experts agree, they pose a very real and present danger to users everywhere.

Common tactics employed by Cyber Criminals involve the selling of illegally obtained data, disrupting systems and holding them for ransom (known as Denial of Service or D.O.S attacks) and other nefarious scams involved with phishing for data such as social engineering, business email compromise (BEC), botnets, brute force password attacks, exploit kits, malware, ransomware and unfortunately a lot, lot more.

What’s A Malicious Insider?

A malicious insider is at once the easiest and hardest of the categories to protect your organisation against.

They tend to be disgruntled (or just malicious) ex-employees, contractors, agencies or anyone else who may have had access to your systems, networks or data.

A Malicious Insider is best defined as someone who intentionally misuses or exceeds the access you’ve granted them, either for personal profit or in an attempt to hurt your organisation.

It should be pointed out that there is a clear difference between a ‘Malicious Insider’, acting in the full knowledge of what they’re doing, and an ‘Unwitting Insider’ accidently clicking on a link in a dodgy email.

 

Steps to combat against Malicious Insiders usually involve governance that instantly revokes the credentials of anyone after their business with your organisation is done as well as the changing of any joint passwords they may have had access to (although preferably your organisation wouldn’t have any joint passwords in the first place).

What Are Nation State Actors?

The vast majority of people need not worry about Nation State Actors as you’re unlikely to ever come to their attention, although in recent years you may have read in the news about Nation State Actors on social media trying to influence foreign elections and the like.

However, depending on the size of your organisation or it’s prominence within a geographic/political territory, you may wish to make specific Cyber Security plans against Nation State Actors as they can be very well funded, with a lot of resources at their disposal.

A Nation State Actor can be most easily recognised by their targeting of public and private sector networks in an attempt to compromise, steal, change or otherwise destroy data (espionage in other words) and being motivated by, political, economic or military ideologies.

They can be both a direct department of a nation state or just receive covert funding, direction or technical advice from one.

 

Some Cyber Security experts still use the term Nation State Actor interchangeably with Advanced Persistent Threats (APT’s) but APT tends to refer to a specific type of activity which can be conducted by a variety of different Cyber Actors as it’s normally defined as someone who’s gained long-term access to your system or network.

What’s A Hacktivist?

Hacktivists are individuals or groups that tend to have a lot of self-taught cyber skills behind them and can be best defined as ideologically motivated Cyber Criminals.

Rather than doing what they do out of a desire for personal gain though; they do it from a standpoint of political, social or ideological motivation; targeting both individuals or organisations whom they feel deserve their wrath.

Common methods used by Hacktivists will include Denial of Service attacks, doxing (the practice of researching and then publicly broadcasting private or identifying information about an individual or organization) and website defacements.

If your organisation works in a sector that inspires a lot of extreme feelings, on either side of the fence, then it’s well worth considering Hacktivist attacks in your Cyber Security strategies.

What Are Cyber Terrorists?

The final category of Cyber Actors are Cyber Terrorists.

Sometimes confused with Hacktivists, in reality a Cyber Terrorist is just another word for a terrorist. Whilst cyber attacks by terrorist organisations are become more common, their primary motivation online currently remains the dissemination of their ideologies and goals as well as the recruitment of new members.

 

The talent, knowledge, abilities and resources of the various different types of Cyber Actors can vary wildly, as can their motivations for targeting your organisation.

As part of your Cyber Security strategy you need to consider what you do, why you do it and what kind of Cyber Actor that may encourage to try and attack your organisation.

From there you can take steps in protecting your systems, networks and sensitive data.

If your primary goal is profit led, it stands to reason you’re more likely to attract the attention of a Cyber Criminal.

If you’re a Non-Profit organisation it may be you attract more hacktivist attacks.

 

All these things need to be considered when putting (or updating) your Cyber Security Protocols in place.

Privacy By Design – What You Need To Know

Privacy-By-Design… We can hear you groaning already!

If you have absolutely anything to do with handling private data in your organisation then we’re sure the 25th May 2018, the day GDPR came into force, has been indelibly burned into your mind.

Companies, organisations and businesses were scrambling to secure their data to comply with the new regulations (and you’d be surprised how many still are), but it didn’t have to be that difficult.

That’s where Privacy-By-Design steps in to help…

What Is Privacy-By-Design?

Privacy-By-Design is an approach to creating a system that empowers data protection, privacy compliance and an individual’s right to privacy from the get-go.

Under Privacy-By-Design, protecting and anonymising data isn’t something that’s just bolted on at the very end of a project (if at all). Instead it becomes an integral part of both the current project and your organisation’s culture going forwards.

It’s worth noting here that although Privacy-By-Design isn’t specifically called for under GDPR, the benefits to its implementation within your organisation will be immeasurable when conforming to Data Privacy legislation (we’ll come back to this point at the end of this article).

 

Problems with Privacy-By-Design come when attempting to implement it with older, less secure systems.

Many organisations still struggle with legacy issues when introducing the principals of Privacy-By-Design and that’s where the experience of a privacy specialist partner can prove invaluable (*cough, shameless cloudThing plug, cough).

The reason organisations struggle is that a lot of older systems can’t enable or support modern data security best practices which help maintain confidentiality, integrity and the availability of data.

The solution then becomes one of trying to add patch over patch to make it work or stepping back and seeking a way to integrate it into those same legacy systems that mitigates data breaches and keep your organisation compliant with GDPR (or whichever legislation is applicable to your region).

Privacy-By-Design’s Foundational Principles

Privacy-By-Design can, perhaps, best be defined by looking deeper into each of its seven foundational principles…

 

  • Proactive Not Reactive; Preventive Not Remedial – Any approach to Privacy-By-Design should be proactive, not reactive. Rather than responding to privacy concerns as they occur, a Privacy-By-Design enabled system should try to anticipate and then prevent any invasive practises before they occur. It’s not there to help you respond to risks or breaches once they’ve occurred, its purpose is to make sure they don’t occur in the first place.
  • Privacy As A Default Setting – A Privacy-By-Design system should put an individual’s privacy first (the clues in the name!). If an individual does nothing, their privacy should still remain intact without having to sign in, opt out, re-register or unsubscribe. The individual’s privacy needs should come first, by default, never as an afterthought.
  • Privacy Embedded Into Design – Privacy-By-Design shouldn’t just be a cultural goal for your organisation. It should be embedded into the very design and Business Architecture of your IT systems and entire organisation. It should never be seen as a nuisance or a reactive protocol but instead a core component of all your Business and IT architecture.
  • Full functionality; Positive Sum, Not Zero-Sum – Any Privacy-By-Design system that you implement into your organisation, should, by default, seek to support all legitimate interests and goals your organisation has in a positive-sum (or win-win) manner. Conversations should never be held about trade-offs regarding goals, functionality or privacy (a zero-sum approach.) Privacy-By-Design skips over any seemingly contradictory goals, such as privacy vs security, instead making sure both are possible to achieve in a complimentary fashion.
  • End-To-End Security: Full Lifecycle Protection – Privacy-By-Design isn’t a one-time thing that an organisation can just ‘do’ then move on. It’s something that should extend throughout the lifecycle of the data you hold and the systems you hold it on. It should ensure that your systems are compliant for the entire lifecycle of the data you hold, erasing it in a timely fashion as well as ensuring your system stays private and secure with any future updates you might implement.
  • Visibility & Transparency, Keep It Open – Privacy-By-Design should give an organisation confidence in their business practises, technology and culture. Confidence that they’re being operated in a way that aligns them with the organisation’s goals whilst providing complete transparency to both staff and end users alike.
  • Respect For User Privacy, Maintain A User-Centric Perspective – Before anything else though, Privacy-By-Design should require all Business Architects involved with an organisation (both internal and external) and system operators to keep the interests of the end-user at the forefront of their mind.

Why Is Privacy-By-Design So Important?

As we’ve already mentioned, Privacy-By-Design isn’t (yet) necessary to be GDPR compliant.

However, implementing a Privacy-By-Design culture within your organisation will help you both be, and stay, compliant much more successfully than any other method.

It’s a powerful tool in both mitigating potential GDPR breaches and building trust with the public.

Creating a Privacy-By-Design system that places privacy above all else has multiple benefits, including…

 

  • It helps identify privacy risks early, allowing developers to adapt to and change your systems to address issues before they become organisation wide (and thus much more costly to fix).
  • It will increase awareness of data protection, GDPR and privacy in general across your organisation, helping with brand reputation.
  • It will have immeasurable benefit in showcasing how your organisation has met its legal obligations should you be called upon to demonstrate them, either by the ICO after a Data Breach or by a potential new client undertaking due diligence.

 

Ultimately GDPR will continue to evolve (and more and more countries will adopt similar legislation if they haven’t already).

Privacy is going to be the key issue that concerns consumers in the coming years.

 

Instead of adapting to new regulations as and when they become law, Privacy-By-Design allows your organisation to get ahead of that and focus on more important goals by future proofing your business now, something we at cloudThing refer to as Build Future.

 

We talk a lot about Big Data, Machine Learning, Deep Learning and Artificial Intelligence and in the coming years those terms will become standard for most sectors and industries but are going to open an organisation up to a world of hurt if they haven’t yet sorted out protecting an individual’s right to privacy.

That’s why Privacy-By-Design is the solution you need; if not now, then soon.

Security-By-Design: Or… Better Safe Than Sorry!

Far too often security is the final afterthought of a Digital Transformation project

 

Software Developer: “I’ve built this really cool ‘feature’; now I must make it secure!

Security ArchitectFacepalm!

 

Sound familiar?

You may have come across it in your own Digital Transformation project or, (hopefully not) been a victim of this kind of thinking further down the line when it was far too late to do anything truly effective about it without spending a fortune in time and resources retrofitting a new solution.

That’s where one of cloudThings guiding principles come in… Security-By-Design.

 

In recent years it’s been good to see that Security-By-Design has started to gain a lot more prominence, becoming a mainstream development approach for many that aims to make a system secure from the very start, rather than scrambling to patch up vulnerabilities as they’re noticed, either at the end of a project or worse, during during a breach.

It’s an approach to software (and hardware) development with a stated aim of making a system as free from vulnerabilities as possible; ideally making it impervious to attack through measures such as Continuous Improvement (or in cloudThings parlance, Build Future), Continuous Testing, multifactor authentication safeguards and strict adherence to software development best practises.

 

Sounds great doesn’t it?

Unfortunately, Security-By-Design is still very much in its infancy, with many developers still only giving it a passing acknowledgement.

Far too often at cloudThing, when speaking to new clients, our software developers come across the same security errors and vulnerabilities time and time again.

 

Does this mean software developers are just lazy by nature? Or incompetent?

Of course not!

 

The problem is often one of culture and what various, different departments are held accountable for.

When starting a project, the development team will be asked to build a ‘feature’ and all their time and effort will likely go into making that ‘feature’ as great as possible.

Often security won’t be an issue till long after the ‘feature’ has gone live, so it receives little attention in development stages.

 

You see the problem that cloudThings founders saw a long time ago though don’t you?

That’s no way to future proof a business – or Build Future as we say here.

What Is Security-By-Design?

Security-By-Design is the opposite of Security-After-The-Fact.

Security-By-Design is defined as an approach to software development in which security is built into the system from the very beginning.

When considering a Digital Transformation project, a company that prioritises Security-By-Design (*cough, cloudThing, cough) will create software that’s been built from the ground up to be secure.

A risk led approach will favour considering, adapting, rejecting, testing and finally optimising multiple, different, security controls and then ensuring only the very best are built into the project’s architecture throughout its design, whilst being used as guiding governance by the software developers involved. With each new release or patch that comes after that, the security of the release and how it interacts with the system as a whole will be a primary concern.

 

You see, Cyber actors/Cyber criminals are lazy.

They’ll always target organisations that offer them up the path of least resistance.

That means, when attacking a system, they’ll likely use well known and predictable tactics, tools and patterns, known in the industry as reusable techniques.

Any Security Auditor worth their salt can apply security controls to combat these threats against a system by utilising approaches such as enforcing multifactor authentication, authorization, confidentiality, data integrity, privacy, accountability, safety and non-repudiation requirements for if/when your organisation comes under attack.

 

Think of it as though you’re building a bank if you like…

Of course, you want a beautiful building, with gorgeous architecture to attract clients but you don’t just say “oh… we’ve built it now, better throw a padlock on the front door”.

When building it you construct foundations that can’t be tunnelled through, walls that are blast proof, all entrances covered by hi-tech security and a great big, state of the art vault in the middle of the building.

That’s the real difference between Security-By-Design and Security-After-The-Fact.

Why Is Security-By-Design Important?

Well, as already mentioned, the obvious answer to that question is a system built to Security-By-Design principals is much more secure… by several orders of magnitude in fact.

And, although that’s a great reason, it’s not the only one…

 

Security-By-Design will actually reduce your overall costs and mitigate many future risks.

Think about the last project you were involved in.

We’re willing to bet that the last month or so is where you faced the most budget and time constraints.

Right?

Ask yourself… Is that really the best place to be considering the security of your entire system and organisation? (You don’t need to answer that by the way, the answers pretty obvious).

 

A Security-By-Design system will always end up with more resilient than a hastily added patch at the end of a project as, by implementing security measures in a step by step process throughout the project, you allow your designers to identify security flaws as they go, enabling them to quickly, easily (and cheaply) fix them, rather than having to overhaul the entire project at the end.

Identifying security related bugs early means they can be on the lookout for similar flaws, preventing further problems in the build process, or worse production.

 

Finally, the last point many forget when building a new system is that it isn’t an ‘end-goal’ in of itself. It will continue to organically grow, adapt and evolve over time as your organisation does.

If you’ve taken a Security-After-The-Fact approach then any future modifications to your system may well invalidate your entire security protocol without you even realising it, creating new risks for your organisation as well as multiple opportunities for malicious cyber actors.

That doesn’t happen with a Security-By-Design approach as your security is an inherent part of the system, not a bunch of controls stuck on around the edges.

Building A Culture Of Security-By-Design

All the above is well and good but skips over the most important step of all, building a culture of security within your organisation.

It has to start with a positive relationship between those commissioning the project and those building it, with everyone’s goals and values being aligned from the off.

 

Security-by-design breaks down traditional development/security silos, making security part of everyone’s role, which means everyone is both empowered and responsible for delivering a secure solution. Tony Leary – cloudThing Principal Architect

Reducing Member Churn & Delivering Member Insights With Data Science

Discussing Membership Churn and how to deliver valuable Membership Insights with Data Science.

*Transcript from cloudThings recent Membership Sector Digital Conference

 

Today I’ll be talking about reducing Membership Churn and how to deliver valuable Membership Insights with Data Science.

Hopefully you’ll come away with an overview of how Membership Organisations (or any organisation really) can make use of technology, Data Science and a data led approach to deliver a really modern approach to managing Membership Churn whilst maintaining (and growing) engagement and how all of that should fit into the broader context of emphasizing data driven decision making in an ongoing strategy.

 

I’ll start by discussing the type of data you should be tracking and why that’s important before walking through some concrete examples of how data can be used to build a Churn management program.

Finally, the crucial piece of this entire topic must be organisational ‘buy-in’ and how to build a positive culture around the proactive use of data to drive decisions at a strategic as well as a tactical level.

First though…

What Is Membership Churn?

Membership Churn is the likelihood of an individual ceasing to engage or pay dues with a Membership Organisation for a variety of different reasons which results in the Membership Organisation having to spend extra resources to attract new members.

It’s also known as Membership Churn Risk, Member Attrition Risk or Member Turnover Probability.

How To Calculate Membership Churn

There are a few different ways of tracking Churn, which I’ll talk about later but the most straightforward formula to calculate Membership Churn is:

 

  • Membership Churn = No. of lost members in 12 months / No. of total members in 12 months

Membership Churn & Retention… The Challenge

I’d like to really emphasize just why Membership Churn retention is such an important topic, especially for membership organizations.

 

As paid for (PPC) channels become the default standard, the cost of new member acquisition is just going to go up and up and as more and more organisations embrace modern multi-channel engagement patterns, the competition for ‘eyeballs’ on your content and adverts is going to increase as well; making proactive membership retention strategies critical to strategic thinking organisations. – Greg Roberts – cloudThing Data Scientist

This applies across all sectors, but especially organisations that are dependent on a recurring Membership model as their primary revenue stream.

 

The first piece of the puzzle you’ll need to solve is in understanding what’s actually going on in your organisation on a day to day basis.

And really, that all comes down to data.

The more complete a picture you have the better equipped you’ll be to make impactful decisions in developing a winning strategy.

The Single Customer View

 

 

You may have heard the ‘Single Customer View’ referred to by different names… The Single User View, The Single Member View… Whatever you call it though, it’s a data object that should be at the heart of your strategy

It doesn’t matter whether it’s a glorious, all singing, all dancing enterprise data warehouse with dozens of real time data feeds coming in or just a centralized set of spreadsheets that you update manually; the crucial point is that the data is consistent and correct.

Maintaining accurate (correct) data isn’t easy though.

Often, you’ll have lots of different systems, all talking to each other, making it much more difficult to maintain a single version of the ‘truth’, but consistency and accuracy have to go hand in hand as maintaining a Single Customer View is all about one central place where everything matches and gives you full Omni-channel view.

This also means you need to learn to see it as a continually evolving project instead of just a one off, ‘Big Bang’ get everything in one place and it’s done job.

That mindset will allow you to create an asset that you can use to drive your strategic goals forwards.

 

You can start this process with just some very basic data that’s easy to get a hold of.

 

  • Demographic information about your members
  • Membership revenue history

 

Even that basic level of data is valuable information that can help you make more informed decisions if viewed correctly as it starts to hint at what things drive people’s behaviour, what drives people’s engagement or makes them to churn or not churn.

Once you have a good process established for gathering this level of data (and more importantly for using the data in your day-to-day decision making) you can start prioritising other streams of information to pull in; more aspects of your Members that you want to understand.

 

Acquisition Source is important one to understand and can probably be accessed fairly straightforwardly. Having access to that data will allow you to segment your churn rate by source, allowing you to better understand performance and ROI of different acquisition channels.

 

Outbound Engagement is always going to be a central source for any marketing team.

It doesn’t matter whether this is just email opens and clicks or if you can analyse the data right down to fine-grained engagement metrics, it’s not only a useful way of digging into the performance of your outbound activities but can also reveal critical early churn signals.

 

Inbound Engagement can be defined as people reaching out for support or about accessing services or having questions about those services.

This is where things are likely to get a bit more difficult in terms of accessing data.

Inbound Engagement is the area where most organisations will have a multitude of disparate systems that need tying together. Unfortunately, a lot of the time, those systems won’t be set up to make this an easy task but will offer a wealth of information when done correctly.

 

Touchpoint Tagging is the Holy Grail of a Single Customer View.

Once you have all of this data around who people are, what they’ve been doing, how they’ve been engaging with you and the type of thing they’ve been engaging with, you can start to think about categorising those touchpoints in terms of what that engagement actually means for your organisation.

This could be something domain specific like the types of content your members are engaging with or more generic like requests for membership info or even some kind of referral engagement or the offering of a discount leading to additional engagement.

Being able to aggregate by categories of engagement will help immediately show the impact of your strategy and help you to review actions taken and make sure that what you’re doing is both working, impactful and delivering value for yourself and your Membership base.

 

In my view that’s what a Single Customer View should always aspire to look like.

Once you have some of that data in place you can start thinking about what you need to measure and (more importantly) how you’ll measure it.

That takes us to something called Key Value Measures…

Key Value Measures

 

 

The way to think about all that data you have in your Single Customer View is as ‘inputs’.

It’s the data that directly refers to touchpoints and is very tangible. It’s contextual, it’s relatable you likely know what it means in relation to your organisation and what actions you can/should take to influence it.

The measurements that you have of your data… those are the Outputs.

They’re the things you want to influence with your different strategies by changing things which can affect the inputs.

That’s why it’s important to have good strategy for measuring each of these.

 

We’re now going to take a high-level look at some of the more important metrics, how to keep track of them and how to best calculate them.

Subscriber Churn

What is Subscriber Churn?

Subscriber Churn is a way of measuring how many people your organisation loses. The simplest way of measuring that is to take how many members you’ve lost (or have churned) over a certain period and divide that by the total number of people you had at the start of the period you wish to measure.

The key point about this is that it’s generic as it doesn’t define a specific period and it doesn’t necessarily define a member, allowing you to measure this at different levels. It means you could measure this on a weekly, monthly or yearly basis.

How you might break it down is governed not only by your organisational structure but also by what you’re trying to measure. Measuring over a shorter timeframe can tell you how particular campaigns are affecting churn but are subject to some volatility. Measuring over a longer timeframe will smooth out this volatility and allow you to see the longer-term trends.

Lifetime Value

Lifetime Value is a phrase that should resonate with anyone involved in the Membership Sector and this form of LTV calculation is specific to a recurring model, so is measuring LTV in terms of Membership Churn.

To calculate LTV (Lifetime Vale) you just need to take your average margin over a specific period and divide it by your subscriber churn over the same period.

As before, calculating this over different periods means you can aggregate at different levels.

You can show LTV as a monthly view, you can have it by cohort view or by a segment view allowing you to consider trends different levels.

Total Cost Of Acquisition

Next up is another classic metric known as Cost of Acquisition.

This is a fundamental calculation for the Membership Sector because its value comes from applying it in a segmented way.

COA (Cost of Acquisition) can be applied to all your different channels to get an idea of how they’re performing against each other or it can be used to measure different segments to see what sort of value and cost of opportunity you’re getting for different activities.

Margin Churn

Margin Churn, sometimes known as Revenue Churn or Reoccurring Revenue Churn, is a metric that has comes out of the SaaS software model of companies but it’s really applicable to anywhere that has this kind of recurring structure

To calculate it you take the initial margin (the margin that you made in the previous period) and then look at the next period. You then need to ask how much margin should we lose through members churning and how much new margin did we generate from new acquisitions?

The point about measuring this as opposed to just subscriber churn is that it allows you to get a view of where you’re losing or gaining more value.

It could be the case that you’ve noticed your subscriber churn rising quickly so you send a big 50% discount on renewals to those in some segment you’ve identified as being at risk. As a result of doing that you’ve potentially reduced your subscriber churn, but at a cost of people taking that discount when they weren’t actually going to churn, meaning all you’ve really done is reduce your margin going forward.

Or in other words you’ve reduced Subscriber Churn by driving up Margin Churn.

Calculating and focusing on Margin Churn explicitly alongside Subscriber Churn and these other calculations will give you a view on what activities provide better value, especially when it comes to Member Retention.

KVMS – Dimensions To Measure

 

 

I’ve gone into a fair bit of detail there on how to segment these different measurements and the different ways you can divide them up to get a granular picture of engagement, retention and churn across your database.

 

Dates were the most obvious metric to measure, by week, month or year (I’ll come on to the other metrics in a minute).

The granularity that each of those different things will give you will paint a different picture with more or less volatility and more or less granularity in terms of seeing the impact of small specific time boxed events.

MoM or YoY snapshots are another great way to get a comparison across time without having to measure your charts with a fine-tooth comb.

Let’s say you’ve noticed a yearly pattern on one of these charts. Taking the time to build that month on month picture within the year gives you a view of how it’s changed over certain periods allowing you to ascribe a RAG status to any trends that may need your attention.

Rolling averages are another way of good of looking at the medium-term picture while smoothing out some of that volatility that you’re likely to see with more granular pictures.

Another way you could segment Membership data is by cohorts…

What are Cohorts?

Cohorts refer to one contiguous set of people.

They let you look at the behaviour of people when grouped by when they first appeared in your database or by what channel or campaign they first came through, to get a view of your channel performance or even more generically by the category of the first touch points on which you encountered them.

(This works even better if you have these touchpoints tagged and categorized as I mentioned earlier).

Segments

The third way you can segment member data is by Segments.

Segments are very distinct from Cohorts in that, with a Cohort, a person stays in the same Cohort forever.

A Segment however, is defined by what they do going at a particular time. This means people can move around between different segments based on their behaviour, making measuring the size of your Segments and how people move between them is a critical activity.

Typical types of Segments include level of engagement – based on how many things someone interacts with in each week/month etc.

Content Preference – This could be channel preference or a content category, identifying people by the sort of thing they interact with/how often they interact with it.

Segments are an area where Data Science can really help out with cool things like clustering algorithms as something like Content Preference is going to be really fuzzy measure but it’s still useful to be able to aggregate those fuzzy things into concrete and actionable business intelligence.

 

The final thought I’d like for you to takeaway around Segmentation is that it’s most valuable when you start applying domain/sector specific knowledge to build Segments around data that’s important to you rather than, perhaps, some of the generic ideas mentioned here.

In a perfect world you’d end up with a Single Customer View, which contained everything you could want to know about your Members and all the channels they’ve interacted with throughout their history with you.

You’d then be able to apply that data to any of the Key Value Measures mentioned above, segment, split or group it by any Segment or other dimension that might apply to your organisation and by doing that get to a point where you have an easy way to perform an in-depth analysis that will explain explicate behaviour patterns.

What Causes Membership Churn?

I’m going to take a moment now to talk about how to actually get to grips with all that data you’ve collected and analysed and what kind of actions to take from it.

I’m also going to touch on what causes Membership Churn in the first place, how you can predict it, and then what sort of actions you can take to reduce it.

How To Predict Membership Churn

The first key point is to understand how to predict Membership Churn before it ever happens.

I’m going to walk through one example of how you might do this, how you might build a Membership Churn signal predictor using an engagement score.

 

The above diagram shows a typical customer journey with a Membership Organisation.

You can clearly see someone joining and then engaging, but eventually also churning with engagement split out by channel and by month.

In the first month you can see their sign up, they’ve received their welcome email, they’re reading loads of your content and they’re even attending events.

Over the next few months engagement is fairly consistent. They may be engaging slightly less but you’d expect that because people have lives to lead.

Eventually some of that engagement starts to drop off.

They might sign up to an event which they then don’t attend, they’re reading less content per month, maybe they’re not engaging with your emails as much and then by month six… BOOM… they’ve churned.

They’ve cancelled their subscription and you’ve lost them as a Member and they go back to the top of the funnel as someone that you need to think about acquiring again.

So where did it all go wrong?

How To Stop Membership Churn

Finding and calculating an engagement score can help you understand that question.

You could say that it went wrong as soon as they didn’t attend an event or you could say it went wrong as soon as their engagement dipped down from where it previously sat.

The best way to define that score however is by accounting for all channels of activity and using that data to build a score.

For instance, you could say looking across all these channels, the engagement score starts off high.

Great!

They’re highly engaged with your activities but then… oops… You start to see some signals indicating they’re not quite as fully engaged as they were and that pattern continues, getting worse and worse and Boom… They’ve churned.

Even from this high level analogy you can start to see if you calculate something as simple as the number of newsletters opened or events attended, minus the number newsletter sent but not opened or events not attended but signed up for you can start to see that there are key turning points in their Membership journey and because you’re reducing it to an actual data problem you can then say, “OK, we’ve got this. Once people reach a yellow engagement score, we need to start taking action.”

You may feel that there’s no need to do anything when someone’s on a green engagement score and that by the time someone’s on a red engagement score it’s already too late to do anything.

 

The real (data led) solution you’re looking for is the best way of targeting each of these zones whilst still being responsive to the preferences of individuals, providing great value for all your Members and not just giving everything away for free.

To do that you’ll need to calculate the opportunity costs of different Churn Management Strategies.

Let’s say you’ve identified a Cohort with a hundred of your members in it who have low engagement.

They’re in the yellow zone, about to crossover to the red zone and past data tells us that when that happens, about 20% of those members can be expected to churn within the next three months.

So, what’s at risk here?

From the 20% figure you can calculate the potential Margin Churn, allowing you to make a calculated decision on incentives to keep that segment.

 

“We’ve got this pot of money.

It’s at risk.

So let’s offer a discount on renewal to that whole Segment.”

 

Then, looking again at past data, you can see that 30% of Members who receive that discount will convert.

However, that means if it’s sent to a hundred members, you’re most likely giving that discount to ten people who weren’t going to churn in the first place.

Being able to calculate that will give you the ‘real’ cost of the discount you’re offering.

Giving away too much of a discount may not be efficient so calculating an accurate Margin Churn figure will allow you to fine tune the discount percentage offered, meaning you can calculate an ROI in a predictable manner.

And all of this can be calculated before you offer anyone the discount in the first place, thanks to the data you’ve collected.

 

Summing that up in a slightly different way; you now have the different segments of engagement score with green, yellow and red RAG status, and for each of those different segments you’ve identified the likelihoods of Members churning, expressed as a percentage.

That means if a Member has a green score, the likelihood is only 10% of them will churn in the next three months. With a yellow score that might rise to 40% and when it reaches red that might rise to 60%.

Now what you need to do is define the Opportunity Costs for a campaign directly targeting each of these segments.

 

You need to decide where’s best to deploy your resource to generate the most value.

 

You might decide that it’s really not worth doing anything to try and stop your green segment from churning but that it’s somewhat more worthwhile going after the members who are in the red zone.

But your data also shows that by the time people have ‘turned red’ they’re not likely to respond to anything you do. It may even be they’ve set up a spam filter on your emails.

That means you still need to the ‘sweet spot’ to prevent Membership Churn.

 

For the Members in your Green Zone, you don’t really need to worry about Churn Management.

You just want to think about how to get them more engaged so you can do really cool stuff with personalisation and segmentation.

Then, for the members where you have the highest likelihood of delivering value to yourself by addressing them as Churners you can focus on re-engagement campaign, building an actual content funnel to send people through with progressively better discounts to get them to re-engage.

And finally for members who you’re almost positive will churn no matter what you do you might start thinking about targeted outreach campaigns… picking up the phone, send a personal email or maybe even send a survey to better understand why those members churned allowing you to adjust your value proposition strategy based on the data you receive from it.

The Cold Start Problem

The cold start problem is when a member comes into your organisations ecosystem that you need to make recommendations too, but you don’t know anything about them yet.

You don’t know who they are or what they’re interested in, so you’ll be leaping on the smallest of signals to make them recommendations.

How To Solve The Cold Start Problem?

 

 

All Recommendation Engines essentially work on one of two principles.

They’re either content based, which works on the principle of, “you’ve read X, X is related to Y so let’s suggest that you read Y.”

Or they’re collaborative based, working on the principal of, “you’ve read X, someone else has read X who then also read Y… Let’s show you Y”

 

Both approaches have their pros and cons…

A concept-based filtering approach is very useful, but it requires having a lot of metadata attached to your content

Collaborative filtering can also be a useful approach as it will surface the similarities between your pieces of content.

The difficulty with more collaborative based approaches is that you need lots of tagging of touch points. You need all of that data to be available to you in a way that joins up with your other data (Our article on The Common Data Model might be worth reading here).

So in terms of solving the Cold Start Problem, most of the time a hybrid approach between the two will be required as neither solve this problem entirely on their own.

 

One possible solution to the Cold Start Problem is to take a heuristic approach, setting new members with a default profile based on content that other members have engaged with before.

You could also designate some content as featured content that you know existing members already engage with.

A third, and perhaps better solution to the Cold Start Problem, is one I’ve had a lot of success with in the past and that was by engineering ‘diving off points’. Central hubs where people only need to express a small bit of interest in a type of content before they’re directed down a well-made marketing funnel of content, almost immediately giving you data on the new member to understand what they’ll engage with.

 

That’s just a couple of examples of the sort of thing you could try, and whilst there’s a lot more, what it really comes down to is having a joined up Omni-Channel view of your Membership Base.

The best Membership Churn strategy will come from taking all of that data that you’ve generated, taking all of those segments metrics that you’ve defined and feeding that stuff back into your engagement channels so that when you send out a new campaign, it’s just the touch of a button to decide what will be of interest to which segment.

 

Ultimately, the value of data comes when you can turn it into actionable wisdom.

Going up from raw data in files and tables to aggregating that data in segments and cohorts and then using it to provide context to ongoing organisational strategies.

Becoming more and more mature at this process will allow you to do more automation at each stage, using data science, but it’s crucial to get the foundations solid before you start on that journey.

The Common Data Model

Ed Yau – cloudThing ​Solution Architect & Predictive Science Team Leader​

Building a Common Data Model across a Membership Platform under a single view, with multiple lenses whilst avoiding Vendor Lock-In​

*Transcript from cloudThings recent Membership Sector Digital Conference

 

 

The  Common Data Model that I’ll be discussing today is one of those things that can often get overlooked but can also massively accelerate a Digital Transformation project if considered early enough.

 

I’ve been lucky enough in my career to work on some fascinating problems dealing with data, Big Data, Data Lakes and all the associated Data Architecture that comes with it and currently I’m working with an amazing team of Data Analysts and Data Scientists to find innovative solutions for the Membership sector.

 

If you want to avoid rework and technical debt in your project when it comes to make use of your data, one of the first steps of a project should be to design your data architecture to consider how you will be using that data – and to make sure you have coverage of all your domain entities with a data schema.  Getting this right maybe the basis of numerous workshops between different teams of stakeholders.

 

But what if you could skip that effort; because it has already been done for you?  Read on to find out more.

 

 

So, to begin the journey lets talk about… ‘Data as the new oil’.

 

That’s an idiom that’s been coined in recent years; recognising the fact that data holds tremendous value for an organisation when utilised correctly.

 

It’s worth protecting with secure engineering practices and if set up correctly allows you to dive in for insights as to how your organisation should be making decisions.

 

Unfortunately, the decision as to how data is stored (rather than used) is often left quite late into a Digital Transformation project, by which time it’s often too late to make any fundamental changes to correctly optimise it for its use in AI or ML (Artificial Intelligence & Machine Learning) projects.

The Common Data Model

 

 

That’s where the Common Data Model comes in.

In a nutshell it’s there to make sure your Data Architecture is as simple, useable and transferable as possible. Trust me… you no one benefits from a complicated Data Architecture!

 

Talking about Data Architecture I imagine is a little like plumbers talking about plumbing to their wives… if you’re the person in the charge of it at your business, you will feel like no one else cares until it goes wrong! Ed Yau – cloudThing ​Solution Architect & Predictive Science Team Leader​

 

That’s why I’m talking today about Data Architecture’s unsung hero… the Common Data Model (or the Common Data Service) that quietly operates behind the scenes, behind all of Microsoft’s PowerPlatform products, and why consideration of it it should be a fundamental part of your next Digital Transformation project.

 

Before I dive into talking about the Common Data Model specifically though, it’s worth setting the scene as to why it’s such a good thing.

Realising We are All in Debt

Debt is a word with unpleasant connotations.  In general, when it comes to creating solutions, as in life, the lower your debt the cheaper it is to deliver.

 

 

If you leave your Data Architecture as an afterthought to a project, you’ll effectively be incurring something called Solution Debt from its inception.   It’s something you’ll eventually have to pay back, most likely whilst you’re trying to make improvements to your Platform in the future.

 

In other words, good Data Architecture will prevent conversations with your Financial Director that sound like…

 

“Why does this Data Analysis project cost so much?

Why can’t I have this data in real time?

Why is our GDPR compliance so complicated?

Why can’t I find the data I’m looking for?

Why are our reports so complicated to run?”

 

What can you do on your next project to avoid ever being asked those questions?

Plan out your Data Architecture now, preferably using the Common Data Model as the basis.

 

As I’ve mentioned, far too often Solution Debt isn’t considered when planning a Digital Transformation project.

 

Consider for a moment…

Why do ‘disruptors’ entering a marketplace for the first time do so well against large established brands?

 

Surely an established organisation should have the capital reserves to pivot and reinvent themselves in a competitive market?

So why is it they so often struggle to move as quickly as a newer, younger competitor?

 

I would argue it’s because when an organisation is born, it is swift and agile.  It has not yet acquired the Solution or Organisational Debt that occurs as a business grows.  It may go through a merger or two, make several acquisitions or grow naturally as successful business do.

Eventually though, making a change becomes a lot more difficult because there’ll be data everywhere with duplicate data stored in different places meaning making just one, simple change involves having to access four different places.

 

That’s when you find can’t even move away from these systems anymore because you’re stuck in contracts that last for years with various vendors and providers… sound familiar to anyone?

It’s a difficult problem called Vendor Lock-In and it’s a problem cloudThing have a lot of experience in solving.

 

I will refer to a few specifically by name, as there are many different types of debt an organisation accumulates naturally as it grows, but that’s not to say with the correct architectural decisions early on you cannot aim to minimise the impact have on your organisational effectiveness. – Ed Yau – cloudThing ​Solution Architect & Predictive Science Team Leader​

 

Enter then… The Common Data Model!

Benefits Of The Common Data Model

This is where the Common Data Model shines. When implemented early in a programme, it can really ease the headaches in future change.

Benefit #1: It Helps with Data Debt

 

 

The first type of debt the Common Data Model can combat is Data Debt but to understand that we’ll need to go back to first principles and consider a concept called Data Gravity.

 

It’s a phrase coined originally by Dave McCrory, a wellk-known pioneer of noSQL databases back in 2010.  In a nutshell…  the more data you have in one place, the better your Data Gravity will be.

The better your Data Gravity is, the easier it’ll be to use with apps or to analyse it and garner insights from. If you have your data spread around, and siloed, you have low data gravity situation.  And it will cost you to bring it together before you will be able to fully unlock all the value held within.

 

If you only take one thing away from this article, I’d hope it was this as it’s such an important concept.

 

For instance… Have you ever been caught in a scenario where you wanted to integrate two different membership databases but couldn’t because the tables holding those members had different columns or even completely different data types, meaning it wasn’t as simple as just combining all the records together?

 

Or maybe you’ve got the data and you want it on your application database but can’t until the data has been sanitised. For instance, there may be missing or broken records.

 

Those are just a couple of examples of how Data Debt can harm your organisation.

 

It makes the simple act of using your own data in just a slightly different form from the one its currently in difficult. You have to pay off that debt before being able to do the things you really want to with your data. – Ed Yau – cloudThing ​Solution Architect & Predictive Science Team Leader​

 

So how does the Common Data Model help with that?

 

First of all, it’s true the Common Data Model is a Microsoft market design but I should point out that it’s been designed to a non-vendor specific standard that they created for all their business applications.

There are elements which Microsoft consider ‘core’ such as Account and Safety, Contact, Currency, Email, Letter, Notes etc… Things that every organisation has.

Then they’ve got segments with things like Sales, Service, Finance, Supply Chain, Commerce, Marketing, Emails and Marketing Pages.

So contact on there would be the same as your core, which is how you can share data across both of those domains.  There are tables of entities that are specifically created for that standard that include many Solutions for common sector problems. e.g. Finance, SupplyChain, Marketing, Healthcare sector.

 

The main takeaway I want to get across though is that any Development Team can implement this and if they follow the Common Data Model guidelines then any application that uses this standard will fit together with other apps like Lego.Ed Yau – cloudThing ​Solution Architect & Predictive Science Team Leader

 

 

However… Microsoft being Microsoft they’ve have taken it way beyond that!

 

One of the big benefits of the Common Data Model is that there is already a ready-to-use implementation that Microsoft has built called the Common Data Service.

That means (circling back to Data Gravity for a second) if you store your data in the CDS (Common Data Service) you literally have to do nothing to be able to use the standard.  But let’s go over the other details that get uses geeks excited….

Benefit #2: The Common Data Service Talks to Almost Everything

 

 

The next benefit of the Common Data Service is that you have a way of getting data from almost anywhere.  Within the Power Platform ecosystem there are already over three hundred pre-built connectors and that includes 3rd parties like Gmail, Twitter, LinkedIn, Salesforce, DropBox with all the usual connectors into Microsoft Office 365.

 

Because you have so much Data Gravity in your CDS, these connectors let you assemble a phenomenal amount of value with relatively little effort.   You can literally treat it as the centre of your ‘Octopus’ with its ‘tentacles’ pulling data from all sorts of sources. – Ed Yau – cloudThing ​Solution Architect & Predictive Science Team Leader​

Benefit #3: Out-the-Box DataLake Integration

 

 

Whilst a clear benefit to the Common Data Service is that it’s great for building apps, where you need your data in real-time; it’s also great if you want to do very, very deep Analytics.  When you do offline analytics, you might be combining many different data sources to augment your source data, some of these may not be sanitised yet.  You won’t want this polluting your application database.  But it maybe considered ok for analysis purposes.

 

Step in the out-the-box feature to plug straight into Microsoft Azure’s Datalake, which is Microsoft’s store for big data and deep analytics using DataBricks.  So, you get a vital component of your data architecture for free. Ed Yau – cloudThing ​Solution Architect & Predictive Science Team Leader​

 

Benefit #4: Enterprise Ready

 

 

So, you’ve may have all your data in-one-place, but now you’ve realised it’s become a very juicy target for attack.  If you rolled your own data architecture, now you to think about designing security features.  If you forgot to do this from the beginning, that’s technical debt right there.

 

Well Microsoft have thought about that by building in a range of Enterprise features.

Security for instance… having all your data in one place makes it a massive target for cyber actors unless it’s very well secured.

Privacy features… Who’s got that data, who’s using it and for what?

 

All of these things are provided for by the Common Data Service as an out-of-the-Box solution so in the end your team needs to design a lot less than they would if they went down a customisation route.  You still need to design the appropriate policies, but at least you know that if created correctly your data is safe.

Benefit #5: Easy to Build On

Another type of debt to consider is Process Debt and again.

 

This is not so much debt caused by bad solutions architecture, as a natural debt that occurs as the processes in an organisation form; especially as they undergo mergers and end up with multiple systems doing similar jobs.

 

CDS is also at the heart of solving this through its ability to be the centre of your PowerPlatform universe and how you can use the connectors, PoweApps and PowerAutomate to ‘glue’ your disparate systems together in an orchestrated way.  Also, AI Builder, as Microsoft’s low code way of building AI automatic decision-making modules for linking to your process automation.

 

Within that system we can create apps, analyse data with Power BI, create bots through Power Virtual Agent and have the ability to knit together any number of systems using Power Automate. Power Automate is a great way of stringing together multi-stage processes which can link together as many different systems as you like.

 

A re-occurring pattern in the way cloudThing use the Common Data Service is as a single source-truth, linking into Microsoft Dynamics 365 but also other non-Microsoft systems as well. Ed Yau – cloudThing ​Solution Architect & Predictive Science Team Leader​

 

 

So if you’ve got five or six different payroll systems for example you might use Power Automate to keep those in-sync but ultimately it works much better if the Common Data Service is you’re single source of truth.

 

Imagine an email coming in from a volunteer with an expense request for example.

Power Automate can funnel it into an approval process which triggers a PO to be raised in your finance package, a response can then automatically be sent back to the requestee saying it’s been authorised. You could even imagine PowerAutomate ordering an item from a supplier if these were re-occurring orders.

 

Those are the types of things that PowerAutomate is good at, whilst still bringing everything back into your Common Data Model for analysis later.

Power Virtual Agent

Whilst Power Automate is for gluing together other systems, there are other tools at your disposal.

Power Virtual Agent has been designed to let you run a customer facing bot that is pre-scripted for tracking conversations and their results all in one place

Power Apps

Finally Microsoft’s Power Apps is great for building low code apps with all that data flowing straight back into your CDS.

 

So you can, for instance, personalise that app for a member, tracking how they engage with it so you can understand what features, activity and information they might be showing interest in and their Power BI can report on what’s occurring in those other apps you’ve built; whether through a Power Virtual Agent, through Power Automate or through Power Apps, its ultimately all stored in one place for you to analyse.

 

This has been a journey that I hope you have enjoyed; we started off with so fairly tenuous concept for data nerds, but we end up concluding that the strength of Common Data Model is the strength of the Microsoft ecosystem the surrounds this universe. Your organisation will undoubtedly care more about the fancy shapes you can build, but Common Data Model is the most important piece of Lego: the base that you start from. Ed Yau – cloudThing ​Solution Architect & Predictive Science Team Leader​

Leading With Technology In The Membership Sector

The world has changed and everyone’s gone to the digital rapture

 

*Transcript from cloudThings recent Membership Sector Digital Conference

 

The world has changed…

And actually, I don’t think it’ll ever go back to exactly the way it was before COVID-19, with people embracing new ways of working, socialising and living.

A great example for this is an event we were planning with Microsoft recently and had to cancel. Rather than just waiting for lockdown to be over though we decided to run it virtually and instead of twenty or so people within thirty miles of London turning up we’ve had attendees from Ireland, South Africa, the USA, Finland and Canada.

We’ve all faced challenges thrown up by the current climate but with the right technology and attitude there are still opportunities out there for an organisation to pivot and grow.

 

In fact, one of the big things I’ve noticed over the last few weeks is that there’s been a massive trend towards ‘eyeballs’ on a business’s webinars and content.

We’re all at home after all.

Maybe some people will have a bit more time than others but the number of requests I’ve had to attend a webinar or various other meetings has increased massively and they’ve all seemed to be on the same topic or theme.

No matter how far along our digital transformation journey each of us might be, COVID-19 has changed the way we approach things.

It’s probably made us go a little bit faster and a little bit harder into Digital than we might have before but as I said, there’s real opportunities out there for those who are ready to pivot and grab them.

 

cloudThing was founded as a vehicle for change in a fast-paced world with an ethos of #BuildFuture.

But what does that mean?

That means thinking about the long-term.

It means when you’re making decisions today, think about how you can pay it forward. Think about what you might want to achieve in five years’ time because you can potentially make a choice today that will make it exponentially easier to achieve a long-term goal.

And you should try for that mind set every day; don’t just be tactical.

 

Always try and think about your long-term view because, quite often, there’ll be no additional cost to doing something slightly differently today that’s really going to make a difference down the line.

Why Is Leading With Technology Important?

As that’s the main point I want to get across, I started by thinking how I use technology in my own life (and as you can imagine, running a tech business, there’s quite a lot of it!)

And I concluded that it pretty much permeated every single aspect of my life.

In fact, with the advent of the coronavirus and the rapid changes to the world we’ve all gone through I’d say that’s even gone faster and deeper in recent weeks.

Take banking for example.

 

I can’t honestly remember the last time I went into a bank, instead using my card or iPhone (in fact I’ll probably try not to use cash at all now going forward).

Does that mean cash is dead?

Is that one of the things that’s going to change?

Should we all be thinking about the various ways we enable people to pay for services with us?

Because let’s be honest, at least in the short term, who’s going to want to touch things?

Just think about how you order a takeaway.

 

I literally can’t stand picking up the phone to my local pizza place, trying to spell my address to them three or four times and then repeating the order over and over again, only to find they can’t find my address and when they finally do the orders still wrong. It’s so much easier to just order via an app.

 

That physical disconnect is literally everywhere, even in the way I socialise now.

Absolutely some physical socializing is great but how much would we all struggle without WhatsApp or Facebook or all the other platforms that people use daily?

COVID-19 has really focussed that with a massive uptake in in the way everybody is now consuming Zoom or Microsoft Teams.

It almost feels like everyone’s become a guru on using these platforms, changing their backgrounds etc, being empowered to engage with these types of technology.

I remember my parents struggling when lockdown first came into effect but are now experts in using various video conferencing apps.

 

But why are technologies like this permeating our life so much, even before the coronavirus crisis?

Well, I think there’s a number of factors.

 

  • It removes unnecessary interactions – Not everybody wants to pick up the phone; some people just aren’t comfortable with that; they give us the opportunity to get what we need when we need it.
  • It’s mobile – I know we’re not particularly mobile at the moment but the great thing about tech is that it’ll go wherever you go. Being able to interact with the services you want, wherever you might be is important.
  • It’s personal – Tech has the ability to be personalized and tailored, remembering who you are and what you like and what you don’t.
  • It reduces human error – it makes services better in the way they delivered.

 

So it really is everywhere.

Sometimes it’s done well, sometimes not so well.

One thing I do know though is that whatever we’re doing digitally will be compared to what others are doing digitally, whether that be their platforms, company or organisation.

We both judge and are being judged all the time which means new technology should absolutely be part of all our strategies, now more than ever.

 

During the COVID-19 era we’ll be likely be judged on how well we deliver on our promises and how we deliver our services.

Therefore, it’s incumbent on us to take a leadership approach to the way in which use we use technology.

Ultimately, we have no choice in that as expectations have massively exploded over the past ten years or so. People’s expectations of what good looks like and how they should receive their services have massively increased and those expectations levels have increased across all sectors… including the Membership sector.

Membership Expectation

 

 

Engagement, Engagement, Engagement!

It’s easy to say that a member of an organisation probably wants more engagement, but they’ll only want that engagement if it’s good engagement.

Being bombarded by poor engagement is very, very frustrating.

So, let’s assume we all want to increase engagement with our Members.

First off, is there a base line to start from; to measure our ROI?

What do we try and measure, can it be measured and if it can how do we bring that together into a cohesive view?

 

  • Website visits?
  • Event signups?
  • Click throughs?
  • Self-service request purchases?
  • Upgrade requests?
  • Support requests?

 

But then… even if we can measure it and get all that data into one place, what do we do with it then?

How can we easily use that data and turn it into something that automates a Member’s personalisation experience?

Personalisation Is Key

Members want an experience that’s personalised with fresh content served up to them on demand. Content that they know is aimed at them as part of a journey that they can recognise is tailored to themselves.

It’s too easy to fall into that trap of thinking it’s a onetime thing, that you kind of understand who somebody is and then that’s it forever.

 

Actually, it’s not.

 

Membership engagement should be a real time thing that adapts and grows with members as they do.

We need to be able to understand that they’ve changed and change with them.

Now those changes won’t always be obvious, so it needs to be scientific, with decisions being made based on the data we hold.

That means we need to think about how we understand our Members and to do that, we need to think about our Data Architectures, how we can get those altogether and once we have how we make the best use of them.

And that’s where the key differentiators really are. It’s more than just knowing somebody’s name.

You know that’s not good enough.

We need to know how they interact, how they navigate, what their preferences are, past behaviours and purchase histories.

Ultimately however the Holy Grail of data is being able to make predictions and that’s where you need to be ahead of everyone else, using AI (Artificial Intelligence) and ML (Machine Learning) to make them.

Social Media In Membership

I’m sure we’re all doing ‘some’ social but how well are we doing it?

Do we have the right tools?

Is it integrated with our CRM?

How does it affect what we do?

Is it just for putting content out for the sake of it or is it a bit more strategic than that?

Will the CRM integration allow us to collect more (useful) information about our members journeys?

Operational Overheads

I think Operational often gets overlooked.

In my experience Members want to know that you’ll be delivering a great value proposition but they also want to know that you’re doing that with maximum efficiency so that their subsidies or the money that they pay in is being spent efficiently.

How many reading this can truly say they’ve looked at their business processes, reigned in off system process and looked at the opportunities around robotic process automation?

Can menial tasks be automated so that those staff can be adding more meaningful value into the businesses elsewhere?

And again, Predictive Science and platforms like Power Automate can be a really big help there as well.

Value Propositions

And finally, all those points come together in your value propositions.

Many of us are clear about what our value proposition is and how well we’re communicating it.

But has it been received and understood by our membership base?

 

There’ll be many instances we’re what we’re delivering isn’t a physical product or certainly not all physical.

Often it can be a little more amorphous or difficult to evaluate as it’s evolving overtime and that makes it a little bit more important and a little bit more difficult to get people to judge its value and that’s probably been heightened during the COVID-19 pandemic.

But I guess I’d also say it’s a two-way commitment between yourselves and your members.

You know right now your members will need you for guidance just as much as you need them for the revenue.

 

And we should all remember that we’re in this together by asking, how can we help each other?

 

We know some people may have more time at home and are looking to upskill/re-skill and certainly looking for guidance on how they can achieve that.

So valued and trusted leadership is absolutely being looking for.

 

Ultimately however, technology is both part of the solution but also part of the problem.

My experiences with our customers across all industries (but particularly the Membership sector) is that tech has become the biggest blocker within their organisation.

There are too many legacy systems that have prevented companies from providing a good experience to their Membership base.

Too often at the moment, with Legacy Technologies we hear it can’t be done or it will be very, very difficult.

So we need to change that round to help really define what value propositions are in the new normal and really seize those opportunities in the new world.

 

 

Everyone reading this will have technology in their business already.

We might have some.

We may have a lot.

We may have all the bits of the jigsaw that we’d expect to see in a modern solution.

It may not all be new; the IT infrastructure may be old and it may be siloed or it might not be doing what we really need it to do. It’s likely it’s not integrated.

All in all, it’s very unlikely any of us have a completely green field IT landscape.

 

We need to really think about how many applications we do have.

One of the things that we see in a lot of organizations before we even get started on the actual digital transformation journey is understanding what they’ve already got and we always find all sorts of really interesting things; apps that have been proliferating in the background that IT knew nothing about or leadership teams that didn’t know anything about the things that IT were doing.

And we’ve even had situations where customers have applications they didn’t know they had or  thought they had applications that had been retired years ago.

Those kinds of issues really are out there proliferating and one of the things we need to do is get a grip on that.

How can we bring all those disparate ‘things’ together if we don’t even know what they are?

We also find that with a lot of these legacy applications that some of them are burning through resources whilst causing a lot of problems for our customers.

 

  • They might not be supported anymore.
  • Everything else has moved to the Cloud but this particular application.
  • It can’t be upgraded because it’s been extended so much over the years that it’s now too many versions behind
  • The people that really understood that application have left

 

So quite often there’s some real key drivers there besides the market itself that you may need to address.

 

We also find there’s a lot of chaos around processes; quite often when we get involved with a Membership organisation, they’ll say “here’s a pack with a list of all of our processes that covers everything”.

We’ll then start looking at those and find that’s only about a quarter of the processes they have because they’ve only focused on the ones that are on-system and not those that are off-system or the ones that are actually just work arounds because the systems are blocking them from doing what they really need to do.

 

That means it’s often about understanding what those processes really look like and what the opportunities are to deliver a better experience, both internally in your back office and more importantly in terms of the value proposition to the membership community that you’re trying to engage with.

Data Chaos

We see a lot of data chaos.

One of the first tasks we often get involved with is mapping out a data landscape to understand what data an organisation has available.

What data is out there and what’s it doing?

Typically, we find a lot of it is again, off-system, being shipped around on spreadsheets as well as duplicated in multiple databases and directories.

 

We all know how much power there is in data and I’m sure we all appreciate that you can’t really leverage data without first having the right data architectures. That’s why it’s so important to tame an organisations data first.

But as well as badly architected data structures, there might be suboptimal UX legacy experiences preventing people from engaging with you in the way that they want to which often leads to Member frustration.

All of that comes together in preventing an organisation from really leveraging the full potential and making a difference.

 

So I think one of the big challenges in terms of leading with technology has to be about taming the chaos that already exists out there.

It’s not always about just throwing everything away and starting again but instead finding a way to tame the chaos that already exists both inside our businesses and outside.

 

 

 

Why Are You Digitally Transforming?

The first thing is really understanding why we’re trying to transform in the first place.

According to some measures, over 80% of transformation projects will fail and in my experience a large proportion of those will be simply down to not really understanding why they needed to transform or what they were trying to achieve in the first place.

That means understanding those reasons and getting them right is vital. Having different reasons to transform can lead to different routes to getting there.

 

  • Are you being defensive?
  • Are you trying to be disruptive?
  • Are you trying to enter new geographical markets?
  • Are you trying to enter new vertical markets?

 

All those things really play into the way in which your digital transformation journey needs to be approached.

What Does Success Look Like?

It’s important to understand what success looks like and be clear about it before you start as measuring benefits is quite a difficult thing to do.

It’s easy with things like finance or new member acquisition or retention or your membership churn rate or the number of processes that have been decommissioned or automated… but much more difficult with the softer benefits around the value proposition.

That’s why you need to be clear about what success would look like, so you can actually tie what you’re doing towards those measures of success and ensure that everything that you do is a step along that journey.

Choosing The Right Platform Ecosystem

cloudThing are a Microsoft House and although we do use lots of other, different cloud environments, we favour the Microsoft platform because of the ecosystem that sits around it.

There’s such a great plethora of services from the platform.

Dynamics, Azure, the Power Platform,

They all allow a host of different approaches to building complex, big data solutions or low code/no code solutions.

Build Future

Circling back to cloudThings Build Future ethos that I mentioned; architecting for the future, both for your solution architecture and your data architecture is massively important.

Small decisions made today can have a really huge impact down the line, which is why we always encourage our clients to have a vision about where they’d want to five, ten or even fifteen years from now.

Even if you don’t accomplish everything you set out to, making some really core good decisions today can really help mitigate the cost and ensure re-platforming in future won’t be necessary.

Everything should be geared towards incremental and continuous improvement nowadays rather than ‘Big Bang’ moves from one system to the next to the next to the next.

Consume Solved Problems

I think I’d always suggest to at try and consume solved problems.

What do I mean by that?

There are things that people are great at which instantly can solve a problem.

Rather than trying to re-invent the wheel yourself, focus on the things that really matter to your business. A great example is SQL.

Microsoft SQL as a service is something you can easily adopt (or consume) by using Azure. You see, Microsoft are pretty good with stuff like that.

They’ve designed and built a database that they’ll also run for you.

Why would you want to build and run a database when somebody’s already solved that problem and can do it for you faster, better and more securely; meanwhile letting you really focus on the things that are important to the organisation.

The same probably goes with Cognitive services for AI and ML as well.

OOTB (Out Of The Box)/Configure/Integrate/Dev

This is where you need to think about how you want to approach your development lifecycle.

Does an Out of the Box solution come first, which then gets configured and integrated; thus making development the priority?

Of course, the less development you do and the more Out of the box configuration you can use, the less overhead and technical debt you’re creating.

If you do need to develop, just make sure you do it in the right way, following accepted, industry best practices so as not to accrue problems for yourself down the line.

MVP (Minimum Viable Product)

In terms of how you get to market and make changes, Minimum Viable Product is always a mantra that we would recommend.

It’s too easy to get caught up in the ‘bells and whistles’ and the ‘wouldn’t it nice’ of a solution.

In my experience it’s much better to get a minimum viable product to market as quickly as possible and then have a period of continuous improvement afterwards in which you continue to develop and enhance your services.

MVP empowers that whilst also allowing you feedback from your membership along the transformation journey.

That means your solutions can become more targeted because they’ll be based on the feedback and engagement that you get from your Members.

It also come with the added benefit of making them feel like their requirements are being listened to and acted on giving you continuous feedback cycle for deployment.

Embedding A New Culture and WoW (Way Of Working)

To do any the above you’re going to need a new type of culture and way of working.

Changing your culture to a one of continuous development, continuous improvement and continuous deployment will offer continuous benefits to your members, manifesting in constant releases and constant upgrades for them to engage with.

Multi Discipline Suppliers

And finally look for the right supplier(s).

It should all be about multidisciplinary in today’s world (and tomorrow’s as well).

It’s not all about just having a software provider or a cloud hosting provider or a system integrator or a DevOps provider or a data specialist.

A good supplier these days needs to be all those things.

That’s the only way to arrive at the best (and most efficient) solutions).

Impact Of The Coronavirus On The Membership Sector

 

 

Finally, I thought I’d quickly discuss how COVID-19 may have affected the membership sector.

Actually, my view is that it probably hasn’t changed the challenges that we all face, but it might be shining a light just a little brighter on some areas where things aren’t quite as good as they could be.

 

I’ve already mentioned that people’s expectations have increased.

They’re consuming more digital because that’s the only way they can consume services now (and they probably also have a little bit more time to do so in).

 

One of the strangest dichotomies to come from the coronavirus pandemic is probably the fact that most businesses want to be a little bit more cautious in the way they invest when in fact our members are looking for all of us to go faster and quicker and harder; creating challenges for us all.

So really, it’s probably about not change. The challenge haven’t changed.

 

What has is probably the priority of those. Forcing us to re-evaluate our priority lists.

We all need to make sure we can get the most value out as quickly as possible because clearly the sector is vying for attention whilst everyone’s at home.

And as I said at the beginning, we’re all being judged.

The decisions we make now in this tumultuous time will be being judged and the actions we take today will very much be remembered by our members and form their views of the kind of organization we are going forward.

 

During and post COVID-19 organizations within the Membership sector have a real opportunity to grow their digital estate and not just go back to an old way of doing things because that’s the easy thing to do.

It’s a real opportunity both for the sector and its members.

Get the data right and then really think about Machine Learning to help reduce member churn.

Just a 5% reduction in that churn can lead to quite significant increases in revenue.

Existing Members are more likely to try recommended products.

 

My final thought would be how many of us can claim they can truly measure the value of a member over the lifetime of their journey, plan for that and reduce churn whilst also increasing efficiency and revenue off the back of that data?

 

What Does IAAS, PAAS & SAAS Stand For?

What are IaaS, PaaS and SaaS and what can they do for you?

 

If you’ve ever found yourself researching Cloud-based solutions for your business or organisation then you’ve probably come across the term’s IaaS, PaaS and SaaS and may be wondering what they actually mean and how they can help you.

It wasn’t so long ago that all of a company’s I.T needs would have been on-premise and clouds were something artists added to make their landscape paintings pretty.

These days, if you want it to, a Cloud-Platform can replace all your on-premise IT needs (with the added benefit of saving your business money in the process).

 

IaaS, PaaS and SaaS are the three main platforms a cloud-based solution will be built on.

 

Hopefully, by the end of this article, you’ll know:

  • What IaaS means
  • What PaaS means
  • What SaaS means
  • Will understand what the benefits of each are for your business
  • How cloudThing can help implement them into your organisation.

Infrastructure as a Service

IAAS DEFINITION

IaaS offers businesses and organisations various services and solutions such as pay-as-you-go Cloud storage, networking and VM’s (Virtual Machines).

It offers a cloud-based alternative to on-premise infrastructure so that business don’t have to invest in expensive and on-site equipment.

 

To put that into simpler language; picture your I.T department for a moment. Do they have a server room somewhere in the building?

That’s on-premise infrastructure.

Now imagine that room wasn’t there anymore and those ‘servers’ were virtual machines on the internet. That’s IaaS.

Infrastructure as a Service delivered over the internet.

BENEFITS OF IAAS

Having on site infrastructure (like servers) is expensive, labour intensive, takes up valuable office space and with depreciation of hardware isn’t a great long-term investment.

It can often involve an expensive up-front cost for the physical hardware plus the cost of installing and then, on a day to day basis, maintaining it.

The benefit of IaaS is that you only need to buy what you need as you need it, purchasing more as your business expands.

This makes it a flexible and scalable solution that can be upgraded or replaced as per business demands compared to expensive physical hardware, that due to its high initial cost you may need to ‘get the most out of’, even if it’s long past it’s best.

EXAMPLES OF IAAS

  • Microsoft Azure,
  • Amazon Web Services (AWS),
  • DigitalOcean,
  • Linode,
  • Rackspace,
  • Cisco Metapod,
  • Google Compute Engine (GCE)

DO I NEED IAAS?

In a word… Yes

Given its scalable nature IaaS is the perfect solution for any size or shape of business as it gives you complete control and freedom over your infrastructure and also operates on a pay-as-you-use model meaning it can fit all budgets.

IaaS, simply put, is a safer, cheaper, more reliable investment for a business.

There’s no physical hardware on your premises that can be damaged or go wrong and with most IaaS platforms you’ll have access to a 24/7 help desk.

If you’re looking to save money and future proof your business, IaaS is the solution you’re looking for.

Platform as a Service

PAAS DEFINITION

Platform as a Service (PaaS) provides both virtual hardware and software, so VM’s and tools which can then be used to develop applications.

Because of this PaaS is predominantly used by developers.

 

A PaaS solution is often built on top of an IaaS solution, so again, putting that in more understandable terms, it might help to think of PaaS as a virtual workshop.

The workshop ‘space’ plus all the ‘tools’ will be there for you already but what gets built in the workshop will vary drastically from developer to developer, user to user and business to business.

BENEFITS OF PAAS

A PaaS solution gives your Dev Team the ability to create unique software or applications that can be customised to your business’ exact needs.

The big selling point of PaaS is that they won’t need to start from scratch for every project by writing time consuming, ‘basic’, code.

Instead, they can take a ‘template’ and then customise it to exact specifications and needs, which as you can imagine saves a lot of time (and money).

This makes PaaS the perfect choice if your business wants to develop a unique piece of software/app without having to spend a fortune or have to create it from the ground up.

Other, more tangible, benefits of PaaS are that it’s accessible by multiple users and, as with IaaS, it’s scalable in line with your business needs, growing or shrinking as projects and business needs may require.

EXAMPLES OF PAAS

  • Microsoft Azure
  • AWS Elastic Beanstalk
  • Heroku
  • Force.com
  • Google App Engine
  • Apache Stratos
  • OpenShift

DO I NEED PAAS?

If your business does some or all its own development in house then PaaS will be, by far, the most cost and time effective way for your developers to create amazing software.

It gives them the space to focus on the creative side of software and app development without having to worry about the more technical side like software updates or security patches.

With a PaaS solution that work is already all done for them meaning all their efforts can be focussed into creating, testing and finally deploying the software or app.

Software as a Service

SAAS DEFINITION

SaaS platforms such as Microsoft Azure or Amazons AWS make software and apps available to business’ and organisations over the internet, normally on a monthly subscription model.

 

Putting that into simpler terms again; imagine all the different programs you currently have on your computer. Now also list the ones that are exclusive to your business and imagine the time and recourse needed for IT to install them on every machine in your organisation.

With SaaS all of those programs will be on the Cloud (online) with someone else responsible for making sure they all work on a day to day basis.

BENEFITS OF SAAS

As mentioned, with SaaS you don’t have to have anything installed on your computer to run the applications.

That means with just a log in and password you can usually access everything from any machine with internet access should you need to.

Your IT department won’t have to waste hours of time downloading new applications every time ones needed to everyone’s machine or configuring everything for new starters; all they need to do is create a user a log in and password, set their permission levels and they’ll be ready to go.

Another benefit is the software will always be up to date, no longer requiring regular updates or security patches.

SaaS and the Cloud takes care of it all for you!

 

As with IaaS and PaaS, SaaS is operated on a subscription model, normally with a fixed, monthly fee which will allow your Accounts Team to budget and plan ahead accordingly with most providers also including maintenance, update and security services withing that fixed, monthly cost.

Most SaaS solutions also tend to be ‘out of the box’ meaning the initial set up is quick and simple to implement, often able to be deployed within a couple of hours.

EXAMPLES OF SAAS

  • Google Apps,
  • Dropbox,
  • Salesforce,
  • Cisco WebEx,
  • Concur,
  • GoToMeeting

DO I NEED SAAS?

Can you guess what we’re going to say here?

Of course you do!

SaaS platforms are incredibly flexible, allowing users to access everything the need from any machine with internet access, making a SaaS solution perfect for any business working on a Disaster Recovery (DR) plan, streamlining or future proofing themselves.

As with IaaS and PaaS it’s scalable based on your business needs with pricing tiers available for small, medium and even enterprise level businesses.

This all makes it ideal for when you want software to run quickly and reliably with minimal input from your IT team.

 

So, in summary; IaaS will provide your business with cost savings on hardware combined with scalability and flexibility of infrastructure.

PaaS, often built on top of IaaS, will save your development team time (thus saving you money), allowing them to focus on develop award winning applications rather than spending all their time on basic coding jobs.

And SaaS will give you quick, flexible and easily deployed, ‘out of the box’ solutions to meet a particular business need.

What Is Business Architecture?

Mike Eckersley – Business Architect, cloudThing

Defining Business Architecture and how cloudThing can empower your organisation

 

Business Architecture At cloudThing

Normally we (Business Architects) get involved right at the start of a project, however quite often we get brought in at a later date which can make it harder to convince a client of the benefits that Business Architecture can bring.

When we get brought in at the start of a project, we can make a real difference to inform the story-writing stage and produce better and more accurate stories that are aligned with exactly what the client wants and the business benefits that they expect the solution to deliver.

To clarify, when we say stories, we mean the development projects taking on the voice of the customer and what their business needs.

It’s about taking the persona of a particular type of user and then working out what they want and why they want it; that’s the story-writing process.

What we need to do is understand what the business needs and what the users need so that we can speak to the Solution Architects (SAs) who will write the stories.

 

At cloudThing the Business Architects are the voice of the customer although it’s important to have a foot in both camps, one for the client and the other for cloudThing.

It’s about getting under the skin of the client’s business very quickly so you can get a deeper understanding of the company, how they operate, and what their business processes and structures are so that you can effectively brief the SA’s.

Benefits Of A Strong Business Architecture

Overall, we can save a business hundreds and thousands of pounds per year by removing waste and inefficiencies and by maximising automation possibilities of a solution.

So, because we know what Microsoft Dynamics 365 can do, this allows us to ask the customers questions like, ‘why don’t you do it his way?’.

If you do, we can then get the system to do X, Y and Z steps which removes someone having to manually key the steps in and automate tasks. The minute you automate something, you remove waste and improve efficiency.

It’s about making processes within businesses and technology more efficient.

We start by holding structured workshops to understand all the business processes within the client’s organisation and we start with how they do it now and how they think they do it.

What do we mean by this?

Very often, when we talk to the managers, we get a piece of paper saying this is our business process but when you go down and talk to the people actually actioning the processes, they say that they kind of do that way but first they have to download a spreadsheet and manipulate it and then upload it somewhere else. But that’s not all because Mavis has something that she has to do etc. Once we’ve established what really happens, we can map it out and start looking at the process.

We try to look for ‘sources of variation in ways’ because a good process is done the same way every time, no matter who does it- it should be a standard process.

If there is variation in the way a process operates, this is where problems arise.

Once you have a standard process mapped out, you can then start to look at where the waste occurs, and we have various numonics and models of how to combat certain types of waste.

When the waste is stripped away, we have the backbone of the process and we call this the ‘As-Is’ process (how the process works at the moment).

 

Now comes the fun part- we go back and have another meeting to discuss how the client actually wants the business processes to be, hence the name for the new desired processes, the ‘To-Be’ process.

This is where we can start to streamline the process and evolve it into a new process that takes into account the capabilities of Dynamics. This is where we slot in the automated steps and remove the manual ones.

At first, client’s do find it difficult to start a-fresh but with our help, encouragement and knowledge, we give them a proverbial magic wand and say, how would you like it to operate?

It’s great because at the end of the session, the client can see just how much better their lives are going to be when this new system comes into action.

As a result of that, it improves the buy-in when we come to implement it instead of having any resistance to change or people trying to find a fault within a new system, which does happen because they haven’t been involved in the journey.

Now, you’ve got a load of people who are really motivated and can’t wait to see this new solution that they’ve contributed their thoughts to.

Clients are best placed to know where the problems are in the processes they are currently operating so you’ve got an ideal situation where with a bit of help, they’ve improved that process and made it more efficient.

What makes a good Business Architect?

It’s about that blend of experience of a business and system knowledge and the ability to quickly form relationships with staff you’ve never met before and in order to that, you must work at it.

For example, adding humour, judging your audience and level of abilities, forming a positive relationship, constructively challenging them and bringing them along with you.

You also have to have credibility.

There’s no point in going in and saying, ‘I’m not sure if we can do that in Dynamics’.

 

Business Architecture is about getting to understand what the user wants and what the business wants out of the software but by doing it in a way that maximises the capabilities of the software.

Because we know what Microsoft Dynamics 365 can do and what the business is trying to do/achieve, we can make the two come together and work harmoniously.

We want to make sure that they get the most out of their investment in this project and achieve what they wanted from day one (for example, saving money).

 

Once you start achieving this goal, we want to make sure that the solution continues to deliver this, so we get the processes locked in and make it standard.

What Is Data Gravity? (And How Your Organisation Can Benefit From It)

Everything you ever wanted to know about Data Gravity, including the answer to the question… What the hek is Data Gravity?

 

What Is Data Gravity?

First of all, lets answer the question ‘what is Data Gravity?’

The term Data Gravity was first coined by a man named Dave McRory as an analogy between how data in IT systems attracts more data and apps and how objects with more mass, in real life, will attract those with less.

 

Data Gravity then, is the process in which large amounts of data in a network, system or an

organisations processes will attract more applications, services or additional data to itself.

The gravity analogy comes into play because, as more data is stored, more software or business processes will grow around it, in turn attracting more data, in turn drawing in more applications at an ever increasing frequency.

 

It’s also worth noting that, when coining the phrase, Dave McRory made a distinction between naturally occurring Data Gravity and Artificial Data Gravity, which he defined as occurring through a result of external forces such as legislation, throttling and manipulative pricing.

 

In practical terms, the further spread out data is, perhaps over different systems or networks, the more it will impact on the ability of users or applications to utilise it effectively.

To maximise work efficiency then, it makes sense for an organisation to store all their data in one, easily accessed location, with any associated applications, services or business processes attached to it in the same place.

It’s also worth noting that from the perspective of representing your business as a going concern, the more date you hold in a particular repository, the greater it’s perceived value will be, both commercially and in terms of the results analytic tools and AI can spin out with it.

 

If you’re looking to digitally transform your business (or perhaps more ambitiously digitally disrupt your sector), Data Gravity is an issue you’re going to need to consider at a strategic and technological level.

 

At a strategic level, its an issue that will affect the sequence that you approach your transformation from you want it to be a sustainable effort.

 

At a technological level, its selecting the right technology platform to ensure you don’t back yourself into a corner.

 

Any organisation operating today will generate a tremendous amount of data, often to an extent where it’s unrealistic to manage it with a traditional approach to CMS’s or analytics.   This is because data analytic platforms tend to live in their own hardware/software stacks and the data they use will be accessed through direct-attached storage (DAS).

 

A lot of analytic platforms though (Splunk, Hadoop, TensorFlow etc) like to ‘own’ their data which means for any large scale digital transformation, wholesale data consolidation and migration becomes essential before you can run any really cool analytics, AI or ML on it.

Benefiting From Data Gravity

The first thing you’ll need to realise with data gravity is that you can’t stop it. Much as gravity is a fundamental law of physics, data gravity is a phenoma to be understand and used to inform your digital transformation.

 

Choose a platform such as Azure that has a range data storage option such as Azure SQL, DataLake, CosmoDB; data processing options such as Databricks, Python, Data Factory; data visualisation and analysis options such as PowerBI, Python, Azure Analysis; and AI capabilities in Azure ML;  to let you build an architecture for consolidating your data in a way that lets you start small but scaling up fast.  Once in place your organisation will be in a much stronger position to bring advanced analytics and AI to bear on it.

System Scalability For Data Gravity

 

 

It should be an obvious point (but it’s normally the obvious ones that get overlooked isn’t it?); when architecting out a system or network with the idea of data gravity its core feature should be scalability.

 

If the whole point is to gather data at exponential rates, then your architecture will need to be able to handle that. You’ll also want a solution in which personnel and infrastructure costs don’t scale with your increases in data.

 

This is where a Cloud Based solution becomes your friend and where cloudThing are happy to step in and advise you on your best course of action.

Future Proofing & Data Gravity

It sometimes feels like Data analytic apps and AI/ML platforms change on a weekly basis, which obviously needs to be considered for any end solution. The data must be accessible across multiple platforms, be built to open standards and adhere to compliance standards, including any your organisation currently use but also, any they might use in the future.

Data Gravity Key Take Away’s…

  • Data Gravity will occur no matter what you do
  • The more data you have in one place the more powerful it will become
  • If your organisation’s data is spread around different, discrete, networks and systems it’ll be more costly to access and utilise, much harder to secure and exponentially harder to analyse as your transformation continues.

cloudThing’s recommend solutions based in Microsoft Azure.  Data Lake is already architected to help you tame your data gravity by giving you tools such as Data Factory to perform ETL processes that fetch your data from disparate sources into one place.

Similarly CDS, part of PowerPlatform, allows you to centralise all your app data into one place to create maximum gravity for building your Apps, RPA, or BI from it.

 

 

How To Fix DateTime Stamps In Microsoft Dynamics 365

Craig Seymour – Dynamics Practise Lead, cloudThing

Our dedicated testers have recently been reporting a one day offset between two custom date fields.

 

Some of the quality assurance team in our cloudThing India offices have recently started noticing a one day offset between two custom date fields (‘Contract Start Date’ and ‘Invoice Schedule Start Date’) in Opportunity, even though the second was being set as a direct copy of the first.

Looking into the problem a little deeper I realised that the Contract date was set as a ‘normal’ DateTime field but with Time not shown in the UI, whilst the Invoice date was set as a TimeZone independent DateTime field.

Once I knew what the problem was all I had to do was find a solution… ah the fun bit!

 

The Contract Date

When a user creates a Date without a timestamp in Microsoft Dynamics 365 (or any model-driven Power App) through the User Interface (UI), it will automatically assume a timestamp of 00:00 in their current locale (local time zone preference and daylight saving).

However, when it writes that date to the database, it converts it to the UTC value, taking into account the user’s locale.

So for example, dates created in India will be 18:30 the day before.

When Dynamics 365 then reads it back out, it does the reverse.  So, from the Indian user’s point of view, the displayed date is always correct.

Continuing with the example, that means for records created in India, the Indian user sees in the UI 2020-01-20 (with @ 00:00 hidden) but in the database a user in the UK would see 2020-01-19 @ 18.30.

That means if ‘I’ look at this record in D365 from the UK I actually see ‘2020-01-19’ because my TimeZone correction (from UTC to BST) is +1 hour, so Microsoft Dynamics 365 is displaying 19-01 (with @ 19:30 hidden).

The Invoice Date

When the field is of type TZ independent, Dynamics 365 ignores the locale and stores in the database exactly the same date as you see in the UI.

As for the Contract date, Dynamics 365 assumes a timestamp of 00:00 and then stores this without correction as 2020-01-20 @ 00:00.

When this is read in India, the UK, or anywhere else for that matter, Dynamics 365 will show us 2020-01-20.

Copying the Contract Date To the Invoice Date

When we create the Invoice date we’re doing a straight copy (using the API) of the field value ‘from the database’ from a field which Dynamics 365 knows it should apply the correction for locale to it, to one which it knows it shouldn’t.

So our database value is 2020-01-19 @ 18.30 and Dynamics 365 studiously ignores the 18:30 part and simply shows us 2020-01-19 in the UI.

And the fix?

Well in theory we should just do the same as Microsoft Dynamics 365 (by that I mean apply the locale correction before writing the Contract date) but in practice we can’t because Dynamics 365 doesn’t store the locale the data was created, just the UTC DateTime, so we don’t have the data we need to do this.

One hack which might work would be simply to ’round’ the Contract date up or down to the nearest midnight when we copy it, which will work everywhere except for dates created by users in New Zealand, Eastern Siberia and a handful of islands in the Pacific.

Or…

We could convert all Contract dates to have a timestamp of noon; again this works for most Time Zones except New Zealand, Eastern Siberia and a handful of islands in the Pacific again.

If you’re in the UK (or any other country in the GMT timezone) and looking for another solution you could simply convert the Contract date to TZ independent.  Luckily for us, GMT and UTC are exactly aligned, so all our UK created dates happen to be set at 00:00 anyway!

The downside is that all the dates created by the QA Team in India will start to show a day early.

6 Easy Steps For Promoting A Culture Of Cyber Security

Cyber security is vital for a business with any kind of digital presence, no matter how small (or large) they are

 

Modern firms that want to make their digital estate secure will often spend fortunes on Security by Design solutions such as firewalls, MFA’s (Multi Factor Authentication), anti-virus and anti-malware software. What many fail to realise however (sometimes to their detriment) is that the biggest weakness in their cyber security isn’t digital in nature at all; it’s their staff.

 

From Simon in Accounts, clicking a link in a dodgy email for the nineteenth time, to Brenda in Sales logging onto an unsecured Wi-Fi network in a coffee shop without using the VPN because she just ‘had’ to get her caffeine fix and wanted to check her emails;  it’s unaware or untrained staff that will cause (but can also prevent) the majority of your problems.

 

Fortunately, there are several basic cyber security tips you can pass on to your staff to make your business a much less appealing target for cyber scammers, hackers and other unsavoury types.

Secure Passwords

We discussed this at length in a previous article so won’t belabour the point too much but it really is vital your staff realise that Password1234 (or even Password4321 if they’re trying to be clever) isn’t secure.

The majority of your problems are going to come down to good culture and governance, making sure your staff are digitally aware and empowered to report anything they’re not sure of.

And if you’re worried, it’s a relatively easy job for your IT team to block the use of certain passwords and make sure they have to be changed regularly.

It’s also important your staff realise they shouldn’t be sharing passwords with each other.

In an ideal word you’d be able to trust all your co-workers but a strong cyber security culture has to start with good governance, which sometimes means putting rules into place to protect against that 1% outside chance.

Educate Your Staff On Phishing Scams

Gone are the days of Nigerian Princes asking for your bank details because they’re so impressed with your business acumen… the modern cyber criminal is a lot savvier than that.

Your staff will need to be on the lookout for multiple scams, across the phone, email and social media.

There are specific things they should be looking out for (which we cover more in-depth here ) but again, our biggest tip would be making sure the right culture exists in your workplace.

 

Get that right and everything else falls into place.

 

Your IT Team should be running cyber security awareness sessions at least every six months to show your teams examples of good and bad practices as well as updating them on new security measures and common email scams they should be looking out for.

What To Do When Someone Leaves

The majority of cyber security precautions your staff need to take are for ‘what if’ situations. Chances are they’ll never occur but they need to be prepared if they do.

 

In reality, if Sarah from Marketing retires after twenty years of loyal service, not changing your companies Facebook password probably won’t harm you… but you can never be too careful.

 

Having good governance in place, with a file showing which member of staff has access to what application is always a good idea.

If they leave, whether that be through choice or especially if not, standard practice should be to immediately revoke their access and change all the passwords for systems they had access to.

Yes it’s annoying, yes it’s ‘faffy’ (especially in companies with a high turnover of staff) but in the long term it’ll pay dividends.

It only takes one disgruntled member of staff for your entire organisations cyber security to be put at risk.

Appoint a Cyber Security Advocate For Each Team

By now it should be obvious that good cyber security is everyone’s responsibility.

On a day to day basis though it can fall back in the priority list behind someone’s regular duties.

To combat this (and depending on the size of your organisation) it may be worth appointing Cyber Security Advocates into each team within your business.

They can almost act as an extension to the IT team, with a much greater understanding of the day to day operations of their department and able to more efficiently spot potential cyber security risks.

Do Your Staff Know What To Do If The Worst Does Happen?

Eventually someone will make a mistake.

The very worst thing that can happen if someone does make a mistake though is to not report it. The sooner a security breech is identified, the sooner steps can be taken to mitigate or solve it.

Discussing culture again, your staff need to be aware that the first port of call should be to your IT Team and that they won’t be in ‘trouble’ for clicking a dodgy link or downloading a piece of software they thought was safe. The important thing is in fixing the problem, not in punishing for it.

 

Human error will occur and it’s your job to make sure that it doesn’t compound a problem.

Reward Your Employees

Continuing with the theme of how important culture is, if a staff member does identify a malicious email, or you notice someone championing cyber security then reward them!

 

Once employees realise they won’t get in trouble if they do happen to make a mistake and they can be rewarded for identifying holes in your security then you’ll find it much easier to bed down a culture of strong cyber security withing your organisation.

 

In the modern office cyber security must be both everyone’s responsibility and everyone’s priority but employers need to realise that staff skill levels won’t all be the same.

 

Empowering your staff by upskilling them on cyber threats whilst promoting a culture in which threats can be openly discussed and mitigated is the biggest step you can take in protecting your organisation.

Taming Digital Chaos

86% Of Digital Transformation Projects Fail – How To Make Sure You’re In The 14%

 

According to an IDC report published in 2018, it’s estimated that businesses will spend over $2 trillion on digital transformation projects this year but another report (from McKinsey in 2018) states that over 86% of Digital Transformation projects will fail.

That means across all regions and industries, including businesses of all sizes (SME’s and Enterprise level) only 14% of Digital Transformation projects will succeed; or put another way, $1.7 trillion potentially wasted, causing digital chaos within the organisation that started the project. We’ve never wasted $1.7 trillion here at cloudThing but can imagine feeling rather sheepish if we did!

 

Clearly then, the ideal is to be in the 14% that make a success of their Digital Transformation project but with so many companies failing it does beggar the question of how best to go about it…

 

It’s a complicated issue that won’t have a single, ‘silver bullet’ of an answer; with multiple variables depending on the industry, individual business or type of transformation required, but it can loosely be simplified down into four interconnected topics.

WHY CHANGE?

It’s 2020 and you keep hearing buzz words like Cloud Based Migration, Dynamics 365, Automation, Microsoft Azure or AWS, Data Science and Big Data… of course your business needs a complete digital transformation… Right?

Far too many business leaders however are starting their Digital Transformation projects without first correctly defining why they want/need to change or even what success might look like when they have.

CHANGING PEOPLE

Once you’ve decided on the case for digital transformation you need next to decide on the Solution Architecture of how you’ll actually go about transforming your business to best future proof it.

The need for rapid deployment of changes in an ever changing digital market is why many leaders will choose to transform their business, but few realise that it’s less a piece of hardware/software that needs implementing and more a fundamental change to mindsets and cultural approaches within their organisation.

CHANGING PEOPLE

A digital transformation can’t and won’t happen in isolation.

Questions will need to be answered around your current workforce, with the amount of upskilling that will be needed as well as the shortfall in digital skills addressed from new process and systems being introduced.

CHANGING LEADERSHIP

Getting a digital transformation project started is only half the battle. To see it successfully completed will also require the right kind of leadership, with the ability to understand and implement sweeping changes to an organisations systems, people and processes.

Do You Know Why Your Business Needs A Digital Transformation?

Gone are the days of ‘digital dabbling’; digital transformation projects in 2020 have the power to transform entire industries almost overnight.

It’s a bit of an overused example now but just look at the success of Uber, a taxi company that owns no cars, or Airbnb, an accommodation business that owns no properties.

One good idea good idea from an AGILE start-up combined with the right deployment of technology can revolutionise an industry and leave established competitors scrambling to catch up in the wake of it.

 

Are you aware of the things that are likely to change your industry in the next 12/24 months?

Is your business ready to compete with them when they occur?

 

It’s surprising how often wide sweeping changes can take established industry leaders by surprise, with competition from smaller, more agile competitors that will often have a business model not reliant on turning a traditional profit in their first few years.

It’s not just the scope of the changes though; oft times it’s the speed in which they become the go to choice of end users that’s hard to adapt too.

 

Perhaps the most fundamental questions to ask however, when questioning why you’re Digitally Transforming your business, is what that the transformation will look like and how you’ll be defining success within it.

Will your digital transformation be a form of digital enablement; doing away with your onsite servers to a PaaS/SaaS based Cloud model whilst you migrate your current CRM onto it or will it be fundamentally changing how your business operates and competes within your industry?

 

The first, known as Digital Enablement, can certainly speed up certain processes within your business and reduce operating costs but if not done right runs the risk of leaving you with a tangle of digital ‘things’ that don’t talk to each other and with no one in your organisation really knowing how they work or worse, what they’re truly capable of.

 

The second type of Digital Transformation, if done well, has the potential to revolutionise your business and keep you relevant in a world of start-ups disrupting traditional markets.

 

The most important questions you need to answer have to be:

 

  • What and why is the case for doing this?
  • What does success look like?

 

If you can’t answer those there’s a very good chance your Business Transformation project will either not deliver the results you were hoping for, fail completely or leave you with digital chaos within your organisation.

 

Digital Chaos is what a lot of businesses are left with after a badly run Digital Transformation process, with a multitude of new systems, applications and abilities but little or no strategy on how to implement them within the current business structure.

Worse, this new ‘Digital chaos’ can actually slow existing processes down.

 

Imagine after your business transformation all your previous applications (internet access, email, CRM access, email builder etc) had different authentication methods.  If no one has stepped back during the process and asked;

 

  • What are these systems for?
  • Can they talk to each other?
  • Can my staff use one log in to access them all?

 

Then you’ll end up with a form of Digital Chaos. Everything’s now in the Cloud and is undeniably faster but your staff will be spending twice as long to accomplish previously easily accomplished tasks with no clear strategy on going forwards.

Taming that Digital Chaos is a big step in a digital transformation and requires thinking about from multiple angles.

How Can You Digitally Transform Your Business?

Change is never easy but once you’ve made the bold decision that your business needs a Digital Transformation to stay relevant for the future then the process of how you go about that change becomes paramount.

 

The first thing to realise in adopting an AGILE business model is that your transformation shouldn’t have an end point.

The idea is to adopt a system or culture of Continuous Improvement within your business which will  enables real gains immediately through a series of ongoing smaller releases over time rather than one large release that could take 12/18 months before it’s completed (by which time it’s most likely outdated anyway).

This Agile model will also leave you in a much stronger position to respond to changes (threats to your business) as they occur either internally or externally.

How best to approach making those changes happen though?

 

According to a study conducted by Gartner, companies will move twice as fast in bedding down Digital Transformation projects when there’s a shared buy in amongst the senior leadership team, strategists, other individuals involved in the project and the organisation as a whole.

It’s important they all know the purpose of the transformation and understand both the road map to accomplishing it and their role within it.

True Digital Transformation Isn’t Easy.

It will most likely require looking at solutions far outside the scope of your current business practices by envisioning changes, not from a starting point of where you are now, but by defining the ideal future point of the business and then retroactively working out how that ties in to the present and how best to realise getting there.

 

But that approach won’t be easy…

 

It means leaving preconceptions and long held assumptions by the wayside whilst deciding where and how to allocate spending and resources for investing in the future (without jeopardising current business profitability, of course).

 

Once everyone is onboard and you’ve understood/accepted that change will require cultural as well as process and system changes the next step is in transforming your business.

 

Things that will require extra consideration are if you have the skill sets within your organisation to make those changes or if you’ll need an external partner to help you.

 

There’ll be pros and cons either way.

 

The skill sets required for a Digital Transformation are, unfortunately, in short supply due to a digital skills gap here in the UK.

If you’re lucky enough to have someone within your organisation that can accomplish your transformation, it’s important you plan for what happens if they leave.

Have you upskilled your other staff enough within your organisation to manage the different functions they’ve implemented if they do?

Siloed knowledge is anathema to an agile business model!

 

The other option is to work with a partner to manage you through the transformation process which will address the skills gap issue and mean it will most likely happen a lot quicker than an internal project but, even if the management of most of the functions created can eventually be taken in house, you’ll  probably end up needing that partner on an ongoing basis for any changes, which obviously has ongoing cost implications that need to be considered carefully.

 

In their recent report, Speed Up Your Digital Business Transformation, Gartner recommend these three steps:

 

  • Question tried and trusted approaches to business model transformation.
  • Emerging digital trends invalidate many of the common assumptions you operate under and impact how you select and execute critical transformation initiatives and activities.
  • Look to anchor transformation efforts around potential areas of future competitive strength. Identifying those areas takes a new approach — one that goes far beyond just implementing new technology.

How Do You Bring Your Current Staff With You On A Digital Transformation Project?

As we’ve already mentioned, an intrinsic part of any Digital Transformation is changing the culture within your organisation, not just the processes and systems it uses.

 

Those changes will create a slew of issues with your current workforce though and you’ll need to decide which are relevant to you and how you’ll be addressing them ahead of time.

 

If you’re going down the Digital Enablement route you’d be forgiven for thinking that not much should change, however, even with a small amount of automation you’ll need to decide what’s to be done with the extra staffing capacity that’s been created.

If all of your systems do migrate to the Cloud you’ll need to manage the roll out to your staff and make sure they’re all fully trained on it.

As we mentioned earlier, no one likes change, and you’ll need to lead them through the transformation process as well as manage the frustration with new systems in existing members of staff who may have been using your current system for years.

 

An important step in managing that frustration will be in empowering your staff by democratising their ability to take the business forward.

 

Citizen Developer is a relatively new term but can be defined as a user who can create new business applications in the Cloud by using Microsoft’s Power Platforms.

These apps need no previous design experience or the ability to code but let the people in charge of the day to day running of the business make tangible changes.

 

Oft cited examples are a small team at G&J Pepsi who, with no previous app development experience, created auditing apps that saved the company over $500,000. Another is an individual at Auto Glass who designed an app to help his own workflow processes which, when was noticed by his supervisors, was rolled out company wide. He now manages his own team of citizen developers in house, making other apps for the company.

This approach will  also help a business address the digital skill gap and mean solutions can be reached in a timely and agile fashion (where they can make a real difference) but will mean the right building blocks and architecture need to be laid down as part of your original solution.

 

The second type of Digital Transformation involves fundamentally changing the way you do business but then the question needs answering, do your staffs existing job roles even exist in the new structure?

If not, can they be retrained, or will you need to make mass redundancies? If that’s the case, you’ll need to plan for how that can affect the morale of the staff you’re training as well as the years of BI (Business Intelligence) you’ll be losing with the staff that go.

Do You Have The Correct Leadership Skills For A Digital Transformation Process?

A good leader will need to be able to embrace change but the key in a Digital Transformation is to be able to recognise the need for change ahead of time and be able to drive it, reacting proactively rather than reactively.

 

Digital Disruption and Business Transformation will be/are the driving factors in this new era and those that are best able to utilise cloud-based technologies with AI driven data will be the market leaders that others scramble to emulate.

 

A strong leader needs to be flexible and agile, willing to work within organisational structures that are much more horizontal than vertical; taking advantage of the possibilities remote working offers for the recruitment of key skills needed by the business to move forward.

 

However, implementing new technologies for technology’s sake will most likely be pointless unless cultural changes to the business can also be instituted.

Whilst the leaders of tomorrow need to be able to identify and make use of emerging technological trends in sectors such as AI, Automation, Big Data and Cloud Computing (combined with sufficient expertise and vision to use these resources most effectively) they’ll also need to be able to inspire their teams into embracing change before their competitors disrupt an industry with their own tech savvy workforces.

 

Good leaders then will need to encourage early adoption behaviours by empowering their staff to gain qualifications in relevant skills and engage their workforces in wanting to actively transform the business for the better.

 

An effective Digital Transformation will touch all points within a business and its success will rely on internal support for a lot of its execution within wider functions.

If a leader doesn’t understand their own role in enabling that change and align their thinking with that then the transformation will falter with the leadership team becoming an obstacle, rather than an agent, for change.

Key Take-Aways

  • When considering the types of changes made within your businesses Digital Transformation it’s vital to have a clear understanding of why you need to change with a clear view of where you wish to get to; a solid understanding of how you’ll get there and what skill sets will be required (and if they’re available internally or if a third part will required); and finally have a clear picture of the leadership teams role within the transformation process.
  • Understand the extent of change to your existing business model  how the changes will affect existing workforces and if you’ll be upskilling existing staff to fulfil newly created capacity gaps or downsizing staff and how you’ll cope with the loss of valuable business intelligence as well as how you’ll monetise your business going forward with your new model.
  •  As the majority of business realise the power Digital Transformation has to disrupt existing markets it’s important a good leader stays abreast of the possibilities new technologies offer and finds ways to accelerate their early adoption into their business model.

To Chatbot Or Chatnot?

Ed Yau – Solution Architect, cloudThing

Chatbots make it easier to engage with customers by removing barriers to services

 

Many organisations, including those in Housing Sector, are interested in Chatbots.

They make it easier to engage with customers by removing barriers to their services, there’s no App install required and if someone knows how to use Facebook messenger, then they know how to use your bot….

 

Many in the Housing Sector are interested in Chatbots as they make it easier to engage with customers by remove barriers to their services.

As always though, with such a new solution it’s easy to get caught up in the hype and assume that a chatbot is the best and only delivery route for everything.

There are specific ways Chatbots could offer real value to the Housing Sector but only by examining the customers it would serve can we understand the unique business opportunities and challenges presented by them.

 

A good place to start would be to understand what a Chatbot is naturally good at and where we can use existing frameworks to speed up the development of a solution.

 

Pro Tip: Be wary of the Wix-type chatbot makers if you want to create a bot that will scale with the needs of your organisation!

 

Some tasks will be quick and easy to complete by a Chatbot; whereas more complex requests may need some assistance from the existing contact centre team to complete them correctly and thoroughly.

This doesn’t mean that Chatbots shouldn’t feature in the process; a common mistake is to assume that the Chatbot is the whole solution; often it’s simply adding a more efficient way for customers to access/process information that already exists.

 

Another advantage is that Chatbots are available 24/7, which makes the tenant/customer feel that there’s someone they can contact at all times, ultimately providing comfort and security that they don’t have to wait until working hours to raise a concern.

 

Why The Housing Sector?

We’ve discussed the needs of the Housing Sector in previous blogs but to summarise, the IT department should, at all times, be working with the business to:

 

  • Lower costs
  • Improve customer satisfaction
  • Ensure that services are optimised

 

With challenges to lower rent by 1% year on year and the rollout of universal credit; using technology to increase the efficiency services and lower costs should be at the forefront of an IT departments priorities.

Chatbots can help here by speeding up the route to information and solutions for tenants/customers and enabling contact centre staff to focus on more high priority cases.

One of the problems that social landlords face is that there are too many incoming enquiries to solve across multiple channels; such as the subjects of:

 

  • Rent
  • Repairs
  • Emergency help (for broken boilers etc)
  • Complaints

 

Without a Chatbot, all these questions need to be answered by a member of staff in the contact centre either on by phone, social media or webchat.

A Chatbot is a great tool for dealing with the typical ‘FAQ’ enquiries across multiple channels, while your contact centre staff can deal with more intricate requests.

A bot can shield your contact centre from being high volumes, whilst being able to handle enquiries from across Facebook messenger, webchat or even a digital speaker in a tenant’s home, to ensure the simple enquiries are dealt with from end-to-end by the bot while the more demanding enquiries are handed to an experienced contact centre agent.

Another clear benefit is a bot can carry on working for you out of normal business hours.  This translates into better productivity and an improved service from your organisation

However, Chatbots can be put to work in more targeted situations as well as a reactive point of contact approach. With the need to keep on top of customers who may be late with rent payments, an automated chatbot could be triggered to reach out if rent was late: making first contact to triage why rent is delayed and alert a member of staff if further action needs to be taken.

A Focus On Business Value

The key to delivering value to housing through a Chatbot is to not overcomplicate its design just to show off what technology can do.

Focusing on the genuine business problems within Housing and solving it with an intuitive Chatbot is better than building a very complicated approach for a problem that does not need solving.

Sometimes less is more and most tenants are happy to talk to a bot if the conversation feels natural and they feel it is going to speed up the service, even if there are still a few things that the bot needs to redirect to a human.

When the user experience has been well-designed and thought through, any solution that uses familiar technology and removes barriers is going to be quickly adopted by end users.

This becomes especially effective when we blend the bot into existing contact centre processes by allowing operators to jump in when required to deal with complicated queries.

A simple user experience that works across multiple channels for housing customers will attract more customers than a complicated one.

To ensure we keep it simple, a bot that focuses on quickly providing the information required in natural language, or handing off to a member of staff will not require any prior knowledge from the customer, or a great deal of technical know-how to interact with it. If you end up with a web of complex loops and hooks in you bot conversation experience, the chances are you’re going to lose your users along the way.

If you notice the Conversation Flow getting complicated, much like with code, you may find it easier to draw a box around that section of the bot and treat it as a separate bot in it’s own right, redirect from the bot to your webpage or escalate to a human operator.

Whether it’s paying rent, or reporting a repair, or raising a complaint Housing Associations want their customers to be able to complete these tasks with ease and efficiency and not have the bot make the process made any more rigid or complicated than it would otherwise be.

15 Cyber Security ‘Things’ To Safeguard Your Business

It doesn’t matter what size your business is or how much money you’ve invested in Cyber Security; the harsh truth of the world we live in is that at some point you’ll be targeted by Cyber Scammers.

 

However, that doesn’t mean you should just shut up shop and stop thinking about cyber security though or even that the hackers will be successful; it’s just one of the factors to consider when doing business in a modern, digital world.

 

Fortunately, there are quite a few things either yourself, your IT Team and/or all your other employees can do to make a scammers job much harder.

By their very nature they they’re not nice people (and we’re aware how much we’re understating that) so it’s likely if you make things too difficult for them they’ll just go and seek out an easier target (perhaps someone who hasn’t read this article?)

 

Here then, is cloudThings checklist for improving your business or organisations cyber security…

Back it up, back it up, then back it up again!

Starting with a worst-case scenario, should the worst happen (where possible) it’s important that you’re not negatively affected.

To help alleviate the risks of a successful cyber attack it’s vital that you’re making regular backups of all your key systems and data.

Additional storage doesn’t have to cost a lot these days so making sure you’ve copies in a secure offsite location or, (even better) on the Cloud means should the worst occur you can be back up and running straight away without having to deal with any kind of ransomware scam.

Update needed

We all know how annoying those ‘Update Needed’ pop ups are in the corner of your screen when you log in but new security patches for your OS (operating system), web browser and all your other software or hardware really are important.

Cyber Criminals are on a continuous look out to exploit any weaknesses they find in systems and these updates are deliberate attempts to stop them when such a weakness is identified by the manufacturer.

Ignoring them is an open invitation to a hacker.

Have you covered the basics?

It should go without saying but make sure your entire system, network and all individual devices have trustworthy anti-virus and anti-malware software installed and then make sure that it’s regularly updated to keep the devices safe.

Is your password on the naughty list?

If you take nothing else from this article then take this… please use a strong password and then make sure it’s changed regularly!

In this day and age it’s still amazing how many people use simple passwords from the ‘naughty list’ for convenience sake thinking it’ll never happen to them.

If you’re a system administrator then it’s good practice to ensure all employee passwords must include both capital and lower-case letters, non-sequential numbers and a symbol.

The more complicated it is, the harder it will be to crack with a brute force attack.

It may also be worth putting automatic rules in place to prohibit the partial use of the most popular password choices…

MOST POPULAR PASSWORD CHOICES:

 

  • 123456
  • 123456789
  • Qwerty
  • 12345678
  • 111111
  • 1234567890
  • 1234567
  • Password
  • 123123
  • 987654321
  • Qwertyuiop
  • Mynoob
  • 123321
  • 666666
  • 18atcskd2w
  • 7777777
  • 1q2w3e4r
  • 654321
  • 555555
  • 3rjs1la7qe
  • Google
  • 1q2w3e4r5t
  • 123qwe
  • Zxcvbnm
  • 1q2w3e
  • 12345
  • 12345678
  • Password
  • Password1
  • Admin
  • Admin1
  • Their name or surname
  • Their birthday
  • ABCDE
  • 696969

 

Never forget, a badly chosen password doesn’t just have the power to compromise one laptop but possibly your entire organisations data as well as potentially your clients, suppliers and partners!

Want a tip from the experts?

The ideal situation would be to have a separate, randomised, password for every device and application an employee has access to. That’s obviously impractical but by using a password manager like LastPass or 1Password it’s possible to have secure passwords whilst only having to remember and update one.

And as a final tip, if your employees are in charge of setting their own passwords and do look after sensitive data tell them to stay away from using middle names, pet names or their child’s names or birthdays.

It’s scary what a determined scammer can learn about someone after a quick search of their social media… but it happens far more often than you’d think.

Passwords for real life

Passwords need protecting in real life too!

Do your employees ever work away from the office?

Have you ever been tempted to log in and check your emails whilst stood in line at Costa?

Even in the workplace have you ever had to let a disgruntled employee go?

It’s important that your employees protect their passwords not just in the digital world but in the real world too.

It’s far too easy to look over someone’s shoulder as they type out a password (especially if it’s something simple like ABCD1234). Make they know to take a look around before typing in their password and that they’re aware of who might be watching. It also goes without saying that they should never share it with anyone.

EVER!

Preventing phishing scams

Always have one eye out for Phishing Scams.

Phishing scams are the fraudulent attempt to obtain sensitive information such as usernames, passwords or credit card information by scammers disguising themselves as a trusted person via email or other digital communication and they’re getting more and more sophisticated every year.

It’s important your IT team or other knowledgeable individual within your organisation teach your staff what to look out for in ‘dodgy’ emails.

Unfortunately, they won’t come from Nigerian Princes these days!

THE EMAIL DISPLAY NAME

A common tactic is to ‘spoof’ a senior member of the organisation in the ‘from’ box.

Just because their name displays doesn’t mean it’s from then.

A good step to take in preventing this is to empower your staff to speak to the sender to double check if the email was from them, especially if it’s requesting information (especially payment details) or requesting a link be clicked.

DON’T CLICK A SUSPICIOUS LINK. JUST… DON’T!

It’s not fool proof but a quick check is to hover your mouse over the link (without clicking it!) This will display the address of where the link will send you. If it looks spammy then it probably is.

Another common tactic of these spammy links is to direct you to a fraudulent homepage of a trusted site (maybe a fake PayPal?) asking you to login again.

If this does happen and you’re unsure either check with your IT Team or go direct to the actual website itself rather than trusting the link in the email. It’ll take an extra 15 seconds but will prevent you giving your details out to a phishing scammer.

LOOK OUT FOR SPELLING, GRAMMAR AND SYNTAX ERRORS

It’s not a hard and fast rule so be careful but a lot of scammers won’t have English as a first language.

If the email is badly worded or spelt there’s a chance it’s not to be trusted.

STRANGE GREETINGS

To save time a lot of scammers send out multiple emails at once.

If your boss normally addresses you by your first name but you suddenly get an email from them that starts dear Valued Employee or Important Client then be instantly suspicious of it, especially if it’s asking you for something.

URGENT RESPONSE REQUIRED

We’ve all had that email from the boss that needs actioning immediately but this is also a scare tactic used by cyber scammers to knock you off kilter and be easier to manipulate.

Whilst it may be genuine and need urgent attention, picking up the phone or walking to their office to double check isn’t going to hurt and may save the company a lot of money… particularly if the email want’s you to do something you wouldn’t normally, like pay an invoice or log in to an account etc.

SOMETHING’S NOT QUITE RIGHT

Sometimes you’ll just look at an email and it won’t feel quite… right.

Maybe the logo is pixelated or the images or layout just feel ‘off’. If something does feel wrong about it trust your instinct and run it by your IT Team before you do anything with it.

SUSPICIOUS DOMAIN NAMES AND URL’S:

Many email scammers will try to spoof existing domain names to make their scams seem more credible. Instead of Amazon.com you might get an email request asking you to log into Amaz0n.com.

It’s easy to miss if you’re not being vigilant and if you’ve already clicked the link the landing page you go to may seem legitimate but it’s important you keep an eye out for these as it’s a common technique.

STRANGE ATTACHMENTS

As email scams become more sophisticated the scammers are relying less on you clicking a link and more on you clicking an attachment infected with some kind of malware. If you’re sent an email from a source you don’t recognise or even one you do but the attachment looks strange (like a .doc for a word file for instance) it may be suspicious. Before opening ask a member of your IT Team to look it over.

Good software, bad malware

If you make it the responsibility of the IT Team to check, download and install new programs then your staff can’t ever accidently download something that poses a security risk.

Unfortunately, many staff will believe it’s safe to download a program as long as they know what the program is (let’s use Microsoft Excel as an example). The problems come when they don’t check where they’re downloading the new software from and perhaps just Google ‘download Microsoft Excel’ then click the first link.

The truth is however these types of programs can often be riddled with virus’, spyware, malware, trojans and worms.

It’d be our advice, to reduce the risk of accidently downloading something like this to a works machine, to implement a complete download protocol where staff are unable to download or install anything without IT’s permission.

It may take a little more time but it will keep your sensitive data a lot more secure.

The Great Firewall of…

If you don’t have a firewall installed get one; if you do make sure it’s kept up to date and the latest firmware is installed.

If you’re using a Wi-Fi network in your office make sure it’s encrypted (with something like WPA2) and make sure you regularly change the password, especially if visitors are logging on to it. Whilst you may trust your guests implicitly, there’s no way to tell if their devices are infected until it’s too late.

Lastly, if your staff ever work remotely, out of the office or from home make sure they log in through a VPN (Virtual Private Network) to avoid any issues with open Wi-Fi networks.

Current and ex-employees might be your biggest vulnerability

In a perfect world there would be no scammers and we’d be able to trust all our employees 100%.

Sadly, we don’t live in that world which means we have to take several uncomfortable steps to protect the workplace from cyber-attacks.

If someone wanted to deliberately download malicious software then chances are they wouldn’t do it from their own machine. For that reason it’s always best to educate your staff as much as possible around cyber security and to implement a policy of locking their devices whenever they step away, never sharing their passwords with anyone or giving remote access to their computer without IT’s permission and although it may sound silly, never leaving their password on a post-it note on their desk.

If someone does leave, especially under bad terms it’s important to change all the passwords they had access to immediately to prevent possible breaches of your secure date.

It’s good practise to keep a record of who has access to what so you know exactly what to update when it’s needed.

Are you using MFA?

Multifactor Authentication sounds a lot more complicated than it actually is, especially when compared against the increases to security it offers.

Simply put, the more barriers you can put in place to make it harder for hackers to access your networks and systems, the better off you’ll be.

Those additional barriers are the point of MFA (Multifactor Authentication).

By combining the need for two or more independent credentials to access data, what the user knows, like a password for instance and what a user has, like a swipe card or other security token, you exponentially increase the security of your data.

Depending on the sensitivity of the data you’re storing you could even go a step further and make biometric verification needed like facial or fingerprint recognition.

It’s all about creating different layers of defence so that even if one is compromised, cybercriminals still have another layer or two to hack.

Implementing some form of MFA is a quick win in increasing your cyber security and doesn’t need to be complicated.

It can be as simple as combining a password with a fingerprint scan or even a security question only known to the user.

Do you even https?

It’s pretty common now but you should still be double checking any website you visit starts with https instead of just the old http.

If it doesn’t it’s not secure so don’t put in any confidential details likes credit card numbers, passwords or address’.

Ever heard of malvertising?

Malvertising is a relatively new way that cyber criminals can add malicious code or malware to your computer. They put viruses and other items into pop up ads then add them into legitimate online advertising networks and websites.

This means you can be doing everything right, just innocently browsing a perfectly legitimate website and still have your computer attacked, often without you even realising it.

Whilst the Ad networks themselves do their best to weed these you or your IT Team can also help by installing an adblocker on all your work machines and making sure your antivirus programmes are up to date.

Your IT Team might be a vulnerability

Hopefully we’re not making you too paranoid here but if you are the victim of a cyber attack chances are you’ll be attacked from a direction you never even considered.

Whilst your IT Team are your most valuable asset in preventing cyber crime attacks within your organisation, that very level of expert knowledge they use can also be a vulnerability. Most members of your IT Team will probably have admin rights and access to every piece of hardware and software within your company… and that’s fine but…

If they’re just working on a day to day basis, browsing the internet etc make sure those admin rights are locked down under a different profile.

There should be no reason they need them on a day to day basis so having them locked away under a different password in case they are attacked adds in an extra step of protection and defence. If on the off chance their computer is then compromised at least the hacker hasn’t gained access to the entire organisation.

Understand and utilise your activity logs

As we’ve already stated, encouraging a culture of digital awareness is vital in protecting your company from attacks from cyber criminals. That’s why we’d recommend you teach all your staff how to check the activity logs of their emails accounts and if used for work their social media accounts as well.

These will show them what browsers and devices they’ve accessed their accounts from and even from what IP address.

If there’s anything then they don’t recognise they can immediately terminate it and reduce the risk of a scammer having unfettered access.

What do you do with all your old devices?

It seems like these days you only need to buy a phone or laptop for it to be out of date 6 months later… but does your company have a recycled electronics protocol in place?

If you’re getting rid of anything that once held any kind of sensitive data on it then it all needs reformatting and returning to the original factory settings.

Scammers go out of their way to buy second-hand office equipment for this reason –  as so many companies don’t follow this vital step.

Hopefully this won’t have left you feeling to paranoid, glancing over your shoulder every time you boot up your laptop or unlock your phone.

A lot of the points we mention should be standard practice for most IT Teams, the main thing is to promote a culture of awareness in your organisation around cyber attacks so that protecting the business becomes everyone’s responsibility.

Microsoft Dynamics 365: Settings In solutions

Craig Seymour – Dynamics Practice Lead, cloudThing

When you’re exporting a solution in Microsoft Dynamics 365, you can choose to include organization settings… but there’s no documentation

 

When you’re exporting a solution in Microsoft Dynamics 365, you can choose to include organization settings… but there’s no documentation

So do you know exactly what it is you are exporting?

Never fear, Craig Seymour is here!

cloudThing’s Dynamics Practice Lead has the answers you seek…

 

At cloudThing, we follow the guiding principle that the deployment of D365 solutions should be automated wherever possible and if it’s not possible, figure out a way to make it possible – hence the birth of our buildThing tools.

 

One exception to this has been organisation settings: when you’re exporting a solution, you can choose to include organisation settings but there’s no documentation… so how do you know exactly what it is you are exporting?

With that uncertainly and as these settings are generally a one-off change, we’ve covered them by documenting the necessary steps in the release note.

But I’d really rather we didn’t have manual steps in the process, so when I had a bit of spare time recently I decided to take a bit of a goosey gander…

So, if you’ve ever wondered what’s included with each checkbox, read on!

 

 

 

So, if you’ve ever wondered what’s included with each checkbox, read on!

Diving In…

So, how do you find out what settings are actually going to get transferred with your solution?

The first thing I tried was simply exporting the solution twice, once with the settings checked and once without. This gave the XML output at the bottom of this post, but I was left a bit unsure about what each setting was and where it had come from. For example, there seemed to be a lot of settings from the UI which were missing. Perhaps it doesn’t export ones which are set to the default?

However, when I looked online for references to the XML attributes, I found the SDK documentation for the Request classes which can be called when you are generating a Solution programmatically; They’re a bit hard to digest, but they do help to understand what exactly you’re going to get.

 

And the answer is?

The short answer is, exactly the same list as above; the longer answer is as below:

 

 

Organization.WeekStartDayCode : Designated first day of the week throughout Microsoft Dynamics 365
Organization.DateSeparator : Character used to separate the month, the day, and the year in dates throughout Microsoft Dynamics 365.
Organization.ShowWeekNumber : Information that specifies whether to display the week number in calendar displays throughout Microsoft CRM.
Organization.DateFormatCode: Information about how the date is displayed throughout Microsoft CRM.

 

 

Organization.MaxAppointmentDurationDays : Maximum number of days an appointment can last.

 

 

Organization.TimeFormatCode : Information that specifies how the time is displayed throughout Microsoft CRM.

Organization.CalendarType : Calendar type for the system. Set to Gregorian US by default.

Customisation

 

 

 

Organization.IsAppMode : Indicates whether loading of Microsoft Dynamics 365 in a browser window that does not have address, tool, and menu bars is enabled.

Email Tracking

 

 

Organization.TrackingPrefix : History list of tracking token prefixes
Organization.TrackingTokenIdBase : Base number used to provide separate tracking token identifiers to users belonging to different deployments.
Organization.TrackingTokenIdDigits : Number of digits used to represent a tracking token identifier.
Organization.MaximumTrackingNumber : Maximum tracking number before recycling takes place.

Organization.IgnoreInternalEmail : Indicates whether incoming email sent by internal Microsoft Dynamics 365 users or queues should be tracked.

 

 

Organization.RenderSecureIFrameForEmail : Flag to render the body of email in the Web form in an IFRAME with the security=’restricted’ attribute set. This is additional security but can cause a credentials prompt.
Organization.AllowUnresolvedPartiesOnEmailSend : Indicates whether users are allowed to send email to unresolved parties (parties must still have an email address).

General

 

 

Organization.IsAutoSaveEnabled : Information on whether auto save is enabled.
Organization.FullNameConventionCode : Order in which names are to be displayed throughout Microsoft CRM.
Organization.IsPresenceEnabled : Information on whether IM presence is enabled.
Organization.PricingDecimalPrecision : Number of decimal places that can be used for prices
Organization.ShareToPreviousOwnerOnAssign : Information that specifies whether to share to previous owner on assign

Organization.BlockedAttachments : Prevent upload or download of certain attachment types that are considered dangerous.

 

 

Organization.NumberFormat : Specification of how numbers are displayed throughout Microsoft CRM.
Organization.NegativeFormatCode : Information that specifies how negative numbers are displayed throughout Microsoft CRM.

 

 

Organization.CurrencySymbol : Symbol used for currency throughout Microsoft Dynamics 365.

 

 

Organization.CurrencyFormatCode : Information about how currency symbols are placed throughout Microsoft Dynamics CRM.

 

 

Organization.GlobalHelpUrl : URL for the web page global help.
Organization.GlobalHelpUrlEnabled : Indicates whether the customizable global help is enabled.
Organization.GlobalAppendUrlParametersEnabled : Indicates whether the append URL parameters is enabled
Organization.GetStartedPaneContentEnabled : Indicates whether Get Started content is enabled for this organization.

Organization.MobileClientMashupEnabled :

 

Marketing

Organization.AllowMarketingEmailExecution : Indicates whether marketing emails execution is allowed.

 

 

Organization.AllowAutoResponseCreation : Indicates whether automatic response creation is allowed
Organization.AllowAutoUnsubscribe : Indicates whether automatic unsubscribe is allowed.
Organization.AllowAutoUnsubscribeAcknowledgement : Indicates whether automatic unsubscribe acknowledgement email is allowed to send.

Organization.TagPollingPeriod : Normal polling frequency used for email receive auto-tagging in outlook.

Organization.TagMaxAggressiveCycles : Maximum number of aggressive polling cycles executed for email auto-tagging when a new email is received.

Organization.AllowOutlookScheduledSyncs : Indicates whether scheduled synchronizations to Outlook are allowed.

 

 

Organization.EmailSendPollingPeriod : Normal polling frequency used for sending email in Microsoft Office Outlook.
Organization.MinOutlookSyncInterval : Minimum allowed time between scheduled Outlook synchronizations.
Organization.AllowOfflineScheduledSyncs : Indicates whether background offline synchronization in Microsoft Office Outlook is allowed
Organization.MinOfflineSyncInterval : Normal polling frequency used for background offline synchronization in Microsoft Office Outlook.
Organization.AllowAddressBookSyncs : Indicates whether background address book synchronization in Microsoft Office Outlook is allowed.
Organization.MinAddressBookSyncInterval : Normal polling frequency used for address book synchronization in Microsoft Office Outlook.

 

Sales

 

 

 

Organization.CreateProductsWithoutParentInActiveState : Enable Initial state of newly created products to be Active instead of Draft
Organization.UseInbuiltRuleForDefaultPricelistSelection : Flag indicates whether to Use Inbuilt Rule For DefaultPricelist.
Organization.MaxProductsInBundle : Restrict the maximum no of items in a bundle
Organization.OOBPriceCalculationEnabled : Enable OOB pricing calculation logic for Opportunity, Quote, Order and Invoice entities
Organization.DiscountCalculationMethod : Discount calculation method for the QOOI product
Organization.MaximumDynamicPropertiesAllowed : Restrict the maximum number of product properties for a product family/bundle

Relationship Roles

You’re not still using these, right? 🙂

Conclusions

If you bothered to read all the detail above, you’ll have noticed that there’s a few incomplete bits, and my intended next steps were to go through and track down the missing items or uncertain items, but actually I think that I’d be wasting my time.

We can see that:

  • Not all the visible settings in the UI under the tab of the same name as the checkbox in the Solution Export dialog are taken through (e.g. If you tick ‘General’ in the Solution Export, you don’t get all the settings in the ‘General’ tab in Application Settings.
  • Functionally related settings are scattered throughout the UI (e.g. some of the ‘General’ settings are in the currency and locale areas of the UI)
  • There are settings for things which you shouldn’t be using (e.g. Relationship Roles)

All of which makes me reticent to use this functionality – without careful checking, there’s no certainty about which settings are actually going to make it through.

And with the dynamic nature of Dynamics 365 CE updates, it could change tomorrow…

So where next?

I think that cloudThing needs to write a buildThing which iterates through the the whole list of Organization Settings (not just those exposed as specific Request classes), and allows you to set them programmatically in your ALM pipeline.

 

Watch this space

What The Hek Is DevOps

Devops is the buzz word on everyone’s lips in the software industry and should be the goal of all IT organisations; but what actually is it?

 

What Is Devops?

Devops is a set of automated software practices that combine development (Dev) and IT operations (Ops) – see what they did there?

Its aim is to shorten the development life cycle of resilient systems whilst providing continuous and scalable features, fixes and updates with superb software in line with a business’ stated goals.

 

Or… In plain English it’s a set of practises for delivering the highest possible standard of IT support that reduces the time between committing to a change f