Bitdefender's Black Friday antivirus deal gives you 70% off the world's best security
A fantastic saving on some remarkable software - this best antivirus deal lets you choose between Bitdefender's three levels of cybersecurity.
Read MoreThe onset of a new wave of innovation and technology can induce a mix of reactions amongst C-level decision-makers in any business. For every CEO, CIO, or CFO who chases the latest technological ‘shiny thing’ and plans to implement meaningful change within their organization, there are countless others who would prefer to take little or no action regarding new technological advances.
Don Schuerman is CTO and Vice President of Product Marketing at Pegasystems.
Here’s the thing – as much as some senior leaders might prefer to stick their heads in the sand, new technology trends are like seasons of the year; they are inevitable. In fact, the biggest risk far too many organizations run is that by either implementing innovations half-heartedly or ignoring them completely, they can lose ground to more progressive competitors who are harnessing new technologies to differentiate themselves and gain a competitive advantage.
The key to avoiding this is to emphasize accountability. This is more than a ‘buck stops here’, one-size-fits-all approach. Accountability can come in many forms – through integration, innovation adoption, or even governance.
The emergence of artificial intelligence (AI) in modern organizations is a great example of where effective governance is needed. While it’s now more widely accepted as an emerging technology that can help to drive meaningful change and add value to consumers, AI regulation is still something of a grey area. In a recent global study amongst C-level executives, three quarters (65%) felt that the current level of external AI governance isn’t sufficient to manage the growth of the technology.
To that end, 78% advocated for an equal share of responsibility for regulation between government and the private sector as their ideal scenario. However, when asked what they felt the balance would actually look like in five years’ time, 75% expected the government to be largely or fully responsible for AI governance. Why? Because only 27% said they currently have no designated AI governance leader within their organization and only 25% are managing a formal policy at the C-suite level. All of this suggests a failure at C-level within organizations to step up and take accountability for AI governance themselves. For companies to become leaders in AI governance, business leaders must make sure that they are kept in the loop of any internal processes that are made more autonomous through the use of AI. If not, these findings suggest that the private sector will lose control of regulating the technology entirely. If business leaders implement a strategy that is based on technical expertise and outcomes, it can help keep their governance responsive to new challenges.
Another way in which accountability can be important is in ensuring that businesses are able to successfully integrate new technologies, such as hyperautomation, which helps reduce costs and increase efficiencies in areas like case management. However, many see hyperautomation having a longer-term impact as well. Our study found that 32% are using it to help their business improve workflow and case management today, and this number almost doubles to 61% when respondents were asked to look five years ahead.
However, for hyperautomation to have the impact many hope and expect it to, it is important that businesses have accountability for successfully integrating it. At present, significant concerns exist around the ability to do so at present, with 58% of respondents citing integration with existing legacy systems as their biggest automation challenge, while 40% point to compatibility with third party technologies as their biggest concern. The success of hyperautomation deployments also depends on keeping operations and processes consistent, even when situations such as the coronavirus pandemic arise, and organizations are required to develop new automated solutions quickly.
It’s just as important to ensure that business leaders take accountability for recognizing technologies that are still at the early stages of emerging on their radar. After all, innovation is only innovative as long as it’s new. Once a technology has become mainstream, businesses can quickly find themselves playing catchup in terms of fully understanding how it can be used effectively within their organizations.
Take extended reality (XR). Virtual Reality headsets and augmented reality have long been emerging technologies for consumers, but when asked how business leaders were using it to interact with customers, just 35% said it was changing the customer experience. This might be expected for such relatively new technology, but when asked about the outlook five years from now, a different picture emerges – almost a third (30%) of C-suite respondents say XR will become essential to customer engagement. In fact, more than half (52%) believe XR will become a competitive differentiator. The takeaway is that even though a technology may not be in the here and now in terms of widespread adoption, there’s no excuse not to be prepared. It’s never too early to hold oneself accountable for innovation. As XR begins to take shape, business leaders can also score early wins through a market-focused strategy that does not wait for such technology to mature before identifying early, valuable opportunities.
The same can be said when it comes to investing in IT infrastructure that can support emerging technologies. For example, it’s no surprise that 73% of survey respondents said that current remote and mobile work trends have made cloud deployments a priority, nor that over half (51%) said mobile and remote functionality will continue to be one of the drivers of extended edge technology. However, if these technologies are to be effective, organizations need to adopt and help complementary technologies along the way. 41% of C-level respondents said the maturation of AI, automation, and machine learning are necessary for the cloud and extended edge to achieve deeper success. Today, only 22% rated their distributed cloud technology as ‘intelligent’ or ‘mature’ while 18% said the same of extended edge technology, demonstrating the extent of the challenge ahead.
The bottom line is that technology accountability requires more than just standing up and taking ownership. It’s a process of proactive investigation, preparation, and study into what’s best for an organization. One thing’s for sure - technology will continue to evolve and new trends will continue to emerge. If businesses aren’t ready to maximize its value by embracing accountability, they could be left behind.
A fantastic saving on some remarkable software - this best antivirus deal lets you choose between Bitdefender's three levels of cybersecurity.
Read MoreDNA has all the necessary attributes to solve the looming data crisis.
Read MoreAndroid and desktop users get to the Read Later feature already enjoyed on iOS
Read MoreNvidia is supposedly planning to resurrect the RTX 2060 GPU with double the VRAM, but a modder has already made such a graphics card, taking the DIY route to cram the same amount of extra video memory on-board. This is another effort from Russian hardware modder VIK-on who has previously boosted the VRAM on various graphics cards, including sticking 22GB on Nvidia’s RTX 2080 Ti and doubling up the RTX 3070 to 16GB. As VideoCardz reports, this time around the modder has put 12GB on an Asus RTX 2060, with the GPU successfully identified by tools like GPU-Z with that memory configuration validated. The DIY card apparently works ‘just fine’ but there is an issue around it sending the host PC into the occasional black screen crash (a bug seen with previous similar VRAM mod efforts, and one which can be worked around). This project is interesting because as mentioned at the outset, the rumor mill insists that Nvidia is about to unleash its own rereleased RTX 2060 with 12GB of VRAM instead of the 6GB loadout on the original Turing GPU. The idea being that this will help the card remain relevant for those looking for a decent 1080p option (and being frustrated by ongoing stock issues around contemporary GPUs). Nvidia should be able to supplement Ampere products with this resurrected RTX 2060 when it theoretically emerges on December 7, but as ever, treat this prospect with caution – though VideoCardz claims that purported launch date is still on track, and that Overclocking.com recently confirmed it with their sources. We’ll know soon enough if Nvidia’s RTX 2060 with 12GB of VRAM is real, given that the launch date is (just) inside a fortnight now. The new version of the GPU isn’t expected to make any other changes save for this RAM boost. When VIK-on doubled the VRAM on the RTX 3070, it was shown to run Watch Dogs: Legion noticeably more smoothly, but this time out, the beefed-up RTX 2060 wasn’t put through any gaming paces. The modder did, however, test the 2060 with 12GB in crypto-mining and the Unigine Superposition benchmark, finding roughly similar performance to the vanilla 6GB model. Having the VRAM doubled to 12GB may not make any difference in some games, either, but those titles which do push memory requirements harder should obviously benefit from more headroom when graphics settings are cranked up more. Apparently we may see more testing on this DIY 2060 model in the near future, so it’ll be worth keeping an eye out for that, and any findings on gaming performance that might be aired.
Analysis: Does this tell us anything about what to expect from Nvidia’s RTX 2060?
Want to hire best people for your project? Look no further you came to the right place!