Showing posts with label digital. Show all posts
Showing posts with label digital. Show all posts

Daily Tech Digest - January 29, 2022

BotenaGo Botnet Code Leaked to GitHub, Impacting Millions of Devices

Researchers also found additional hacking tools, from several sources, collected in the same repository. Alien Labs called the malware source code “simple yet efficient,” able to carry out malware attacks with a grand total of a mere 2,891 lines of code (including empty lines and comments). In its November writeup, Alien Labs noted that BotenaGo, written in Google’s open-source Golang programming language, could exploit 33 vulnerabilities for initial access. The malware is light, easy to use and powerful. BotenaGo’s 2,891 lines of code are all that’s needed for a malware attack, including, but not limited to, installing a reverse shell and a telnet loader used to create a backdoor to receive commands from its command-and-control (C2) operator. Caspi explained that BotenaGo has automatic setup of its 33 exploits, presenting an attacker a “ready state” to attack a vulnerable target and infect it with an appropriate payload based on target type or operating system. The source code leaked to GitHub and depicted below features a “supported” list of vendors and software used by BotenaGo to target its exploits at a slew of routers and IoT devices.


The best IT skill for the 2020s? Become an 'evergreen' learner

For starters, the "soft" skills will matter in the months and years ahead. These include professional skills such as communication, leadership, and teamwork, says Don Jones, vice president of developer skills at Pluralsight. Then there is a need for "tech-adjacent skills, like a familiarity with project management and business analysis." Jones urges an "evergreen" approach to skills mastery, as technology evolves too quickly to commit to a single platform or solution set. "The biggest-impact skill is the ability to learn," he says. "There's no single tech skill you can invest in that won't change or be outdated in a year; your single biggest skill needs to be the ability to update skills and learn new skills." This also means placing a greater emphasis on emotional intelligence, as many emerging systems will be built on artificial intelligence, analytics, or automation that mimic human processes, therefore augmenting human workers. "Anyone can be taught to swap out memory, but the skill of communication and responding to human emotion is not a skill so easily taught," says Chris Lepotakis


Three things Web3 should fix in 2022

Web3 backers love to talk about how blockchain networks are computers that can be programmed to do anything you imagine, given superpowers by the fact that they are also decentralized. Ethereum was the first of these computers to get real traction, but it was quickly overwhelmed by traffic. Traffic is managed by charging fees to use the computer, and the fees to complete a single transaction on the Ethereum network can run over $100. Imagine spending $75 to create a “free” Facebook account and another $75 every time you wanted to post something, and you have a sense of what it would be like to participate in a social network on the blockchain today. Ethereum is in the midst of a transformation designed to make it more efficient — which is to say, faster, less expensive, and less wasteful of energy. In the meantime, technologists routinely appear announcing that they have built a more efficient blockchain. Solana, for example, is a company that raised $314 million last year to build what it calls “the fastest blockchain in the world.” With that in mind, let’s check in on how the fastest blockchain in the world was doing on Sunday, when the aforementioned crypto crash led many people to use it to buy and sell assets.


Five Data Governance Trends for Organizational Transformation in 2022

There is a growing challenge to better govern data as it increases in variety and volume, and there is an estimate that 7.5 septillion gigabytes of data is generated every single day. Moreover, in organizations, silos are getting created through multiple data lakes or data warehouses without the right guidelines, which will eventually be a challenge in managing this data growth. To achieve nimbleness, we can simplify the data landscape by using a semantic fabric, popularly called data fabric, based on a strong Metadata Management operating model. This can further make data interoperable between divisions and functions while working to a competitive advantage. Data fabric simplifies Data Management, across cloud and on-premise data sources, even though data is managed as domains. In addition, data democratization can be a strong enabler for managing data across domains with ease and making data available as well as interoperable. Allowing business users to source and consume relevant data for their instantaneous reporting or generation of insights can reduce significant turnaround time in acquiring or sourcing data traditionally.


How the metaverse could impact the world and the future of technology

The metaverse could potentially use virtual reality, or augmented reality as we know it now, to immerse users in an alternate world. The technology is still being developed, but companies like Meta say they are building and improving these devices. Meta's Oculus Quest, now in its second model, is one such device. "When you're in the metaverse, when you're in a virtual reality headset, you will feel like you're actually sitting in a room with someone else who can see you, who can see all of your nonverbal gestures, who you can respond to and mimic," Ratan said. Immersive worlds and creating online avatars is nothing new, as games like Grand Theft Auto Online, Minecraft and Roblox have already created virtual universes. Meta's announcement last October aims to go beyond entertainment, and create virtual workspaces, homes and experiences for all ages. "What's happening now is the metaverse for social media without gaming," Ratan said. "The new metaverse is designed to support any type of social interaction, whether that's hanging out with your friends or having a business meeting."


Use the Drift and Stability of Data to Build More Resilient Models

Data drift represents how a target data set is different from a source data set. For time-series data (the most common form of data powering ML models), drift is a measure of the “distance” of data at two different instances in time. The key takeaway is that drift is a singular, or point, measure of the distance between two different data distributions. While drift is a point measure, stability is a longitudinal metric. We believe resilient models should be powered by data attributes that exhibit low drift over time — such models, by definition, would exhibit less drift-induced misbehavior. In order to manifest this property, drift over time, we introduce the notion of data stability. Stable data attributes drift little over time, whereas unstable data is the opposite. We provide additional details below. Consider two different attributes: the daily temperature distribution in NYC in November (TEMPNovNYC) and the distribution of the tare weights of aircraft at public airports (AIRKG). It is easy to see that TEMPNovNYC has lower drift than AIRKG; one would expect lesser variation between November temperatures at NYC across various years, than between the weights of aircrafts at two airports.


How to become an AI influencer

An influencer has huge responsibilities to fill. As someone with a big following, it is important to understand the kind of impact they can have on their target audience, especially if they are young or just starting out in their career. Venkat Raman, co-founder of Aryma Labs, a data consulting firm, lists down a few things influencers should keep in mind while creating their content. Don’t give false hopes An influencer should not give people false hopes. He adds, “I see many posts and tweets where some influencers proclaim that one does not need to know advanced math to break into data science. The poor aspirants believe it, and when they face the tough curriculum, they give up. I think we need to be honest. This will help set the correct expectations.” ... Many influencers in the field teach statistics through their content. Statistics is one of the core foundations of data science. Raman adds, “I have seen even the most popular YouTubers teach statistics wrongly.” The foundation can’t be left shaky. The influencers owe it to their audience to teach the right stuff. Unfortunately, in the chase for ‘number of followers’ and pressure to create content every now and then, they end up creating substandard content.


‘Dark Herring’ Billing Malware Swims onto 105M Android Devices

On the technical side, once the Android application is installed and launched, a first-stage URL is loaded into a webview, which is hosted on Cloudfront, researchers said. The malware then sends an initial GET request to that URL, which sends back a response containing links to JavaScript files hosted on Amazon Web Services cloud instances. The application then fetches these resources, which it needs to proceed with the infection process — and specifically, to enable geo-targeting. “One of the JavaScript files instructs the application to get a unique identifier for the device by making a POST request to the “live/keylookup” API endpoint and then constructing a final-stage URL,” according to the analysis. “The baseurl variable is used to make a POST request that contains unique identifiers created by the application, to identify the device and the language and country details.” The response from that final-stage URL contains the configuration that the application will use to dictate its behavior, based on the victim’s details. Based on this configuration, a mobile webpage displayed to the victim, asking them to submit their phone number to activate the app (and the DCB charges).


4 ways to mature your digital automation strategy

Immature strategies focus on simple tasks. It’s a great place to start, but to get the most out of automation, it needs to grow. To evolve these task-based automations into automated workflows, applications and systems need to communicate with each other. Steadily adding connected systems provides the opportunity to build increasingly complex, end-to-end workflows. As more processes are connected, you will need a platform to manage the increasing complexity. Fortunately, vendors in different segments of enterprise IT are converging with offerings of business process automation (BPA) suites that include integration libraries and automation and workflow capabilities. This trend provides support for organizations building out their strategies and validates the importance of automation paired with connectivity. RPA bots are very popular because they are powerful and easy to use. This is both a blessing and a curse because RPA is often used when it shouldn’t be, leading to poorly designed processes. 


Integrating IoT in Your Business

If you look at the LoRaWAN ecosystem as a whole, we now have a few hundred hardware partners that have created off the shelf products. So the first one, we say, okay, just don’t start, build your own hardware, look at it, look what’s there. And of course, we have experience with a lot of these devices and we’ve highlighted them. And of course, we also know as a company, which ones are higher quarter quality, and which are of lesser quality. But this abundance of availability make sure that you can choose, and also make sure there’s a market. Second, if you wanna move into, let’s say custom hardware development, because the sensor is not out there, or because you wanna build up IP or because it’s, I mean, you can think of many reasons. What you now see is that with, in the LoRaWAN ecosystem, there’s a lot of libraries, there’s a lot of tools, a lot of modules, that also makes it easier to build your own hardware. So we’ve started off with an open code initiative called a generic node, where we were offering the ecosystem, that’s a example of how we feel what should be the perfect LoRaWAN device and you can use it for inspiration or we can help you further. 



Quote for the day:

"A company is like a ship. Everyone ought to be prepared to take the helm." -- Morris Wilks

Daily Tech Digest - August 28, 2021

Why scrum has become irrelevant

The purpose of the retrospective is just that: to reflect. We look at what worked, what didn’t work, and what kinds of experiments we want to try.Unfortunately, what it boils down to is putting the same Post-its of “good teamwork” and “too many meetings” in the same swim lanes as “what went well,” “what went wrong,” and “what we will do better.” ... Scrum is often the enemy of productivity, and it makes even less sense in the remote, post-COVID world. The premise of scrum should not be that one cookie cutter fits every development team on the planet. A lot of teams are just doing things by rote and with zero evidence of their effectiveness. An ever-recurring nightmare of standups, sprint grooming, sprint planning and retros can only lead to staleness. Scrum does not promote new and fresh ways of working; instead, it champions repetition. Let good development teams self-organize to their context. Track what gets shipped to production, add the time it took (in days!) after the fact, and track that. Focus on reality and not some vaguely intelligible burndown chart. Automate all you can and have an ultra-smooth pipeline. Eradicate all waste. 


Your data, your choice

These days, people are much more aware of the importance of shredding paper copies of bills and financial statements, but they are perfectly comfortable handing over staggering amounts of personal data online. Most people freely give their email address and personal details, without a second thought for any potential misuse. And it’s not just the tech giants – the explosion of digital technologies means that companies and spin-off apps are hoovering up vast amounts of personal data. It’s common practice for businesses to seek to “control” your data and to gather personal data that they don’t need at the time on the premise that it might be valuable someday. The other side of the personal data conundrum is the data strategy and governance model that guides an individual business. At Nephos, we use our data expertise to help our clients solve complex data problems and create sustainable data governance practices. As ethical and transparent data management becomes increasingly important, younger consumers are making choices based on how well they trust you will handle and manage their data.


How Kafka Can Make Microservice Planet a Better Place

Originally Kafka was developed under the Apache license but later Confluent forked on it and delivered a robust version of it. Actually Confluent delivers the most complete distribution of Kafka with Confluent Platform. Confluent Platform improves Kafka with additional community and commercial features designed to enhance the streaming experience of both operators and developers in production, at a massive scale. You can find thousands of documents about learning Kafka. In this article, we want to focus on using it in the microservice architecture, and we need an important concept named Kafka Topic for that. ... The final topic that we should learn about is before starting our stream processing project is Kstream. KStream is an abstraction of a record stream of KeyValue pairs, i.e., each record is an independent entity/event. In the real world, Kafka Streams greatly simplifies the stream processing from topics. Built on top of Kafka client libraries, it provides data parallelism, distributed coordination, fault tolerance, and scalability. It deals with messages as an unbounded, continuous, and real-time flow of records.


What Happens When ‘If’ Turns to ‘When’ in Quantum Computing?

Quantum computers will not replace the traditional computers we all use now. Instead they will work hand-in-hand to solve computationally complex problems that classical computers can’t handle quickly enough by themselves. There are four principal computational problems for which hybrid machines will be able to accelerate solutions—building on essentially one truly “quantum advantaged” mathematical function. But these four problems lead to hundreds of business use cases that promise to unlock enormous value for end users in coming decades. ... Not only is this approach inefficient, it also lacks accuracy, especially in the face of high tail risk. And once options and derivatives become bank assets, the need for high-efficiency simulation only grows as the portfolio needs to be re-evaluated continuously to track the institution’s liquidity position and fresh risks. Today this is a time-consuming exercise that often takes 12 hours to run, sometimes much more. According to a former quantitative trader at BlackRock, “Brute force Monte Carlo simulations for economic spikes and disasters can take a whole month to run.” 


Can companies build on their digital surge?

If digital is the heart of the modern organization, then data is its lifeblood. Most companies are swimming in it. Average broadband consumption, for example, increased 47 percent in the first quarter of 2020 over the same quarter in the previous year. Used skillfully, data can generate insights that help build focused, personalized customer journeys, deepening the customer relationship. This is not news, of course. But during the pandemic, many leading companies have aggressively recalibrated their data posture to reflect the new realities of customer and worker behavior by including models for churn or attrition, workforce management, digital marketing, supply chain, and market analytics. One mining company created a global cash-flow tool that integrated and analyzed data from 20 different mines to strengthen its solvency during the crisis. ... While it’s been said often, it still bears repeating: technology solutions cannot work without changes to talent and how people work. Those companies getting value from tech pay as much attention to upgrading their operating models as they do to getting the best tech. 


Understanding Direct Domain Adaptation in Deep Learning

To fill the gap between Source data (train data) and Target data (Test data) a concept called domain adaptation is used. It is the ability to apply an algorithm that is trained on one or more source domains to a different target domain. It is a subcategory of transfer learning. In domain adaptation, the source and target data have the same feature space but from different distributions, while transfer learning includes cases where target feature space is different from source feature space. ... In unsupervised domain adaptation, learning data contains a set of labelled source examples, a set of unlabeled source examples and a set of unlabeled target examples. In semi-supervised domain adaptation along with unlabeled target examples there, we also take a small set of target labelled examples And in supervised approach, all the examples are supposed to be labelled one Well, a trained neural network generalizes well on when the target data is represented well as the source data, to accomplish this a researcher from King Abdullah University of Science and Technology, Saudi Arabia proposed an approach called ‘Direct Domain Adaptation’ (DDA).



AI: The Next Generation Anti-Corruption Technology

Artificial intelligence, according to Oxford Insights, is the “next step in anti-corruption,” partially because of its capacity to uncover patterns in datasets that are too vast for people to handle. Humans may focus on specifics and follow up on suspected abuse, fraud, or corruption by using AI to discover components of interest. Mexico is an example of a country where artificial intelligence alone may not be enough to win the war. ... As a result, the cost of connectivity has decreased significantly, and the government is currently preparing for its largest investment ever. By 2024, the objective is to have a 4G mobile connection available to more than 90% of the population. In a society moving toward digital state services, the affordable connection is critical. The next stage is for the country to establish an AI strategy. The next national AI strategy will include initiatives such as striving toward AI-based solutions to offer government services for less money or introducing AI-driven smart procurement. In brief, Mexico aspires to be one of the world’s first 10 countries to adopt a national AI policy.


Introduction to the Node.js reference architecture, Part 5: Building good containers

Why should you avoid using reserved (privileged) ports (1-1023)? Docker or Kubernetes will just map the port to something different anyway, right? The problem is that applications not running as root normally cannot bind to ports 1-1023, and while it might be possible to allow this when the container is started, you generally want to avoid it. In addition, the Node.js runtime has some limitations that mean if you add the privileges needed to run on those ports when starting the container, you can no longer do things like set additional certificates in the environment. Since the ports will be mapped anyway, there is no good reason to use a reserved (privileged) port. Avoiding them can save you trouble in the future. ... A common question is, "Why does container size matter?" The expectation is that with good layering and caching, the total size of a container won't end up being an issue. While that can often be true, environments like Kubernetes make it easy for containers to spin up and down and do so on different machines. Each time this happens on a new machine, you end up having to pull down all of the components. 


Now is the time to prepare for the quantum computing revolution

We've proven that it can happen already, so that is down the line. But it's in the five- to 10-year range that it's going to take until we have that hardware available. But that's where a lot of the promises for these exponentially faster algorithms. So, these are the algorithms that will use these fault-tolerant computers to basically look at all the options available in a combinatorial matrix. So, if you have something like Monte Carlo simulation, you can try significantly all the different variables that are possible and look at every possible combination and find the best optimal solution. So, that's really, practically impossible on today's classical computers. You have to choose what variables you're going to use and reduce things and take shortcuts. But with these fault-tolerant computers, for significantly many of the possible solutions in the solution space, we can look at all of the combinations. So, you can imagine almost an infinite amount or an exponential amount of variables that you can try out to see what your best solution is.


Ragnarok Ransomware Gang Bites the Dust, Releases Decryptor

The gang is the latest ransomware group to shutter operations, due in part to mounting pressures and crackdowns from international authorities that already have led some key players to cease their activity. In addition to Avaddon and SyNack, two heavy hitters in the game — REvil and DarkSide – also closed up shop recently. Other ransomware groups are feeling the pressure in other ways. An apparently vengeful affiliate of the Conti Gang recently leaked the playbook of the ransomware group after alleging that the notorious cybercriminal organization underpaid him for doing its dirty work. However, even as some ransomware groups are hanging it up, new threat groups that may or may not have spawned from the previous ranks of these organizations are sliding in to fill the gaps they left. Haron and BlackMatter are among those that have emerged recently with intent to use ransomware to target large organizations that can pay million-dollar ransoms to fill their pockets. Indeed, some think Ragnarok’s exit from the field also isn’t permanent, and that the group will resurface in a new incarnation at some point.



Quote for the day:

"Leadership is a matter of having people look at you and gain confidence, seeing how you react. If you're in control, they're in control." -- Tom Laundry

Daily Tech Digest - April 06, 2020

How DevOps is integral to a cloud-native strategy

How DevOps is integral to a cloud-native strategy image
Containerisation allows applications to be made environment-agnostic and eliminates application conflicts between developers and operations teams, in turn allowing greater collaboration between developers and testers. Breaking down monolithic applications into constituent microservices also increases agility and creates a common toolset, terminology, and set of processes between development and operations teams, which makes it easier for these teams to work with one another. This enables the advanced automation of processes and contributes to an organisation’s move towards agile software development (defined by the continuous delivery of software created in rapid iterations). It’s important to stress that these technologies will only be successfully implemented if that cultural shift happens too, which is where embracing DevOps becomes key. Going cloud-native is a gradual process and a learning experience. Most organisations have established IT environments that use on-premise applications.


"An increase in state digital surveillance powers, such as obtaining access to mobile phone location data, threatens privacy, freedom of expression, and freedom of association, in ways that could violate rights and degrade trust in public authorities -- undermining the effectiveness of any public health response. Such measures also pose a risk of discrimination and may disproportionately harm already marginalized communities," the joint statement said. "These are extraordinary times, but human rights law still applies. Indeed, the human rights framework is designed to ensure that different rights can be carefully balanced to protect individuals and wider societies. "States cannot simply disregard rights such as privacy and freedom of expression in the name of tackling a public health crisis. On the contrary, protecting human rights also promotes public health. Now more than ever, governments must rigorously ensure that any restrictions to these rights is in line with long-established human rights safeguards." As part of the statement, the signatories set out eight proposed conditions for all governments to adhere to if increased digital surveillance is used to respond to the COVID-19 pandemic.


Fog and Edge Computing: Principles and Paradigms provides a comprehensive overview of the state-of-the-art applications and architectures driving this dynamic field of computing while highlighting potential research directions and emerging technologies. Exploring topics such as developing scalable architectures, moving from closed systems to open systems, and ethical issues arising from data sensing, this timely book addresses both the challenges and opportunities that Fog and Edge computing presents. ... The Cloud Adoption Playbook helps business and technology leaders in enterprise organisations sort through the options and make the best choices for accelerating cloud adoption and digital transformation. Written by a team of IBM technical executives with a wealth of real-world client experience, this book cuts through the hype, answers your questions, and helps you tailor your cloud adoption and digital transformation journey to the needs of your organisation. ... The updated edition of this practical book shows developers and ops personnel how Kubernetes and container technology can help you achieve new levels of velocity, agility, reliability, and efficiency.


Applications: Combining the old with the new


There are a few reasons why mainframes applications cannot be migrated to public cloud infrastructure easily. Cresswell says mainframe applications will not run on the underlying cloud hardware without significant refactoring and recompilation. “They are typically compiled into mainframe-specific machine code and the mainframe instruction-set architecture is substantially different from the x86 platforms that underpin almost all cloud services,” he says. “Legacy mainframe applications rely on infrastructure software to manage batch and online activity, data access and many other legacy mainframe features. Like the applications themselves, this infrastructure software is also tied to the physical mainframe hardware and will not run in a conventional x86 cloud environment.” Another barrier to migrating mainframe systems is that the mainframe software development pipeline cannot support many of the rapid deployment features that cloud-native applications rely on, says Cresswell, and it is virtually impossible to spin up testing environments on mainframes without extensive planning.


7 Key Principles to Govern Digital Initiatives


An important starting point is to take an inventory of digital initiatives. This may sound like a straightforward task, but it is often quite challenging. People are reluctant to share information for fear they may lose control over their initiatives. Thus, it is helpful to stress that the inventory phase is about the centralization of information about digital initiatives, not control over them. Fred Herren, senior vice president, digital and innovation at SGS, the world’s largest provider of inspection, testing, and certification services, understood that applying a top-down approach to rules rarely works in decentralized cultures. He noted, “I think it’s necessary to walk the talk rather than give instructions. I’ve managed to get a lot of information because I’m not telling employees to stop [their activities]. I walk around and ask people what’s new and I always react positively.” ... Establishing appropriate key performance indicators (KPIs) is a critical exercise, particularly for digital initiatives that are highly dependent on strategic priorities related to the company’s future vision, success, and implementation objectives. However, when we asked leaders how they measure the performance of digital initiatives, most of them answered in one of two ways: either “we don’t” or “it depends.”


Emerging from AI utopia

Embedded Image
Facial recognition is a good example of an AI-driven technology that is starting to have a dramatic human impact. When facial recognition is used to unlock a smartphone, the risk of harm is low, but the stakes are much higher when it is used for policing. In well over a dozen countries, law enforcement agencies have started using facial recognition to identify “suspects” by matching photos scraped from the social media accounts of 3 billion people around the world. Recently, the London Metropolitan Police used the technology to identify 104 suspects, 102 of whom turned out to be “false positives.” In a policing context, the human rights risk is highest because a person can be unlawfully arrested, detained, and ultimately subjected to wrongful prosecution. Moreover, facial recognition errors are not evenly distributed across the community. In Western countries, where there are more readily available data, the technology is far more accurate at identifying white men than any other group, in part because it tends to be trained on datasets of photos that are disproportionately made up of white men. Such uses of AI can cause old problems—like unlawful discrimination—to appear in new forms. Right now, some countries are using AI and mobile phone data to track people in self-quarantine because of the coronavirus disease 2019 pandemic. The privacy and other impacts of such measures might be justified by the scale of the current crisis, but even in an emergency, human rights must still be protected. Moreover, we will need to ensure that extreme measures do not become the new normal when the period of crisis passes.


Is Blockchain Necessary? An Unbiased Perspective

Is Blockchain Necessary? An Unbiased Perspective
Bankers hate blockchain. It’s obvious why they would; the greatest advantage of blockchain is that it cuts down on costs, only requiring infrastructure costs. No transaction fees, no maintenance charges, nothing. Effectively, blockchain makes banking obsolete, and honestly, I feel it should. The banking industry has remained unchanged over millennia. It is an integral part of society whose mismanaged monetary transactions have incited myriad wars. Unfortunately, the banking industry is in a pathetic state. Bankers have too much power, control and streams of revenue. It needs to topple. It’s a legacy system, and the pain points of this system haven’t changed since the days of Venetian merchants. There is so much abuse of power involved, and the fact that it is legal paints a grim picture. For example, the man who invented the credit card never wanted interest rates to go over 8%. Today, banks on average charge from 12% to 18% not including transaction, processing and various other fees. Blockchain can destroy and recreate this system. However, this brings us to the greatest chink in blockchain’s armor: This transformative process is expensive and decentralized.


Remote Working: What It Means For RPA


RPA still has considerable risks with remote working. If anything, companies will need to engage in even more planning with their systems. “Enterprise grade security needs to be baked into any RPA platform from the start, which helps provide greater resilience and business continuity,” said Jason Kingdon, who is the Executive Chairman at Blue Prism. There will also need to be more attention paid to managing bot development and deployment. Otherwise there could be much more sprawl across an organization, lessening the benefits of the technology. This is why its important to have a Center-of-Excellence or COE (you can learn more about this from one of my recent Forbes.com posts). “You need to have a group of champions who control the system, and monitor what bots are being built and who is building them,” said Tabakman. “It’s best to provide regular training around bot design and consider an approval process, where your champions review bots before they’re deployed. You’ll want to ensure that a bot being created doesn’t create more problems than it solves, such as bots that go into infinite loops, resulting in more work for IT teams.


Overcoming flat data to unlock business insight and productivity

Overcoming flat data to unlock business insight and productivity image
Artificial intelligence is eliminating entire swathes of manual intervention in the processing of documents, and, more importantly, adding context to them. It’s not enough to simply scan a document and store it along with a reference number: the technology must be able to add meaning to it and to create links with other related data, structured or unstructured. This type of technology falls into a category that we call Context Driven Productivity. At its core is the ability to extract information from flat data and transform it into semantic data, whereby links are created to other data sources, both internal and external, building relationships, connections and additional meaning. Semantic data allows humans or AI robots to gain contextual information automatically, rather than having to rely on a limited number of hard-wired connections. In practical terms, the possibilities are enormous. Not only will administrative workers be freed from the tedious task of manually processing incoming documents, but the resulting context-driven data will be infinitely more useful to any organisation.


How cloud computing is changing the laboratory ecosystem


Cloud computing allows labs to partake in immense computing processes without the cost and complexity of running onsite server rooms. Switching from an onsite solution to the cloud alleviates the costs of IT infrastructure, reducing the cost of entry into the industry, while also leveling the playing field for smaller laboratories. Moreover, cloud computing can allow data to be extracted from laboratory devices to be put in the cloud. Device integration between lab equipment and cloud services allows real-life data from experiments to be collated in a cloud system. One of the most popular products in the market is Cubuslab, a plug-and-play solution that serves as a laboratory execution system and collects instrument data in real time as well as managing devices remotely. This new collection of high amounts of data requires a centralised system that integrates the scientists protocols and experimental annotations. The electronic lab notebook, is starting to become a common tool in research by allowing users to organise all their different data inputs and retrieve this data at any point. This also allows for large R&D projects to effectively control data over their scalability potential.



Quote for the day:


"The art of communication is the language of leadership." -- James Humes