Basil Leaf Technologies is still working towards creating a Tricorder in the way that most people think of it: a single device that can diagnose a range of conditions. For a real-life Tricorder to serve as a universal diagnostic tool in the way that Star Trek envisioned, it would need to be able to analyse far more biomarkers than the DxtER currently does. Handily, scientists are also working on expanding the capabilities of Tricorder-like devices. Earlier this year, researchers from the University of Glasgow created a handheld sensor device based on a CMOS chip that can analyse a number of metabolites in blood or urine, analysing them to diagnose conditions including heart attacks. Elsewhere, companies are working on creating Tricorder type hardware with a focus on infectious disease: the Q-POC, made by QuantumDx, is expected to launch next year, and brings handheld diagnostics for bacterial and viral infections.
“The increasing use of hybrid cloud environments by enterprises also lines up nicely with the software-defined data center story, which HCI is certainly a large part of,” Lagana says. HCI has become a suitable platform for broader use due to a lot of the underlying improvements in the technology, Lagana says. At the same time, many enterprises have gone through an IT “refresh cycle” and HCI seems like a natural transition. “We’ve spoken with some HCI adopters and, in some cases, folks we’re talking to are upgrading multiple generation-old infrastructure running on old, sometimes now unsupported software,” Lagana says. “At that point, if the old server and/or storage technology they’re using is that far behind what’s now available, it becomes a matter of the level of complexity they’re seeking in their new environment.”
The best thing is to acknowledge your wireless ecosystem has security holes in it. This is even more likely when you have users connecting to random wireless hotspots at home, while traveling and so on. Even if you eliminate all the above vulnerabilities and implement WPA3, your business can be exposed to someone mimicking a legitimate AP -- the "evil twin" vulnerability, which has been around since the inception of Wi-Fi. Not only can an evil twin attack exploit network systems and information, but when it does happen you'll likely never know about it. The evil twin vulnerability can be mitigated using a wireless intrusion prevention system offered by many of the big networking vendors. Still, these systems won't protect your mobile users when they are out and about.
Regulator action will take time – six months is too early to get a proper read. Yet, we can still get a feel for what is going on by looking at what’s happening in a given country. The UK is interesting; their Information Commissioner predates GDPR as UKs privacy regulations go back to 1998. The UK commissioner is currently publishing findings and leveling fines after investigations for activities dating back to 2016. That gives us a feel for how long investigations may take under GDPR. Perhaps we will not know the full impact for another two years to the magnitude of fines levied. Facebook’s challenges with Cambridge Analytica were lucky in that they fell under the prior law resulting in a smaller 500K GDP fine than the billions allowed by GDPR. Breaches at British Airways and others, which took place since GDPR became active, are being carefully monitored to see if in fact they were properly reported to the UK commission within the 72-hour limit of being discovered.
When working on complex challenges, you’ll need to try doing new things (new offerings) and doing old things in new ways (new processes). But this risk-taking has to be prudent. At my firm, new team members must have the diligence and humility to learn the established way of handling a problem before they invent a new way. We try small experiments in safe contexts (tweaking established offerings and processes with trusting and trusted partners) before trying big experiments in dangerous contexts. In Mexico, for instance, although the work involved a unique situation and lots of trial and error, a foundation of decades of relevant experience enabled us to advance. You can improvise well only if you have practiced a lot. ... Often, you can’t rely only on your own perspective. Ask for feedback: from your colleagues, clients, and anyone else involved with the problem you’re trying to solve. Ask casually and formally, verbally and in writing, and with specific and open-ended questions.
While most organizations are moving business operations to Software as a Service (SaaS) and cloud computing solutions, some organizations retain dependencies on legacy platforms and the software that runs on them. Maintaining access to these deprecated platforms can often be a source of frustration for IT, as aging hardware and software often requires scavenging websites such as Craigslist or eBay for decades-old parts. However, new parts and software can be used in its place, making the process easier. For software which requires the use of older operating systems, VirtualBox can readily be used to virtualize the OS and application, allowing the legacy environment to be used on modern hardware. VirtualBox has a built-in host for Remote Desktop Protocol (RDP), allowing users to connect remotely to a VirtualBox VM. VirtualBox is more adept at handling virtualization for legacy software than QEMU/KVM or other modern hypervisors.
In a worst-case scenario, the credentials for an admin account could grant access to an advanced threat actor – once they are in the environment they can move laterally, placing backdoors, RATs and other software to become persistent, and exfiltrate the data of employees or customers to resell or utilize for their own financial gain. Though phishing and spear-phishing remain somewhat seminal techniques, particularly when combined with social engineering, malware use is often more efficient in terms of volume and timeliness than phishing. Though more complex skills are required for this tactic to be efficient, many malware families are openly sold -as-a-service – AgentTesla, for example is marketed between $6-15 per month, with customer support and updates available, bringing the barrier to entry down. Advanced attackers may use malware to infect machines and move laterally in an organization’s network.
The trojan itself is a giant shell script of over 1,000 lines of code. This script is the first file executed on an infected Linux system. The first thing this script does is to find a folder on disk to which it has write permissions so it can copy itself and later use to download other modules. Once the trojan has a foothold on the system it uses one of two privilege escalation exploits CVE-2016-5195 (also known as Dirty COW) and CVE-2013-2094 to get root permissions and have full access to the OS. The trojan then sets itself up as a local daemon, and even downloads the nohup utility to achieve this operation if the utility is not already present. After the trojan has a firm grasp on the infected host, it then moves on to executing its primary function for which it was designed for, which is cryptocurrency mining. The trojan first scans and terminates the processes of several rival cryptocurrency-mining malware families, and then downloads and starts its own Monero-mining operation.
Probably the main reason for the difficulty in predicting where the IoT market at large is going to go is that there's no general agreement on a precise definition of the boundaries of that market. Hence, the large number of large numbers purporting to describe the "size of the IoT market," which are frequently measuring very different aspects of it. “Everyone knows it’s going to be big,” said Alan Griffiths, principal consultant with market researcher Cambashi. “And no one’s got the faintest idea, in my opinion, of how big it’s going to be.” He talks to top technical people – CIOs and CTOs – for his estimates of the IIoT market, which gives him a better read on who’s buying what. Griffiths’ research on the IIoT market highlights another important point: IoT trend predictions focused on more specific market segments, or on particular technologies, tend to be a lot more digestible. The relevant details needed to create such an analysis are easier to get, and it’s more difficult to make guesswork look presentable.
The problem is with good intention people in business and technology community who are still in the awe of the promises of Bitcoin. They are now hurting the cause and becoming the burden by forcing the one thought or one defined checklist for any Blockchain implementation. I think technology should be allowed to evolve organically and is not made the prisoner of the ‘original idea.’ I believe ‘Identifying the business problem you want to solve’, and will be the key to the success of any Blockchain implementation (rather than the phrase ‘Blockchain implementation’ it should rather be ‘Blockchain network setup’ and ‘application implementations’ on that setup). Eliminating intermediaries is the Utopian idea where one is asking to get into business transaction wherein one has to believe set of programmers of Blockchain platform rather than the entity that can be dragged in the court of law in the situations of dispute.
Quote for the day:
"Don't focus so much on who is following you, that you forget to lead." -- E'yen A. Gardner