Daily Tech Digest - April 12, 2020

AI (Artificial Intelligence) Projects: Where To Start?

GUI (Graphical User Interface) concept.
You don’t want to spend time and money on a project and then realize there are legal or compliance restrictions. This could easily mean having to abandon the effort. “First, customer data should not be used without permission,” said Debu Chatterjee, who is the senior director of platform AI engineering at ServiceNow. “Secondly, bias from data should be mitigated. Any model which is a black box and cannot be tested through APIs for bias should be avoided. The risk of bias is present in nearly any AI model, even in an algorithmic decision, regardless of whether the algorithm was learned from data or written by humans.” In the early phases of an AI project, there should be lots of brainstorming. This should also involve a cross-section of people in the organization, which will help with buy-in. The goal is to identify a business problem to be solved. “For many companies, the problem is that they start with a need for technology, and not with an actual business need,” said Colin Priest, who is the VP of AI Strategy at DataRobot. “It reminds me of this famous quote from Steve Jobs, ‘You’ve got to start with the customer experience and work backwards to the technology. You can’t start with the technology and try to figure out where you’re going to sell it.’”


How to Reduce Remote Work Security Risks

istock 876819100
Employees should remain cautious of downloading random applications or software to avoid malware, viruses, or insecure protocols. If they’re unsure, they should check with IT support or their Security team. Also, remind remote workers to be careful when sharing confidential data. They should use company-issued apps for file sharing, storage of confidential documents, and communication. Let them know this is for their own safety, too, that the company has protective measures around these apps and can monitor for suspicious behavior. Consistently communicate with your employees. Ultimately, keeping everyone informed on how to secure their home technologies and practice security in their everyday lives trumps technologies. Maintain communication in a variety of communication channels, to keep them up-to-date on the latest security threats and how to reduce their risk to their personal, and company information. Make sure your security and IT experts are household names, available for questions and sharing red flags.


Automated Machine Learning Is The Future Of Data Science

Data Science
The objective of autoML is to abbreviate the pattern of trial and error and experimentation. It burns through an enormous number of models and the hyperparameters used to design those models to decide the best model available for the data introduced. This is a dull and tedious activity for any human data scientist, regardless of whether the individual in question is exceptionally talented. AutoML platforms can play out this dreary task all the more rapidly and thoroughly to arrive at a solution faster and effectively.A definitive estimation of the autoML tools isn’t to supplant data scientists however to offload their routine work and streamline their procedure to free them and their teams to concentrate their energy and consideration on different parts of the procedure that require a more significant level of reasoning and creativity. As their needs change, it is significant for data scientists to comprehend the full life cycle so they can move their energy to higher-value tasks and sharpen their abilities to additionally hoist their value to their companies.



How Hyperscale Storage Is Becoming More Accessible

It is a scale-out solution that enables you to scale compute and storage independently. And it's through software-defined storage. So you can pick any client, any server, any network, we can run on a quanta server, HP Dell, we can run with Intel CPU, on AMD, or even on arm. There are two main components that I want to touch on. The first one is the NVMe over TCP. This is basically a standard that we invented together with Facebook, Dell, Intel, and a few others. Today, the standard is already fully ratified. What we have here is a super optimized TCP stack userspace that combined together with the NVMe stack, and gives us the ability to support in a very large data center, thousands of connection thousands of containers in millimeter or virtual environment. The second very important layer is the global FTL. FTL is a flash translation layer. That's the layer you can find in every SSD. It's a very high level during the translation between the logical a transaction A to the storage system to the physical transaction to the flesh, what we have done in lightweights.


COVID-19 is accelerating CI/CD adoption

COVID-19 is accelerating CI/CD adoption
As it turns out, the stakes are much higher given the now pervasive work-from-home arrangements most organizations now embrace. Talking with Rose in a phone interview, he stressed that even after years of DevOps discussion, “You still have a lot of companies that are doing most of their software testing on-prem and behind the firewall. The big installed base remains Jenkins in a proprietary data center.” This wasn’t ideal but it was workable when developers and operations professionals worked in an office environment, within the firewall. In a remote-only situation, getting access to the application development workflow is “tricky,” he stresses, because, in part, there’s no guarantee that you’ll be able to VPN in. And so companies are moving much faster than planned from private data centers to public clouds, in an effort to move workloads to a place where modern CI/CD can happen. “All the timelines have shrunk,” Rose says. Over the last two years companies have realized they need to move faster, but perhaps still struggled to start moving. “Now every company is trying to get apps to be cloud-enabled or cloud-native,” he stresses.


Zoom Promises Geo-Fencing, Encryption Overhaul for Meetings


In response to Citizen Lab's report, Zoom immediately promised to implement geo-fencing to ensure that no keys get routed via China, except for China-based users. Yuan attributed the routing of keys via China to a development error as the company attempted to rapidly scale up to meet a surge of demand, starting in China, where the COVID-19 outbreak began, leading the company to allow much greater, free access to its tool, in part, to support medical professionals. (Free versions typically otherwise have a 40-minute time limit for meetings.) "In February, Zoom rapidly added capacity to our Chinese region to handle a massive increase in demand," Yuan says. "In our haste, we mistakenly added our two Chinese data centers to a lengthy whitelist of backup bridges, potentially enabling non-Chinese clients to - under extremely limited circumstances - connect to them (namely when the primary non-Chinese servers were unavailable). This configuration change was made in February." He says Zoom fixed this problem immediately after learning of it via Citizen Lab. "We have also been working on improving our encryption and will be working with experts to ensure we are following best practices," Yuan says.


DevOps proponent lays it on the line: stop the madness and start automating


The final three steps is where many development teams tend to stumble, Davis says. "The most blissful thing about writing code or doing a complex admin task and so forth is when you get everything in your head, and you can see how everything fits together, and the world disappears, and you know exactly how your org works, and anybody could ask for any change and you can fix things. Developers live for that blissful feeling -- to know everything and fix anything." The catch is, a particular project ends, distractions distract, new projects begin, and time passes, Davis continues. "That disappears out of your working memory right? There may be a day, or a week, or a month delay before you know that you broke something. By the time three weeks has elapsed, you forgot that you even built that thing. And if you remember that you built it, you forget how you built it, you forget exactly why you built it. You can make another change of course, but then it might take you another three weeks until you can get that back to your users." Multiply this by hundreds or even thousands of change requests within a large organization, and it's easy to see how things can go awry. DevOps brings order and flow to this potential madness, and Davis boils it down to a three-step process: development, innovation delivery, and operations.


New machine learning method could supercharge battery development for electric vehicles

New machine learning method could supercharge battery development for electric vehicles
"Computers are far better than us at figuring out when to explore—try new and different approaches—and when to exploit, or zero in, on the most promising ones." The team used this power to their advantage in two key ways. First, they used it to reduce the time per cycling experiment. In a previous study, the researchers found that instead of charging and recharging every battery until it failed—the usual way of testing a battery's lifetime -they could predict how long a battery would last after only its first 100 charging cycles. This is because the machine learning system, after being trained on a few batteries cycled to failure, could find patterns in the early data that presaged how long a battery would last. Second, machine learning reduced the number of methods they had to test. Instead of testing every possible charging method equally, or relying on intuition, the computer learned from its experiences to quickly find the best protocols to test. By testing fewer methods for fewer cycles, the study's authors quickly found an optimal ultra-fast-charging protocol for their battery.


How Big Data and IoT Are Connected


Sensors upon sensors will crop up in all sorts of technologies if they aren’t already. Gigabytes and terabytes of information will whizz between devices at a frightening speed and big data technologies will work even harder to store, process and take value from the collected yet often unstructured sensory information. End-points from numerous locations will knowingly unlock an almost unlimited amount of data, what happens to that data will be considered by those who work in the IoT and big data industries. The result of this interaction will create two likely winners. Firstly, the businesses that can profit from the information provided, and the end-user who has better information to act on. Ultimately, businesses that are seeking to implement IoT into their products are also seeking greater profits, more productivity, higher efficiency and reduced costs. The development of big data technologies works in favor of IoT companies, with both seeking to strategize the ways in which we see and utilize data sets. As for the customer or end-user, they will (if they aren’t already) benefit from the provision of greater useful information, as well as improved customer service and experiences.


Fotolia_131189299_S Sergey Tarasov
In a related twist, customers will, with no surprise, first call their ISPs whenever there is any connectivity problem. In order to provide service, that means a larger call staff. However, what if the problem is a specific device? Even more complex, what if it’s a specific application being run on the phone? An ISP which can quickly identify the root cause of the issue can either fix its own issues or point the customer towards the appropriate firm to provide service. Doing that efficiently will save enormous amounts of money. Identifying technical issues is a clear use case for AI. The question that needs to be answered is how close to the devices can an AI system run. On the ISP’s services, there’s a distance that can obscure some issues. It would be much better to run AI on an individual home’s modem or, even better, a router. The question becomes the footprint. Even runtime AI has not been known for highly efficient resource usage, and many companies have been working to address that for many IoT applications. One such company addressing the issue for the connected home is Veego. They claim to have AI inference that runs on home routers and modems in order to identify performance issues.



Quote for the day:


"As a leader, you set the tone for your entire team. If you have a positive attitude, your team will achieve much more." -- Colin Powell


No comments:

Post a Comment