Daily Tech Digest - June 07, 2018

Microsoft drops data center into the sea: 'It will keep working for five years'

natickfrance063-768x512.jpg
According to Naval Group, the French defense naval-systems contractor that built Microsoft's pod, the data center has a payload of 12 racks containing 864 servers with a cooling system. After assembly, it was moved by truck to Scotland, from where it was dragged out to sea on a raft and then carefully lowered 117 feet, 35.6 meters, to a rock slab on the seabed. Although the data center is built to last five years, it will remain on the seabed for at least a year as Microsoft observes how it fares. The pod is attached by a cable leading back to the Orkney Islands electricity grid, which supplies 100 percent renewable wind and solar energy to about 10,000 residents. The data center itself needs a quarter of a megawatt. The Natick team also explained the pod's cooling system and how it uses ocean water to cool liquids inside the system. "The interior of the data-center pod consists of standard computer racks with attached heat exchangers, which transfer the heat from the air to some liquid, likely ordinary water," they said.



How to think like a programmer — lessons in problem solving

Do not try to solve one big problem. You will cry. Instead, break it into sub-problems. These sub-problems are much easier to solve. Then, solve each sub-problem one by one. Begin with the simplest. Simplest means you know the answer (or are closer to that answer). After that, simplest means this sub-problem being solved doesn’t depend on others being solved.Once you solved every sub-problem, connect the dots. ... “If I could teach every beginning programmer one problem-solving skill, it would be the ‘reduce the problem technique.’ For example, suppose you’re a new programmer and you’re asked to write a program that reads ten numbers and figures out which number is the third highest. For a brand-new programmer, that can be a tough assignment, even though it only requires basic programming syntax. If you’re stuck, you should reduce the problem to something simpler. Instead of the third-highest number, what about finding the highest overall? Still too tough? What about finding the largest of just three numbers? Or the larger of two? Reduce the problem to the point where you know how to solve it and write the solution. Then expand the problem slightly and rewrite the solution to match, and keep going until you are back where you started.” 


What is pervasive engineering?

Pervasive engineering is physical product (and software) development designed to harness information streams from digitally tracked (typically Internet of Things centric) assets using smart sensors that are connected to an analysis hub of data analytics and management. Pervasive simulation (through the use of digital twins and supporting data analytics) allows (physical product and software) engineers to explore design and product development using real-world conditions. Prototypes can be created to ‘fork’ concepts that skew existing products (or services) while those existing assets remain in working operation, in their core pervasive state. The state of all machine assets is therefore developed continuously, iteratively and pervasively. You can read more here from Ansys on how it positions its approach to this element of design and it is from these pages that we have drawn the above definition. The firm’s SAP partnership embeds Ansys’ pervasive simulation software for digital twins into SAP’s digital supply chain, manufacturing and asset management portfolio. The partnership’s first result is called SAP Predictive Engineering Insights enabled by Ansys and has been built to run on the SAP Cloud Platform.


You’re probably doing your IIoT implementation wrong

You’re probably doing your IIoT implementation wrong
One of the great misconceptions about the IIoT is that it’s a brand-new concept – factory floors and utility stations and other major infrastructure have all been automated to one degree or another for decades. What’s different, however, is the newly interconnected nature of this technology. Steve Hanna, senior principal at Infineon Technologies, said that the security risks of IIoT have grown rapidly of late, thanks to a growing awareness of IIoT attack vectors. A factory that was never designed to be connected to the Internet, with plenty of sensitive legacy equipment that can be 30 years old or older and designed to work via serial cable, can find itself suddenly exposed to the full broadside of remote bad actors, from Anonymous to national governments. “There’s a tool called Shodan that allows you to scan the Internet for connected industrial equipment, and you’d be surprised at the number of positive results that are found with that tool, things like dams and water and sewer systems,” he said. The most common oversights, according to Hanna, are a lack of two-factor identification, allowing hackers to compromise equipment they find via things like Shodan, and direct interconnections between an operational equipment network and the Internet.


Why Microsoft's GitHub Deal Isn't a Sign of the Apocalypse

Image: Pixabay
Despite loud protests and much rending of garments by many in the Minecraft community, the video game sandbox remains as popular as ever. What many GitHub developers fail to realize is that their friendly community was going to be acquired by someone anyway — either that or face eventual liquidation. The private company, CEO-less and still feeling the effects of a workplace discrimination charge, was burning through money with no immediate prospects of additional venture capital funding or launching an IPO. It's just as well that Microsoft stepped forward with piles of cash to make things better. Would GitHub developers feel any more loved in the hands of an Apple, Google or Oracle? Really? Remember, too, that even if Microsoft actually does revert back to its bumbling old days and somehow manages to run GitHub into the ground, the open source coder community is not without viable alternatives. GitLab, a GitHub rival, recently boasted that it has seen a 10-fold increase in the number of developers moving their repositories over to its service. And if GitLab somehow drops the open source ball, it's inevitable all the young stallions will likely find yet another place to hang their backpacks.


What's the difference between low-code and no-code platforms?

istock-846843314.jpg
The difference between no-code and low-code platforms principally comes down to the approachability, ease of use and the level of technical knowledge that the user is assumed to require to have. With a no-code platform like Quick Base, a majority of our customers have no programming skills whatsoever, and they're able to use Quick Base to basically help burn down their backlogs, streamline workflows, and get their work done very quickly. Low-code platforms, on the other hand, also very useful and important, do assume some level of technical sophistication and technical skills in their users, and they're principally aiming at helping those IT developers get a very productive platform for them to be able to build and deliver projects quickly. No-code platforms in particular can help companies drive their digital transformation, by really providing the power of software to many more people in their organization. At Quick Base, what we found time and time again with our customers is that their IT and developer groups are working very hard on the big rock priorities within their organization, and what Quick Base can really help them do is move forward tons and tons of little rocks, little efforts that sort of stack up in a backlog of priorities that central IT


Is explainability enough? Why we need understandable AI

In order to create this human-centric ‘understandable’ AI, a person must be empowered through the AI to make a decision on the algorithmically ambiguous decisions. This means that the AI making the initial decisions about the veracity of a transaction has to also be built in a way that a human reviewing a specific issue can help resolve – without being a data scientist or “algorithmically literate”. Using a non-black box model, the data scientist can identify confidence parameters and inform the UI/UX designer what the nature of these parameters are. The UI/UX designer then creates an AI output that is descriptive rather than prescriptive. That is, it would clearly explain the confidence parameters and enable an end user to provide reasoning for a decision. Do we simply want machines to take over and make decisions for us? Likely not – we’ll instead want to take a more collaborative approach with machines where they augment us to make better decisions but allow us to manage inputs and set guidelines. Therefore, we need not only transparency and explainability but also, understanding.


The game-changing potential of smartphones that can smell

nose
How can a smartphone smell? According to KIT, "the electronic nose only is a few centimeters in size. The nose consists of a sensor chip equipped with nanowires made of tin dioxide on many individual sensors. The chip calculates specific signal patterns from the resistance changes of the individual sensors. These depend on the molecules in ambient air, differ for the different scents and, hence, are characteristic and recognizable. If a specific pattern has been taught to the chip before, the sensor can identify the scent within seconds. To start the process, the researchers use a light-emitting diode that is integrated in the sensor housing and irradiates the nanowires with UV light. As a result, the initially very high electrical resistance of tin dioxide decreases, such that changes of resistance caused by molecules responsible for the smell and attached to the tin dioxide surface can be detected." Clearly, this research has a long way to go before handset manufacturers will be open to including such a mechanism, but who nose? After all, in the ultra-tight packed innards of today's smartphone designs, "a few centimeters" is hardly trivial.


10 ways the enterprise could put blockchain to work

istock-849254008-1.jpg
The distributed ledger technology has the promise to make many operations more efficient and enable new business initiatives. But limitations in the technology itself as well as the business issues that arise with its implementation have curtailed mass adoption, said David Furlonger, vice president and fellow at Gartner. "All firms can do right now is experiment," Furlonger said. "They have to look at multiple offerings in the marketplace and understand the different governance models, data management architectures, security levels, and how it impacts their business." A large number of companies are now inquiring about the technology, said Martha Bennett, principal analyst at Forrester. "Many of the firms I speak with have projects going on, but not always with a firm view on whether to operationalize them," Bennett said. "There are a few highly ambitious projects under way, but these haven't gone live yet."Putting blockchain to work depends on the use case, Bennett said. "If there is a use case that calls for multi-party collaboration around shared trusted data, with or without an added element of automation, then that's worth pursuing if the existing system is error-prone, full of friction, or otherwise deficient," Bennett said.


How to protect physical infrastructure from cyberattacks

How do you know the unknown? So really, it's not really about identifying who these future or current threat actors might be, it's about understanding the types of attacks that we might be vulnerable to. The types of attacks that are emerging. We see for example evolutions in AI technology coming on very, very fast here. We realize, well, AI has the potential of being extremely good for our quality of life and the products that we build, but ultimately this technology is going to be turned against us. So as cyber professionals, it's our job to start to anticipate this technology and how these technologies are going to then be applied to the attack vectors, attacking our devices looking for openings, and how we can then build the fences against what we anticipate. It's very much a game of understanding our product, understanding the attacked surface, and building the fences for these types of attack vectors. ... . So when you start looking at the world of cybersecurity today and you look at the type of markets we're dealing with, some of the challenges come right down to geopolitical attacks as Andy was mentioning just a few seconds ago. 



Quote for the day:


"Intuition becomes increasingly valuable in the new information society precisely because there is so much data." -- John Naisbitt


No comments:

Post a Comment