We are on the cusp of a new era – the convergence of IT, OT and Internet of Things (IoT). While IoT is relatively new, the biggest challenge for security and risk professionals to figure out how to bring OT into the fold in a broader security management program, which was traditionally managed by engineers. These roles are expanding and getting more complex. Security has historically being about confidentiality, integrity and availability, but cybersecurity – where IT, OT and IoT come into play – is bringing safety to the forefront as the fourth element. As digital blurs with physical, it becomes possible for digital means to effect kinetic changes, for the technology and automation of devices, people and physical environments to be used to cause injury or loss.
IoT devices aren’t just passive data generators relaying information out to Big Data analytics engines. Control systems are some of the oldest examples of the Internet of Things. For example, 33 years ago in 1982, CMU students built the first Internet Coke Machine, so students could order sodas while still at their desktops, charge the cost, and then go pick it up. At the 1989 Interop conference, Dan Lynch with others created the first Internet ... The value here is in automation and distributed control. Security still needs much more attention when connecting devices over the network, per the recent Wired story on how a car was hacked while being driven.
"Developers are the new heroes of the idea economy," said Mahony. "Through our Haven and Haven OnDemand platforms, we are empowering these heroes to transform their business through data, by allowing them to harness the value of all forms of information, rapidly connect and apply open source, and quickly access the tools they need to build winning businesses." Also addressing the keynote audience was recent Turing Award winner Mike Stonebraker, CTO and co-founder of Tamr. He said that the development of the column store database was the most disruptive thing I ever did. "It transformed the market," he said, and lead to the Vertica big data platform that HP acquired in 2011.
As is obvious, digital technology’s impact is visible in a big way due to widespread adoption of smartphones, tablets, and social apps. These offer great ease to customers who can use digital channels for interacting with financial institutions from anywhere anytime. The transformational potential of digital technology had undoubtedly eased the customer connect. Customer convenience is more evident in the smart usage of digital technology, like in the case of online, mobile and now social banking. However the real issue is to offer reliable, secure, and superior customer experience through these new ways, and software testing has a major role to play in ensuring these goals.
Currently available technical debt quantification tools focus only on a few dimensions such as code debt and to some extent design debt and test debt. Such tools do not provide a comprehensive support to detect issues pertaining to other dimensions such as architecture debt or documentation debt. In fact, the comprehensiveness of the supported dimensions is also questionable! For instance, how many design debt issues (or design smells) such tools identify and report? Although, such tools support a set of design rules (that may lead to design smell detection), but such rules are just handful. Further, dealing with false positives (i.e., false alarms) generated by the underlying analysis tools is inherently difficult.
When purchasing storage, there are two main areas of risk: financial and technological. To mitigate financial risk, service providers should ask the vendor about its capacity management and scale model. For example, purchasing too much capacity up front can threaten a provider’s profitability. To avoid financial risk, it is critical that the vendor allows for scaling capacity up and down as needed. To reduce technological risk, service providers should consider if the vendor forces migrations and redevelopment of automation, orchestration and integration when moving from one version to another.
In the next few years, expect to see science fiction become retail fact, as augmented reality enhances trying-on-and-buying everything from clothes, cars and furniture to books, movies, and video games. Expect concerns over privacy (though important) to be offset by the convenience of highly personalized services and customized information. IKEA lets you paint, style and place virtual furniture anywhere you drop their product catalogue through your smart phone or tablet. Lego lets you see and rotate a fully constructed and animated Lego set on top of the box at a kiosk or through your device.
The initial design of a device can take months, along with the time needed to create working prototypes. Hunting for the best manufacturing partners can be challenging, and locating the best materials—at the best price point—is key to production success. Straight-forward design and development costs can start in the hundreds of thousands of dollars. The materials available for the creation of wearable devices, from sharp leather bands to precision-cut stainless steel, form an area ripe for misunderstandings. “You might see some of these materials on an Apple watch, but remember that Apple is getting a volume discount and leveraging their supply chain,” Patel said. "Startups obviously don’t have that advantage, so it’s going to cost more.”
The main thing to know is that the chip in the card is communicating with the network behind the terminal to enhance security instead of just forwarding your card number and related data to the network, as with the magnetic stripe approach. ... The chip can communicate a unique encrypted token (or an alias) with the network instead of your actual credit card number. That way, the network, and even the store, won't know your card number. When the token reaches your bank, it is decrypted so the bank can verify your account and then authorize payment. This all happens in a few seconds or less. As to whether the security is necessary, the answer is again, yes, especially for banks, but not necessarily for card users.
Serving as a replacement for MapReduce, Dataflow was designed to analyze pipelines with arbitrarily large datasets, crunching information in either streaming or batch mode. After being pushed out as an alpha release, Google later tacked on an open sourced SDK for Java to make it easier for developers to integrate with Google's managed service in order to port Dataflow to other development languages and environments. Dataflow finally made its way into beta by this April as the ... As for Cloud Pub/Sub, designed for integrating apps and services to then analyze their data streams in real-time, Google Cloud product managers touted in a blog post on Wednesday this release follows a "decade of internal innovation."
Quote for the day:
“Only by binding together as a single force will we remain strong and unconquerable.” -- Chris Bradford