Showing posts with label mobility. Show all posts
Showing posts with label mobility. Show all posts

Daily Tech Digest - August 21, 2019

Handing Over the (Digital) Keys: Should You Trust a Smart Lock?


Inherent security flaws that lead to hacks aren’t the only avenue third parties can use to eye your data. Sometimes, it hits a little closer to home. If you have access to the app that controls a smart lock, you can probably see someone leaves and enters for the day, which can be beneficial in knowing your significant other made it home safely. But it could also inform someone of your whereabouts. Technically, if you don’t own the lock, the owner might be able to see your information, too “If a lock is connected to the internet, then there is always the danger that it could be hacked,” Ray Walsh, digital privacy expert for ProPrivacy.com, said in an email to Reviews.com. “Of course, an internet-connected smart lock may be able to feed its owner additional information – such as an alert when someone unlocks it. This data certainly has its merits, but may only be so useful in the end,” Walsh said. For example, although the privacy policy has since changed, Gizmodo found that smart lock company Latch stated GPS information could be stored and shared with owners and any subsequent owners in an archived link from May 8th.


Don’t get woken up for something a computer can do for you; computers will do it better anyway. The best thing to come our way in terms of automation is all the cloud tooling and approaches we now have. Whether you love serverless or containers, both give you a scale of automation that we previously would have to hand roll. Kubernetes monitors the health checks of your services and restarts on demand; it will also move your services when "compute" becomes unavailable. Serverless will retry requests and hook in seamlessly to your cloud provider’s alerting system. These platforms have come a long way, but they are still only as good as the applications we write. We need to code with an understanding of how they will be run, and how they can be automatically recovered. ... There are also techniques for dealing with situations when an outage is greater than one service, or if the scale of the outage is not yet known. One such technique is to have your platform running in more than one region, so if you see issues in one region, then you can failover to another region.


In July, Reuters reported that as part of an effort to combat money laundering, Japan’s government is “leading a global push” to set up for cryptocurrency exchanges a system like SWIFT, the international messaging protocol that banks use for bank-to-bank payments. Last week, a report from Nikkei suggested that 15 governments are planning to create a system for collecting and sharing personal data on cryptocurrency users.  But several people familiar with the FATF-led international discussions around cryptocurrency regulation told MIT Technology Review that these reports don’t have it quite right. There doesn’t appear to be a government-led global cryptocurrency surveillance system in the works—at least not yet. And it’s likely that whatever does eventually emerge won’t look much like SWIFT. Exchanges are still early in the process of figuring out what systems and technologies to use to securely handle sensitive data, Spiro says, and how to do it in a way that complies with a range of local privacy rules. “There are a lot of balls in the air,” he says.


Security concerns blocking UK digital transformation


“Protection and prevention are still paramount yet, to stay ahead of these evolving trends, organisations need to start thinking differently about cyber security. Business leaders need to make the leap from seeing cyber security as only a protective measure, to it also being a strategic value driver,” he said. The report also shows that across many organisations, chief information officers (CIOs) and wider board member views around cyber security are not yet aligned. Business leaders such as the CEO, CFO and COO tend to be less confident about their organisation’s cyber security than those with direct responsibility for IT and technology such as the CIO and chief information security officer (CISO). In addition, technology leaders are more likely to believe it is important for competitive advantage to have a cyber-secure brand (82%), compared with only 68% of business leaders.


Use of Facial Recognition Stirs Controversy

Use of Facial Recognition Stirs Controversy
Over the past several years, the use of facial recognition - along with other technologies such as machine learning, artificial intelligence and big data - has stoked global invasion of privacy fears. In the U.S., the American Civil Liberties Union has taken aim at Amazon's Rekognition product, which uses a number of technologies to enable its users to rapidly run searches against facial databases. The ACLU's Nicole Ozer last year called for guarding against supercharged surveillance before it's used to track protesters, target immigrants and spy on entire neighborhoods. More recently, city officials in San Francisco and Oakland have banned police from using facial recognition technology. The debate over facial recognition technology has also been addressed by several U.S. presidential candidates. On Monday, Democratic hopeful Bernie Sanders became the first presidential candidate to call for a ban on the use of facial recognition by law enforcement. This is one part of a larger criminal justice reform package that the Vermont senator's campaign calls "Justice and Safety for All."


Extreme Programming in Agile – A Practical Guide for Project Managers

extreme programming aquarium example
The XP lifecycle can be explained concerning the Weekly Cycle and Quarterly Cycle. To begin with, the customer defines the set of stories. The team estimates the size of each story, which along with relative benefit as estimated by the customer, indicate the relative value used to prioritize the stories. In case, some stories cannot be estimated by the team due to unclear technical considerations involved, they can introduce a Spike. Spikes are referred to as short, time frames for research and may occur before regular iterations start or along with ongoing iterations. Next comes the release plan: The release plan covers the stories that will be delivered in a particular quarter or release. At this point, the weekly cycles begin. The start of each weekly cycle involves the team and the customer meeting up to decide the set of stories to be realized that week. Those stories are then broken into tasks to be completed within that week. The weekends with a review of the progress to date between the team and the customer. This leads to the decision if the project should continue or if sufficient value has been delivered.


Breakthroughs bring a quantum Internet closer

3 nodes and wires servers hardware
The TUM quantum-electronics breakthrough is just one announced in the last few weeks. Scientists at Osaka University say they’ve figured a way to get information that’s encoded in a laser-beam to translate to a spin state of an electron in a quantum dot. They explain, in their release, that they solve an issue where entangled states can be extremely fragile, in other words, petering out and not lasting for the required length of transmission. Roughly, they explain that their invention allows electron spins in distant, terminus computers to interact better with the quantum-data-carrying light signals. “The achievement represents a major step towards a ‘quantum internet,’ the university says. “There are those who think all computers, and other electronics, will eventually be run on light and forms of photons, and that we will see a shift to all-light,” I wrote earlier this year. That movement is not slowing. Unrelated to the aforementioned quantum-based light developments, we’re also seeing a light-based thrust that can be used in regular electronics too. Engineers may soon be designing with small photon diodes that would allow light to flow in one direction only, says Stanford University in a press release.


Automated machine learning or AutoML explained

Automated machine learning or AutoML explained
Automated machine learning, or AutoML, aims to reduce or eliminate the need for skilled data scientists to build machine learning and deep learning models. Instead, an AutoML system allows you to provide the labeled training data as input and receive an optimized model as output. There are several ways of going about this. One approach is for the software to simply train every kind of model on the data and pick the one that works best. A refinement of this would be for it to build one or more ensemble models that combine the other models, which sometimes (but not always) gives better results. A second technique is to optimize the hyperparameters of the best model or models to train an even better model. Feature engineering is a valuable addition to any model training. One way of de-skilling deep learning is to use transfer learning, essentially customizing a well-trained general model for specific data. Transfer learning is sometimes called custom machine learning, and sometimes called AutoML (mostly by Google). Rather than starting from scratch when training models from your data, Google Cloud AutoMLimplements automatic deep transfer learning and neural architecture search for language pair translation, natural language classification, and image classification.


Considerations for choosing enterprise mobility tools


One option is to use an open source enterprise mobility management (EMM) platform. If the organization is willing to invest in the resources, open source EMM offers the flexibility to customize and extend the source code to match specific needs. IT pros should be aware of challenges that can come with maintaining their own open source EMM, such as hidden costs of deployment and lack of support. A few options for open source EMM include WSO2 Enterprise Mobility Manager or Teclib's Flyve MDM. WSO2's offering includes enterprise mobility tools such as mobile application management and mobile identity management. It also includes open source support for IoT devices, such as enrollment and application management, through IoT Server. Organizations looking for more established enterprise mobility tools can look to UEM platforms including Citrix Workspace, VMware Workspace One, IBM MaaS360, BlackBerry Unified Endpoint Manager, MobileIron UEM or Microsoft Enterprise Mobility + Security, which includes Intune.


The Future Enterprise Architect


Archie II understands the needs of decision makers throughout the organization including the need to provide timely, if not on-demand, decision support based on solid information and analysis. Archie II also understands that he must not only support the decision making processes in the organization, but also to enable those decisions by providing guidance. Archie II is proactive and is often ready with answers before questions arrive. Archie II uses or adapts existing architectures, and/or creates new architectural patterns and models to support analysis he performs in order to make recommendations needed as value chains or value streams progress. Archie II collects just enough information, resulting in just enough architecture, to support the decisions at hand that match the cadence of the business. Yet Archie II is continuously listening, evolving and analyzing his models of the enterprise as new information becomes available. He proactively connects with those necessary, when necessary. His calls are always returned as he has the reputation of “when Archie II speaks, we need to listen!”



Quote for the day:


"Your greatest area of leadership often comes out of your greatest area of pain and weakness." -- Wayde Goodall


Daily Tech Digest - June 09, 2018


The old school adopters who firmly believed that all tokens and offerings should be based on a “game-changing” software-based utopian utility platform creating borderless financial technologies and instruments are now bearing witness to more than 50% of those projects being in ruins or reported on as blatant money grabs. Its only funny money, right? Until you lose it. Slick websites, vaporware prototypes and teams that either have no proven track record or were often fake, promising great returns, but without an understandable or executable plan to monetization. Internet Deja Vu all over again, a la 2000 anyone? And just like in 2000, the industry is beginning to shift to real business, with real business plans, with real proven profitable business models backed by verifiable hard assets and being offered as Security Token Offerings (STO).  For the market to continue to mature into an alternative digital economic model for the future, this old school way of thinking “utility only” will need to evolve to be more accepting of asset-backed tokens and offerings. With regulations on the rise and token sales from 2017 literally in shambles, both existing and new investors entering the crypto space are becoming more risk averse



The State of Enterprise Mobility in 2018: Five Key Trends


Samsung recently commissioned Oxford Economics to conduct an in-depth study into the state of enterprise mobility in 2018, focusing in particular on the differences between organizations who have adopted Bring Your Own Device (BYOD) policies and those who have chosen to provide mobile devices to their workforce. The research, based on survey of 500 senior IT and business leaders and in-depth interviews, also quantifies the total investment in mobile-enablement for the enterprise, highlighting areas for cost-efficiency and the strategies that deliver the greatest return on investment. The study turned up some thought-provoking findings and I’d encourage business leaders involved in setting mobile policies to download the full report or register for the webinar with Oxford Economics to learn more. ... Unsurprisingly, most companies that have opted for a BYOD approach have done so because of the perceived cost savings. These savings can be significant when employees pay for their own mobile service plans, but the survey revealed that increasingly enterprises are providing employees a hefty stipend to compensate for personal mobile usage. In many cases, this stipend wipes out the savings achieved.


The Advent of a New Synergy: the Blockchain & Cloud

A Blockchain based Distributed Cloud will guarantee trust, security and transparency, speed up processes, and keep accurate records that can be accessed by the relevant stakeholders via cloud. This will promote the demand for cloud based services in industries such as real estate, healthcare, banking among others. Blockchain based cloud can help real estate agents and homeowners store property information in a central place, so that anyone with interest in buying and selling property can easily access it. This will cut hours of phone calls, paper pushing, prevent fraud and eliminate middlemen. It can also help address medical data integrity and security in terms of patient information, reduction of errors and fraud and promote transparency. It can have significant impact in tracking and protecting personal healthcare information. It can also address the privacy issue of medical billing logs from a financial angle. Thus, it will not only protect clinical data, but also tell transactions costs, making it harder for healthcare institutions and insurance companies to commit fraud or make errors.


Understanding the Varieties of .NET


.NET Standard 2.0, the latest version, has a very broad API coverage, but numerous missing APIs still exist. It pretty much covers .NET Core, but leaves out a fair amount of .NET Framework. Of course, there’s nothing keeping you from targeting those missing APIs, but then you’re targeting .NET Framework, not .NET Standard, and you’re locked into it until you get rid of those API calls. If you’re writing a new set of libraries, I would recommend trying to target .NET Standard. If you do, your libraries will run on .NET Framework, .NET Core, or Xamarin, with no additional effort. You will of course have to create apps targeted for the specific .NET variants, but if you make the apps small enough, including the GUI-based classes that aren’t supported in .NET Standard, and put most of the functionality in the shared libraries, then you should be able to get the benefits of cross-platform support for as much of your code as possible. Migrating existing .NET Framework code again may be more involved due to the lack of certain APIs, but even there you may be able to migrate as much code as possible to .NET Standard libraries and keep the platform-specific code isolated. 



Why Does Artificial Intelligence Scare Us So Much?

Negative feelings about AI can generally be divided into two categories: the idea that AI will become conscious and seek to destroy us, and the notion that immoral people will use AI for evil purposes, Kilian Weinberger, an associate professor in the Department of Computer Science at Cornell University, told Live Science. "One thing that people are afraid of, is that if super-intelligent AI — more intelligent than us — becomes conscious, it could treat us like lower beings, like we treat monkeys," he said. "That would certainly be undesirable." However, fears that AI will develop awareness and overthrow humanity are grounded in misconceptions of what AI is, Weinberger noted. AI operates under very specific limitations defined by the algorithms that dictate its behavior. Some types of problems map well to AI's skill sets, making certain tasks relatively easy for AI to complete. "But most things do not map to that, and they're not applicable," he said. This means that, while AI might be capable of impressive feats within carefully delineated boundaries — playing a master-level chess game or rapidly identifying objects in images, for example — that's where its abilities end. 


Traditional ERP Falls into the Arms of Cloud

(Image: 4x-image/iStockphoto)
"Enterprises have been embarking on a journey of digital transformation for many years," Microsoft Vice President for Azure Girish Bablani wrote in a blog post listing the benefit of SAP's offerings on Microsoft's cloud platform Azure. "For many enterprises, this journey cannot start or gain momentum until core SAP Enterprise Resource Planning landscapes are transformed." Microsoft wants to be the cloud choice of those SAP customers. Bablani noted that SAP customers including Penti, Malaysia Airlines, Guyana Goldfields, Rio Tino, Co-op, and Coats have all migrated to the cloud on Azure. For its part, Microsoft right now looks like the success story of an early technology company that has made the transition to the cloud and the new digital economy. Under a new cloud-minded CEO (who is not a company founder) Microsoft has made some progressive acquisitions including the R company Revolution Analytics, careers social network LinkedIn, and now GitHub. It also operates a successful public cloud and has transitioned its popular Office software into a cloud service as Office 365. 


Could blockchain have solved the mystery of the romaine lettuce E. coli outbreak?

Not long ago, Yiannas, who guards the integrity of food in Walmart’s $280-billion grocery empire, would have brushed off the notion of an instantly “knowable” and verifiable food chain as fantasy. He heard about it two years ago, when Walmart was about to open a food safety institute in China, where 10 years ago a baby formula adulteration scandal sickened 54,000 babies. “Up until that point I only knew that it was the technology behind bitcoin,” Yiannas said. “I will tell you I was a bit of a skeptic, just like many people are about the technology.” Blockchain, for all its cloak-and-dagger associations, is basically a democratized accounting system made possible by advances in data encryption. Rather than storing proprietary data behind traditional security walls, companies contribute encrypted blocks of data to a “distributed” ledger that can be monitored and verified by each farmer, packer, shipper, distributor, wholesaler and retailer of produce. No one can make a change without everyone knowing, and agreeing to it.


The Future Computed: Artificial Intelligence and its role in society

Photo of a stack of books entitled "The Future Computed"
Beyond our personal lives, AI will enable breakthrough advances in areas like healthcare, agriculture, education and transportation. It’s already happening in impressive ways. But as we’ve witnessed over the past 20 years, new technology also inevitably raises complex questions and broad societal concerns. As we look to a future powered by a partnership between computers and humans, it’s important that we address these challenges head on. How do we ensure that AI is designed and used responsibly? How do we establish ethical principles to protect people? How should we govern its use? And how will AI impact employment and jobs? To answer these tough questions, technologists will need to work closely with government, academia, business, civil society and other stakeholders. At Microsoft, we’ve identified six ethical principles – fairness, reliability and safety, privacy and security, inclusivity, transparency, and accountability – to guide the cross-disciplinary development and use of artificial intelligence. The better we understand these or similar issues — and the more technology developers and users can share best practices to address them — the better served the world will be as we contemplate societal rules to govern AI.


Forward Secrecy Configuration
Forward Secrecy’s day has come – for most. The cryptographic technique (sometimes called Perfect Forward Secrecy or PFS), adds an additional layer of confidentiality to an encrypted session, ensuring that only the two endpoints can decrypt the traffic. With forward secrecy, even if a third party were to record an encrypted session, and later gain access to the server private key, they could not use that key to decrypt a session protected by forward secrecy. Neat, huh? Forward secrecy thwarts large-scale passive surveillance (such as might be conducted by a snooping nation state or other well-resourced threat actor) so it is seen a tool that helps preserve freedom of speech, privacy, and other rights-of-the-citizenry. It is supported and preferred by every major browser, most mobile browsers and applications, and nearly 90% of TLS hosts on the Internet, according to a recent TLS Telemetry report (PDF). The crypto community applauds forward secrecy’s broad acceptance today. While forward secrecy foils passive surveillance, it also complicates inspection for nearly every SSL security device currently in existence. 


picture.jpg
Like any digital transformation technology, the biggest driver will come from the need to remain secucompetitive. Blockchain is not only about security, it's also about transparency and eliminating mediators, which translates to cost effectiveness and higher return on investment for both vendors and end users. Those service providers that ignore the technology — and overlook fortifying, enhancing, and accelerating their services with it — will be left behind. ... Tech giants such as SAP and IBM are testing the water, especially the latter with public- and finance-sectorpilot projects. However, blockchain brings a huge opportunity to startups in the CEE region, where government support and advanced skills can offer a fertile ground for things to really happen. DX is about fast progress and agility — the tech giants’ size and legacy is not an advantage here!  ... Blockchain is often confused with the volatility and craziness of cryptocurrency, even among technology professionals and enthusiasts. This is related to the nature of trading and mining those currencies as a product, and not to blockchain as technology. Development and adoption levels for blockchain vary across the CEE region, from initiatives and discussion 




Quote for the day:


"If you don't demonstrate leadership character, your skills and your results will be discounted, if not dismissed." -- Mark Miller


Daily Tech Digest - May 16, 2018


In terms of the legal implications of AI, vicarious liability and agency cannot be applied to AI in the same way as they would for employee liability. Due to the black box nature of AI and the lack of transparency in its reasoning, it is difficult to attribute liability. The Fairchild principles of a 'material increase to risk' could be applied in future to determine liability, but without legislative clarification, the position is not entirely clear. Furthermore, AI can monitor price changes within a market and react very quickly, thereby potentially stifling competition by creating a form of collusion in the market. The European Commission is currently taking the threat of AI in competition seriously and exploring solutions to resolve these types of issues. From an intellectual property perspective, legislation has not been updated to cover the ownership of AI-generated intellectual property. Companies will need to ensure ownership of any materials or intellectual property created by AI vests or is transferred to them. In terms of ethics, the law cannot cover every moral scenario. AI is already creating unintended gender, race and socio-economic bias based on the data it works with.



Force multipliers in cybersecurity: Augmenting your security workforce

Organizations are employing security automation and orchestration technologies to make sure that the right person, with the right data, is there at the right time to make decisions, he said. In cybersecurity, it is important that the organization is clear about what actions must be taken after an incident occurs. Automation technologies can make changes right away to contain the issue, he added, but just relying on technologies isn't enough to help prepare for today's advanced threats, he added. Organizations should also practice breach preparedness drills to test their response, he stressed.  Implementing these security orchestration and automation practices also relies on strong leadership that develops a team atmosphere, and teaches team members to work together during a crisis, he said. It will be important to exhibit these strong cultural traits during a breach, especially because cybersecurity playbooks can crack under pressure, he added. "People want to practice what it's like to go through a breach," he said. "Security orchestration gives you the technology to respond fast and encourages you to practice it so that [when things go wrong] you're ready."


What is predictive analytics? Transforming data into future insights

What is predictive analytics? Transforming data into future insights
Organizations use predictive analytics to sift through current and historical data to detect trends and forecast events and conditions that should occur at a specific time, based on supplied parameters. With predictive analytics, organizations can find and exploit patterns contained within data in order to detect risks and opportunities. Models can be designed, for instance, to discover relationships between various behavior factors. Such models enable the assessment of either the promise or risk presented by a particular set of conditions, guiding informed decision-making across various categories of supply chain and procurement events. ... While getting started in predictive analytics isn't exactly a snap, it's a task that virtually any business can handle as long as one remains committed to the approach and is willing to invest the time and funds necessary to get the project moving. Beginning with a limited-scale pilot project in a critical business area is an excellent way to cap start-up costs while minimizing the time before financial rewards begin rolling in. Once a model is put into action, it generally requires little upkeep as it continues to grind out actionable insights for many years.


Successful IoT deployment: The Rolls-Royce approach

"The IoT is useful when you know you can derive business benefit by making unknown processes visible," she says. "If you try and use sensors everywhere, you will get nowhere because it's too expensive and it's too imprecise. Rolls-Royce picks the places where its IoT solutions can make data visible, and which will create significant operational benefits. That, for me, is the key to a successful IoT deployment." Gorski advises other digital chiefs to analyse their business operations and understand where a lack of data transparency creates a headache. She has seen big-bang instrumentation projects happen and, for the most part, these are difficult to justify. "They end up being expensive to implement," says Gorski. "It's costly to transmit data and the business ends up with a patchwork quilt of information. It's important to remember there isn't a single solution for IoT instrumentation and you must bootstrap technology together from lots of different suppliers. All that bootstrapping adds costs and creates complexity."


The NHS is failing to deliver 'basic IT', says Matthew Swindells


“We are investing millions of pounds in technology, yet we’ve got six organisations that still can’t tell us what their waiting lists are. It’s not acceptable,” he said. Barts Health NHS Trust, for instance, hasn’t submitted a referral to treatment report to NHS England for nearly four years. “We walk around most hospitals and we’ve not known how many beds we have and how many patients are lying in them,” said Swindells. “We need to at the very least get the data that we capture back out. If we can’t do the basics, me going cap in hand to the treasury for another £10bn to sort IT out just sounds like fool’s money.”  He highlighted e-rostering as another example of failing to use data properly, saying most hospitals use an e-rostering system, which he described as a “glorified spreadsheet” and “expensive pieces of technology that are not enabling better rostering: not enabling the matching of staffing to clinical need, not enabling staff to be flexible about when they work and therefore making more available”.  “We have to make this stuff work well,” said Swindells.


Here’s what the big four U.S. mobile ISPs are doing with IoT

iot services network
The playing field might not remain in its current state for long, with the main issue being the proposed $26.5 billion merger between T-Mobile and Sprint. Partridge said that would be a game-changer for carrier-based IoT in the U.S. “In the consumer business, T-Mobile’s going to be in charge of that, they’ve been wildly successful – but I think in IoT, Sprint will have every opportunity to take the lead,” he said. The idea, after the combination, would be to make acquisitions aimed at strengthening the new company’s position on the enterprise side of service provisioning in general, and focused on IoT particularly, though there are a number of tactical options for pursuing such a strategy. The new company could get into fleet management, a la Verizon and AT&T, snap up IoT software companies and package their offerings into new branded services, move heavily into surveillance and security, or even hardware. “The playbook is fairly open in terms of that, but the goal is to get away from connectivity-only value, because that’s not the place to be,” according to Partridge.


GDPR: Less Than One Month Out, the Top 3 Struggles

Instead of a collective sigh, May 25 might create more of a collective grunt. Most privacy professionals know that although a lot of work has been done in the run-up to D-day, GDPR compliance will require a constant focus. It is a journey, not a final destination. Those organizations that treat May 25 as the endpoint of their compliance drive, will be proven wrong. Another distinction between organizations will be their levels of ambition. Some organizations will look at GDPR as a mere checklist approach, I call it the "lawyer" approach (with all due respect to the lawyers amongst you, including myself). Legal compliance is core, but an organization's ambition should aim to go beyond and create a true cultural change. I truly believe that these privacy leaders ultimately will be rewarded in the market, banking on what I call a "trust dividend", reaping the benefits of constant investments in this space. Even though there is a broad spectrum amongst organizations around GDPR compliance, there are also some common themes and questions. In my role as CA Technologies Chief Privacy Strategist, I have had the opportunity to discuss GDPR with organizations, both public and private. 


Location-based services move beyond mobile and into enterprise apps

Location-based services move beyond mobile and into enterprise apps
The battle for LBS relevance moves from companies that only support increasingly commoditized location data, which they license (e.g., mapping data for GPS), to those that can offer enhanced and supplemental services. Previously seen as an old-style GPS/mapping data company, the largest LBS company, HERE, is moving away from the old model, although not totally. It’s changing from just being a database to being a value-added supplier of a full range of LBS with its Open Location Platform. HERE has several partnerships with auto companies (Audi, BMW) and others (Intel, Oracle, Amazon Web Services, Microsoft) to add platform capabilities beyond their extensive mapping database. Those capabilities include value-added services such as tracking, traffic, safety services, and HD maps. HERE's main cloud-based LBS platform competitor, MapBox, offers similar services but does not include its own mapping database, instead allowing clients to link to their preferred mapping data. HERE and Mapbox have some distinct strategy differences: Mapbox relies on others' data sets and can connect as needed and by user preference. HERE has its own data sets and is looking to add value on top.


Threat analytics: Keeping companies ahead of emerging application threats

Applications which can be downloaded are particularly vulnerable to cyber criminals, as they can be isolated from the network and attacked indefinitely until their defences are broken. Due to so many people using their personal mobile devices for work purposes, a compromised app will not only attack the individual or the business entity that published the app but could also grant attackers access to enterprise networks. Any application on an app store can be downloaded by anyone, and that includes bad actors. If an app is lacking in protection, once downloaded a bad actor might reverse engineer the app leaving it vulnerable to wide-scale tampering; IP/PII theft or API attack. With the code being left so vulnerable, the threat is extremely likely to turn into a widespread attack resulting in a loss of customers, brand damage, lost revenue, and lost jobs. On the other hand, with a threat analytics solution in place from the start, apps can provide valuable insights to the business the moment they are downloaded from an app store, thereby closing the loop.


Optimizing an artificial intelligence architecture: The race is on


Today, most AI workloads use a preconfigured database optimized for a specific hardware architecture. The market is going toward software-enabled hardware that will allow organizations to intelligently allocate processing across GPUs and CPUs depending on the task at hand, said Chad Meley, vice president of analytic products and solutions at Teradata. Part of the challenge is that enterprises use multiple compute engines to access multiple storage options. Large enterprises tend to store frequently accessed, high-value data such as customer, financials, supply chain, product and the like in high-performing, high I/O environments, while less frequently accessed big data sets such as sensor readings, web and rich media are stored in cheaper cloud object storage. One of the goals of composable computing is to use containerization to spin up computer instances such as SQL engines, Graph engines, machine learning engines and deep learning engines that can access data spread across these different storage options. 



Quote for the day:


"The task of leadership is not to put greatness into humanity, but to elicit it, for the greatness is already there." -- John Buchan


Daily Tech Digest - March 16, 2018

The future of IT: Snapshot of a modern multi-cloud data center

Multi-cloud Data Centers Are Emerging as a Hedge Against the Major Commercial Clouds
The idea of cloud computing remains simplicity itself, which is a key element of its appeal: Move the cost and complexity of procuring, provisioning, operating, and supporting an endless array of hardware, software, and enabling services for your business out to a 3rd party, which does it all it for you, yet more securely and with much greater economies of scale. Writ large across virtually all industries, a comprehensive shift to the cloud thus continues to be a top objective of CIOs in many organizations this year. Even the objective, despite misgivings that we're really just going back to the monolithic IT vendor world again. Not surprisingly, enabling such a strategic move is also the top business goal of the leading commercial cloud vendors, namely Amazon, Microsoft, and Google, who continue to vie vigorously for marketshare, technical leadership, and -- some would say -- the most interesting and valuable part of the market itself ... Hosting companies like Rackspace and others used to be able to provide a hedge that IT departments could use for such purposes, through services like co-location. However, most such providers have not been able to keep up with the overall capacity race or compete in the bruising cost efficiency battles that the top cloud providers can afford to wage.



The Containerization of Artificial Intelligence

While AI is more hype than reality today, machine intelligence — also referred to as predictive machine learning — driven by a meta-analysis of large data sets that uses correlations and statistics, provides practical measures to reduce the need for human interference in policy decision-making. A typical by-product of such application is the creation of models of behavior that can be shared across policy stores for baselining or policy modifications. ... Adoption of AI can be disruptive to organizational processes and must sometimes be weighed in the context of dismantling analytics and rule-based models. The application of AI must be constructed on the principle of shared security responsibility; based on this model, both technologists and organizational leaders will accept joint responsibility for securing the data and corporate assets because security is no longer strictly the domain of specialists and affects both operational and business fundamentals.


AI & Blockchain: 3 Major Benefits Of Combining These Two Mega-Trends


AI, as the term is most often used today is, simply put, the theory and practice of building machines capable of performing tasks that seem to require intelligence. Currently, cutting-edge technologies striving to make this a reality include machine learning, artificial neural networks and deep learning. Meanwhile, blockchain is essentially a new filing system for digital information, which stores data in an encrypted, distributed ledger format. Because data is encrypted and distributed across many different computers, it enables the creation of tamper-proof, highly robust databases which can be read and updated only by those with permission. Although much has been written from an academic perspective on the potential of combining these ground-breaking technologies, real world applications are sparse at the moment. However, I expect this situation to change in the near future. So here are three ways in which AI and blockchain are made for each other.


Cyber criminals using complex financial system, study shows


The findings on cyber criminal money-laundering and cashing-out methods are part of a study into the macro economics of cyber crime and how the various elements link together which has been led by Michael McGuire, senior lecturer in criminology at Surrey University. “This is the first time the financial flows of cyber criminals have been put together into a composite picture,” said McGuire, who will present the full findings of the nine-month Web of profit study at the RSA Conference in San Francisco from 17-19 in April. “Law enforcement and cyber security professionals can use the study to understand how revenue generation is feeding into laundering, and how laundering is feeding into more traditional methods of money-laundering and the way cyber criminals are spending their money, so that they look at the intersections between the various networks more carefully,” he told Computer Weekly.


To OSPF Or Not? Which Routing Protocol To Use

OSPF with a multipoint MAN is a classic DR/BDR LAN situation, reducing the amount of peer-to-peer flooding. I haven’t run into this at large scale in a design setting yet. Would having such a MAN provide a pretty good reason to run OSPF overall? How would one damp instability in such a network? Large failure domain? What number of peers is “too big” for a full mesh MAN? The other problem I’m still mulling over is the OSPF WAN to dual datacenters design. In one case, a customer was running more than 250 VLANs (one per area) over DWDM, and more recently over OTV between datacenters, with more than 4000 GRE over IPsec tunnels. Dual hub DMVPN and BGP route reflectors looks very attractive compared to that. “Totally stubby EIGRP” — hubs that advertise only 0/0 or corporate default to remote sites — could also work well. By the way, if you are using EIGRP, note Cisco’s clever recent stub-site feature, which was probably built to simplify IWAN.


5 Applications Of Smart Contracts

Smart-Contracts
Due to a lack of automated administration, it can take months for an insurance claim to be processed and paid. This is as problematic for insurance companies as it is for their customers, leading to admin costs, gluts, and inefficiency. Smart contracts can simplify and streamline the process by automatically triggering a claim when certain events occur. For example, if you lived in an area that was hit by a natural disaster and your house sustained damage, the smart contract would recognise this and begin the claim. Specific details (such as the extent of damage) could be recorded on the blockchain in order to determine the exact amount of compensation. ... The terms of a mortgage agreement, for example, are based on an assessment of the mortgagee’s income, outgoings, credit score and other circumstances. The need to carry out these checks, often through third parties, can make the process lengthy and complicated for both the lender and the mortgagee. Cut out the middle men, however, and parties could deal directly with each other (as well as access all the relevant details in one location).


China wants to shape the global future of artificial intelligence

“[The Chinese government] sees standardization not only as a way to provide competitiveness for their companies, but also as a way to go from being a follower to setting the pace,” says Jeffrey Ding, a student at Oxford University’s Future of Humanity Institute who studies China’s nascent AI industry, and who translated the report. The government’s plan cites the way US standards bodies have influenced the development of the internet, expressing a desire to avoid having the same thing happen with AI. China’s booming AI industry and massive government investment in the technology have raised fears in the US and elsewhere that the nation will overtake international rivals in a fundamentally important technology. In truth, it may be possible for both the US and the Chinese economies to benefit from AI. But there may be more rivalry when it comes to influencing the spread of the technology worldwide. “I think this is the first technology area where China has a real chance to set the rules of the game,” says Ding.


Open Source & Smart Mobility In The Transportation Industry


Open source projects in the big data space move their development and feature sets along quickly to harness the latest enhancements in technology, performance, and scalability. New best practices get baked into data platform solutions very quickly, and a huge community of data scientists, scripters, and programmers all works toward the same goal, making best-of-breed technology available to anyone. At the foundational level, innovation occurs so rapidly that it is unrealistic to expect a vendor to encapsulate all these new developments in anything but a proprietary solution layered on top. Selecting an open source platform for data projects removes any risk of vendor lock-in. When it comes to the data space, like most things, putting all your eggs in one basket is inadvisable. Much of the innovation that is occurring in the open source data space is directly attributable to the best and brightest minds’ aversion to being tied down to a single vendor, making a shared effort much more attractive.


Transforming Bank Compliance with Smart Technologies

Digitization, the final stage in the transformation process, has the potential to create a step change in compliance operations. The catalyst is the emergence of smart technologies, which offer significant performance improvements and the ability to mimic human capabilities such as learning, language use, and decision making. Smart technologies have multiple potential applications in the context of compliance, from support for relatively routine tasks in client onboarding to analysis of unstructured data sets—for example, in relation to money laundering. Across the board, these technologies offer a route to significant efficiency gains and can help employees work more effectively. The starting point in building a cutting-edge compliance framework is to establish a taxonomy that describes and classifies key areas of risk. Such a taxonomy is also a prerequisite for defining the scope of a target operating model. The six most relevant types of compliance risks relate to financial crime and conduct.


IBM sets up no-fee Data Science Elite team to speed up AI adoption


Big Blue is calling the latter “Cloud Private for Data”, based on an in-memory database. It adds up to a platform for doing data science, data engineering and app building. IBM said the aim is to “enable users to build and exploit event-driven applications capable of analysing data from things like IoT [internet of things] sensors, online commerce, mobile devices, and more”. ... IBM is also announcing a “Data Science Elite Team”, described as a “no-charge consultancy dedicated to solving clients’ real-world data science problems and assisting them in their journey to AI”. Patricia Maqetuka, chief data officer at South African bank Nedbank, has used the team. She said: “Nedbank has a long tradition of using analytics on internal, structured data. Thanks to IBM Analytics University Live, we were exposed to the guidance and counsel of IBM’s Elite team. This team helped us to unlock new paradigms for how we think about our analytics and change the way we look at use cases to unlock business value.”



Quote for the day:


"Don't waste words on people who deserve your silence. Sometimes the most powerful thing you can say is nothing at all." -- Joubert Botha


Daily Tech Digest - November 05, 2017

The end of the cloud is coming

An internet powered largely by client-server protocols (like HTTP) and security based on trust in a central authority (like TLS), is flawed and causes problems that are fundamentally either really hard or impossible to solve. It’s time to look for something better — a model where nobody else is storing your personal data, large media files are spread across the entire network, and the whole system is entirely peer-to-peer and serverless (and I don’t mean “serverless” in the cloud-hosted sense here, I mean literally no servers). ... Peer-to-peer web technologies are aiming to replace the building blocks of the web we know with protocols and strategies that solve most of the problems I’ve outlined above. Their goal is a completely distributed, permanent, redundant data storage, where each participating client in the network is storing copies of some of the data available in it.

Europe’s businesses leaving workers behind in the technology skills race
Human-centred interactions between people and machines have profound implications on the design of products and services. No longer do consumers need to command machines using a graphical interface: voice interfaces such as Alexa, Siri and Cortana etc. have changed that. Next, the emphasis will shift from understanding the meaning to interpreting intent. For example, in Toyota’s Concept-i car instead of commanding its virtual AI assistant, Yui, to turn the AC up, Yui will be able to understand intent in statements like “I’m feeling a bit cold.” It isn’t necessary to look into the future to see this trend. Already data-driven products are taking on board the emotional reactions of their users. For that reason, the best data-driven services don’t exhaust the user with endless data-gathering questions: Apple Music asks new users to “Tell us what you’re into” and presents a few bubbles containing genres to select.


Blockchain Aims to Foster Payer, Provider Trust for Value-Based Care

Blockchain and value-based care
Value-based care has accelerated the need for seamless data sharing in an environment that is both transparent and unquestionably trustworthy – one that can bring payers and providers together to improve quality, reduce costs, and enhance the patient experience. While stakeholders have offered up plenty of potential solutions for creating a free-flowing data environment that can support the complex environment of pay-for-performance reimbursements, blockchain may be the methodology that ticks the most boxes with a relatively low amount of effort. At Hashed Health, an industry consortium dedicated to applying blockchain to real-world use cases, CEO John Bass believes that the distributed ledger approach offers a number of promising improvements to the way providers, payers, and patients collaborate in a value-based world.


Future of Digital Currency May Not Involve Blockchains

Future of Digital Currency May Not Involve Blockchains
The problem with cryptocurrencies conceived before Bitcoin was their centralized structure. Without Blockchain technology, there was no “decentralized, immutable, transparent” ledger in which transactions could be recorded, leading to a centralization. Yet it looks like Blockchain may not be the be-all, end-all of digital currency technologies. Recently, a new form of crypto has emerged that leverages the Directed Acyclic Graph (DAG) organizational model for the structure of its decentralized ledger, allowing old problems to be solved and new features to be added. Today, we’re going to take a look at the technology that can potentially replace the Blockchain itself and some of its current implementations. Although the implementations that we are going to discuss today are new, the concept is not.


Do More With Machine Learning Thanks To These 6 Open Source Tools

Machine Learning Open-Source Tools
One problem the industry is seeing, however, is that there’s a severe lack of developers and new talent. It’s a problem for the entire development and programming industry, not just machine learning. Many companies and brands are vying for new employees, leaving the startups and newer names in a bit of hot water. Luckily, this can be offset by adopting open-source development protocols. More importantly, you can open your projects — future and present — to an even broader development community and audience by making it open-source. Open-source tools allow anyone to contribute to a project and work on fixes for bugs, new features and new builds. You can retcon separate versions, selecting the content and elements you want in an official release. This way, even though there’s a development community behind the project, you still have a great deal of control over the central project path.


The road to artificial intelligence in mobility—smart moves required

This overall interest in what AI could enable in automotive and mobility technology leads to a considerable willingness to pay for those features. Of the consumers who indicated high interest in AD features (24 percent of those surveyed), 46 percent are willing to pay more than $4,000 for autonomous-driving features on their next car. And AD features are so important to consumers that 65 percent would switch OEMs for better AD functionality; that figure exceeds 90 percent for young consumers and those living in large cities. Expectations are high, though, and may need to be tempered. On average, consumers expect full autonomy to be widespread in about five years—a tight timeline for any player, and for regulators. Machine learning will have a significant impact on the automotive and mobility industry, since it will unlock entirely new products and value pools and not just lead to productivity improvements.


Blockchain’s explosive growth pushes job skills demand to No. 2 spot

FinTech - financial technology - blockchain network - distributed ledger wireframe
It's not hard to imagine blockchain as a "disruptive skill" that is both fast-growing and hard to find, according to Burning Glass Technologies. While the technology and hiring patterns are in their early stages, it might be a good idea for employers to start figuring out where they will find blockchain talent, "even as they are still considering how the technology will change their business. "Because of its connection with 'cryptocurrencies,' blockchain is associated with finance, and major banks like Liberty Mutual, Capital One and Bank of America have posted openings," Burning Glass Technologies said in its blog. "There are also companies devoted to building blockchain applications, like Consensys Corporation. But the demand for blockchain is much broader, including major consulting firms like Accenture and Deloitte and technology companies like IBM and SAP. ..."


There's a Lot More to AI Than Just Chatbots

There's a Lot More to AI Than Just Chatbots
Options, where the AI uses data to create a model, but does not integrate with the DMP, are okay and will deliver enhanced business results. But they will never be as powerful as a truly integrated system. Artificial intelligence perceives its environment and makes decisions which will maximize its chance of success at any given goal. This could range from optimizing profit margin, to maximizing stock efficiencies. For example, a supermarket will want to ensure it always has enough salad in stock to supply its customers, while making sure there is minimal wastage and minimal unsold product. A good AI system can take that supermarket's typical sales into account, but should also be linked to weather information, so if there is a freak heatwave in October, the weather, and not just October's average salad sales, will be considered.


Microservices Interaction and Governance model - Orchestration v Choreography


In order to understand the options for managing microservice interaction, we should first study its history. Let’s look back to a time that is almost a decade before microservices really took off. In the early 2000s, the book Enterprise Integration Patterns was published. The corresponding web site for EIP remains an important reference for service interaction even to this day. Workflow engines were a popular option back in the days of Service Oriented Architecture, Business Process Management, and the Enterprise Service Bus. The promise at that time was that you could create orchestration style APIs without needing to employ a fully trained engineer. They are still around but there isn’t much interest in this option for microservice interaction anymore, primarily because they could not deliver on that promise.


The Fear of Disruption Can Be More Damaging than Actual Disruption


The automotive industry is at the start of just such a period. Massive changes appear to be inevitable: connected cars, autonomous vehicles, battery breakthroughs, and the like. But these changes will probably take decades to be fully adopted. The vehicles themselves have been in development for years now, and their potential impact has been analyzed extensively through computer models. Many critical factors will slow down their adoption. These include technical factors, such as the difficulty of designing vehicles for a wide variety of terrains and climate conditions. Incumbent automakers have built up fundamental advantages in design, manufacturing, distribution, sales, and financing, making it hard for new entrants to compete. All manufacturers, old and new, will need time to ramp up so they can produce the necessary technologies at scale. The transition will also require new types of auto repair shops, new fleet-management companies with new sources of capital for financing them, new forms of auto insurance, and new traffic and safety regulations.



Quote for the day:


"Don't wait. The time will never be just right." -- Napoleon Hill


Daily Tech Digest - October 10, 2017

IT spending increases for software-defined storage, on-demand services
SDS is gaining popularity because of its versatility in a modern data center. Enterprise storage has been migrating from hardware-defined arrays as data centers migrate to virtualization and cloud-based infrastructure. SDS solutions run on commodity hardware but use virtualization and all functionality, such as provisioning and de-duplication, via software. This adds automation and thus speed to storage networks. "For IT organizations undergoing digital transformation, SDS provides a good match for the capabilities needed — flexible IT agility; easier, more intuitive administration driven by the characteristics of autonomous storage management; and lower capital costs due to the use of commodity and off-the-shelf hardware," said Eric Burgener, research director at IDC, in a statement.


Rise in Insider Threats Drives Shift to Training, Data-Level Security

With an insider threat, the culprit is already inside the network. Securing the perimeter around the network — which has long been the focus for enterprise security — does not do the job against this kind of a threat, whether it is malicious or unintentional. Nor is focusing on securing the perimeter the best strategy against many external threats. That's because data-smart companies want to be able to safely give partners, suppliers, and customers access to their networks in order to increase business opportunities. As a result of this shift, security needs to rest with the data itself, not just at the network level. The move to the cloud elevates the need for data-level protection. To reduce the risk of insider threats, companies and organizations need to focus on three areas


Understanding Cloud Native Infrastructure: Interview with Justin Garrison and Kris Nova

A major benefit of public cloud comes from process rather than performance. ​The people hours you can save from becoming an infrastructure consumer rather than an infrastructure builder will be very difficult to calculate but will likely enable a new method of working that far outweighs the technical limitations of a public cloud. Not to mention some of the best infrastructure builders and maintainers in the world work at public cloud providers and the companies behind them spend billions every year building out the infrastructure, R&D, and new features. ​The biggest considerations when building your own cloud is not what it will cost you to build the private cloud, but what it will cost you to maintain it and what happens when you fall behind public cloud offerings.


Make Cybersecurity A Priority in a Small Business’ Early Stages


The need for strong passwords is crucial for cybersecurity, no matter how often we groan about having to change (and remember) a new one. Shubhomita Bose writes about this and data from Headway Capital for smallbiztrends.com. The Headway infographic emphasizes having a company policy to avoid “weak” passwords, to change passwords on a regular basis, and to incorporate “two-factor authentication” — as some businesses are now doing with an additional text-message step in the password process. This is an increasingly significant threat to cybersecurity. ... As Anita Campbell, CEO of Small Business Trends, writes for Inc.com, “The ransom is displayed on the screen with a message stating you must pay a fine or fee in order to access your own system. Ransoms have ranged from hundreds of dollars to tens of thousands of dollars.”


Leaving employees to manage their own password security is a mistake

“Far too many organizations are leaving the responsibility for password management to their employees and don’t have the automated password management technology in place to identify when things are going wrong.” “In many cases, an organization’s password management practices are overly reliant on manual processes and far too often place an excessive level of trust in employees to use safe password practices,” said Matt Kaplan, GM of LastPass. “The threat posed by human behavior coupled with the absence of technology to underpin policy is leaving companies unnecessarily at risk from weak or shared passwords. Organizations need to focus on solving for both obstacles in order to significantly improve their overall security.”


How IPv6 deployment affects the security of IoT devices


As a result of their vast address space, IPv6 devices are provisioned with at least one unique global address and, thus, NATs are doomed to disappear. Therefore, a NAT's enforcement of the filtering policy to only allow outgoing communications is also likely to disappear, meaning communication between internal and external systems may no longer be policed by the network. In fact, the distinction between internal and external networks may disappear altogether if a filtering policy is not enforced at the network border. While this could have potential benefits -- for example, for peer-to-peer applications, in which unsolicited inbound communications are common -- this clearly comes at the expense of increased attack exposure.


Organizational Culture Needs To Change So That Security And DevOps Can Exist In Tandem

Cloud adoption often started to be called in terms of ‘shadow IT’ or ‘bypass IT’. So cloud adoption often occurred outside of the mainstream IT and mainstream IT security groups. So in a sense IT and IT security are still playing catch up to the original adoption of cloud. Even if they have been given responsibility for it now. And we have started to see that change. 2 years ago even in the US often it was – we were working with those ‘shadow IT’ projects. Now the responsibility is more moving into IT and IT security. So they’re bringing the traditional mindset. I think the remaining roadblock that you are still getting is the developer pipeline is moving at a much faster pace than it did historically where application introduction used to occur maybe in months


Intel plans hybrid CPU-FPGA chips

Intel plans hybrid CPU-FPGA chips
“The advantage for FPGA is GPUs play in some areas but not all, and if you look at the use model of inline vs. offload, they are limited to offload mostly. So, there’s a broader application space you can cover with FPGA,” he said.  The integrated solution provides tight coupling between CPU and FPGA with very high bandwidth, while the external PCI Express card is not as tightly coupled. For ultra-low latency and high-bandwidth applications, integrated is a great fit, Friebe said.  “Most of the differentiation [between integrated and discrete] is due to system architecture and data movement. In a data center environment where [you] run many different workloads, you don’t want to tie it to a particular app,” he said.  The more you do specialization, the more performance you can squeeze out of the accelerator, said Friebe. 


The future of mobility: Are we asking the right questions?

One of the categories requiring the sharpest questions about the future is mobility. The mobile present has many moving parts and is very complex, but base patterns are discernible. I believe every human on this planet needs at least to attempt to comprehend the current point to which the mobile revolution has brought us. Furthermore, I believe modern executives have a fiduciary responsibility to think long and hard about where the mobile revolution is taking us.  The most rapidly adopted consumer technology in the history of mankind, mobile technology has had a huge economic impact — more than $1 trillion — and has changed the corporate competitive landscape as well as how people live their daily lives. Some go so far as to argue that mobile technologies have changed what it is to be human.


Detecting and Analyzing Redundant Code

A typical analysis would involve running the tool repeatedly to prune back the source tree as brutally as possible. This was then followed by several cycles of reverting changes so as to get successful builds and then passing tests. The reasons for failure being that the tool had behaved incorrectly or there was a known limitation, examples of the latter being reflection or the existence of a code contract. The tool was trained on various GitHub repositories for C# projects that were chosen on the basis that I had used them and thus wanted to contribute back. Ultimately a pull request was submitted to the community asking for discussion of the changes in my branch. As the tool is brutal and I was engaging online with people for the first time this was where diplomacy was required and hopefully I didn’t offend too many people.



Quote for the day:


"When you're around someone good, your own standards are raised." -- Ritchie Blackmore