Daily Tech Digest - August 27, 2017

Applying Enterprise Architecture Rightly

This article takes reference of TOGAF for the discussion. While practicing TOGAF, it’s always implied that TOGAF is a suggestive approach and NOT a prescriptive approach. Xoriant has recently forayed into TOGAF with the help of newly formed TOGAF Certified Architects and is committed to provide solutions to the clients on the proven success stories. Many a times, while interacting with business houses, requirements always start with a single line or sometimes, even a clause, e.g., – We need ERP. This is where a business canvas is provisioned and plans are made to evaluate the opportunity and provide information on that. Considering entire lifecycle of a business opportunity, it’s very important to put the right starting step so that Service Provider does not incur losses.


Issues necessitate Change - Evolution of Enterprise Architecture

The drive for continuous improvement and Information Technology alignment demanded further improvements in the way we thought about architecture – moving ever into the task of understanding an organizations business issues.  Other issues that drove changes, such as the following – no chronology implied. Corporate espionage, whether stealing secrets of an organization or stealing information about customers, where taking advantage of corporation vulnerabilities drove the need to look at those vulnerabilities holistically – that is looking at people, processes, and technology. Architects and builders changed to look beyond just IT to address this area.


Where Enterprise Architecture and Project Management Intersect

Enterprise Architecture is not about writing or specifying software code or systems - it is about architecting optimized business processes and systems based on management's strategies for the organization. While software development certainly has its place in organizations, that is an IT function best left to IT and more IT-specific solution, data, and information architects. Since EA is evolving away from IT-centricity and IT project teams, it doesn't make much sense to make them members of IT project teams. However, it may make sense for IT project teams to utilize EAs as subject matter experts - who are advisors and give guidance to specific projects, but are not members of the specific teams OR responsible for any of their scope, timeline, or deliverables.


OpenJDK may tackle Java security gaps with secretive group

The vulnerability group and Oracle’s internal security teams would work together, and it may occasionally need to work with external security organizations. The group would be unusual in several respects, and thus requires an exemption from OpenJDK bylaws. Due to the sensitive nature of its work, membership in the group would be more selective, there would be a strict communication policy, and members or their employers would need to sign both a nondisclosure and a license agreement, said Mark Reinhold, chief architect of the Java platform group at Oracle. “These requirements do, strictly speaking, violate the OpenJDK bylaws,” Reinhold said. “The governing board has discussed this, however, and I expect that the board will approve the creation of this group with these exceptional requirements.”


First Robocop to Join Dubai Police Force

That’s right, the time for the Robocop to leave the sci-fi land and enter reality… or sort of. A couple of years ago, Dubai’s police promised that for 2017 they would have enlisted their first robot police officer, and they have delivered. Built by the Spanish robotics company PAL Robotics, the police officer, known as REEM, was introduce to the public during the Gulf Information Security and Expo Conference. .. Although this is fascinating news, it’s still reality and reality is never as fun as fiction so this Robocop is a little duller. It’s not a cyborg half human, half robot, it’s just a robot with batteries. He cannot chase criminals like said before it has wheels, not legs, and it’s unarmed. However, even if REEM doesn’t have the cool design and features that we are used to seeing on the big screen, it means a tremendous achievement for science.


IoT: Penetrating the Possibilities of a Data Driven Economy

With manufacturing units now being able to communicate with each other through the deployment of IoT system solutions, analysis of facility performance metrics can be performed in real time. Management executives, if they want, can also resolve the performance monitoring to shop floor levels, which helps provide revealing manufacturing insights.  Serving as an example is the manufacturing giant, Caterpillar. The company has deployed the SAP Leonardo system, an IIoT technology, across all its operation facilities. The system furnishes real time information about manufacturing data, energy utilization data, machine performances, and data regarding the production consumables. Combining altogether, the company executives can have a 360-degree view of the manufacturing processes which leads to better tactical decision making.


Why are the stats on women in tech actually getting worse?

Girls look up to their role models, we have a number of women role models in entrepreneurship and other fields but there’s a significant lack of women role models in technology. Only 23.5% of computer science degrees were awarded to women last year in one of the biggest universities in the States. Some seniors found that they knew of few female computer scientists working in the professional world. This can be a reason why girls aren’t interested in this field as much. This can be changed as well, female representation should not be lacking in any field, especially, the ever growing area of technology. The seniors at Stanford have come up with an organization which dedicates itself in telling stories of women who work in programming, the SHE++ aspire women to take up arms in the technology field and to connect with women in this field as well.


Microsoft is making a blockchain that’s fit for business

Note that Coco is a framework, not a ledger; in fact it uses other ledgers. Ethereum is working already, and Intel along with J P Morgan Chase are porting their ledgers to Coco – plus other blockchain ledgers will also integrate with it. You can also choose what algorithm you want to use to achieve consensus. In one test using the Ethereum ledger in Coco, the network delivered 1,500-1,600 transactions per second with latency between 100-200 milliseconds – far faster than Ethereum itself running on the same hardware. Russinovich says Coco will also scale to networks with hundreds of thousands of participants. Because each transaction is only calculated once, time-sensitive or restricted data isn’t a problem either.


What Is Blockchain? A Primer For Finance Professionals

Now imagine that every time you send the monthly living allowance, you laid down a “block” with the transaction information carved into it. Both you and your child can see the block, confirming that the money was sent and received. ... Together, they create a record of all transactions with your future college graduate. When you get old and infirm, you can point to the chain, show your kid how much money you paid for college, and demand that they invest a similar amount in a high-quality nursing home. This is, more or less, how blockchain works. Each block is a record of a monetary transaction. The chain is a shared accounting ledger that is visible to all parties across multiple networks, or “nodes.” Every new transaction is verified by all nodes and, if valid, added to all copies of the ledger—in other words, a new “block” is added to the “chain.”


5 essentials for building the perfect Internet of Things beast

An IoT platform has more elements, and therefore is more complex, than a typical technology platform many are used to, they observe. These new platforms need to reach out to all the devices, sensors and applications and their underlying technology as well. "Look at the whole technology environment, not just the applications," the McKinsey authors advise. "Use fungible/off-the-shelf technology for the things that are less critical." Remember, too, that IoT is a different beast for every industry, or for every company for that matter. For a sportswear company, it may mean sensor-loaded sneakers. For an manufacturer, it means embedding sensors into production-floor tools. For an insurance company, it means planting telematics sensors in policyholders' cars.



Quote for the day:


"Sometimes life takes an unexpected wrong turn in the right direction." -- Unknown


Daily Tech Digest - August 26, 2017

Disaster Recovery Vs. Security Recovery Plans: Why You Need Separate Strategies

A security recovery plan is designed to stop, learn, and then correct the incident. "A disaster recovery plan may follow similar steps, but nomenclature would not likely use 'detection' to describe a fire or flood event, nor would there be much in the way of analytics," says Peter Fortunato, a manager in the risk and business advisory practice at New England-based accounting firm Baker Newman Noyes."Further, not many disasters require the collection of evidence." Another risk in merging plans is the possibility of gaining unwanted public attention. "For instance, invoking a disaster recovery plan often requires large-scale notifications going out to key stakeholders," Merino says. 


The Future of Public Cloud Storage for Big Data

Just as happened with Moore’s law when silicon chips met transistors, public cloud and big data are creating exponential effects. A recent research predicts that public cloud prices for big data processing and storage will decrease by half every few years while processing power will double. Public cloud skeptics predict that costs of big data storage in the public cloud will be the same or increase slightly in 10 years. however, this not true as the costs are expected to decrease significantly. On the other hand, the costs of upgrading big data software will increase. in a few years, Hadoop data lake will need to be upgraded. Right now, data scientists are preferring multiple versions of Spark, indicating the beginning of on-premise headaches. This can only get worse as Google, Amazon Web Services (AWS) and Microsoft pursue a serverless strategy.


5 Industries AI Will Disrupt in the Next 10 Years

It's difficult to talk about AI without evaluating its place in the ecosystem. Loosely speaking, it starts with the Internet of Things, in which objects are connected to the internet and used to gather data. Once enough data has been gathered, it passes the arbitrary threshold and becomes "Big Data", which AI is used to interpret. When there are so many data points that no human could ever process them all, artificial intelligence becomes the only real alternative. But AI doesn't always know what it's looking for, which is where machine learning comes in. Loosely speaking, that's the process of using AI to analyze data in such a way that it 'teaches' itself to interpret it. AI disruption, then, is largely going to come in the form of new ways of processing and interpreting data that have never before been available. Here are just five of the industries that AI is set to disrupt.


3 Amazing Ways AI is Going to Wipe Out Cyber Crime!

Ultimately it comes down to the Machine Learning experts, who are the main players behind "educating" the machines. Digital signature, authentication, hiding the IP, masking identity, encryption, firewall, etc have already been implemented by various firms... but what new strategies can machine learning discover?  Deep machine learning can apply various algorithms to identify the malicious activities taking place on the network. It can be determined by finding unusual patterns interacting with the system infrastructure.  You might have seen Google asking you to verify if you are not a robot. You click on some of the images and then it lets you browse. When you browse Google for long hours on end, this is the common activity that occurs.


Security professionals name top causes of breaches

While this finding underlines the importance of user education and training, the respondents said human error is exacerbated by understaffed security teams and a flood of alerts and false positives. This highlights the negative impact of companies struggling to recruit cyber security teams in the face of a worldwide shortage of people with information security skills and the need for greater staff support. This shortage of cyber defenders with the right skills is further underlined by the fact 43% of respondents said technology detected the attack, but the security team took no action, while another 41% said a combination of technology and human error was to blame. Respondents also blamed a lack of information resources to understand and mitigate attacks, with 42% saying they are left to figure them out themselves.


Leveraging Advanced Analytics to Power Digital Transformation

Recent conversations with Walker Stemple of Intel’s @intelAI organization got me thinking about where and how organizations can leverage “advanced analytics” to power their business models. Now “advanced analytics” is a broad definition, but I have included the following analytics in that definition: Regression, Clustering, Neural Networks, Machine Learning, Deep Learning, Artificial Intelligence and Cognitive Computing. And while these “classifications” seem to change on a regular basis (sometimes due to us getting smarter; sometimes due to non-value-add marketing hype), it is critical that tomorrow’s business leaders understand where and how to apply these advanced analytics to power their business models.


Ukraine Central Bank Detects Massive Attack Preparation

The National Bank of Ukraine - the country's central bank - declined to share a copy of the letter with Information Security Media Group, but confirmed that it had alerted banks to a new, potentially major attack. "In order to prevent cyber attacks, the National Bank of Ukraine consistently cooperates with banking sector participants, the State Service of Special Communication and Information Protection of Ukraine (SSCIPU), as well as relevant units of the Security Service of Ukraine and the National Police of Ukraine," a spokesman for the National Bank of Ukraine tells ISMG. "On August 11, the NBU promptly informed banks about new malicious code, its characteristics, indicators of compromise and the need to take preventive measures to prevent the networks from being attacked by malicious codes."


Looking beyond the hype of robotic process automation

RPA is particularly appealing for companies that are juggling with millions or even billions of transactions a day. With such an overwhelming amount to deal with, they often struggle to effectively manage important tasks like addressing customer requests, processing files, moving information between different systems, allocating work and making decisions. But RPA promises to help some organizations alleviate the challenge and operate more efficiently by automating, and thus accelerating, transaction processing. They can then provide greater customer service, which inspires continued loyalty and has a direct, positive impact on a business’ bottom line. ... The promise of RPA is creating massive hype in the market, which is being leveraged by RPA vendors to position their products as the “silver bullet” for any company looking to streamline and optimize operations.


What Is Data Mining? How Analytics Uncovers Insights

Data mining comes with its share of risks and challenges. As with any technology that involves the use of potentially sensitive or personally identifiable information, security and privacy are among the biggest concerns. At a fundamental level, the data being mined needs to be complete, accurate, and reliable; after all, you’re using it to make significant business decisions and often to interact with the public, regulators, investors, and business partners. Modern forms of data also require new kinds of technologies, such as for bringing together data sets from a variety of distributed computing environments (aka big data integration) and for more complex data, such as images and video, temporal data, and spatial data.


What is Rust? Safe, fast, and easy software development

Rust started as a Mozilla research project partly meant to reimplement key components of the Firefox browser. A few key reasons drove that decision: Firefox deserved to make better use of modern, multicore processors; and the sheer ubiquity of web browsers means they need to be safe to use. But those benefits are needed by all software, not just browsers, which is why Rust evolved into a language project from a browser project. Rust accomplishes its safety, speed, and ease of use through the following characteristics: Rust satisfies the need for speed. Rust code compiles to native machine code across multiple platforms. Binaries are self-contained, with no runtime, and the generated code is meant to perform as well as comparable code written in C or C++.



Quote for the day:


"No great manager or leader ever fell from heaven, its learned not inherited." -- Tom Northup


Daily Tech Digest - August 25, 2017

How Python makes programming simple

Because Python is easy and fast to write, that saves developer time, although this typically comes at the cost of execution time. The same programs in other languages—like C, C++, and Java—may take longer to put together, but they typically run many times faster than a Python app. But Python can also run fast when it needs to, because many third-party libraries for Python are written in faster languages like C. All Python has to do is plug into such a library, and it can run at or close to the speed of those languages when performance matters. Mastering new things in IT is always tricky, whether it’s containerization, devops, or extracting a little meaning from a lot of data. Python is designed to give you a leg up on getting all those things done, both now and into the future.


The current state of government cybersecurity is 'grim,' report says

When it comes to cybersecurity readiness, US government organizations aren't doing so hot. In a recent report from SecurityScorecard comparing the security practices of 18 industries, government ranked no. 16. "In the midst of investigations into a potential 2016 election hacking, regular major malware events, and an overall increase in the number of sophisticated cyberattacks, the report highlights that the government sector is lagging compared to almost every other industry," the report said. For the report, SecurityScorecard analyzed 552 local, state, and federal organizations to see how their security practices stacked up across 10 key categories. The only two industries that ranked lower were telecom (no. 17) and education (no. 18). For government, that's actually an improvement over last year, the report said, when it ranked dead last.


Indian CIOs to benefit from emerging and maturing technologies: Gartner

“This year’s Hype Cycle demonstrates the keen interest Indian organizations are taking in both emerging and maturing technologies,” said Pankaj Prasad, Principal Research Analyst - Gartner. “The market is witnessing the entry of local vendors in emerging, as well as mature, technology segments, including areas such as IoT robotic process automation offerings and machine-learning-based technologies,” added Prasad. ... Some technologies, such as mobile money, social analytics and robotic process automation offerings, will support new ways of doing business across industries. Technologies such as machine learning, IoT and smart city frameworks are of a transformational nature, which will result in a significant transformation within the industry dynamics and in the creation of a new ecosystem.


Susanne Kaiser on Microservices Journey from a Startup Perspective

Microservices come with complexities like multiple independent services, operational & communication complexity, partitioned data, and the complexity of eventual consistency. This comes with challenges of transformation to microservices, such as the need for different skills & tools, and untangling the core functionality; the team still has to take care of the existing system, and the transformation takes longer than anticipated. Kaiser said the monolith to microservices journey in reality is evolutionary. ... The key concept of modeling microservices is loose coupling between the services and high cohesion within a service. The team also identified bounded contexts for microservices with well defined business functions.


Distributed data centers boost resiliency, but IT hurdles remain

Organizations should create an "assurance construct" to address those questions and show CIOs and CTOs how data traverses the network and how failover works, Traver said. That way, the entire business understands the level of resiliency that its infrastructure delivers. At this point, only major public cloud players, such as Google Cloud Platform, have the resources to establish true cloud-based resiliency with complete consistency across all data centers in the network, Lawrence said. "It's probably not something that enterprises will be able to aspire to -- perhaps not at least for the next decade and a half -- but perhaps when they have enough sites and different colos, it might be possible," he said. Ultimately, the kind of resiliency an organization pursues should depend on its applications.


15 noob mistakes even experienced developers still make

If you’re not writing C or C++, Make is probably not your friend. Make launches another compiler process for each file. Most modern languages are not designed to have a separate process launched for each file. Also resolving dependencies in a language like Java using Make is nearly impossible. I once worked at a large network equipment company and shortened its build process from three hours to like 20 seconds by converting its build to Ant. A shell script is also usually a bad move in the end. I recently wrote a shell build for a lab because I didn’t want everyone to have to download a whole Java tool set to run one little lab. I thought it was a good move, but it was a noob mistake (as always) because the next version of the software it depended on broke everything (as always).


.NET Standard 2.0 Is Finalized for Consistent API Usage

The .NET Standard project for Visual Studio 2017, hosted on a GitHub site belonging to the .NET Foundation, was announced last September. Microsoft said .NET Standard will replace Portable Class Libraries (PCLs) as the de-facto tooling story used by developers for building multi-platform .NET libraries. ".NET Standard solves the code sharing problem for .NET developers across all platforms by bringing all the APIs that you expect and love across the environments that you need: desktop applications, mobile apps & games, and cloud services," Microsoft said in a huge blog post (with nearly 200 comments) explaining the standard in detail. Apparently facing developer confusion about exactly what .NET Standard is for, Microsoft has devoted some guidance to explaining it, even pointing to an analogy written by David Fowler.


Why cybercriminals like AI as much as cyberdefenders do

“AI is a hammer that can be used for good or bad,” said Jim Fox, a partner, principal and cybersecurity and privacy assurance leader at PwC. “And if your adversaries have a hammer, you'd better have one, too.” In the right hands, this mighty hammer can do a lot of good. Artificial intelligence software can monitor all network activity and quickly discern odd patterns that could indicate foul play, even if such patterns haven’t been flagged before. It can learn over time to discern truly suspicious behavior from normal patterns. Last year's Petya malware attack made decisions "at machine speed,” says one cybersecurity expert. “Nobody was guiding that malware. They wrote an intelligent program to do all that.”Adobe StockAt the New York-based investment bank Greenhill & Co., Chief Information Officer John Shaffer sought a better way to deal with zero-day attacks.


What do macOS and Android have in common? Both are booming malware markets

macOS hasn't been doing well on the malware front lately. Q2 2017, Malwarebytes says, was bigger for macOS malware than the entirety of 2016. Add to that the discovery of more new macOS malware families in 2017 than any year on record and you have a clear indicator of the vulnerability of Apple computers. The threats facing macOS are different than Android or Windows, which is somewhat good news. Rather than ransomware and malware, which the report says is the smallest concern for macOS, PUPs and adware dominate. Many popular macOS apps have been found to contain threats—even those on the App Store. Popular websites for downloading software, such as Softonic and Macupdate.com, have also been found to contain malicious installers.


Handset makers may need to reboot data collection rules

“The dangers to privacy in an age of information can originate not only from the state but from non-state actors as well. We commend to the Union Government the need to examine and put into place a robust regime for data protection,” the Supreme Court judges said in their ruling. Even before the court’s verdict on Thursday, the government directed 30 handset makers including Apple, Samsung, Micromax and Xiaomi to share the procedures and processes used by them to ensure the security of mobile phones sold in the country by August 28. Handset makers insist they-’re already protecting user data on the phones they sell. “We have always stood for securing the user data. User data on all our devices are fully secure, in compliance with the necessary laws and regulations,” said a spokesperson from Oppo.



Quote for the day:


"Continuous improvement is better than delayed perfection" -- Mark Twain


Daily Tech Digest - August 23, 2017

Its Time To Think Beyond Cloud Computing

Cloud computing giants haven’t ignored the lag problem. In May, Microsoft announced the testing of its new Azure IoT Edge service, intended to push some cloud computing functions onto developers’ own devices. Barely a month later, Amazon Web Services opened up general access to AWS Greengrass software that similarly extends some cloud-style services to devices running on local networks. Still, these services require customers to operate hardware on their own. Customers who are used to handing that whole business off to a cloud provider may view that as a backwards step. Edge computing’s vision of having “thousands of small, regional and micro-regional data centers that are integrated into the last mile networks” is actually a “natural extension of today’s centralized cloud,” Crawford says.


Quantum Computing, Artificial Intelligence (AI) and Solving the Impossible

Quantum computing differs from traditional binary computing in that takes advantage of the strange ability of subatomic particles to exist in more than one state at any time (it’s like your children, where you can both love and hate them at the same time). In classical digital computing, a bit is a single piece of information that can exist in two states – 1 or 0. Quantum computing uses quantum bits, or ‘qubits’ instead. ... One important area where quantum computing is expected to have a dramatic impact is in improving the ability for reinforcement learning to process an exponentially-wider range of operating variables in real-time, which is vital in automated cars and smart entities like factories and hospitals. As an example, Google has built a quantum computer which is 100 million times faster than any of today’s machines.


How To Bridge IT's Growing Generation Gap

The challenge for IT leaders is to manage those changes and balance differing priorities and expectations from the three age groups working in IT — millennials, baby boomers and the Generation X cohort stuck between them. It will require deft management skills, a lot of empathy and the ability to drive needed changes to keep organizations competitive in a fast-moving world. ... The push to accommodate millennials begins with the much-discussed need for companies to embrace "digital transformation" or become "digital organizations." IT experts may differ on exactly what that means, but there's widespread agreement about whom it applies to. "When it comes to the new skills required to accelerate digital transformation, a lot of existing staff who are baby boomers and Gen X-ers have legacy skills and need to be reskilled," says Lily Mok, an analyst at Gartner.


The winding road to GDPR compliance

With under a year to go, many businesses have not started preparations, and will need to develop and implement a strategy for compliance. Every organisation that processes the personal data of EU citizens will require a tailored strategy depending on, among other factors, company size, the types and amount of data it processes, and its current security and privacy measures. It is highly recommended that businesses seek legal advice to determine what may be required in their specific situation. However, there are common requirements that will affect all businesses – even the very smallest – that handle personal data. The first step on the road to GDPR compliance is to understand how personal data is stored, processed, shared and used within your organisation.


SDN: Technology to cut costs, speed new services

The past few years have seen the mainstream network vendors jump into SDN with both feet. They still offer feature-rich, turnkey switches with all the support and services mainstream enterprises have come to rely on but with a vendor-provided SDN controller. Many of the mainstream vendors also offer support for third-party controllers. The primary value is in reduction of operational expenses through the automation of configuration and management tasks instead of focusing on hardware costs. In actuality, network hardware accounts for less than 10% of overall data-center spend, while personnel costs can be well over half of a data center’s total cost of ownership. A small reduction in operational costs can pay significant dividends for the business.


The Hyper-Connected Economy: An Interview With Ken Sakai, MD Of Telehouse Europe

For companies that need secure, reliable access to one or more leading public cloud services, Telehouse Cloud Link, a multi-cloud connectivity exchange, delivers a private connection with predictable and scalable bandwidth between their network and cloud services. Telehouse’s collaboration with Microsoft allows enterprises and their IT infrastructure partners to seamlessly provision and manage private connections to Microsoft Azure and Microsoft Office 365 using a dedicated and predictable connection. ... Additional cloud service providers are expected to be added to Cloud Link in the near future and this ability to connect to multiple clouds through a single source removes the complexity of traditional network procurement.


Mimecast’s newly discovered email exploit isn’t a vulnerability, it’s a feature

Mimecast says that their newly discovered exploit undermines "the security and non-repudiation of email; even for those that use SMIME or PGP for signing…" That sounds frightening, but the reality is completely different. This isn't an exploit, or vulnerability. It isn't even a bug. What Mimecast describes in their advisory is a feature, and one that isn't even widely supported. Outlook.com and Gmail for example, block external calls to CSS using the LINK attribute. Mimecast makes mention of using EMBED, OBJECT, FRAME, or IFRAME, even SVGs as alternate modes of exploitation. Again, these are all known attack methods, and once more, many of the mainstream email providers block them. In fact, Mimecast themselves admit that Gmail, Outlook.com, and iCloud.com were not affected by their discovery.


Software-based networking brings new automation perks, challenges

Network automation gives IT organizations deploying complex applications the ability to control the rapid provisioning of network resources. It provides the ability to centrally manage the network and reduce operational costs by shifting the challenges of configuration from people to technology. Software-based networks can select appropriate network services based on parameters, such as application type, quality of service and security requirements. ... Network professionals spend significant time and resources adapting the physical and virtual network to changes in applications, compute and storage resources, and device location. Software-based networking tools can automate change management by associating specific network and security policies with applications and devices that can "follow" them as they migrate physically and virtually.


The future will be fuelled by data

Overlaid with a cognitive platform the process would not only be streamlined, it would be far more valuable to all the participants. "There is a lot of data we are generating here: identifying the tenant, landlord, specific lease, specific property and specific segment of that property," says Dobson. Add in the data the banks already hold about tenants and their supply chains, Dobson believes, "There is a profound opportunity to use advanced analytics and predictive modelling to give insights to the tenant, the landlord and the bank. If you expand the scope of the network to other documents or instruments, you are probably scaling exponentially the data that you could observe and the conclusions you might draw." Financial services is just one area that blockchain will impact; supply chains, Internet of Things (IoT), risk management, digital rights management and healthcare are poised for dramatic change using blockchain networks.


Serverless computing may kill Google Cloud Platform

According to Allamaraju, serverless computing is outpacing industry darlings like Kubernetes. His own Expedia “did over 2.3 billion lambda calls per month” back in late 2016, a number that has climbed since then. Nor is Expedia alone in discovering the productivity gains to be found with serverless computing: Coca-Cola, Nordstrom, Reuters, and others have jumped in. Yet it’s been AI and machine learning technologies like Kubernetes that Google has pinned its cloud hopes on. Focused on its Kubernetes-to-GCP and machine learning plays, Google has not built out the array of serverless services that its competitors have. As Mytton notes, “Once your core runtime requirements are met, the differences between the [serverless vendors’] services aren’t particularly important. … What does count is the availability of services to consume from within the cloud provider ecosystem.”



Quote for the day:


"Leadership is the art of influencing people to execute your strategic thinking" -- Nabil Khalil Basma


Daily Tech Digest - August 22, 2017

How Google is speeding up the Internet

BBR is not the first effort to speed up TCP. Researchers at North Carolina State University are credited with developing one of the most popular loss-based congestion control algorithms used in TCP today, named binary increase congestion control (BIC) and subsequently, CUBIC. At a high level, these also record measurements to estimate the optimal speed at which to send data when congestion is detected. Another congestion control algorithm that has become popular is named Reno. These all use packet loss to determine congestion, though Jacobson, the Google engineer who developed BBR, says that to his knowledge BBR is the only TCP algorithm that actually estimates the speed of traffic to determine the best way to send it, regardless of whether packets have been lost.


Artificial intelligence will let us outsource tedious tasks to our phones

This week marks the debut of Essential’s first gadget. The Essential Phone is an anomaly: a sleek, premium smartphone not designed by Apple, Samsung or a discount Chinese brand. It has a mirrored ceramic back, titanium edges, a display that covers most the phone’s front and a magnetic connector for a new world of accessories and hardware upgrades that he says will let people hang onto their phones longer. Rubin recognizes that Essential confronts formidable competition, especially from Apple and Samsung. But while he applauds the former’s brand power and the latter’s vertical integration, he said “every saturated market needs a disruption. When there’s a duopoly, that’s the time to do it.”


Doing things right: Cloud and SecOps adoption

The goal of SecOps is to help companies deliver software more efficiently and more securely, while reducing risk for the organization over time. The reality is that due to the new operating model in cloud environments security and operations teams must work together as the security team identifies risks and then works with operations to remediate them. “No matter what resources you do or do not have at hand, including personnel, budget, or tools, SecOps is both critical and achievable,” he believes. But one thing crucial to its implementation is leadership buy-in – the people in charge must realize that security is on equal footing with availability and performance. “If the e-retail boom taught suppliers that they must invest in site availability like they would to ensure their brick-and-mortar has its lights on, they must also invest in security like they would to ensure that the alarms work and doors lock.”


New York University Abu Dhabi researchers develop 'unhackable' computer chip

The chip has a secret key that makes it virtually impossible to access and would only function for authorised users. “Without the secret key, the chips cannot be made functional,” he said.  “The functionality of chip - what it does, how it does it - can only be known if the secret key is known.” A patent application has been filed at the US Patent Office. The researchers are creating a web-based platform to make information about the chip available to the public.  An extensive research paper by NYUAD’s Design for Excellence team will be presented in November at the ACM Conference on Computer and Communications Security in the US. “These are all theoretically proven points and we will present this at a top cyber security conference, but we need to test our claims practically as well," said Mr Sinanoglu.


Calls for UK boards to be better educated on cyber threats

One of the most worrying aspects is the lack of understanding of the serious nature that ignorance brings, said Simmonds. This ignorance has led to a lack of basic cyber hygiene, with companies typically lacking basic security controls and processes, and failing to train employees at all levels from the board down on how to deal with cyber threats. “This has been a consistent theme of Verizon’s annual Data breach investigations report over the past 10 years,” said Laurance Dine, managing principal of investigative response at Verizon. “We’ve seen that the majority of data breaches could so easily have been prevented if basic measures and protocols had been in place. For example, we often see that around two-thirds of breaches are traced back to weak, stolen or lost passwords, which could easily be prevented using two-factor authentication.


How To Choose The Right Enterprise Mobility Management Tool

A key to choosing the best EMM solution is aligning the features and capabilities of the platform to your organization’s requirements. This includes such factors as what types of business apps users typically work with, what security and regulatory compliance requirements the company has, what sort of network and service management features it needs, which mobile operating systems are in use, what level of reporting capabilities is needed, and so on. Selecting the right platform isn’t just a matter of getting the most features, but acquiring the features that best meet the organization's requirements. “Organizational needs relating to mobility differ considerably, as do the infrastructure environments into which mobility solutions will be implemented,” Holtby says.


Are you ready for state-sponsored zombie malware attacks?

Zombie malware combines the most deadly aspects of malware and zombie computers into one horrible mess. Typically malware gets into a compute device via phishing or email attachment which limits the scale of the attack. In contrast, zombie malware autonomously hunts for vulnerable systems across LAN, WiFi and VPN connections. Once zombie malware finds a system to infect, it utilizes the new host to scan for other systems which can be anywhere on the globe. Another key aspect of zombie malware is the lack of a control channel to manage its destructive path (unlike zombie computers used in DDoS attack). Subsequently zombie malware just destroys anything it can connect to. For example, the NotPetya started on Ukraine government systems but then quickly spread around the globe.


How to get Android 8.0 Oreo on your Pixel or Nexus right now

While Google's own Pixel and Nexus devices are almost always first in line for a fresh Android rollout, this year's dessert-themed delight isn't actually quite ready to be served to everyone just yet. Google says it's in the midst of "carrier testing" with the Pixel, Nexus 5X and Nexus 6P Oreo builds and expects to start sending updates out to those devices soon. ... Realistically, the wait for Pixel and Nexus owners to get Oreo as an official over-the-air update likely won't be long. But we tech enthusiasts are a notoriously impatient bunch, and when something new is available, gosh darn it, we must have it. Well, not to fear, my fellow shiny-new-software fanatics: If you own a Pixel, Nexus 5X or Nexus 6P, you can actually get Android 8.0 Oreo on your phone this very minute — with the help of a handy little hack.


The cloud could drive open source out of the enterprise

First of all, open source’s no-cost attribute means less in the cloud. Public cloud providers will charge you for the time you use their cloud to access open source software—or any software. Thus, it doesn’t really matter if you AWS Linux, Red Hat Linux, or closed-source platforms from Microsoft, because they are all “free” yet cost the same in cloud time charges for access. The same is true with the databases; there’s not much different in your monthly cloud bill if you use open source databases versus closed source, or those that are native to a specific cloud such AWS Red Shift. If there is not a dramatic cost advantage, most enterprises won’t care about the platforms that they use in the long run, and that takes away one of open source’s historic strengths.


How to set up an all open-source IT infrastructure from scratch

Not choosing Microsoft Windows is the first obvious decision here. The cost is to high (both in terms of up-front monetary investment and recurring costs associated with securing a closed platform). MacOS is, for the same reason, off the table.  What specific platform I chose, at that point, comes down to what my specific needs are within the organization. Chance are I would select a Linux-based platform (either a free Linux distribution – Debian, openSUSE, Fedora, etc. – or a similar system with paid support). Support is the main reason to consider a paid, closed system anyway, so might as well get all the benefits with none of the drawbacks of a system like Windows. Save money, increase security. No brainer.  For applications, I’d also standardize around LibreOffice for the office suite and one of the several open-source web browsers (such as Firefox).



Quote for the day:


"Knowledge Management is the art of creating value from intangible assets." -- Karl-Erik Sveiby


Daily Tech Digest - August 21, 2017

Industry 4.0: How the Internet of Things is Revolutionizing Manufacturing

"Unlike traditional relationships where feedback on products and services takes time to gather, the automated closed-feedback loop is an inherent component of Industry 4.0," Ramaswami said. "The seamless record-keeping enabled by digital systems will speed traceability, while limiting liabilities, warranty costs and recalls." Despite these advantages, the shift is still in the early stage. According to research from Capgemini, only 6 percent of manufacturers are considered "digital masters," or those that have reached an advanced stage in digitizing the production process. That means competitive advantage is still up for grabs, rather than implementation becoming an imperative to merely remain competitive. Still, the movement is real; Capgemini estimates that 76 percent of manufacturers already have a smart factory initiative in the works or currently under formulation.


The importance of building ethics into artificial intelligence

Companies deal with team changes regularly. Issues arise tied to trust, accountability and personnel behavior that goes against the values of a company – or society, in general. In the tech industry alone, sexism, racial bias and other serious, but eradicable trends persist from the C-suite down to the entry-level.  Consequently, the industry should focus on efforts to develop and grow a diverse talent pool that can build AI technologies to enhance business operations and address specific sets of workplace issues, while ensuring that it is accountable.  Employers need to recruit people who understand the importance of applying strict human resources guidelines to AI performing tasks alongside human employees across industries and geographies. AI, for its part, needs to learn how to conduct itself in a work environment and be rewarded for expected behavior to reinforce good habits.


Top 3 Breakthroughs in Combating Financial Crime

In times of political and economic change, financial crime and corruption tend to grow fast. The shock of Brexit, terrorist attacks, the revolution in the Islamic world and other factors create an environment that is demanding for change. AI and Analytics driven solutions have been widely adopted across different industries for various purposes. However, only a handful of banks around the world are working with advanced analytics and artificial intelligence technologies to improve their risk and compliance activities. As the world enters into an era of high uncertainty, the upcoming years will see financial institutions adopt and deploy best-in-class analytics powered tools as part of their efforts to remain fully compliant and to combat financial crime. With that in mind, here are the top three trends that will power the compliance revolution


Building security into IoT devices: the new potential for security integration

IoT devices are vulnerable by virtue of their networked operation. A connected wristband monitoring a patient’s heartbeat and blood oxygen levels, for instance, might continually send sensitive private data over a wireless link to a medical application hosted by a cloud service provider. It is useful to think of the vulnerability in this type of device – and therefore the protection that is required – in terms of layers. For example, one layer is the personal area network connection, typically a Bluetooth Low Energy radio link to a smartphone or tablet with which the wristband is paired. An extension of this layer might be the Wi-Fi link provided by the smartphone or tablet to a home router or gateway. The second layer might be the cloud platform, such as Microsoft’s Azure or Amazon’s AWS; and the third is the application itself running in the cloud.


Predictive marketing: taking the guesswork out of adverts

Predictive marketing moves away from stats, stereotypes or the constraints of age and gender towards informed messaging decisions that amplify the customer journey. This is done through AI-driven propensity models based on billions of moments. These models learn about a customer’s future behaviour, based on their previous interactions, such as browsing behaviour, past purchases and interests, as well as metadata about their devices and thousands of other variants. All these aspects combined paint a holistic picture of the entire customer journey and are delivered at scale in real time. All the data in the world – even your own – is worthless it can be converted to intelligence and applied to your business, giving you better insights into your own customers and prospects than ever before.


Seven Keys to Strengthen Your Cybersecurity Culture

A lot has been written about benchmarking and following best practices in cybersecurity. One important question is whether you know where you are heading? What is the vision of what success looks like for your security and technology teams? Consider visiting your industry peers and learning from other public and private sector organizations that are doing cybersecurity culture well. Look at the National Association of State CIOs (NASCIO) award-winners, NGA best practices and state and local partners in your region. Consider a road-trip to learn from others and benchmarking progress. For example, back in 2011-2012, Stu Davis the Ohio CIO, brought a team up to Michigan to see how we built our security architectures and governance. Ohio State government used that visit and follow-on conversations to build an excellent cybersecurity program.


Your failure to apply critical cybersecurity updates is putting your company at risk

Despite the impact of WannaCry, a month later it seems that many organisations hadn't bothered to apply the correct patches, as Petya used the same exploit to spread itself across infected networks. It claimed a number of high-profile victims -- many of which are still dealing with the post-infection fallout. "Something we don't talk about often enough is the opportunity everyone has to limit bad consequences by employing consistent and effective cybersecurity hygiene," said Phil Quade, chief information security officer at Fortinet. "Cybercriminals aren't breaking into systems using new zero day attacks, they are primarily exploiting already discovered vulnerabilities." Researchers say lessons must be learned and that if security patches are released then they need to be applied.


What’s new with WebAssembly portable code

A key goal of WebAssembly is enabling code written in languages besides JavaScript to run in the browser. The technology serves as a compile target for other languages. Right now, C++ is the preferred language for use with WebAssembly. It is technically possible now to use other languages with WebAssembly, and there have been experimental implementations to work with the format. However, these languages cannot currently achieve the ideal performance, memory utilization, or DOM integration, Wagner said. As a result, WebAssembly will likely be enhanced to support languages using higher level garbage collection, such as Java, C#, and Python. “We’ve been discussing adding direct support for WebAssemby in a way that plugs into the garbage collector that’s already in the browser,” Wagner said.


Debunking the myths around agile development

By all indications, agile is helping enterprises around the world succeed. For the past five years, the top three cited benefits of agile include: manage changing priorities (cited by 87 percent), team productivity (cited by 85 percent), and project visibility (cited by 84 percent). Still, projects still have interdependent tasks and the percentage complete must still be tracked and reported to completion. A project is still a project, a deliverable is still a deliverable, and as such project management principles still apply. So myth one debunked: Agile does not mean you don’t project manage. Agile means you project manage constantly, to the very heartbeat of the development teams. Keep your basic project management practices as your guiding principles. And stop thinking that the scale or complexity is new or unique.


UCaaS vs. CPaaS: Which supports external communications better?

Any cloud communications-as-a-service system -- whether it's a UCaaS or CPaaS platform -- will usually be better than on-premises options for external communications, because the cloud itself resides outside corporate premises. The cloud-based system naturally communicates across network boundaries. At times, though, cloud-based systems may falter for internal communications, especially if the network or IT is highly restrictive. If this is the case, an on-premises approach is probably a better option. But, with on-premises networks, external communications can be challenging. In general, internal communications can work better with UC services, whether it's a UCaaS platform or on premises. UC is designed, developed and deployed to handle internal communications.



Quote for the day:


"He that is overcautious will accomplish little." -- Friedrich Schiller


Daily Tech Digest - August 20, 2017

How to Prepare for the Next Cloud Outage

Preparing for a cloud service outage isn't much different than getting ready for any system failure, according to HyTrust's Krishnan. No matter the nature of the network, there will always be three pinch points, or "vectors of control," that managers need to master. The first is scope, which is the number of objects each admin or script is authorized to act upon at a particular time. Using the Microsoft outage as an example, a deployment task's scope would limit the number of containers it could operate on at one time. The second control vector is privilege, which controls what type of action an admin or script (task) can take on an object. An example of a privilege restriction would be a task that is allowed to launch a container but not to destroy one. 


A former Marine cyber warrior explains how hackers will transform the face of modern combat

Cyber warfare is already used for such things as disabling air defense systems, but these attacks will grow dramatically in range and capability in the coming years. Thanks to the rise of Internet of Things technologies, which are now being adapted into everything from dams and power grids to commercial trucks, US cyber warfare teams will have an abundance of targets at their disposal.  It’s not hard to imagine future scenarios in which US forces use cyber warfare tactics to sabotage power plants, telecommunications infrastructure and other critical facilities, either through coordinated remote attacks or on-site Special Forces teams with embedded cyber warriors. We’ve already seen this to some extent with Stuxnet, Flame and other malware which were designed to disrupt the nuclear capabilities of adversarial states.


Merging big data and AI is the next step

Businesses can now process massive volumes of data which was not possible before due to technical limitations. Previously, they had to buy powerful and expensive hardware and software. The widespread availability of data is the most important paradigm shift that has fostered a culture of innovation in the industry. The availability of massive datasets has corresponded with remarkable breakthroughs in machine learning, mainly due to the emergence of better, more sophisticated AI algorithms. ... Previously, chatbots had trouble identifying certain phrases or regional accents, dialects or nuances. In fact, most chatbots get stumped by the simplest of words and expressions, such as mistaking “Queue” for “Q” and so on. With the union of big data and AI however, we can see new breakthroughs in the way virtual agents can self-learn.


Artificial intelligence is coming to medicine — don’t be afraid

Artificial intelligence (AI) is bringing us to the precipice of an enormous societal shift. We are collectively worrying about what it will mean for people. As a doctor, I’m naturally drawn to thinking about AI’s impact on the practice of medicine. I’ve decided to welcome the coming revolution, believing that it offers a wonderful opportunity for increases in productivity that will transform health care to benefit everyone. Groundbreaking AI models have bested humans in complex reasoning games, like the recent victory of Google’s AlphaGo AI over the human Go champ. What does that mean for medicine? To date, most AI solutions have solved minor human issues — playing a game or helping order a box of detergent. The innovations need to matter more. The true breakthroughs and potential of AI lie in real advancements in human productivity.


How Do You Get Data into Your Company DNA?

How Do You Get Data into Your Company DNA? 5 Strategies for Spreading Data Management Best Practices Throughout Your Organization It would be nice if sound data management required nothing more than hiring great data scientists or having the right data tools. Unfortunately, it’s more complicated than that. Sure, having data experts on your team and a great data management toolset in your organization’s portfolio of IT resources forms the foundation for leveraging value from your data. But making the very most of your data requires help from everyone in your organization. That doesn’t mean every employee needs to get a stats Ph.D. It does, however, require you to implement some organization-wide policies and cultural values in order to brick smart data practices into your entire organization.


Gartner Predicts Information Security Spending To Reach $93 Billion In 2018

The Gartner report suggests that security services will continue to be the fastest growing segment – especially IT outsourcing, consulting and implementation services. However, hardware support services will see growth slowing, due to the adoption of virtual appliances, public cloud and software as a service (SaaS) editions of security solutions, which reduces the need for attached hardware support overall. ... “If you look at the continuous and almost unstoppable acceleration in breaches, I think these estimates are vastly underestimated. If you take a look at the aggregate losses due to data breaches in the last five years and project those forward, the growth rate would be at least an order of magnitude above what the spend estimates are to stop these breaches. ...”


How to make agile work for the C-suite

At the enterprise level, think of all of your corporate initiatives as a backlog just like how software developers think of future product features as a backlog. See your leadership team as employing an agile software-development framework that prioritizes the backlog based on importance, then tackles each task in sequence until they’re all completed. Reprioritize your enterprise backlog when new initiatives are added and supplement the traditional annual strategic-planning cycle with real-time, issue-based planning, so resources can be allocated more dynamically. Continuous planning can ensure that resources are being directed toward evolving priorities and away from initiatives that have grown less important.


How IT became business problem solvers at Level 3 Communications

It’s important that we as an IT organization understand that entire journey, since unlike any other organization at Level 3, we are involved from quote to cash. Each step in the journey may be unique, but the reality is that they all build upon each other. In IT, we see the steel thread and the levers we can use to improve the experience. There used to be a time in IT when we would say to our business partners: “Don’t tell us the how; tell us the what.” We are beyond that now; we don’t wait for the “what.” Our job is to bring the “what” to our business partners. For example, members of my team were looking at how the business was processing orders. They found a way to aggregate multiple orders, which made the process much simpler. They came up with that themselves, and brought it to the operations group, where the solution was well received.


Australia Aims to Regulate Bitcoin Exchanges

When we speak about casinos and betting in Australia, it is important to note that the country is now in the process of changing its gambling regulation regulation. The results of the amended regulation have already been felt, quite a few gambling companies have exited the market and more are expected to stop operating in the country within the coming weeks. Consequently, more severe regulations in the bitcoin space will assist Australian officials in keeping the industry compliant. Next to this, it is a common knowledge that the bitcoin is a main currency for various illegal activities: ransoms, drug deals, weapons and more. Once the exchanges get regulated, illegal transactions will not be as frequent, at least on the territory of Australia. However, the above use cases are not the main rationale behind regulating the bitcoin exchanges in Australia.


Q&A on the Book Stupidity Paradox

Many organisations encourage people to think of themselves as inspirational leaders. But this often alienates their followers and means they ignore the nuts and bolts of getting a task done. The second is an attachment to branding. We witnessed military organisations which were more keen on running rebranding exercises than running military exercises. The third driver of functional stupidity is mindless imitation. Often large organisations copy others for no better reason than they want to up with the latest fashion. This leads firms to implement new initiatives which are inappropriate for them. The fourth is pointless policies and procedures which are thoughtlessly followed. Many professionals spend more time ticking off boxes than actually doing their job.



Quote for the day:


"If you want people to to think, give them intent, not instruction." -- David Marque


Daily Tech Digest - August 19, 2017

Oracle doesn't want Java EE any more

Oracle plans to explore its desire to offload Java EE with the open source community, licensees, and candidate foundations. Although Oracle has not named possible candidates, the Apache Software Foundation and the Eclipse Foundation are likely possibilities. Oracle has already donated the OpenOffice productivity suite and the NetBeans IDE to Apache, and the Hudson integration server to Eclipse. Like Java, all three technologies—OpenOffice, NetBeans, and Hudson—were acquired in Oracle’s 2010 acquisition of Sun Microsystems. Eclipse is ready to take on Java EE if chosen. “We believe that moving Java EE to a vendor-neutral open source foundation would be great for both the platform and the community,” said Eclipse Executive Director Mike Milinkovich. “If asked to so, the Eclipse Foundation would be pleased to serve as the host organization.”


Next step in the content evolution

A recent ASG-commissioned technology adoption profile study, “Today’s Enterprise Content Demands a Modern Approach” by Forrester Consulting found 95% were using more than one system to manage enterprise content, including 31% using five or more systems. This leads to disjointed information and difficult access. Lack of flexibility is therefore one clear shortcoming of existing approaches to ECM. Organisations want to invest in systems and technology that allow them to grow and adapt to changing markets but traditional ECM often hinders their progress. Further, 82% of respondents reported an increase in unstructured data in the form of business content, like office documents, presentations, spreadsheets, and rich media. They are also managing transactional content from outside the organisation. Traditional ECM systems struggle to cope with this level of growth due to another key shortcoming – their inability to scale.


How Blockchain Technology Is 'Disrupting' The Art Economy As We Know It

The technology that supports Bitcoin and other cryptocurrencies and is now being used to decentralize other industries as well. Given that the blockchain is a distributed ledger and completely secure and transparent, users are able to be connected to each other without the centralized hub of a corporation. Simply put, management has been replaced by machines. In this new decentralized world, art has been one of the first and greatest use cases. Artists who otherwise would have been forced to use a large-scale centralized company to distribute their work are now able to distribute work in a decentralized way, and to receive rewards for their creations without profit-skimming corporate structures in place. And, are there entities seeking to disrupt matters, although whether they can succeed in their endeavours is another matter.


How a data cache can solve your JavaScript performance problems

Service workers can be unpredictable. They can generate their own responses, and their response mechanism is not baked into the browser. "There are no caching semantics baked into service workers, unless the developer adds them in," Weiss said. If a service worker is not able to create a response, it uses the fetch API to look further up the stack. At the network layer, the application then checks the HTTP cache, which uses very strict caching semantics. HTTP cache is also persistent, which allows it to save resources to disk for later use. However, it is considerably slower than MemoryCache, which operates at RAM speeds. If data is not found in HTTP cache, the browser makes one last check for the Push Cache available as part of HTTP/2. But this is more complicated, since different browsers have different rules for managing Push Cache.


Demystifying AI, Machine Learning and Deep Learning

Deep learning is the name for multilayered neural networks, which are networks composed of several “hidden layers” of nodes between the input and output. There are many variations of neural networks, which you can learn more about on this neural network cheat sheet. Improved algorithms, GPUs and massively parallel processing (MPP), have given rise to networks with thousands of layers. Each node takes input data and a weight and outputs a confidence score to the nodes in the next layer, until the output layer is reached where the error of the score is calculated. With backpropagation inside of a process called gradient descent, the errors are sent back through the network again and the weights are adjusted improving the model. This process is repeated thousands of times, adjusting a model’s weights in response to the error it produces, until the error can’t be reduced any more.


Pentagon eyes bitcoin blockchain technology as cybersecurity shield

The key to blockchain’s security: Any changes made to the database are immediately sent to all users to create a secure, established record. With copies of the data in all users’ hands — even if some users are hacked — the overall database remains safe. This tamper-proof, decentralized feature has made blockchain increasingly popular beyond its original function supporting the bitcoin digital transactions. Many cutting-edge finance firms, for instance, have used blockchain to expedite processes and cut costs without compromising security. In Estonia, home of the video phone pioneer Skype, officials have reported using blockchain to track national health records. In Russia, experiments are underway to integrate blockchain into the general payment economy.


Tech breakthroughs megatrend

Collectively, those driving factors are forcing big questions to the surface - questions that C-suite executives themselves are asking. To help provide answers, we tracked more than 150 discrete technologies, and have developed a methodology to identify the most pertinent of those technologies;  ... The specific technologies most impactful to a company can - and likely will - vary, of course, but when we analysed for technologies with the most cross-industry and global impact over the coming years, eight technologies emerged. They are at varying degrees of maturity; some have been around for years but are finally hitting their stride, while others are maturing rapidly. None will be surprising to CEOs; they are regular subjects of often breathless coverage in popular newspaper coverage.


Hacker claims to have decrypted Apple's Secure Enclave

"Apple's job is to make [SEP] as secure as possible," xerub said. "It's a continuous process ... there's no actual point at which you can say 'right now it's 100% secure.'" Decrypting the SEP's firmware is huge for both security analysts and hackers. It could be possible, though xerub says it's very hard, to watch the SEP do its work and reverse engineer its process, gain access to passwords and fingerprint data, and go even further toward rendering any security relying on the SEP completely ineffective. "Decrypting the firmware itself does not equate to decrypting user data," xerub said. There's a lot of additional work that would need to go into exploiting decrypted firmware—in short it's probably not going to have a massive impact. An Apple spokesperson, who wished to remain unidentified, stated that the release of the SEP key doesn't directly compromise customer data.


Businesses need to talk about the cloud

Performance issues are a commonly cited bugbear following a cloud migration – with research finding organisations experience a problem at least once every five days. If the application in question is business critical, this could be at serious detriment to the organisation. From high network latency to application processing delays – poor cloud performance costs businesses both time and money, and greatly affects the end-user experience. But for many organisations, simply understanding where a performance issue occurs in the first place, is a challenge. In the ‘old days’ of on-premise IT infrastructure, life was simpler. Organisations could, for example, quickly identify a misbehaving server in their data centre and initiate a fix. Today, the picture is not that straightforward, particularly with the increased uptake of public cloud services, because ‘your’ server is now in someone else’s data centre.


All ‘things’ connected, the ‘I’ in the IoT – a closer look. Part three

Which technology or network type will prevail in the future is (very) hard to predict. In fact, there’s no real reason why they should be mutual exclusive, they don’t have to be. The fact that LTE networks have such a broad range globally and that they can also be used to provide NB-IOT and LTE-M networks with relative ease could oppose a threat to LPWAN networks. Especially when companies like Verizon and AT&T are the ones pushing the technology. Though the same can be said for LoRa as well, companies such as IBM and Cisco are showing immense interest, as are CSP’s like Swisscom and KPN. On the other hand, with the LTE/cellular companies focussing on the high-end market, so to speak, and the LPWAN providers focussing on the lower to mid-market range, mainly in the form of sensor based data transport, there could be room for both.



Quote for the day:


"The desire of knowledge, like the thirst for riches, increases ever with the acquisition of it." -- Laurence Sterne