Daily Tech Digest - August 27, 2017

Applying Enterprise Architecture Rightly

This article takes reference of TOGAF for the discussion. While practicing TOGAF, it’s always implied that TOGAF is a suggestive approach and NOT a prescriptive approach. Xoriant has recently forayed into TOGAF with the help of newly formed TOGAF Certified Architects and is committed to provide solutions to the clients on the proven success stories. Many a times, while interacting with business houses, requirements always start with a single line or sometimes, even a clause, e.g., – We need ERP. This is where a business canvas is provisioned and plans are made to evaluate the opportunity and provide information on that. Considering entire lifecycle of a business opportunity, it’s very important to put the right starting step so that Service Provider does not incur losses.


Issues necessitate Change - Evolution of Enterprise Architecture

The drive for continuous improvement and Information Technology alignment demanded further improvements in the way we thought about architecture – moving ever into the task of understanding an organizations business issues.  Other issues that drove changes, such as the following – no chronology implied. Corporate espionage, whether stealing secrets of an organization or stealing information about customers, where taking advantage of corporation vulnerabilities drove the need to look at those vulnerabilities holistically – that is looking at people, processes, and technology. Architects and builders changed to look beyond just IT to address this area.


Where Enterprise Architecture and Project Management Intersect

Enterprise Architecture is not about writing or specifying software code or systems - it is about architecting optimized business processes and systems based on management's strategies for the organization. While software development certainly has its place in organizations, that is an IT function best left to IT and more IT-specific solution, data, and information architects. Since EA is evolving away from IT-centricity and IT project teams, it doesn't make much sense to make them members of IT project teams. However, it may make sense for IT project teams to utilize EAs as subject matter experts - who are advisors and give guidance to specific projects, but are not members of the specific teams OR responsible for any of their scope, timeline, or deliverables.


OpenJDK may tackle Java security gaps with secretive group

The vulnerability group and Oracle’s internal security teams would work together, and it may occasionally need to work with external security organizations. The group would be unusual in several respects, and thus requires an exemption from OpenJDK bylaws. Due to the sensitive nature of its work, membership in the group would be more selective, there would be a strict communication policy, and members or their employers would need to sign both a nondisclosure and a license agreement, said Mark Reinhold, chief architect of the Java platform group at Oracle. “These requirements do, strictly speaking, violate the OpenJDK bylaws,” Reinhold said. “The governing board has discussed this, however, and I expect that the board will approve the creation of this group with these exceptional requirements.”


First Robocop to Join Dubai Police Force

That’s right, the time for the Robocop to leave the sci-fi land and enter reality… or sort of. A couple of years ago, Dubai’s police promised that for 2017 they would have enlisted their first robot police officer, and they have delivered. Built by the Spanish robotics company PAL Robotics, the police officer, known as REEM, was introduce to the public during the Gulf Information Security and Expo Conference. .. Although this is fascinating news, it’s still reality and reality is never as fun as fiction so this Robocop is a little duller. It’s not a cyborg half human, half robot, it’s just a robot with batteries. He cannot chase criminals like said before it has wheels, not legs, and it’s unarmed. However, even if REEM doesn’t have the cool design and features that we are used to seeing on the big screen, it means a tremendous achievement for science.


IoT: Penetrating the Possibilities of a Data Driven Economy

With manufacturing units now being able to communicate with each other through the deployment of IoT system solutions, analysis of facility performance metrics can be performed in real time. Management executives, if they want, can also resolve the performance monitoring to shop floor levels, which helps provide revealing manufacturing insights.  Serving as an example is the manufacturing giant, Caterpillar. The company has deployed the SAP Leonardo system, an IIoT technology, across all its operation facilities. The system furnishes real time information about manufacturing data, energy utilization data, machine performances, and data regarding the production consumables. Combining altogether, the company executives can have a 360-degree view of the manufacturing processes which leads to better tactical decision making.


Why are the stats on women in tech actually getting worse?

Girls look up to their role models, we have a number of women role models in entrepreneurship and other fields but there’s a significant lack of women role models in technology. Only 23.5% of computer science degrees were awarded to women last year in one of the biggest universities in the States. Some seniors found that they knew of few female computer scientists working in the professional world. This can be a reason why girls aren’t interested in this field as much. This can be changed as well, female representation should not be lacking in any field, especially, the ever growing area of technology. The seniors at Stanford have come up with an organization which dedicates itself in telling stories of women who work in programming, the SHE++ aspire women to take up arms in the technology field and to connect with women in this field as well.


Microsoft is making a blockchain that’s fit for business

Note that Coco is a framework, not a ledger; in fact it uses other ledgers. Ethereum is working already, and Intel along with J P Morgan Chase are porting their ledgers to Coco – plus other blockchain ledgers will also integrate with it. You can also choose what algorithm you want to use to achieve consensus. In one test using the Ethereum ledger in Coco, the network delivered 1,500-1,600 transactions per second with latency between 100-200 milliseconds – far faster than Ethereum itself running on the same hardware. Russinovich says Coco will also scale to networks with hundreds of thousands of participants. Because each transaction is only calculated once, time-sensitive or restricted data isn’t a problem either.


What Is Blockchain? A Primer For Finance Professionals

Now imagine that every time you send the monthly living allowance, you laid down a “block” with the transaction information carved into it. Both you and your child can see the block, confirming that the money was sent and received. ... Together, they create a record of all transactions with your future college graduate. When you get old and infirm, you can point to the chain, show your kid how much money you paid for college, and demand that they invest a similar amount in a high-quality nursing home. This is, more or less, how blockchain works. Each block is a record of a monetary transaction. The chain is a shared accounting ledger that is visible to all parties across multiple networks, or “nodes.” Every new transaction is verified by all nodes and, if valid, added to all copies of the ledger—in other words, a new “block” is added to the “chain.”


5 essentials for building the perfect Internet of Things beast

An IoT platform has more elements, and therefore is more complex, than a typical technology platform many are used to, they observe. These new platforms need to reach out to all the devices, sensors and applications and their underlying technology as well. "Look at the whole technology environment, not just the applications," the McKinsey authors advise. "Use fungible/off-the-shelf technology for the things that are less critical." Remember, too, that IoT is a different beast for every industry, or for every company for that matter. For a sportswear company, it may mean sensor-loaded sneakers. For an manufacturer, it means embedding sensors into production-floor tools. For an insurance company, it means planting telematics sensors in policyholders' cars.



Quote for the day:


"Sometimes life takes an unexpected wrong turn in the right direction." -- Unknown


Daily Tech Digest - August 26, 2017

Disaster Recovery Vs. Security Recovery Plans: Why You Need Separate Strategies

A security recovery plan is designed to stop, learn, and then correct the incident. "A disaster recovery plan may follow similar steps, but nomenclature would not likely use 'detection' to describe a fire or flood event, nor would there be much in the way of analytics," says Peter Fortunato, a manager in the risk and business advisory practice at New England-based accounting firm Baker Newman Noyes."Further, not many disasters require the collection of evidence." Another risk in merging plans is the possibility of gaining unwanted public attention. "For instance, invoking a disaster recovery plan often requires large-scale notifications going out to key stakeholders," Merino says. 


The Future of Public Cloud Storage for Big Data

Just as happened with Moore’s law when silicon chips met transistors, public cloud and big data are creating exponential effects. A recent research predicts that public cloud prices for big data processing and storage will decrease by half every few years while processing power will double. Public cloud skeptics predict that costs of big data storage in the public cloud will be the same or increase slightly in 10 years. however, this not true as the costs are expected to decrease significantly. On the other hand, the costs of upgrading big data software will increase. in a few years, Hadoop data lake will need to be upgraded. Right now, data scientists are preferring multiple versions of Spark, indicating the beginning of on-premise headaches. This can only get worse as Google, Amazon Web Services (AWS) and Microsoft pursue a serverless strategy.


5 Industries AI Will Disrupt in the Next 10 Years

It's difficult to talk about AI without evaluating its place in the ecosystem. Loosely speaking, it starts with the Internet of Things, in which objects are connected to the internet and used to gather data. Once enough data has been gathered, it passes the arbitrary threshold and becomes "Big Data", which AI is used to interpret. When there are so many data points that no human could ever process them all, artificial intelligence becomes the only real alternative. But AI doesn't always know what it's looking for, which is where machine learning comes in. Loosely speaking, that's the process of using AI to analyze data in such a way that it 'teaches' itself to interpret it. AI disruption, then, is largely going to come in the form of new ways of processing and interpreting data that have never before been available. Here are just five of the industries that AI is set to disrupt.


3 Amazing Ways AI is Going to Wipe Out Cyber Crime!

Ultimately it comes down to the Machine Learning experts, who are the main players behind "educating" the machines. Digital signature, authentication, hiding the IP, masking identity, encryption, firewall, etc have already been implemented by various firms... but what new strategies can machine learning discover?  Deep machine learning can apply various algorithms to identify the malicious activities taking place on the network. It can be determined by finding unusual patterns interacting with the system infrastructure.  You might have seen Google asking you to verify if you are not a robot. You click on some of the images and then it lets you browse. When you browse Google for long hours on end, this is the common activity that occurs.


Security professionals name top causes of breaches

While this finding underlines the importance of user education and training, the respondents said human error is exacerbated by understaffed security teams and a flood of alerts and false positives. This highlights the negative impact of companies struggling to recruit cyber security teams in the face of a worldwide shortage of people with information security skills and the need for greater staff support. This shortage of cyber defenders with the right skills is further underlined by the fact 43% of respondents said technology detected the attack, but the security team took no action, while another 41% said a combination of technology and human error was to blame. Respondents also blamed a lack of information resources to understand and mitigate attacks, with 42% saying they are left to figure them out themselves.


Leveraging Advanced Analytics to Power Digital Transformation

Recent conversations with Walker Stemple of Intel’s @intelAI organization got me thinking about where and how organizations can leverage “advanced analytics” to power their business models. Now “advanced analytics” is a broad definition, but I have included the following analytics in that definition: Regression, Clustering, Neural Networks, Machine Learning, Deep Learning, Artificial Intelligence and Cognitive Computing. And while these “classifications” seem to change on a regular basis (sometimes due to us getting smarter; sometimes due to non-value-add marketing hype), it is critical that tomorrow’s business leaders understand where and how to apply these advanced analytics to power their business models.


Ukraine Central Bank Detects Massive Attack Preparation

The National Bank of Ukraine - the country's central bank - declined to share a copy of the letter with Information Security Media Group, but confirmed that it had alerted banks to a new, potentially major attack. "In order to prevent cyber attacks, the National Bank of Ukraine consistently cooperates with banking sector participants, the State Service of Special Communication and Information Protection of Ukraine (SSCIPU), as well as relevant units of the Security Service of Ukraine and the National Police of Ukraine," a spokesman for the National Bank of Ukraine tells ISMG. "On August 11, the NBU promptly informed banks about new malicious code, its characteristics, indicators of compromise and the need to take preventive measures to prevent the networks from being attacked by malicious codes."


Looking beyond the hype of robotic process automation

RPA is particularly appealing for companies that are juggling with millions or even billions of transactions a day. With such an overwhelming amount to deal with, they often struggle to effectively manage important tasks like addressing customer requests, processing files, moving information between different systems, allocating work and making decisions. But RPA promises to help some organizations alleviate the challenge and operate more efficiently by automating, and thus accelerating, transaction processing. They can then provide greater customer service, which inspires continued loyalty and has a direct, positive impact on a business’ bottom line. ... The promise of RPA is creating massive hype in the market, which is being leveraged by RPA vendors to position their products as the “silver bullet” for any company looking to streamline and optimize operations.


What Is Data Mining? How Analytics Uncovers Insights

Data mining comes with its share of risks and challenges. As with any technology that involves the use of potentially sensitive or personally identifiable information, security and privacy are among the biggest concerns. At a fundamental level, the data being mined needs to be complete, accurate, and reliable; after all, you’re using it to make significant business decisions and often to interact with the public, regulators, investors, and business partners. Modern forms of data also require new kinds of technologies, such as for bringing together data sets from a variety of distributed computing environments (aka big data integration) and for more complex data, such as images and video, temporal data, and spatial data.


What is Rust? Safe, fast, and easy software development

Rust started as a Mozilla research project partly meant to reimplement key components of the Firefox browser. A few key reasons drove that decision: Firefox deserved to make better use of modern, multicore processors; and the sheer ubiquity of web browsers means they need to be safe to use. But those benefits are needed by all software, not just browsers, which is why Rust evolved into a language project from a browser project. Rust accomplishes its safety, speed, and ease of use through the following characteristics: Rust satisfies the need for speed. Rust code compiles to native machine code across multiple platforms. Binaries are self-contained, with no runtime, and the generated code is meant to perform as well as comparable code written in C or C++.



Quote for the day:


"No great manager or leader ever fell from heaven, its learned not inherited." -- Tom Northup


Daily Tech Digest - August 25, 2017

How Python makes programming simple

Because Python is easy and fast to write, that saves developer time, although this typically comes at the cost of execution time. The same programs in other languages—like C, C++, and Java—may take longer to put together, but they typically run many times faster than a Python app. But Python can also run fast when it needs to, because many third-party libraries for Python are written in faster languages like C. All Python has to do is plug into such a library, and it can run at or close to the speed of those languages when performance matters. Mastering new things in IT is always tricky, whether it’s containerization, devops, or extracting a little meaning from a lot of data. Python is designed to give you a leg up on getting all those things done, both now and into the future.


The current state of government cybersecurity is 'grim,' report says

When it comes to cybersecurity readiness, US government organizations aren't doing so hot. In a recent report from SecurityScorecard comparing the security practices of 18 industries, government ranked no. 16. "In the midst of investigations into a potential 2016 election hacking, regular major malware events, and an overall increase in the number of sophisticated cyberattacks, the report highlights that the government sector is lagging compared to almost every other industry," the report said. For the report, SecurityScorecard analyzed 552 local, state, and federal organizations to see how their security practices stacked up across 10 key categories. The only two industries that ranked lower were telecom (no. 17) and education (no. 18). For government, that's actually an improvement over last year, the report said, when it ranked dead last.


Indian CIOs to benefit from emerging and maturing technologies: Gartner

“This year’s Hype Cycle demonstrates the keen interest Indian organizations are taking in both emerging and maturing technologies,” said Pankaj Prasad, Principal Research Analyst - Gartner. “The market is witnessing the entry of local vendors in emerging, as well as mature, technology segments, including areas such as IoT robotic process automation offerings and machine-learning-based technologies,” added Prasad. ... Some technologies, such as mobile money, social analytics and robotic process automation offerings, will support new ways of doing business across industries. Technologies such as machine learning, IoT and smart city frameworks are of a transformational nature, which will result in a significant transformation within the industry dynamics and in the creation of a new ecosystem.


Susanne Kaiser on Microservices Journey from a Startup Perspective

Microservices come with complexities like multiple independent services, operational & communication complexity, partitioned data, and the complexity of eventual consistency. This comes with challenges of transformation to microservices, such as the need for different skills & tools, and untangling the core functionality; the team still has to take care of the existing system, and the transformation takes longer than anticipated. Kaiser said the monolith to microservices journey in reality is evolutionary. ... The key concept of modeling microservices is loose coupling between the services and high cohesion within a service. The team also identified bounded contexts for microservices with well defined business functions.


Distributed data centers boost resiliency, but IT hurdles remain

Organizations should create an "assurance construct" to address those questions and show CIOs and CTOs how data traverses the network and how failover works, Traver said. That way, the entire business understands the level of resiliency that its infrastructure delivers. At this point, only major public cloud players, such as Google Cloud Platform, have the resources to establish true cloud-based resiliency with complete consistency across all data centers in the network, Lawrence said. "It's probably not something that enterprises will be able to aspire to -- perhaps not at least for the next decade and a half -- but perhaps when they have enough sites and different colos, it might be possible," he said. Ultimately, the kind of resiliency an organization pursues should depend on its applications.


15 noob mistakes even experienced developers still make

If you’re not writing C or C++, Make is probably not your friend. Make launches another compiler process for each file. Most modern languages are not designed to have a separate process launched for each file. Also resolving dependencies in a language like Java using Make is nearly impossible. I once worked at a large network equipment company and shortened its build process from three hours to like 20 seconds by converting its build to Ant. A shell script is also usually a bad move in the end. I recently wrote a shell build for a lab because I didn’t want everyone to have to download a whole Java tool set to run one little lab. I thought it was a good move, but it was a noob mistake (as always) because the next version of the software it depended on broke everything (as always).


.NET Standard 2.0 Is Finalized for Consistent API Usage

The .NET Standard project for Visual Studio 2017, hosted on a GitHub site belonging to the .NET Foundation, was announced last September. Microsoft said .NET Standard will replace Portable Class Libraries (PCLs) as the de-facto tooling story used by developers for building multi-platform .NET libraries. ".NET Standard solves the code sharing problem for .NET developers across all platforms by bringing all the APIs that you expect and love across the environments that you need: desktop applications, mobile apps & games, and cloud services," Microsoft said in a huge blog post (with nearly 200 comments) explaining the standard in detail. Apparently facing developer confusion about exactly what .NET Standard is for, Microsoft has devoted some guidance to explaining it, even pointing to an analogy written by David Fowler.


Why cybercriminals like AI as much as cyberdefenders do

“AI is a hammer that can be used for good or bad,” said Jim Fox, a partner, principal and cybersecurity and privacy assurance leader at PwC. “And if your adversaries have a hammer, you'd better have one, too.” In the right hands, this mighty hammer can do a lot of good. Artificial intelligence software can monitor all network activity and quickly discern odd patterns that could indicate foul play, even if such patterns haven’t been flagged before. It can learn over time to discern truly suspicious behavior from normal patterns. Last year's Petya malware attack made decisions "at machine speed,” says one cybersecurity expert. “Nobody was guiding that malware. They wrote an intelligent program to do all that.”Adobe StockAt the New York-based investment bank Greenhill & Co., Chief Information Officer John Shaffer sought a better way to deal with zero-day attacks.


What do macOS and Android have in common? Both are booming malware markets

macOS hasn't been doing well on the malware front lately. Q2 2017, Malwarebytes says, was bigger for macOS malware than the entirety of 2016. Add to that the discovery of more new macOS malware families in 2017 than any year on record and you have a clear indicator of the vulnerability of Apple computers. The threats facing macOS are different than Android or Windows, which is somewhat good news. Rather than ransomware and malware, which the report says is the smallest concern for macOS, PUPs and adware dominate. Many popular macOS apps have been found to contain threats—even those on the App Store. Popular websites for downloading software, such as Softonic and Macupdate.com, have also been found to contain malicious installers.


Handset makers may need to reboot data collection rules

“The dangers to privacy in an age of information can originate not only from the state but from non-state actors as well. We commend to the Union Government the need to examine and put into place a robust regime for data protection,” the Supreme Court judges said in their ruling. Even before the court’s verdict on Thursday, the government directed 30 handset makers including Apple, Samsung, Micromax and Xiaomi to share the procedures and processes used by them to ensure the security of mobile phones sold in the country by August 28. Handset makers insist they-’re already protecting user data on the phones they sell. “We have always stood for securing the user data. User data on all our devices are fully secure, in compliance with the necessary laws and regulations,” said a spokesperson from Oppo.



Quote for the day:


"Continuous improvement is better than delayed perfection" -- Mark Twain


Daily Tech Digest - August 23, 2017

Its Time To Think Beyond Cloud Computing

Cloud computing giants haven’t ignored the lag problem. In May, Microsoft announced the testing of its new Azure IoT Edge service, intended to push some cloud computing functions onto developers’ own devices. Barely a month later, Amazon Web Services opened up general access to AWS Greengrass software that similarly extends some cloud-style services to devices running on local networks. Still, these services require customers to operate hardware on their own. Customers who are used to handing that whole business off to a cloud provider may view that as a backwards step. Edge computing’s vision of having “thousands of small, regional and micro-regional data centers that are integrated into the last mile networks” is actually a “natural extension of today’s centralized cloud,” Crawford says.


Quantum Computing, Artificial Intelligence (AI) and Solving the Impossible

Quantum computing differs from traditional binary computing in that takes advantage of the strange ability of subatomic particles to exist in more than one state at any time (it’s like your children, where you can both love and hate them at the same time). In classical digital computing, a bit is a single piece of information that can exist in two states – 1 or 0. Quantum computing uses quantum bits, or ‘qubits’ instead. ... One important area where quantum computing is expected to have a dramatic impact is in improving the ability for reinforcement learning to process an exponentially-wider range of operating variables in real-time, which is vital in automated cars and smart entities like factories and hospitals. As an example, Google has built a quantum computer which is 100 million times faster than any of today’s machines.


How To Bridge IT's Growing Generation Gap

The challenge for IT leaders is to manage those changes and balance differing priorities and expectations from the three age groups working in IT — millennials, baby boomers and the Generation X cohort stuck between them. It will require deft management skills, a lot of empathy and the ability to drive needed changes to keep organizations competitive in a fast-moving world. ... The push to accommodate millennials begins with the much-discussed need for companies to embrace "digital transformation" or become "digital organizations." IT experts may differ on exactly what that means, but there's widespread agreement about whom it applies to. "When it comes to the new skills required to accelerate digital transformation, a lot of existing staff who are baby boomers and Gen X-ers have legacy skills and need to be reskilled," says Lily Mok, an analyst at Gartner.


The winding road to GDPR compliance

With under a year to go, many businesses have not started preparations, and will need to develop and implement a strategy for compliance. Every organisation that processes the personal data of EU citizens will require a tailored strategy depending on, among other factors, company size, the types and amount of data it processes, and its current security and privacy measures. It is highly recommended that businesses seek legal advice to determine what may be required in their specific situation. However, there are common requirements that will affect all businesses – even the very smallest – that handle personal data. The first step on the road to GDPR compliance is to understand how personal data is stored, processed, shared and used within your organisation.


SDN: Technology to cut costs, speed new services

The past few years have seen the mainstream network vendors jump into SDN with both feet. They still offer feature-rich, turnkey switches with all the support and services mainstream enterprises have come to rely on but with a vendor-provided SDN controller. Many of the mainstream vendors also offer support for third-party controllers. The primary value is in reduction of operational expenses through the automation of configuration and management tasks instead of focusing on hardware costs. In actuality, network hardware accounts for less than 10% of overall data-center spend, while personnel costs can be well over half of a data center’s total cost of ownership. A small reduction in operational costs can pay significant dividends for the business.


The Hyper-Connected Economy: An Interview With Ken Sakai, MD Of Telehouse Europe

For companies that need secure, reliable access to one or more leading public cloud services, Telehouse Cloud Link, a multi-cloud connectivity exchange, delivers a private connection with predictable and scalable bandwidth between their network and cloud services. Telehouse’s collaboration with Microsoft allows enterprises and their IT infrastructure partners to seamlessly provision and manage private connections to Microsoft Azure and Microsoft Office 365 using a dedicated and predictable connection. ... Additional cloud service providers are expected to be added to Cloud Link in the near future and this ability to connect to multiple clouds through a single source removes the complexity of traditional network procurement.


Mimecast’s newly discovered email exploit isn’t a vulnerability, it’s a feature

Mimecast says that their newly discovered exploit undermines "the security and non-repudiation of email; even for those that use SMIME or PGP for signing…" That sounds frightening, but the reality is completely different. This isn't an exploit, or vulnerability. It isn't even a bug. What Mimecast describes in their advisory is a feature, and one that isn't even widely supported. Outlook.com and Gmail for example, block external calls to CSS using the LINK attribute. Mimecast makes mention of using EMBED, OBJECT, FRAME, or IFRAME, even SVGs as alternate modes of exploitation. Again, these are all known attack methods, and once more, many of the mainstream email providers block them. In fact, Mimecast themselves admit that Gmail, Outlook.com, and iCloud.com were not affected by their discovery.


Software-based networking brings new automation perks, challenges

Network automation gives IT organizations deploying complex applications the ability to control the rapid provisioning of network resources. It provides the ability to centrally manage the network and reduce operational costs by shifting the challenges of configuration from people to technology. Software-based networks can select appropriate network services based on parameters, such as application type, quality of service and security requirements. ... Network professionals spend significant time and resources adapting the physical and virtual network to changes in applications, compute and storage resources, and device location. Software-based networking tools can automate change management by associating specific network and security policies with applications and devices that can "follow" them as they migrate physically and virtually.


The future will be fuelled by data

Overlaid with a cognitive platform the process would not only be streamlined, it would be far more valuable to all the participants. "There is a lot of data we are generating here: identifying the tenant, landlord, specific lease, specific property and specific segment of that property," says Dobson. Add in the data the banks already hold about tenants and their supply chains, Dobson believes, "There is a profound opportunity to use advanced analytics and predictive modelling to give insights to the tenant, the landlord and the bank. If you expand the scope of the network to other documents or instruments, you are probably scaling exponentially the data that you could observe and the conclusions you might draw." Financial services is just one area that blockchain will impact; supply chains, Internet of Things (IoT), risk management, digital rights management and healthcare are poised for dramatic change using blockchain networks.


Serverless computing may kill Google Cloud Platform

According to Allamaraju, serverless computing is outpacing industry darlings like Kubernetes. His own Expedia “did over 2.3 billion lambda calls per month” back in late 2016, a number that has climbed since then. Nor is Expedia alone in discovering the productivity gains to be found with serverless computing: Coca-Cola, Nordstrom, Reuters, and others have jumped in. Yet it’s been AI and machine learning technologies like Kubernetes that Google has pinned its cloud hopes on. Focused on its Kubernetes-to-GCP and machine learning plays, Google has not built out the array of serverless services that its competitors have. As Mytton notes, “Once your core runtime requirements are met, the differences between the [serverless vendors’] services aren’t particularly important. … What does count is the availability of services to consume from within the cloud provider ecosystem.”



Quote for the day:


"Leadership is the art of influencing people to execute your strategic thinking" -- Nabil Khalil Basma


Daily Tech Digest - August 22, 2017

How Google is speeding up the Internet

BBR is not the first effort to speed up TCP. Researchers at North Carolina State University are credited with developing one of the most popular loss-based congestion control algorithms used in TCP today, named binary increase congestion control (BIC) and subsequently, CUBIC. At a high level, these also record measurements to estimate the optimal speed at which to send data when congestion is detected. Another congestion control algorithm that has become popular is named Reno. These all use packet loss to determine congestion, though Jacobson, the Google engineer who developed BBR, says that to his knowledge BBR is the only TCP algorithm that actually estimates the speed of traffic to determine the best way to send it, regardless of whether packets have been lost.


Artificial intelligence will let us outsource tedious tasks to our phones

This week marks the debut of Essential’s first gadget. The Essential Phone is an anomaly: a sleek, premium smartphone not designed by Apple, Samsung or a discount Chinese brand. It has a mirrored ceramic back, titanium edges, a display that covers most the phone’s front and a magnetic connector for a new world of accessories and hardware upgrades that he says will let people hang onto their phones longer. Rubin recognizes that Essential confronts formidable competition, especially from Apple and Samsung. But while he applauds the former’s brand power and the latter’s vertical integration, he said “every saturated market needs a disruption. When there’s a duopoly, that’s the time to do it.”


Doing things right: Cloud and SecOps adoption

The goal of SecOps is to help companies deliver software more efficiently and more securely, while reducing risk for the organization over time. The reality is that due to the new operating model in cloud environments security and operations teams must work together as the security team identifies risks and then works with operations to remediate them. “No matter what resources you do or do not have at hand, including personnel, budget, or tools, SecOps is both critical and achievable,” he believes. But one thing crucial to its implementation is leadership buy-in – the people in charge must realize that security is on equal footing with availability and performance. “If the e-retail boom taught suppliers that they must invest in site availability like they would to ensure their brick-and-mortar has its lights on, they must also invest in security like they would to ensure that the alarms work and doors lock.”


New York University Abu Dhabi researchers develop 'unhackable' computer chip

The chip has a secret key that makes it virtually impossible to access and would only function for authorised users. “Without the secret key, the chips cannot be made functional,” he said.  “The functionality of chip - what it does, how it does it - can only be known if the secret key is known.” A patent application has been filed at the US Patent Office. The researchers are creating a web-based platform to make information about the chip available to the public.  An extensive research paper by NYUAD’s Design for Excellence team will be presented in November at the ACM Conference on Computer and Communications Security in the US. “These are all theoretically proven points and we will present this at a top cyber security conference, but we need to test our claims practically as well," said Mr Sinanoglu.


Calls for UK boards to be better educated on cyber threats

One of the most worrying aspects is the lack of understanding of the serious nature that ignorance brings, said Simmonds. This ignorance has led to a lack of basic cyber hygiene, with companies typically lacking basic security controls and processes, and failing to train employees at all levels from the board down on how to deal with cyber threats. “This has been a consistent theme of Verizon’s annual Data breach investigations report over the past 10 years,” said Laurance Dine, managing principal of investigative response at Verizon. “We’ve seen that the majority of data breaches could so easily have been prevented if basic measures and protocols had been in place. For example, we often see that around two-thirds of breaches are traced back to weak, stolen or lost passwords, which could easily be prevented using two-factor authentication.


How To Choose The Right Enterprise Mobility Management Tool

A key to choosing the best EMM solution is aligning the features and capabilities of the platform to your organization’s requirements. This includes such factors as what types of business apps users typically work with, what security and regulatory compliance requirements the company has, what sort of network and service management features it needs, which mobile operating systems are in use, what level of reporting capabilities is needed, and so on. Selecting the right platform isn’t just a matter of getting the most features, but acquiring the features that best meet the organization's requirements. “Organizational needs relating to mobility differ considerably, as do the infrastructure environments into which mobility solutions will be implemented,” Holtby says.


Are you ready for state-sponsored zombie malware attacks?

Zombie malware combines the most deadly aspects of malware and zombie computers into one horrible mess. Typically malware gets into a compute device via phishing or email attachment which limits the scale of the attack. In contrast, zombie malware autonomously hunts for vulnerable systems across LAN, WiFi and VPN connections. Once zombie malware finds a system to infect, it utilizes the new host to scan for other systems which can be anywhere on the globe. Another key aspect of zombie malware is the lack of a control channel to manage its destructive path (unlike zombie computers used in DDoS attack). Subsequently zombie malware just destroys anything it can connect to. For example, the NotPetya started on Ukraine government systems but then quickly spread around the globe.


How to get Android 8.0 Oreo on your Pixel or Nexus right now

While Google's own Pixel and Nexus devices are almost always first in line for a fresh Android rollout, this year's dessert-themed delight isn't actually quite ready to be served to everyone just yet. Google says it's in the midst of "carrier testing" with the Pixel, Nexus 5X and Nexus 6P Oreo builds and expects to start sending updates out to those devices soon. ... Realistically, the wait for Pixel and Nexus owners to get Oreo as an official over-the-air update likely won't be long. But we tech enthusiasts are a notoriously impatient bunch, and when something new is available, gosh darn it, we must have it. Well, not to fear, my fellow shiny-new-software fanatics: If you own a Pixel, Nexus 5X or Nexus 6P, you can actually get Android 8.0 Oreo on your phone this very minute — with the help of a handy little hack.


The cloud could drive open source out of the enterprise

First of all, open source’s no-cost attribute means less in the cloud. Public cloud providers will charge you for the time you use their cloud to access open source software—or any software. Thus, it doesn’t really matter if you AWS Linux, Red Hat Linux, or closed-source platforms from Microsoft, because they are all “free” yet cost the same in cloud time charges for access. The same is true with the databases; there’s not much different in your monthly cloud bill if you use open source databases versus closed source, or those that are native to a specific cloud such AWS Red Shift. If there is not a dramatic cost advantage, most enterprises won’t care about the platforms that they use in the long run, and that takes away one of open source’s historic strengths.


How to set up an all open-source IT infrastructure from scratch

Not choosing Microsoft Windows is the first obvious decision here. The cost is to high (both in terms of up-front monetary investment and recurring costs associated with securing a closed platform). MacOS is, for the same reason, off the table.  What specific platform I chose, at that point, comes down to what my specific needs are within the organization. Chance are I would select a Linux-based platform (either a free Linux distribution – Debian, openSUSE, Fedora, etc. – or a similar system with paid support). Support is the main reason to consider a paid, closed system anyway, so might as well get all the benefits with none of the drawbacks of a system like Windows. Save money, increase security. No brainer.  For applications, I’d also standardize around LibreOffice for the office suite and one of the several open-source web browsers (such as Firefox).



Quote for the day:


"Knowledge Management is the art of creating value from intangible assets." -- Karl-Erik Sveiby