Daily Tech Digest - September 30, 2017

Securing Applications: Why ECC and PFS Matter

Many of us are familiar with Secure Hypertext Transfer Protocol (HTTPS) that uses a cryptographic protocol commonly referred to as Transport Layer Security (TLS) to secure our communication on the Internet. In simple terms, there are two keys, one available to everyone via a certificate, called a public key and the other available to the recipient of the communication, called a private key. When you want to send encrypted communication to someone, you use the receiver’s public key to secure that communication channel. ... The benefit of securing our communication to prevent snooping of sensitive data is obvious; however, encrypting the communication has its downside – it’s computationally expensive and requires a lot of CPU processing to enable, plus encrypted communication may be used in malicious ways to send proprietary information


DNSSEC key signing key rollover: Are you ready?

“There may be multiple reasons why operators do not have the new key installed in their systems: some may not have their resolver software properly configured and a recently discovered issue in one widely used resolver program appears to not be automatically updating the key as it should, for reasons that are still being explored,” ICANN says. It could also be an awareness issue—that enough operators were not aware of the deployment process. “ICANN is on schedule to begin using the private portion [for signing domains] shortly,” Vixie says. The most challenging part of this multistep, multi-year process was overseeing the plan’s development, seeking broad review and approval, and obtaining approvals from multiple internet governance organizations to execute the plan, Vixie says.


Finally, a Driverless Car with Some Common Sense

A lack of commonsense knowledge has certainly caused some problems for autonomous driving systems. An accident involving a Tesla driving in semi-autonomous mode in Florida last year, for instance, occurred when the car’s sensors were temporarily confused as a truck crossed the highway. A human driver would have likely quickly and safely figured out what was going on. Zhao and Debbie Yu, one of his cofounders, show a clip of an accident involving a Tesla in China, in which the car drove straight into a street-cleaning truck. “The system is trained on Israel or Europe, and they don’t have this kind of truck,” Zhao says. “It’s only based on detection; it doesn’t really understand what’s going on,” he says. iSee is built on efforts to understand how humans make sense of the world, and to design machines that mimic this.


Banking on machine learning

Machine learning refers to the use of mathematical and statistical models to teach machines about new phenomena. It involves ingesting raw information in large datasets, understanding patterns and correlations and drawing inferences. While this may seem similar to how humans learn, machine learning algorithms ‘learn’ at much faster speeds with the ability to adapt from mistakes and course-correct. Needless to say, there are numerous applications of ML in any banking field that requires repetitive work, high-accuracy tasks or even informed decision-making. Take data security, which is a key concern for banks. Deep Instinct, a cyber security company that leverages deep learning for enterprise security, states that new malware often contains code that is similar to previous versions.


The business case for digital supply networks in life sciences


Unlike traditional supply chains, which are linear and siloed, digital supply networks are dynamic, interconnected systems that can more readily incorporate ecosystem partners and evolve over time. This shift from linear, sequential supply chain operations to an interconnected, open system of supply operations could lay the foundation for how life sciences companies compete in the future. Digital supply networks in life sciences can address challenges with optimal management of inventories, reliability, and visibility of products moving across the supply chain, or operations efficiencies and product yields. In view of the forces affecting life sciences—pricing pressures, the emergence of value-based and personalized medicine, and the expectations of customers and regulators—creating a life sciences digital supply network can be a logical new opportunity to deliver value.


6 ways to make sure AI creates jobs for all and not the few

Whenever I talk to people about the potential impact of artificial intelligence (AI) and robotics, it’s clear there is a lot of anxiety surrounding these developments. And no wonder: these technologies already have a huge impact on the world of work, from AI-powered algorithms that recommend optimal routes to maximize Lyft and Uber drivers’ earnings; to machine learning systems that help optimize lists of customer leads so salespeople can be more effective. We’re on the verge of tremendous transformations to work. Millions of jobs will be affected and the nature of work itself may change profoundly. We have an obligation to shape this future — the good news is that we can. It’s easier to see the jobs that will disappear than to imagine the jobs that will be created in the future but are as yet unknown.


Free ebook: Data Science with Microsoft SQL Server 2016


SQL Server 2016 was built for this new world, and to help businesses get ahead of today’s disruptions. It supports hybrid transactional/analytical processing, advanced analytics and machine learning, mobile BI, data integration, always encrypted query processing capabilities and in-memory transactions with persistence. It integrates advanced analytics into the database, providing revolutionary capabilities to build intelligent, high performance transactional applications. Imagine a core enterprise application built with a database such as SQL Server. What if you could embed intelligence, i.e. advanced analytics algorithms plus data transformations, within the database itself, to make every transaction intelligent in real time? That’s now possible for the first time with R and machine learning built into SQL Server 2016.


Cloud Computing Security: Provider & Consumer Responsibilities

The first step Cloud Service Providers take, is to secure the Data Center where they host their IT hardware for the Cloud. This is to secure the DC against unauthorized access, interference, theft, fires, floods and so on. The Data Center is also secured to ensure redundancy in essential supplies (Example power backup, Air conditioner) to minimize the possibility of service disruption. In most cases, Provider’s offer Cloud applications from ‘world-class’ data centers. The Cloud Provider ensures that their Infrastructure and the Services comply with Critical Protection Laws such as data protection laws, Payment Card Industry Data Security Standard (PCI DSS), the Health Insurance Portability and Accountability Act (HIPAA), Criminal Justice Information Services(CJIS) , the Sarbanes-Oxley Act, the Federal Information Security Management Act of 2002 (FISMA) and so on.


Want to be a better security leader? Embrace your red team

Successful business leaders understand the power of disruption as a pathway to anticipating unstated future customer needs. The concept of disruption as a force for innovation is powerful in the field of cybersecurity and often pushes business leaders to problem solve in new or unexpected ways. Proactively simulating attacks on your own organization is an excellent example.  With now-broad acceptance that attackers will get in and that compromise is expected, there are distinct advantages to being “productively paranoid.” Security leaders who are productively paranoid fully embrace the idea that the best way to play defense is to start playing offense. This doesn’t mean companies should “attack back,” but they need to understand the mindset and pathways attackers take to infiltrate organizations.


The digital workplace: 8 steps to greater agility, productivity

What is the digital workplace? It is a business strategy aimed at boosting employee engagement and agility through consumerization of the work environment, Rozwell says. Think of your one-size-fits-all-users ERP or expense management applications and imagine the opposite user experience. Your digital workplace should help individuals and teams work more productively without compromising operations. It should include computers, mobile devices and productivity and collaboration applications that are web-based and synch in real time. Such tools should, for example, mimic the ease of use of Uber and Airbnb and the social aspects of Facebook and Instagram. IBM, for one, has undertaken a massive transformation of its workplace to lure new tech talent.



Quote for the day:


"The most effective debugging tool is still careful thought, coupled with judiciously placed print statements." -- Brian Kernighan


Daily Tech Digest - September 29, 2017

10 Critical Security Skills Every IT Team Needs

As hackers become more sophisticated, and attacks more frequent, it’s no longer a matter of if your organization becomes a target, but when. That reality has forced many organizations to reassess how they address security efforts, and how best to allocate scarce resources toward mitigating the damage as quickly as possible. Here, having the right mix of security skills on board is key. “For a lot of our clients, they’re starting to realize that while they certainly want to hope for the best, they absolutely have to prepare for the worst,” says Stephen Zafarino, senior director of recruiting for IT recruiting and staffing firm Mondo. “Earlier this year, with the Chase and Home Depot breach, with the ransomware attacks on Britain’s NHS top-of-mind, everyone’s trying to figure out how to fortify defenses,” Zafarino says.


Why Data Governance Is Foundational for Data-Driven Success

Analytics governance ensures that all digital assets and activities that generate insights and information using analytics methods actually enable smarter business activities. Policies related to information relevance, security, visualization, data literacy, analytics model calibration and lifecycle management are key areas of focus. Data governance is focussed on the data building blocks. Effective data governance brings together diverse groups and departments to enable the data-driven capabilities needed to achieve success. Data governance defines accountabilities, policies and responsibilities needed to ensure that data sets are managed as true corporate assets. This implies that governed data sets are identified, described, cataloged, secured and provisioned to support all appropriate analytics and information use cases required to enable the analytics methods.


It’s hangover time for enterprise cloud computing

We’re in the hangover stage of cloud computing, with IT pros comparing their giddy expectations with the reality on the ground. What I find most interesting about the 451 Research study is that enterprises see the value of the cloud, and are willing pay more for services that meet their expectations. But the cloud technology providers aren’t meeting those expectations, particularly around customer service.  This expectation gap has a historical cause: Enterprises are accustomed to large enterprise vendors with account executives who provide a “single throat to choke.” But cloud technology providers just began to answer their phones a few years ago, so this customer service stuff is still new to them. I’m also not surprised by the frustrations around cloud migration.


Perspective on Architectural Fitness of Microservices

Domain-Driven Design (DDD) is the latest methodology available to software professionals for designing a piece of software that matches the mental model of a problem domain. In other words, Domain Driven Design advocates modeling based on the practical use cases of the actual business. In its simplest form, DDD consists of decomposing a business domain into smaller functional chunks, possibly at either the business function or business process level, so that the complexity of both a business and problem domain can be better apprehended and resolved through technology. To this effect, figure 2 illustrates how the elements of the earlier business architecture meta-model collaborate to form two business domains. Because of the many documented implementation failures of Service Oriented architecture (SOA).


Why E-waste Should be at the Forefront of a Company’s Cybersecurity Plan


Some electronic devices, such as mobile devices, computers, and other items with storage ability can store valuable information that may be accessed by unauthorized individuals during the end of life process. That may pose a real cyber-security threat if such confidential information is stumbled upon by a cybercriminal. ... The fear of having their security breached via e-waste that is not properly handled has led to the increasing concern about potential exposure to cyber-security among electronics users. Of course, that makes everybody a victim. We all use one electronic product or another, whether at home or in the office. Therefore, we are always apprehensive of losing vital information such as credit card details, social security numbers, or other confidential and sensitive information to cyber-attacks.


Google Cloud IoT Core hits public beta, offers management for millions of devices

One of the biggest new features is the ability to bring your own certificate. Users can now bring their own device key Certificate Authority (CA), and Google Cloud IoT Core will verify the key in the authentication process. According to the release, this "enables device manufacturers to provision their devices offline in bulk with their CA-issued certificate, and then register the CA certificates and the device public keys with Cloud IoT Core." While the service will continue to support the MQTT protocol, it will also now support HTTP connections as well. By doing so, the release said, it will make it easier to inject data into GCP at scale. Additionally, the release noted, the service will now feature logical device representation for use cases where a business might need to retrieve the last state of a particular IoT device.


How Your Company Can Close The Cybersecurity Skills Gap

"Looking at the other areas within your organization, you probably can... leverage some of that talent and create a rotation program, into a cyber team for three to six months," Worley said. “[Put] them with the right talent to help them, just like you would with an intern.” She said creating your own talent pools isn’t just useful to close the skills gap, it can can be extremely useful for when a crisis happens. While no one wants to hear that a crisis is a good thing, Worley said the Equifax and SEC breaches do "raise the awareness of employees, because they've not been touched by this thing. It's another thing when ... your identity may be at risk. It become very personal at that point. Maybe we now have an opportunity to have that dialogue.” Another additional area Worley said companies can help improve their cyber security gap, seems like a simple one: make sure all employees know the best security practices.



Most companies operate within the descriptive and diagnostic stages, using basic data warehousing and BI approaches to get quick views on what HAS happened. Predictive analytics is when organizations project what WILL happen … graduating from rearview mirror to human intervention combined with the automation of repetitive patterns through the application of predictive machine learning (ML) models. So why are most companies not further along the analytics progression? Frankly, most enterprises are drowning in an abundance of data types and sources - many of which contradict each other as data size and ingestion rates are also on different levels. Moreover, many organizations are not taking advantage of new technologies that can unlock and manipulate data.


Cyber Attacks Demand a New Approach to Education

First and foremost is the need for a better educated cyber workforce. More needs to be done to lay a foundation of technical literacy through STEM (science, technology, engineering and math) education. Strengthening the quality of STEM education is vital, and the effort must go beyond simply meeting benchmarks such as proficiency on standardized tests. A more holistic approach to STEM should explore the practical relationships between these disciplines and daily life, thus nourishing in the next generation a technical curiosity that begins in early childhood and spans long careers. Such an approach will ensure that innovation and adaptability become second nature in our approach to cyber technology.


When disasters strike, edge computing must kick in

When disasters strike, edge computing networks must kick in
We've seen how mobile network operators (MNO) are taking advantage of edge computing themselves. It’s used to reduce latency. Those phone companies are increasingly using local computing boxes (often inside their many buildings, left over from the days of copper-requiring phone switches, and on their towers) to store and process data rather than centralizing it. “This ability will give a huge advantage to first responders,” Georgia tech says of its idea. The team of researchers published a paper (pdf) where they describe their “fog-enabled social sensing services” API. In the paper, the researchers describe how docker-friendly fog nodes connect or relay the distributed social sensors — the smartphone-carrying civilians, in other words — to hardened routers that can perform edge data processing and be pinged locally



Quote for the day:


"When we have belief the hard work follows naturally." -- Gordon Tredgold


Daily Tech Digest - September 28, 2017

Professor Harish Bhaskaran of Oxford, who led the team, said “The development of computers that work more like the human brain has been a holy grail of scientists for decades. Via a network of neurons and synapses the brain can process and store vast amounts of information simultaneously, using only a few tons of Watts of power. Conventional computers can’t come close to this sort of performance.” Daniel C. Wright, a co-author from the Exeter team, added that “Electronic computers are relatively slow, and the faster we make them the more they consume. Conventional computers are also pretty ‘dumb,’ with none of the in-built learning and parallel processing capabilities of the human brain. We tackle both of these issues here — not only by developing not only new brain-like computer architectures, but also by working in the optical domain to leverage the huge speed and power advantages of the upcoming silicon photonics revolution.”


Before you deploy OpenStack, address cost, hybrid cloud issues

Training can become an indirect OpenStack cost. IT and developer staff may not have the requisite skill sets needed to tackle an OpenStack deployment. You may need to find more OpenStack-savvy staff to handle the job, spend the money to train up existing staff as Certified OpenStack Administrators, hire consultants to jump-start the work or some combination of these tactics. Consider the implications of OpenStack support. Organizations can certainly adopt a canned OpenStack distribution and associated support from vendors like Red Hat or Rackspace. As open source software acquired directly, however, there is no official support. If you choose to deploy OpenStack, assemble a suite of support resources to address inevitable questions or to resolve problems. Some resources are free, while other resources will incur added costs.


To combat phishing, you must change your approach

To combat phishing, you must change your approach
The threat surface is growing, and cybercriminals are becoming more sophisticated. They’re utilizing threat tactics that have made it increasingly difficult for organizations to protect themselves at scale. Cyber criminals are putting pressure on businesses by increasing the volume of these kinds of targeted attacks, dramatically outpacing even the world’s largest security teams’ ability to keep up. Visibility is sadly lacking within most of today’s organizations, and it’s unrealistic for security teams to secure something they can’t see. There’s no tool or widget that can totally fix this and make everything safe. But we can get to a point where we have the ability to construct a security program that reduces risk in a demonstrable way. We can establish metrics for where your risk profile is today.


Fintech’s future is in the back end

Fear that their money would ultimately be spent on on-premise, and therefore nonscalable, technology has been another reason investors have shied away from the opportunity. This fear arises from the tendency of institutions to want to keep a new technology “in the institution” because of security concerns. However, technology has matured enough to meet the reasonably strict security requirements banks impose on partners and vendors. Just six years ago, only 64% of global financial firms had adopted a cloud application, according to research from Temenos. But now, security has dramatically improved in cloud applications and banks are willing to adopt the technology at scale. This is evidenced in both cloud solution adoption and also the industry’s growing willingness to embrace an open banking framework.


WannaCry an example of pseudo-ransomware, says McAfee

WannaCry may have been a proof of concept, but the true propose, he said, was to cause disruption, which is consistent with what researchers are learning when going undercover as ransomware victims to ransomware support forums. “When one of our researchers asked why a particular ransom was so low, the ransomware support representative told her that those operating the ransoware had already been paid by someone to create and run the ransomware campaign to disrupt a competitor’s business,” said Samani. “The game has changed. The reality is that any organisation can hire someone to disrupt a competitor’s business operations for less than the price of a cup of coffee.” In the face of this reality, Samani said the security industry and society as a whole has to “draw a line in the sand”


The Digital Intelligence Of The World's Leading Asset Managers 2017

Where once the asset management sector was a digital desert, websites and social media channels abound. Whilst this represents genuine progress, the content and functionality within them leaves a lot to be desired in most cases. Quality search functionality is hard to find, websites resemble glorified CVs and blogs read like technical manuals. As for thought leadership, well there’s little thought and no leadership. Social media, especially Twitter and Linkedin, are swamped with relentless HR tweets and duplicate updates. It’s clear that asset managers are missing an opportunity to create content that resonates with FAIs and can build lasting two-way relationships. Over the following pages we present our findings in detail and take a closer look at the digital successes and failures within the world’s leading asset managers.


Heads in the cloud: banks inch closer to cloud take-up

On the one hand, cloud providers – such as the leader of the pack, Amazon Web Services – are likely to have security processes and technology that are at least as advanced as those of their banking clients, thanks to their technical expertise and economies of scale. On the other hand, providers can pass on a bank’s data or system management to yet another contractor, increasing security risks present in traditional outsourcing. The EU’s General Data Protection Regulation, coming into force next year, will up the ante on data security. The new rules require, among other things, that bank customers are able to request that their personal data held is deleted. One practical outcome, say lawyers, is that banks will have to clarify to cloud providers exactly how they should handle


Inside the fight for the soul of Infosys


Murthy criticized Sikka's pay and his use of private jets, and claimed that corporate governance standards had eroded during his tenure. Saying he could no longer run the company amid such criticism from a company founder, Sikka resigned as chief executive on Aug. 18 and left the board six days later. Three other directors followed him out the door, including the former chairman, R. Seshasayee. Murthy's criticisms haven't let up since Sikka's resignation. Speaking to shareholders on Aug. 29, he detailed his "concerns as a shareholder" over how the company's board members approved a severance package worth roughly 170 million rupees ($2.65 million) for former Chief Financial Officer Rajiv Bansal, who left the company in October 2015.


Should CISOs join CEOs in the C-suite?

A working partnership between the CIO and the CISO is clearly a successful formula, regardless of who reports to whom. “CISOs should report to the CEO with further exposure and responsibility to the board of directors,” says Alp Hug, founder and COO at Zenedge, a DDoS and malware protection vendor. “The time has come for boardrooms to consider cybersecurity a key requirement of every organization's core infrastructure along with a financial system, HRMS, CRM, etc., necessary to ensure the livelihood and continuity of the business.” If a board of directors says defending their organization against cyber crime and cyber warfare is a top priority, then they’ll demonstrate it by inviting their CISO into the boardroom. “Of course CISOs and equivalents will say they should report to the CEO,” says John Daniels


The ins and outs of NoSQL data modelling

Data modelling is critical to understanding data, its interrelationships, and its rules. A data model is not just documentation, because it can be forward-engineered into a physical database. In short, data modelling solves one of the biggest challenges when adopting NoSQL technology: harnessing the power and flexibility of dynamic schemas without falling in the traps that a lack of design structure can create for teams. It eases the on-boarding of NoSQL databases and legitimises the adoption in the enterprise roadmap, corporate IT architecture, and organisational data governance requirements. More specifically, it allows us to define and marry all the various contexts, ontologies, taxonomies, relationships, graphs, and models into one overarching data model.



Quote for the day:


"If you realize you aren't so wise today as you thought you were yesterday, you're wiser today." -- Olin Miller


Daiy Tech Digest - Septmber 27, 2017

Google's Pixel phone has basically been a critical success from the start. Even a year after its debut, the phone has remained a standard of comparison to which all other Android devices are held – and generally not to their benefit. That's mostly due to the Pixel's prowess in areas where other Android device-makers can't (or won't) compete. Significant as that is, though, ask an average non-tech-obsessed smartphone user what they think about the Pixel – and all you're likely to get in response is a glossy-eyed stare. Google may be positioning the Pixel as a mainstream device and even marketing it as such, to a degree, but it hasn't yet managed to break through that Samsung-scented wall and make its phone impactful in any broad and practical sense.
heart.jpg
The scan would be a type of authentication known as biometric security. Smartphone fingerprint scanners have long led the way in this market as one of the most popular methods. However, the inclusion of facial scanning in the iPhone X and some Android phones has furthered the conversation on what physical characteristics can be used to secure a computing device. In addition to being used for smartphones, the heart scan technology could be used in airport security screenings, the release said. And, for those who may be worried about potential health effects of the scans, Xu mentioned in the release that the strength of the signal "is much less than Wi-Fi," and doesn't pose a health concern. "We are living in a Wi-Fi surrounding environment every day, and the new system is as safe as those Wi-Fi devices," Xu said in the release.


Rethinking security when everyone's Social Security number is public

"The assumption that we previously held, which was that Social Security numbers and driver's license numbers are relatively private ... that's now gone," he said. "Beyond how Equifax changes credit scoring, there's a big question about how Equifax changes identity validation." This is a distinctly separate issue from fraud detection, Perret said. Bank accounts and card numbers can be shut down and reissued, but banks can't do the same for Social Security numbers and other identity factors. "On the fraud side, there's a ton of work we can do, including multifactor authentication," he said, but "the KYC requirements are pretty explicit ... so that needs to be updated." Indeed, a lot of the security practices being used today are done more out of tradition than out of effectiveness.


Another experiment with currency? RBI is looking at its own Bitcoin

The success of Bitcoin, a popular cryptocurrency, may have encouraged the central bank to consider its own cryptocurrency since it is not comfortable with this non-fiat cryptocurrency, as stated by RBI executive director Sudarshan Sen a few days ago. Despite RBI's call for caution to people against the use of virtual currencies, a domestic Bitcoin exchange said it was adding over 2,500 users a day and had reached five lakh downloads. The company, launched in 2015, said the increasing downloads highlighted the "growing acceptance of Bitcoins as one of the most popular emerging asset class."  A group of experts at RBI is examining the possibility of a fiat cryptocurrency which would become an alternative to the Indian rupee for digital transactions.  According to a media report, RBI's own Bitcoin can be named Lakshmi, after the Hindu goddess of wealth.


IT and Future Employment - Data Scientist

A tongue-in-cheek definition is, a computer scientist who knows more statistics than his or her colleagues, or a statistician who knows more computer science than his or her colleagues. Time will tell if data science will become a new discipline, or if it will remain a cross-disciplinary field between these two (and perhaps other) fields. The statistician David Donoho published a paper in 2015 with the provocative title “Fifty Years of Data Science”. He was referencing the statistician John Tukey’s call, more than 50 years ago, for statistics to expand into what we now call data science. Donoho’s paper is well worth reading. I’ll list the six subfields of data science that he identifies and make several comments about each one ... A growing demand for people trained in data science has caused the shortage of these people to balloon. Moreover, only limited opportunities to obtain such training exist today


AI is changing the skills needed of tomorrow's data scientists

AI is changing the nuts and bolts of data management, alleviating data teams from a lot of tedious, manual dirty work so that they can focus their time on creating business outcomes and allowing data scientist to work at a speed and scale that is impossible today. The data scientist of tomorrow must be prepared to work with the AI revolution, optimizing processes without losing the human ability to think creatively and apply data-driven insights to real-world problems. The next generation of data scientists will be even more necessary for helping to apply models and algorithms to problems and processes across the enterprise. For data science students, it’s not only crucial to understand the data and the technology but it’s equally as valuable to learn how to function in teams, collaborate and teach.


4 Job Skills That Can Boost Networking Salaries

The report dissects the salaries of more than 75 tech positions, including eight networking and telecommunications roles and two network-specific security roles. Among the 10 network and telecommunications roles, network architects will be paid the most in the coming year, Robert Half Technology says. ... By comparison, network architects in the 75th percentile can expect to see starting salaries of $160,750, the 50th percentile can expect $134,000, and the 25th percentile will earn $112,750, according to the guide. This is the first year that Robert Half Technology is breaking down compensation ranges by percentile in its annual salary guide. The categories are designed to help hiring managers weigh a candidate's skills, experience level, and the complexity of the role when making an offer.


Critical Network of Things: Why you must rethink your IoT security strategy

Critical Network of Things: Why you must rethink your IoT security strategy
Having made, lost and then re-made my fortune in and around the industry over the past 20-plus years, I cannot help but smile over the level of hype — and, as a certain US President would call it, “fake news” — surrounding the current world of IoT. In spite of what the media and investors would like to think, IoT is not new. I can recall building all sorts of systems including AMR (Automatic Meter Reading/ Smart Metering) networks covering whole cities, pharmaceutical storage monitoring, on-line pest/rodent trap systems, trucks and trailer tracking, foodstuff refrigeration monitoring and land subsidence monitoring, just to name a few examples. They all followed the same basic architecture as we see with today’s IoT offerings, but under the label M2M (Machine to Machine).


5 fundamental differences between Windows 10 and Linux

Although in the early years, hardware support was a serious issue and the command line was a requirement, the last five years have seen very rare occasions that I've come into a problem that couldn't be overcome. I cannot say the same thing about Windows. No matter the iteration, I've always managed to find troubling issues with the Microsoft platform. Generally speaking, those issues can be managed. The latest iteration of Windows is no exception. Coming from Windows 7, I skipped 8 and headed directly to 10. I've found going from Windows 7 to 10 akin to making the leap from GNOME 2.x to 3.x. The metaphor was quite different and took a bit of acclimation. Even though they go about it very differently, in the end, both platforms have the same goal—helping users get their work done.


The Five Steps to Building a Successful Private Cloud

Like every engineering project, setting wrong expectations and unrealistic goals will lead to poor outcome. It doesn’t have to be that way. Once you have a clear understanding of what problems you need to solve for stakeholders, you must define clear goals and requirements. For example, look at the existing pain points for your developers and how your private cloud solution will solve or mitigate those problems. Improving the developers experience ensure faster adoption and long-term success. Making the move to a private cloud require focus, perseverance, motivation, accountability, and strong communication. You must have a good understanding of your existing service costs by doing a thorough Total Cost of Ownership analysis. What does the day to day operations look like to support private infrastructure?



Quote for the day:


"If you only read the books that everyone else is reading, you can only think what everyone else is thinking." - Haruki Murakami


Daily Tech Digest - September 26, 2017


Time to embrace a security management plane in the cloud

Time to embrace a security management plane in the cloud

There’s an old saying: Change is the enemy of security. To avoid disruptive changes, many cybersecurity professionals strive for tight control of their environment, and this control extends to the management of security technologies. Experienced cybersecurity professionals often opt to install management servers and software on their networks so that management and staff “own” their technologies and can control everything they can. This type of control has long been thought of as a security best practice, so many CISOs continue to eschew an alternative model: a cloud-based security management control plane.  Given the history of cybersecurity, this behavior is certainly understandable — I control what happens on my own network, but I have almost no oversight on what takes place on Amazon Web Services (AWS), Microsoft Azure or Google Cloud Platform.


Canada's Tough New Breach Reporting Regulations

Previously in Canada, entities experiencing a breach were required to identify what kind of breach occurred and to notify regulators. "Contacting affected individuals [about the breach] would be something you would delegate to the regulators to get advice and guidance on," he says. But that all changes under the Digital Privacy Act of 2015, which amended certain Canadian privacy regulations in three key ways and will likely go into effect by the end of 2017, Ahmad says. Those changes include mandatory breach notification to affected individuals; keeping a record log for two years of any types of data breaches that occur; and imposing sanctions of up to $100,000 for each violation of the new law, he says. Those amendments provide "a bit more teeth" to Canadian data breach legal requirements, he notes.



Machine Learning Big Data

From buzzword to boardroom – what’s next for machine learning?

You only need to think of the allocation of payments to invoices, the selection of applicants in the HR area, the evaluation of marketing ROI, or forecasts of customer behaviour in e-commerce transactions. Machine learning offers great potential for companies from the big-data environment, provided they muster the necessary developer capacities to integrate machine learning into their applications. As AI moves from the future into the present, organisations not only want to gain insight into their own processes via classical process mining, they are also looking for practical support for the decision-making process, such as guidance on how to further optimise single process steps or efficiently eliminate any hurdles that still exist. By doing so, they can understand which influencing factors would be worthwhile tackling first.


Firms look to security analytics to keep pace with cyber threats

Implementing security analytics can take time and money, especially if a business is using outdated hardware and software. Gene Stevens, co-founder and CTO of enterprise security supplier ProtectWise, says many CISOs are finding it difficult to retain forensics for an extended period of time in a way that is cost-effective and easy to manage. However, his company has come up with an intelligent, analytics-oriented platform to tackle this problem. “With a memory of network activity, security teams can go back and identify whether they were compromised by an attack once it is discovered – and assess the extent of its impact,” says Stevens. However, traditional approaches are costly to scale and laborious to deploy.


New managed private cloud services hoist VMware workloads

CenturyLink's new VMware managed private cloud, CenturyLink Dedicated Cloud Compute Foundation, rearchitects its flagship private cloud onto Hewlett Packard Enterprise (HPE) hardware. It is cheaper and 50% faster to provision than its predecessor, which required multiple integration points across network, compute and virtualization from five vendors. That's typical with many private clouds that require users to coordinate technologies, either within OpenStack or earlier versions of VMware, said David Shacochis, vice president of product management at CenturyLink. VMware Cloud Foundation serves up an integrated stack with vSphere, NSX and vSAN, which means fewer moving pieces, improved self-service features and security control, Shacochis said.


Data storage in Azure: Everything you need to know

On Azure, things are different. Instead of having to manage data at an operating-system level, Azure’s object file system leaves everything up to your code. After all, you’re storing and managing the data that’s needed by only one app, so the management task is much simpler. That’s where Azure’s blob storage comes in. Blobs are binary large objects, any unstructured data you want to store. With a RESTful interface, Azure’s blob store hides much of the underlying complexity of handling files, and the Azure platform ensures that the same object is available across multiple storage replicas, using strong consistency to ensure that all versions of a write are correct before objects can be read. 



computer frustration man resized

What’s your problem? Survey uncovers top sources of IT pain

Cost-cutting has always been a focus for enterprise IT, but as the diversity and volume of data increases, complexity and storage sprawl are straining budgets. Indeed, budget challenges were a top challenge for 35% of IT pros surveyed at VMworld 2017.  A metadata engine can cut costs in several ways. First, it can automatically and transparently moves warm and cold data off high performance resources to dramatically optimize the use of an organization’s most expensive infrastructure. Second, it enables organizations to dramatically increase the utilization of their existing resources. With global visibility and control, organizations can view data location and alignment against SLAs, as well as available performance and capacity resources. This allows IT to instantly observe resources that might be nearing thresholds, and subscribe to alerts and notifications for these thresholds.


FBI's Freese Shares Risk Management Tips

Confusion over the definitions of "threat" and "risk" exist when IT security teams talk to members of the executive suite. One strategy security professionals may consider is approaching the discussion from a business perspective, instead of leading with fear, says Don Freese, deputy assistant director with the FBI's information technology branch. Freese, who served as a keynote speaker Monday at the ISC(2) Security Congress convention in Austin, Texas, noted that risks are measurable, providing that companies practice good security hygiene, such as logging network activity and taking inventory of the data that the enterprise possesses. In addition to those best practices, Freese also advises IT security leaders to consider the industry that they operate in and the type of data that would be desired by cybercriminals, or nation states.


8 Features of Fintech Apps that Appeal to Millennials

A new wave of fintech apps is set to hit smart devices over the coming years. A number of new startups, such as the digital-only bank Atom and the international money transfer service TransferWise, are going to revolutionize the financial industry. Fintech investments have grown exponentially over the last three years, and there are many opportunities for developers, investors and executives of “legacy companies” to ride on this wave. At the same time, of all the consumer demographics, Millennials are expected to be pivotal in driving changes. As a consequence, fintech developers are aggressively targeting them, tailoring apps to solve key pain points. Let’s look at some of the common features that software developers are currently including in their apps, along with those that are expected to become widespread, in order to appeal to Millennials.



Google's custom data center switch Jupiter

What’s Different about Google’s New Cloud Connectivity Service

Dedicated Interconnect is only one of a set of cloud connectivity options from Google; it’s designed for handling workloads at scale, with significantly high-bandwidth traffic of more than 2Gbps. You can also use it to link your corporate network directly to GCP’s Virtual Private Cloud private IP addresses. Taking your cloud traffic off the public internet and onto your own network range gives you more options for taking advantage of cloud services. “Networking technologies are enabling applications and data to be located in their best execution venue for that workload,” Traver noted. Like its competitors, Google Cloud Platform requires you to connect to one of several global peering locations, so as well as Google’s charges, you’re also going to need to pay your network service provider to reach Google’s peering points.



Quote for the day:

"The two most powerful warriors are patience and time." -- Tolstoy

Daily Tech Digest - September 25, 2017

Deloitte hit by cyber-attack revealing clients’ secret emails

The Guardian understands Deloitte clients across all of these sectors had material in the company email system that was breached. The companies include household names as well as US government departments. So far, six of Deloitte’s clients have been told their information was “impacted” by the hack. Deloitte’s internal review into the incident is ongoing. The Guardian understands Deloitte discovered the hack in March this year, but it is believed the attackers may have had access to its systems since October or November 2016. The hacker compromised the firm’s global email server through an “administrator’s account” that, in theory, gave them privileged, unrestricted “access to all areas”. The account required only a single password and did not have “two-step“ verification, sources said.


Let’s Not Get Physical: Get Logical

In the ideal future, there would be no programmers responsible for data movement. Instead, the data infrastructure would provide the illusion that all data is almost instantly available at the physical point of its need. Data consumers, including data analysts, would log on to a data catalog, shop for, and request the data they needed. That data would be described at a high level, with its business meaning clearly spelled out for both the human and the machine. (We call that computable meaning.) When a user requested data to be delivered to a certain point (perhaps a virtual point in the cloud), the data infrastructure would start copying the data from its origin, using replication techniques—meaning no potentially deforming transformations would be built into the data movement.



How to Survive Wall Street’s Robot Revolution

Consider the junior investment banker, who spends much of his or her time collecting and analyzing data and then creating reports. Consulting firm Kognetics found that investment-banking analysts spend upwards of 16 hours in the office a day, and almost half of that is spent on tasks like modeling and updating charts for pitch books. Machine learning, and natural language processing techniques, are already very good at this. Workers in compliance and regulation have a different worry: Over the last five years, their ranks have doubled, while overall headcount at banks declined 10 percent, according to research by Citigroup. Automating those activities — so-called regtech — could be good news for financial institutions looking to control the rising cost of compliance, and bad news for people looking to keep their jobs.


Data Governance: Just Because You Can, Doesn't Mean You Should

The impact of data use by businesses and government organizations on individuals, communities, and the environment is under constant scrutiny around the world. We are starting to see this formalized with security and privacy regulations such as the EU’s General Data Protection Regulation (GDPR) and the Privacy by Design approach for data systems. But even adhering to legal requirements and compliance regulations will not be enough to protect the business when it comes to ethical data use. Why? Ethical concerns precede legal and compliance requirements. And the stakes are large. Brand reputation is at risk. One wrong move could cause a significant loss, if not the whole business to fail.


Transforming processes with big data: Refining company turns to SAP Process Mining

A key component of the effort to improve process management is SAP Process Mining by Celonis 4.2.0, a process mining software that uses "digital traces of IT-supported processes" to reconstruct what happens in a company. The application shows all of the process variants, and it provides visualization of all currently running processes. The technology is expected to play a critical role in the effort to enhance processes, providing full transparency and analysis so the company can observe business processes directly from the vast data present in IT infrastructure systems such as its SAP enterprise resource planning (ERP) platform. Based on the analytical findings and process key performance indicators (KPIs), the company will be able to identify process improvement opportunities, Rajatora said.


From accounting to code: one woman’s journey to a career in tech

The pressure to find that first role can feel overwhelming, and often people take the first semi-suitable job they find, at the expense of their actual passions. Getting that first experience may well open the doors to something better, but it could also colour your experience of this new industry, for better or worse. As far as I was concerned, I’d had a lot of experience working with traditional banks in my previous role, and spent at least four or five hours each day attempting to complete straightforward tasks across seven banks in five different countries. This meant that fintech and its potential to transform the banking landscape felt like a very attractive prospect to me, and that Starling Bank’s mission was something I felt strongly about.


The Battle for the Cloud Has Not Even Started Yet

The real war will break out when solutions, offered via the cloud, can support business innovation and business differentiation: When cloud solutions drive business benefit directly and not benefits to IT. For that to happen we need to talk about what a business does (its business processes and decisions) and how a business operates, not what IT does and how IT operates. This might seem like a small point but in the overall scheme of things, in the overall war, I think this is a massive point. If I am lucky I might even be around long enough to be proven right (or wrong). So this is where my little framework starts to be useful. Yes, IaaS is a well-known battle field and the armies are out there fighting it out. Of the next battle fronts, PaaS and SaaS will form up. In fact they are forming up already though they are not seen as important yet by many.


Digital is a Strategic Vehicle for Business Disruption

According to the research findings, the top three success factors for customer experience transformation is: 1. customer centric culture, 2. management/leadership buy-in, and 3. visibility into and understanding of the end customer experience. The research also revealed that customer experience (CX) leaders are more likely to be using emerging technologies and creating personalized and omni-channel experiences. CX leaders are also more likely to use data to predict and anticipate consumer needs, understand lifetime value, and track customer advocacy. CX leaders also have a much higher sense of urgency - they believe there is no time to waste in transforming to deliver a superior customer experience. Data is at the heart of meeting the elevated expectations of today’s connected customers.


CISOs' Salaries Expected to Edge Above $240,000 in 2018

A candidate's skills, experience, and the complexity of the role will all need to be taken into consideration when assessing which salary percentile is appropriate. "The midpoint salary is a good indicator of someone who meets the requirements of an open role," Reed says. The midpoint range for CISOs and information systems security managers have improved over the past couple of years. For example, the Dark Reading 2016 Security Salary Survey found the median annual salary of IT security management was $127,000. But fast forward to 2018: the Robert Half Technology survey expects information systems security managers to earn as much as $194,250 if in the 95th percentile salary range, followed by $164,250 for the 75th percentile, $137,000 at the midpoint, and $115,250 at the 25th percentile, according to the report.


Facebook Relents to Developer Pressure, Relicenses React

"We won't be changing our default license or React's license at this time," said Wolff, who apologized "for the amount of thrash, confusion, and uncertainty this has caused the React and open source communities." Furthermore, he said, "We know this is painful, especially for teams that feel like they're going to need to rewrite large parts of their project to remove React or other dependencies." One developer in that camp is Matt Mullenweg -- the main guy behind the popular WordPress platform -- who threatened to redo project Gutenberg, a "block editor" from the WordPress community designed "to make adding rich content to WordPress simple and enjoyable." "The Gutenberg team is going to take a step back and rewrite Gutenberg using a different library," Mullenweg said in a Sept. 14 post.



Quote for the day:


"No plan survives contact with the future. No security is future proof. That's the joy and terror of cyber security." -- J Wolfgang Goerlich‏


Daily Tech Digest - September 24, 2017

How to Get One Trillion Devices Online

I think it’s easy to paint the optimistic picture of what, if we get all of this right, it could mean for our future. One trillion devices isn’t an absurd number. But these types of new technology can be very fragile. It’s interesting comparing CRISPR [the gene-editing technology] to genetically modified crops: GM crops had some bad publicity early on, and that essentially killed the area for a while, whereas CRISPR has had lots of positive publicity: it’s cured cancer in children. IoT will be similar. If there are missteps early on, people will lose faith, so we have to crack those problems, at least to a point where the good vastly outweighs the bad.


The developers vs enterprise architects showdown

Planning out and managing microservices seems like another area where EAs have a strong role for both initial leadership and ongoing governance. Sure, you want to try your best to adopt this hype-y practice of modularising all those little services your organisation uses, but sooner or later you’ll end up with a ball of services that might be duplicative to the point of being confusing. It’s all well and good for developer teams to have more freedoms on defining the the services they use and which one they choose to use, but you probably don’t want, for example, to have five different ways to do single sign-on. Each individual team likely shouldn’t be relied upon to do this cross-portfolio hygiene work and would benefit from an EA-like role instead, someone minding the big ball of microservices.



brain

Human Brain Gets Connected to the Internet for the First Time

“Brainternet is a new frontier in brain-computer interface systems,” said Adam Pantanowitz, ... According to him, we’re presently lacking in easily-comprehensible data about the mechanics of the human brain and how it processes information. The Brainternet project aims “to simplify a person’s understanding of their own brain and the brains of others.” “Ultimately, we’re aiming to enable interactivity between the user and their brain so that the user can provide a stimulus and see the response,” added Pantanowitz, noting that “Brainternet can be further improved to classify recordings through a smart phone app that will provide data for a machine-learning algorithm. In future, there could be information transferred in both directions – inputs and outputs to the brain.”


Impact of Cyber Security Trends on Test Equipment

The trend of applying cyber security practices to test systems makes sense for several reasons, most notably the increased cyber-security incidents that exploit unmonitored network devices. The second reason this trend makes sense is that security practices and technology for general-purpose IT systems are more mature. However, this trend does not make sense categorically for at least two reasons. Primarily, IT-enabled test systems are less tolerant of even small configuration changes. Users of IT systems can tolerate downtime and may not even perceive application performance differences, but special-purpose test systems (especially those used in production) often cannot tolerate them. Second, test systems often have security needs that are unique. They typically run specialized test software not used on other organization computers


This Is What Happens When a Robot Assassin Goes to Therapy

In an email to Singularity Hub, series creator EJ Kavounas said, “With everyone from Elon Musk to Stephen Hawking making dire predictions about the possible dangers of machine intelligence, we felt the character could inject black comedy while discussing real issues of consciousness and humanity’s relationship with the unknown.” Nina starts with Alastair Reynolds, a psychiatrist. During their meeting she explains her past to him, and after watching a recording in which she detonated a missile to kill someone, she breaks into tears. So we know she has feelings—or at the very least, she’s good at faking them. “The biggest thing I try to keep in mind when playing Nina is that everything she does and says was specifically programmed to mimic human behavior and language,” according to actor, Lana McKissack, who plays Nina.


What is cellular IoT and what are its use cases

While LoRa offers the benefit of addressing ultra-low-power requirements for a range of low-bit-rate IoT connectivity, it is faced with a range limitation and must piggyback an intermediary gateway before data can be aggregated and sent to a central server. The cost of deploying multiple gateways for a range of different IoT scenarios would defeat the very economic purpose of using an arguably low-cost solution like LoRa. Moreover, solutions like LoRa are not suited for a wide range of those IoT applications where HD and ultra-HD streaming is a prerequisite. 5G would potentially address a range of both low-bit-rate and ultra-HD IoT connectivity requirements, while also obviating the need to have an intermediary gateway, thus leading to additional cost savings. Moreover, 5G would have the potential to cover as many as one million IoT devices per square kilometer


Gel-soaked conductive ‘fabric’ has potential for energy storage

As electric power becomes more important for everything from ubiquitous computing to transport, researchers are increasingly looking for ways to avoid some of the drawbacks of current electricity storage devices. Whether they are batteries, which release a steady stream of electric current, or supercapacitors, which release a sharper burst of charge, storage devices depend on conductive electrolyte fluids to carry charge between their electrodes. Susceptible to leakage and often flammable, these fluids have been behind many of the reported problems with batteries in recent years, including fires on board aircraft and exploding tablet computers (the later being caused by short-circuiting inside miniaturised batteries).


Lambda vs. EC2

Unlike its predecessors, the underlying Lambda infrastructure is entirely unavailable to sysadmins or developers. Scale is not configurable, instead Lambda reacts to usage and scales up automatically. Instead of using EC2, Lambdas instead use ECS, and the containers are not available for modification. In place of a load balancer, or an endpoint provided by Amazon, if you want to make Lambdas accessible to the web it must be done through an API Gateway, which acts as a URL router to Lambda functions. ... One of the major advantages touted by Amazon for using Lambda was reduced cost. The cost model of Lambda is time-based: you’re charged for requests and request duration. You’re allotted a certain number of seconds of use that varies with the amount of memory you require. Likewise, the price per MS varies with the amount of memory you require.


Artificial Intelligence: The Gap between Promise and Practice

The majority of companies underestimate the importance of rich and diverse data sets to train algorithms, and especially the value of “negative data” associated with failure to successfully execute a task. Talent shortages and unequal access to data engineers and AI experts compound matters. Privacy and other regulations as well as consumer mistrust also temper progress. Whereas such barriers may be expected to decrease over time, there are also more subtle barriers to AI’s adoption that will need to be overcome to unlock its full potential. Algorithmic prowess is often deployed locally, on discrete tasks; but improved learning and execution for one step of a process does not usually improve the effectiveness of the entire process.


Researchers Develop Solar Cells That Can Be Sewn Onto Clothing

“The ideal wearable portable solar cell would be a piece of textile. That exists in the lab but is not a sellable product.” This new research from the RIKEN and Tokyo teams has taken that textile a big step forward from lab curiosity to actual product. What they have done is create a cell so small and flexible that it could, in time, be seamlessly woven into our clothing, rather than awkwardly placed on the outside of a jacket. These solar cells are phenomenally thin, measuring just three millionths of a meter in thickness. Given a special coating that can let light in while keeping water and air out, the cell was able to keep efficiently gathering solar energy even after being soaked in water or bent completely out of its original shape.



Quote for the day:


"Change is the end result of all true learning." -- Leo Buscaglia


Daily Tech Digest - September 23, 2017

Domain-Driven Design Even More Relevant Now

Compromises and trade-offs in software are unavoidable, and Evans encouraged everyone to accept that "Not all of a large system is going to be well designed." Just as "good fences make good neighbors", bounded contexts shield good parts of the system from the bad. It therefore stands to reason that not all development will be within well-defined bounded contexts, nor will every project follow DDD. While developers often lament working on legacy systems, Evans places high value on legacy systems, as they are often the money makers for companies. His encouragement for developers to "hope that someday, your system is going to be a legacy system" was met with applause.


At its most basic level, the monetary system is built around the idea of storing and transferring value. Banks are not going to disappear; there are still high-level efficiencies and advantages to having banks aggregate stored value and deploy it at a targeted rate of return. For example, a bank can write thousands of mortgages and then securitize a portion of said mortgages; this is never going to be a process suitable for the crowdfunding model. Blockchain technology creates numerous benefits across industries and applications, especially in regard to value-transfer. Banks can realize extraordinary efficiencies, streamline their back-office functions and reduce risk in the process. Smart contracts introduce the added dynamic of constraints and conditional operations for transferring or storing value only when certain conditions have been met and verified.


The Digital Twin: Key Component of IoT

In the real world, this might be a machine going into different fault and run states, where the effect of an input on the machine's state depends on the state the machine is in at the time. If I go far enough back in time, I realize that my system did receive an input "A", and so by the rules of my system, the later "B" results in my model producing the output "X". However if I don't go back far enough, I will think that I only got a "B", and the output should be "Y". But how far back is "far enough"? The input "A" might have arrived 100 milliseconds ago, or it might have arrived yesterday, or just before the week-end. Which means that I cannot just pick up and run my model over a selected time period any time I want to get an answer -- apart from the sheer impracticality of crunching the numbers while the User waits for an answer.


How Startups Can Source Data To Build Machine Intelligence

Data is the fuel of the new AI-based economy. Companies, consumers and web-connected devices create terabytes of data that enforce AI research and innovation. Some companies, like Google and Facebook, acquire data thanks to their users who provide ratings, clicks and search queries. For other companies, data acquisition may be a complicated process, especially if they need an enterprise solution for a limited number of members instead of a one-size-fits-all solution for millions of users. Luckily, the emerging AI markets offer a broad range of options for companies to kickstart their AI strategies. As a venture studio partner, I see startups struggling with sourcing the initial data sets for their business problems. That's why I've listed the most popular ways young companies can source data for their AI businesses.


The Challenges of Developing Connected Devices

Many startups can afford to be scrappy at the start and only have a few employees while gaining momentum; when your product is a connected device it is more difficult to build a small team with the range of skills needed to launch a successful product. Luckily, there are plenty of external resources available to these companies that can help. If a founding team is strong with hardware, they can use an agency in order to get their first software suite built. There are also services that they can leverage to help with the build and distribution chain. Any place where work can be offloaded in order to focus on value increases their chances of success. They can then start hiring out a team to save money once they have traction.


bitcoin-18135031280pd.jpg

New alliance advocates the blockchain to improve IoT security, trust

The alliance says that the groups' open-source tools and property will help the enterprise register IoT devices and create event logs on decentralized systems, which in turn will lead to a trusted IoT ecosystem which links cryptographic registration, "thing" identities, and metadata ... "The world is beginning to recognize the potential of blockchain technology to fundamentally reshape the way business is done globally - and we're still just scratching the surface," said Ryan Orr, CEO of Chronicled. "At this early stage we think it's vitally important to establish an inclusive framework that ensures openness, trust, and interoperability among the many parties, in both the public and private sectors, that we believe will begin to adopt blockchain technology over the next several years."


Ethereum’s inventor on how “initial coin offerings” are a new way of funding the internet

In general, you know that when you have public goods, public goods are going to be in very many cases underfunded. So the interesting thing with a lot of these blockchain protocols is that for the first time you have a way to create protocols and have protocols that actually manage to fund themselves in some way. If this kind of approach takes off, potentially, it could end up drastically increasing the quality of bottom-level protocols that we use to interact with each other in various ways. So ethereum is obviously one example of that, we had the ether sale, and we got about $8 to $9 million by, I guess, basically selling off a huge block of ether. If you look at lots of cryptocurrencies, lots of layer-two kind of projects on top of ethereum, a lot of them tend to use a similar model.


New Theory Cracks Open the Black Box of Deep Learning

It remains to be seen whether the information bottleneck governs all deep-learning regimes, or whether there are other routes to generalization besides compression. Some AI experts see Tishby’s idea as one of many important theoretical insights about deep learning to have emerged recently. Andrew Saxe, an AI researcher and theoretical neuroscientist at Harvard University, noted that certain very large deep neural networks don’t seem to need a drawn-out compression phase in order to generalize well. Instead, researchers program in something called early stopping, which cuts training short to prevent the network from encoding too many correlations in the first place.


How to Measure Continuous Delivery

Continuous delivery is all about improving the stability and speed of your release process, so unsurprisingly you should measure stability and speed! Those are intangibles, but they’re not hard to measure. In How To Measure Anything, Douglas Hubbard shows how to use clarification chains to measure intangibles - you create tangible, related metrics that represent the same thing. Luckily for us, the measures have been identified for us. In the annual State Of DevOps Report Nicole Forsgren, Jez Humble, et al. have measured how stability and throughput improve when organisations adopt continuous delivery practices. They measure stability with Failure Rate and Failure Recovery Time, and they measure throughput with Lead Time and Frequency. I’ve been a big fan of Nicole and Jez’s work since 2013


The Decline of the Enterprise Architect

No matter their place in a lumbering bureaucracy or how many eye-rolls they may inspire among developers, these people are smart, competent, and valuable to their organizations. So my opinions and criticisms have nothing to do with the humans involved. That said, I think this role is on the decline, and I think that’s good. This role exists in the space among many large software groups. In the old days, they coordinated in elaborate, mutually dependent waterfall dances. These days, they “go agile” with methodologies like SAFe, which help them give their waterfall process cooler, more modern sounding names, like “hardening sprint” instead of “testing phase.” In both cases, the enterprise architect has a home, attending committee-like meetings about how to orchestrate the collaboration among these groups.



Quote for the day:


"Your excuses are nothing more than the lies your fears have sold you." -- Robin Sharma