Daily Tech Digest - January 05, 2019

"AI becomes the UI, meaning that the pull-based/request-response model of using apps and services gradually disappears," Agarwal wrote. "Smartphones are still low IQ, because for the most part you have to pick them up, launch them and ask something, and then get a response back. In better-designed apps, however, the app initiates interactions via push notifications. Let's take this a step further so that an app, bot, or a virtual personal assistant using artificial intelligence will know what to do, when, why, where and how. And just do it." ... When it comes to machine learning, Agarwal wrote, the advances we'll see in 2019 are part of a logical evolution of that technology. "The most valuable data comes with context," he said, "what you've done before, what questions you've asked, what other people are doing, what's normal versus odd activity. And the best understanding comes from the depth of data in domain-specific use cases, such as manufacturing, marketing campaigns, e-commerce sites, or IT operations center.



Email trustworthiness: Here’s how to avoid looking like spam

At the same time SPF was being published, a second standard was in the works: DKIM (DomainKeys Identified Mail), which was a cryptographic solution for ensuring that content wasn’t tampered with during message transport. Creating standards around where a message originates and what’s in the message when it’s received versus when it was sent greatly help with establishing the trustworthiness of a given email and the sender that’s sending it. But again, this was not a total and complete solution to the global epidemic of spam. DKIM, along with SPF, became the foundation for DMARC (Domain-based Message Authentication, Reporting and Conformance) in 2011. DMARC allows the sender of an email to create a set of instructions for the receiving domain on what to do if the message fails an SPF or DKIM check. This policy makes it very difficult to spoof brands and deliver fraudulent messages to unsuspecting recipients, or hijack pieces of content to fool filters. 


biometric.jpg
While advances in recognition algorithms are important, improvements are more pressing on the sensor side to provide higher quality input for the algorithms to analyze. In an interview with Bloomberg last month, Sony's sensor head Satoshi Yoshihara indicated that 3D camera sensors with advanced depth sensing are coming in 2019. Sony's depth sensing method relies on measuring the time it takes for invisible laser pulses to travel to the target and back to the handset. ... Despite these advances, legal frameworks for biometric security are still inadequate, with neither apparent interest or desire for policymakers to address the problems. While legal protections exist against forcing suspects to disclose passwords to, or unlock devices for the convenience of law enforcement, biometric authentication can be exploited by anyone with physical hardware access. In 2018, police in Ohio unlocked an iPhone X by forcing a suspect to put their face in front of the phone.


Microsoft Releases Surface Diagnostic Toolkit for Business


The Surface Diagnostic Toolkit for Business has two "modes," a desktop mode and a command-line mode, according to Microsoft's documentation. The desktop mode, which has a graphical user interface, is used to assist end users in help-desk fashion or it can be used to create a "distributable .MSI package" for deployment on Surface devices, where the end users are the ones to carry out the tests. Using the toolkit's command-line mode, IT pros can collect details about a Surface device's system information. They also can gather Surface device health indicators via built-in Best Practice Analyzer capabilities. The toolkit will show information about any missing drivers or firmware updates. It'll also report on the warranty status of a Surface device. The tests carried out by the toolkit's Best Practice Analyzer segment will check the state of a Surface device's BitLocker encryption and Trusted Platform Module, and whether or not Secure Boot protection has been enabled on the device's processor.


HHS Publishes Guide to Cybersecurity Best Practices

HHS Publishes Guide to Cybersecurity Best Practices
The goal of the guidance is to aid healthcare entities - regardless of their current level of cyber sophistication - in bolstering their preparedness to deal with the ever-evolving cyber threat landscape. "I spend a lot of time in healthcare providers that run the gamut in size and security maturity and still the top two questions are either: 'Where do I start?' or 'What do I do next, now that this part is done,'" says former healthcare CIO David Finn, an executive vice president at security consulting firm CynergisTek. "The days of small providers not knowing what to do or large providers thinking they've done all they need to do are over," adds Finn.  HHS notes in a statement that the "Health Industry Cybersecurity Practices: Managing Threats and Protecting Patients" document is the culmination of a two-year effort involving more than 150 cybersecurity and healthcare experts from industry and the government under the Healthcare and Public Health Sector Critical Infrastructure Security and Resilience Public-Private Partnership.


Three Ways Legacy WAFs Fail

At the time, a drop-in web application security filter seemed like a good idea. Sure, it sometimes led to blocking legitimate traffic, but such is life. It provided at least some level of protection at the application layer — a place where compliance regimes were desperate for solutions. Then PCI (Payment Card Industry) regulations got involved, and the whole landscape changed. ... Most people weren’t installing WAFs due to their security value — they just wanted to pass their mandatory PCI certification. It’s fair to say that PCI singlehandedly grew the legacy WAF market from an interesting idea to the behemoth that it is today. And the legacy WAF continues to hang around, an outdated technology propped up by legalese rather than actual utility, providing a false sense of security without doing much to ensure it. If that isn’t enough for you to show your legacy WAF the door, here are three more reasons why legacy WAFs should be replaced.


Poor data-center configuration leads to severe waste problem

Poor data center configuration leads to severe waste problem
The EPA estimates e-waste, disposed electronics, now accounts for 2 percent of all solid waste and 70 percent of toxic waste, thanks to the use of chemicals such as lead, mercury, cadmium and beryllium, as well as hazardous chemicals such as brominated flame retardants. A lot of that is old servers and components. And much of that is due to poor configuration and management, according to a study from server vendor Supermicro. In a survey of people who purchase and administer data-center hardware (pdf), only 59 percent of the 361 respondents consider energy efficiency important when building or leasing a new data center. It's fourth on the priorities list behind security, performance, and connectivity when managing existing data centers. The result? About 58 percent of respondents did not know their data-center Power Usage Effectiveness (PUE). PUE measures how efficiently you cool your systems.


Now is the time to get serious about your cloud strategy

The move away from enterprise data centers has been less aggressive than predicted. It seems that many applications and data sets can’t live anywhere else according to enterprise IT, and while cloud computing is an option, IT views it as a tactical solution. The fact is that cloud computing is no bed of roses. Costs are typically higher than expected, migration is typically costlier and more complex than expected, and operations are much more laborious than expected. However, cloud computing keeps you out the hardware and software procurement and operations weeds, letting you move faster, And, if you’re smart in its usage, cloud computing can make things much cheaper and lower risk. Generally speaking cloud computing makes you more agile and cheaper most of the time. So why is there such a slow movement to move to cloud computing and shut down enterprise data centers?


Reviewing 2018, predicting 2019

processsmall.jpg
Software as we know it has fundamentally been a set of rules, or processes, encoded as algorithms. Of course, over time its complexity has been increasing. APIs enabled modular software development and integration, meaning isolated pieces of software could be combined and/or repurposed. This increased the value of software, but at the cost of also increasing complexity, as it made tracing dependencies and interactions non trivial.  But what happens when we deploy software based on machine learning approaches is different. Rather than encoding a set of rules, we train models on datasets, and release it in the wild. When situations occur that are not sufficiently represented in the training data, results can be unpredictable. Models will have to be re-trained and validated, and software engineering and operations need to evolve to deal with this new reality. Machine learning is also shaping the evolution of hardware. For a long time, hardware architecture has been more or less fixed, with CPUs being their focal point.


Will greater clarity on regulation 'considerably expand' .. crypto market?

Self-regulation will be necessary, because global regulatory bodies move incredibly slowly, especially in such a complex space as the world of digital currencies. “In 2019, the cryptocurrency market is set to radically evolve,” confirms Green. “We can expect considerable expansion of the sector largely due to inflows of institutional investors.” “Major corporations, financial institutions, governments and their agencies, prestigious universities and household-name investing legends are all going to bring their institutional capital and institutional expertise to the crypto market.” “The direction of travel has already been on this path, but there is a growing sense that institutional investors are preparing to move off the sidelines in 2019.” This prediction is optimistic, and if anything is certain in the crypto space, it is that of uncertainty.



Quote for the day:



"The leader has to be practical and a realist, yet must talk the language of the visionary and the idealist." -- Eric Hoffer


Daily Tech Digest - January 01, 2019

6G will achieve terabits-per-second speeds

mobile wireless network
The school has been a major research partner in millimeter 5G development, alongside Nokia, and is now starting work on 6Genesis, its 6G development program. 6G is also sometimes called 5G Long Term Evolution. The University of Oulu has been promised funding for the program that is the equivalent of U.S. $290 million that will be supplied by the Finish government’s Academy of Finland and other sources, including partners. Collaborators in the eight-year program will include Nokia, BusinessOulu (my host, which paid some of my travel expenses to the UArctic Congress conference last week), and other universities. “Millisecond latency [found in 5G] is simply not sufficient,” Pouttu said. It’s “too slow.” One of the problems that will be encountered in 5G overall is related to required scalability, he said. The issue is that the entire network stack is going to be run on non-traditional, software-defined radio. That method inherently introduces network slowdowns. Each orchestration, connection or process decelerates the communication.



The solution to dysfunctional cybersecurity and network teams

The answer appears to be allowing the cybersecurity team complete access to the network. "The percentage of survey participants reporting a high level of trust between teams more than doubles at organizations providing complete visibility to cybersecurity staff," the report mentions. "Similarly, when the cybersecurity team has complete visibility, organizations have a higher level of confidence that they are well equipped to protect the network from future cybersecurity attacks." Besides resolving trust issues and promoting collaboration, there are the following additional benefits: Both teams have greater confidence that team members understand what's happening on the network; Each team's activity will complement, not overlap or interfere, with the other team's efforts; and Respondents (55%) believe integrating the teams will allow a faster, more-efficient response to security events.


Threat of the month: Android master key vulnerability
Information pilfered includes “emails, retainer agreements, non-disclosure agreements, settlements, litigation strategies, liability analysis, defence formations, collection of expert witness testimonies, testimonies, communications with government officials in countries all over the world, voice mails, dealings with the FBI, USDOJ, DOD, and more, confidential communications, and so much more,” the group wrote, explaining that the law firm paid the initial ransom demand but then breached the terms of agreement by reporting to law enforcement. The group, which threatened to “bury” the company unless a second ransom demand was paid in bitcoin, said it would escalate the release of the law firm’s internal files, noting “each time a Layer is opened, a new wave of liability will fall upon you.” The hackers referred to Hiscox as one “of the biggest insurers on the planet,” referencing the World Trade Center, following up with a tweet promising to provide “many answers about 9.11 conspiracies through our 18.000 secret documents leak.”



Machine Learning in Excel

This article is written for you who is curious of the mathematics behind neural networks, NN. It might also be useful if you are trying to develop your own NN. It is a cell by cell walk through of a three layer NN with two neurons in each layer. Excel is used for the implementation. ... We are both curious about Machine Learning and Neural Networks. There are several frameworks and free api:s in this area and it might be smarter to use them than inventing something that is already there. But on the other hand, it does not hurt to know how machine learning works in depth. And we also think it is a lot more fun to dig down into things, don't we? My journey into machine learning have perhaps just started. And I started by googling, reading a lot of great stuff on the internet. I saw a few good YouTube videos also. But I found it hard to gain enough knowledge to start coding my own AI. Finally I found this article, that suited me, and which the rest of this text is based on.


An Introduction to CSS Shapes

cssshapes_featured
Until the introduction of CSS Shapes, it was nearly impossible to design a magazine-esque layout with free flowing text for the web. On the contrary, web design layouts have traditionally been shaped with grids, boxes, and straight lines. CSS Shapes allow us to define geometric shapes that text can flow around. These shapes can be circles, ellipses, simple or complex polygons, and even images and gradients. A few practical design applications of Shapes might be displaying circular text around a circular avatar, displaying text over the simple part of a full-width background image, and displaying text flowing around drop caps in an article. Now that CSS Shapes have gained widespread support across modern browsers, it’s worth taking a look into the flexibility and functionality they provide to see if they might make sense in your next design project. The current implementation of CSS Shapes is CSS Shapes Module Level 1, which mostly revolves around the shape-outside property. shape-outside defines a shape that text can flow around.


Data Ingestion Best Practices

In the good old days, when data was small and resided in a few-dozen tables at most, data ingestion could be performed manually. A human being defined a global schema and then assigned a programmer to each local data source to understand how it should be mapped into the global schema. Individual programmers wrote mapping and cleansing routines in their favorite scripting languages and then ran them accordingly. Today, data has gotten too large, both in size and variety, to be curated manually. You need to develop tools that automate the ingestion process wherever possible. For example, rather than manually defining a table’s metadata, e.g., its schema or rules about minimum and maximum valid values, a user should be able to define this information in a spreadsheet, which is then read by a tool that enforces the specified metadata.


Experian exec says biometrics won’t save you from mobile hacks

mobile security / unlocked data connections
"There are a number of ways every security system, not limited to biometrics, can be duped. And most of it, as we have found in post breach research, is due to some form of human error. Biometrics themselves may be very strong, just like malware protection or device security, but the hackers look for a [human] weakness. For example, biometrics may have different levels of sensitivity, and if the person setting up the biometrics doesn't turn up the sensitivity high enough, more people are easily able to get in. If you turn it up too high, you have too many people rejected. "Point I'm making is 80% to 85% of all breaches we service have a root cause in employees not doing the right thing, making a mistake, doing stupid stuff. It's not necessarily that the hackers are so smart that they have all these different attack vectors that are so much better than the company's security; they're looking for the weakest link, and generally employees are the weakest link."


Artificial Intelligence is an engineering problem, not magic!

From an engineering point of view with no serious mathematics background, its very encouraging to see how accessible this field can be for folk like myself, dealing with applied technology solutions on a daily basis. This is to be the first of a series of articles I intend to write on the subject, a brief introduction. The aim is to build up knowledge of different AI areas and give just enough background to enable you to understand how things work, and how to implement them on a practical level. If you have a reasonable grasp of the fundamentals, there is no reason why you cannot get to a position quickly where you will: know how to approach different engineering problems with AI solutions
identify which category of AI will be most suitable for a given problem; and know what libraries to use, and what you need to chain together to build out a solid professional solution. Before we get stuck in, lets draw a line in the sand regarding AI .... the type of AI that we have nowadays, that does we must admit some wonderful (yet limited) things, is referred to as 'Narrow AI'.


Cloud computing gets a second look from health execs

While it’s certainly possible to manage data from disparate sources with on premise solutions, the services developed by cloud vendors—including the liberal use of APIs—are already available for this purpose. “This is not a nice to have anymore. It is very quickly becoming an institutional imperative,” says George Gardner-Serra, partner at Clarity Insights, a consulting firm specializing in data analytics. “The leading organizations are moving very quickly in that respect.” Vendors, eager to ink contracts in the healthcare sector, are working to address providers’ needs. First, the major cloud vendors—such as Amazon Web Services, Microsoft Azure, and Google Cloud—have invested heavily in developing solutions that address security and privacy issues. “They are all willing to sign business associate agreements and maintain HIPAA compliant structures,” notes Jeff Becker, a senior analyst at Forrester Research.


Debunking Low-Code Myths to Empower App Modernization

Using a low-code platform, citizen developers can develop very simple applications that can offer basic functionalities. Power builders can build applications with more functionalities than that offered by citizen developers. Professional developers, on the other hand, can deliver complex applications with multiple functionalities and automation processes. A low-code platform lets a professional developer build application swiftly by reducing the amount of manual coding required. In short, a low-code platform enhances the capabilities of all types of developers by letting them do more than what they are capable of in app development. ... Low-Code and No-Code terminology itself is misleading, as the distinction isn’t about whether people need to code or not. The distinction is more about the types of people using these platforms to build applications.” This sums up the required differentiation between low-code and no code platforms.



Quote for the day:



"Problem-solving leaders have one thing in common: a faith that there's always a better way." -- Gerald M. Weinberg


Daily Tech Digest - December 29, 2018

Facebook's and social media's fight against fake news may get tougher

Filippo Menczer, a professor of informatics and computer science at Indiana University who's studied how automated Twitter accounts spread misinformation, said that because of the lack of available data, it's hard to tell if fake news is being spread through ephemeral content.  "Even the platforms themselves don't want to look inside that data because they're making promises to their customers that it's private," Menczer said. "By the time someone realizes that there's some terrible misinformation that's causing a genocide, it may be too late." Snapchat, which started the whole ephemeral content craze, appears to have kept itself mostly free of fake news and election meddling. The company separates news in a public section called Discover. Snapchat's editors vet and curate what shows up in that section, making it difficult for misinformation to go viral on the platform.


Google's E-Money License And The 8 Reasons Why Bankers Are Relaxed

Thinking of the bank as a provider of products makes it seem as an illustriously big deal that the e-money providers can't offer loans or interest on balances but in effect, when you think of the endless possibilities of contextual MoneyMoments, it is only payments and transfers that offer them and those are firmly possible without a full banking license. ... "But that's not their core business" is one of the most thrown-around phrase of soothing consolation when it comes to discussing any big technology giant entering the financial services arena. That just seems to be firmly outside of the realm of possibility that they would be interested in anything other than search, Prime delivery or spying on our private conversation but it's a healthy exercise to at times recall that the "core business" purpose of any of these companies, is, as it is for the banks themselves - turning a profit.


Microsoft’s ML.NET: A blend of machine learning and .NET


The ultimate tech giant, Microsoft, recently announced a top-tier open source and cross-platform framework. The ML.NET is built to support model-based machine learning for .NET developers across the globe. It can also be used for academic purposes along with the research tool. And that isn’t even the best part. You can also integrate Infer.NET to be a part of ML.NET under the foundation for statistical modeling and online learning. This famous machine learning engine – used in Office, Xbox and Azure, is available on the GitHub for downloading its free version under the permissive MIT license in the commercial application. The Infer.NET helps to enable a model-based approach to the machine learning which lets you incorporate domain knowledge into the model. The framework is designed to build a speak-able machine learning algorithm directly from that model. That means, instead of having to map your problem onto a pre-existing learning algorithm, Infer.NET actually constructs a learning algorithm based on the model you have provided.


An Intro to Data Mining, and How it Uncovers Patterns and Trends

Data mining is essential for finding relationships within large amounts and varieties of big data. This is why everything from business intelligence software to big data analytics programs utilize some form of data mining. Because big data is a seemingly random pool of facts and details, a variety of data mining techniques are required to reveal different insights. Our example from earlier explains how data mining can segment customers, but data mining can also determine customer loyalty, identify risks, build predictive models, and much more. One data mining technique is called clustering analysis, which essentially groups large amounts of data together based on their similarities. This mockup below shows what a clustering analysis may look like. Data that is sporadically laid out on a chart can actually be grouped in strategic ways through clustering analysis.


Microsoft Announces a Public Preview of Python Support for Azure Functions

According to Asavari Tayal, program manager of the Azure Functions team at Microsoft, the preview release will support bindings to HTTP requests, timer events, Azure Storage, Cosmos DB, Service Bus, Event Hubs, and Event Grid. Once configured, developers can quickly retrieve data from these bindings or write back using the method attributes of your entry point function.Developers familiar with Python do not have to learn any new tooling; they can debug and test functions locally using a Mac, Linux, or Windows machine. With the Azure Functions Core Tools (CLI), developers can get started quickly using trigger templates and publish directly to Azure, while the Azure platform will handle the build and configuration. Furthermore, developers can also use the Azure Functions extension for Visual Studio Code, including a Python extension, to benefit from auto-complete, IntelliSense, linting, and debugging for Python development, on any platform.


keyboard-attack.png
This type of vulnerability --known as a side-channel attack-- isn't new, but it's been primarily utilized for recovering cleartext information from encrypted communications. However, this new side-channel attack variation focuses on the CPU shared memory where graphics libraries handle rendering the operating system user interface (UI). In a research paper shared with ZDNet and that will be presented at a tech conference next year, a team of academics has put together a proof-of-concept side-channel attack aimed at graphics libraries. They say that through a malicious process running on the OS they can observe these leaks and guess with high accuracy what text a user might be typing. Sure, some readers might point out that keyloggers (a type of malware) can do the same thing, but the researcher's code has the advantage that it doesn't require admin/root or other special privileges to work.


Top 10 overlooked cybersecurity risks in 2018

Most cyber attacks injure either the confidentiality or availability of data. That is to say, they are either spying on or disabling some system. But there is of course another option: attacks on integrity. If you found out your bank records were, even in some small way, remotely altered say… 18 months ago? How would that change your perception of the safety of keeping your money in the bank? What if 1 percent of the bottles of some over the counter medication had the formula altered to change efficacy, how would that affect your trust in the medical system? Subtle, these operations are hard to detect, harder to prove, and leave a lasting stigma of distrust and conspiracy even if caught. Already we see some criminal groups engaging in this sort of activity to modify gift cards and other forms of petty cyber larceny, which means that more sophisticated operations and nation-state challenges won’t be far behind.


You’ve Heard of IoT and AI, but What is Digital Twin Technology?


A digital twin is a highly advanced simulation that’s used in computer-aided engineering (CAE). It’s a digital duplicate that represents a physical object or process, but it is not intended to replace a physical object; it is merely to inform its optimization. Other terms used to refer to digital twin technology include virtual prototyping, hybrid twin technology, and digital asset management, but digital twin is quickly winning out as the most popular name. Both NASA and the United States Air Force are planning on using digital twin technology to create future generations of lightweight vehicles that are sturdy and able to haul more than their current counterparts. Goldman Sachs recently examined digital twin technology in their series “The Outsiders,” which seeks to identify “emerging ecosystems on the edge of today’s investable universe.” 


10 Social Media Predictions for 2019

Storytelling emerged in 2018 as a core technique for engaging consumers. But up until now a lot of storytelling was stored on blogs and websites and then shared to social media. I see 2019 being the year when storytelling combined with augmented reality is hosted on the main social media platforms. I also see 2019 as the year when brands align their storytelling with enacting positive social change. Studies show that 92% of consumers have a more positive image of a company when it supports a social or environmental issue. And almost two-thirds of millennials and Gen Z express a preference for brands that stand for something. Nike nailed social media storytelling even before the emergence of sophisticated AR technologies. In its Equality campaign it focuses on social change and inspires people to act. The message: by wearing Nikes or even interacting with them on social media, you are supporting the movement.


China is racing ahead in 5G. Here’s what that means.

China sees 5G as its first chance to lead wireless technology development on a global scale. European countries adopted 2G before other regions, in the 1990s; Japan pioneered 3G in the early 2000s; and the US dominated the launch of 4G, in 2011. But this time China is leading in telecommunications rather than playing catch-up. In a TV interview, Jianzhou Wang, the former chairman of China Mobile, China’s largest mobile operator, described the development of China’s mobile communication industry from 1G to 5G as “a process of from nothing to something, from small to big, and from weak to strong.” Money is another good reason. The Chinese government views 5G as crucial to the country’s tech sector and economy. After years of making copycat products, Chinese tech companies want to become the next Apple or Microsoft—innovative global giants worth nearly a trillion dollars.



Quote for the day:


"What great leaders have in common is that each truly knows his or her strengths - and can call on the right strength at the right time." -- Tom Rath


Daily Tech Digest - December 28, 2018

01 intro prediction
Two years after its unveiling, Microsoft Teams has firmly established itself as a real rival to Slack. In the past 12 months, Teams has essentially replaced Skype for Business Online as Microsoft's central communications tool, and a free version is now available - a clear swipe at Slack, which has had sustained success with its freemium model. The app is now used by 329,000 organizations worldwide, Microsoft said during its 2018 Ignite conference, up from 125,000 a year earlier. Thanks to Teams’ inclusion within the Office 365 product suite, it is likely a matter of time before Teams is the most widely used team chat app – though just how often it is actually used remains a cause of debate. Unlike Slack, Microsoft doesn't break out daily active user figures. A recent Spiceworks' report claims that Teams is set for the fastest growth of all business chat apps over the next two years. The survey indicates that 41% of respondents expect to use Teams by 2020, compared to 18% for Slack.



Why have we become desensitised to cyber attacks?

Of course, security is a broad term – it can encompass anything from continuity and recovery to governance. Yet regardless of the area, the trend remains: even when people are clearly literate on the topic, they aren’t putting the safeguards in place required in this day and age. Alternatively, they introduce adequate security practices when it’s too late and the damage has already been done. Given the sensitivity of the data they carry, banks and building societies are the anomaly here. However, organisations spanning all other sectors are exhibiting an often shambolic approach to security at best. Some companies are failing to patch their systems, while some employees store important passwords in easily accessible files. I’ve even seen healthcare organisations operating on legacy systems, having no option but to isolate their systems when faced with a breach. Whether it’s cyber criminals or a state-sponsored attack, many businesses aren’t even monitoring their environments for attacks – and this could have serious repercussions.


Establish a configuration management strategy to guide transition


Address knowledge gaps with internal trainings -- Carfax established quarterly courses -- in addition to the vendor's materials. Make sure staff members can actually do tasks when they leave a workshop, and give them supporting documentation. "You can't manage the systems if you don't know how to use the tool," Woods said. Mix formal and informal training with a focus on teamwork and team organization as part of the overall configuration management strategy. Set a clear agenda, but adapt as tool use matures across the organization. "Standardize early, or expect chaos," Woods said. On the flip side, learn where to pull back and enable creativity. Start the configuration management push with weekly meetings, but space them out more as it becomes a normalized part of the teams' jobs, she advised.


What's the big deal with application integration architecture?

Application integration also continues to change because other aspects of application development evolve. Agile operations have created a need for a new set of tools, and those tools are already evolving into more complicated orchestration tools to deploy and link applications and components running on pools of resources. These tools, as they evolve and improve, are absorbing some of the functions that were traditionally part of application integration. All of these trends influence application integration architecture and the use of databases and information flows to link the IT support components for business processes. The most important trend in application integration today is the fact that, in the traditional sense, it's no longer the only problem or even the most significant one. If you ask CIOs today what their greatest challenge is, application integration is unlikely to figure prominently, but all three of the new factors probably will.


New Intel Architectures and Technologies Target Expanded Market Opportunities

2d and 3d packaging drive new design flexibility
The workloads associated with this computing landscape are changing. No longer do consumers or enterprise customers have simple applications that can be addressed with straightforward scalar architectures alone. Instead we see programs that are solving problems faster by integrating additional architectures from graphics processors to artificial intelligence accelerators to image processors and even adaptable designs like FPGAs powered by new memory technologies. We will combine computing and architecture innovations through high-speed interconnects with new models for software development that simplify APIs for developers and allow more performance and efficiency to be unlocked from Intel computing architectures. ... The message of Moore’s Law is about more than transistors alone, with the combination of transistors, architectural research, connectivity advancements, faster memory systems, and software working together to drive it forward.


Defense contractor: IT must embrace ‘radical transparency and culture change’

The one thing that's permanent in life is change. Once you have a company culture that embraces it, challenges it, and says we can never rest on our laurels, we can never stick with status quo -- we always have to be improving, challenge ourselves to be better, and also thinking about what's the new thing that's going to obsolete. Whatever our customers are relying on or that we think we're good at, we have to be ahead of everybody. You have to embrace innovation. You have to make that part of your corporate culture. You have to encourage risk-taking because that's a necessary, and frequently not enough spoken about, an element of innovation, which is the willingness to take risks, the willingness to be bold, put yourself out there, and be courageous. One of the things that I talk about from a strategy is, I tell people, "I want you to be unreasonable. Don't give me anything reasonable. I'm totally bored with that. I want to see your daring, courageous, bold things that have never been heard of before.


5 trends that will impact digital transformation initiatives


As enterprises continue to map the physical world to an intelligence-rich digital one, smart “things” will become a driving force for implementing next-generation platforms in 2019. This advance will enable large quantities of industry-specific data from the internet of things (IoT) to be analyzed, uncovering novel, hyper-dimensional correlations that provide fresh insights, enhance decision-making capabilities and improve business outcomes. Organizations will benefit greatly from leveraging digital platforms to interpret mass amounts of data and enable correlations that wouldn’t otherwise be possible because there are simply too many factors for the human brain to process. For example, collecting data from heart monitors, fitness watches, the human genome and more can lead to the best possible diagnosis of a condition and, therefore, the most effective treatment plan.


7 best practices for combating cybersecurity risks

Cyber risk reports often focus on technical details and technological risks. Yet, leaders, CEOs and board members should view cyberattacks as business risks and think about the holistic impacts that cyber breaches can have on business reputation, company culture, and profitability. Leaders must also pay special attention to their organization’s extended enterprise and the security flaws these partners could expose. Deloitte recently found that a majority of CEOs fail to hold their extended enterprise to the same risk standards as their own organizations and leaders see IT providers as the third parties that pose the greatest threat. These third parties expose the organization to significant cyber threats. But because these providers are external, they’re beyond management’s direct control. It’s critical that IT vendors are effectively managed and that the entire enterprise is held to strong security standards in 2019.


Building a VPC with CloudFormation - Part 2

AWS has made it easy and inexpensive to take advantage of multiple Availability Zones (AZs) within a given region. For an overly simplistic explanation, you can think of an Availability Zone as a huge independent datacenter. AZs within a region are connected to each other by high-speed, low-latency, privately operated links. They are close enough to each other to support synchronous communications, but far enough apart to mitigate the effect of natural disasters, power outages, etc. Exactly how far apart is not disclosed, and not really relevant. Two AZs is a good number to achieve basic high-availability at minimum cost. But sometimes a single AZ is better for simple cases like demos or POC’s. Other times three is desired for marginally improved high availability, or to make better use of the spot market. So let’s adjust the template to make the number of AZ’s variable. Using the template from article 1, add the following section above the “Resources” section.


India to lead hybrid cloud adoption globally in the next two years: Report

India to lead hybrid cloud adoption globally in the next two years: Report
Interestingly, the study also reveals that cost is not a driving factor anymore in the Indian market in the adoption of cloud technology.  Hybrid cloud is a computing environment that uses a mix of on-premises, private cloud and third-party, public cloud services with orchestration between the two platforms. Hybrid cloud also provides an array of other benefits including workload flexibility, simplicity in processing big data, broader use of cross platform IT services, enhanced data security and compliance, dramatic cost reduction and business growth and RoI, adds the report. Talking to ETtech, Sankalp Saxena, SVP and Managing Director – Operations, India, Nutanix, said that while BFSI is still leading in the adoption of the technology, traditional brick and motor businesses are now running on cloud along with other late adpapters like healthcare beefing up their technology back-ends.  Nutanix is now also exploring IoT and blockchain along with other areas which can be collaborated with cloud.




Quote for the day:



"Leadership is the other side of the coin of loneliness, and he who is a leader must always act alone. And acting alone, accept everything alone." -- Ferdinand Marcos


Daily Tech Digest - December - 27, 2018

Doxxing: What It Is How You Can Avoid It

Doxxing What It Is How You Can Avoid It
Doxxing means publishing private information about someone online to harass or intimidate them. It has ruined reputations and caused untold distress to hundreds of people. On occasion, doxxing has resulted in injury or even death. Being doxxed can have serious consequences for your safety and privacy. How can you prevent it? Doxxing and cyberbullying often go hand in hand, although doxxing has also been used — controversially — by journalists in pursuit of public interest stories. It’s a relatively new phenomenon grown out of early internet subculture, but it’s gaining both popularity and efficacy, driven partly by social media. Information obtained in doxxing attacks is generally gathered from public or semi-public sources: website logs, WHOIS records, social media profiles,and simple Google searches or directories. In some cases, it’s harvested by more sinister means like hacking or social engineering. 




The temptation to measure everything is understandable, but that can be the road to ruin. "Pick something that you don't like," Wallgren said. "Pick something that drives you nuts. Pick something that takes too long. Pick something that fails too often. Just pick something and then figure out a way to measure that and drive a better outcome for that thing. And then move on to the next thing." If you continue to find ways to get better, be it with mean time to recovery, release frequency or any number of other DevOps metrics, you should be able to deliver better software and keep your customers happy. Adopting DevOps metrics does not mean you should count the lines of code produced. While that may be an objective measurement, it's not in any way relevant to outcomes. Concentrate on a few things that help you make better decisions, experts insist, even if those items don't seem like they make an enormous difference to IT overall.



Q&A on the Book Digital Transformation at Scale

For many organisations, agile teams represent a very new way of working. It isn’t really possible to learn that in a classroom, or even be coached towards it. To really establish agile within an organisation, you need to bring in the full team, not just ones and twos (‘the unit of delivery is the team’ was a GDS mantra). That team should be given the conditions that allow them to deliver quickly, work in the open, and become a visible and tangible demonstration of what an agile team is. Some of that is intensely practical - having a decent workspace for them to all sit together, for example. Some of it is more challenging for institutions - moving to governance that is based on show and tells rather than steering boards is a big culture shock for many. Without that team showing what it means for real, agile is just words on a page for people, and not very clear ones at that.


6 Ways to Anger Attackers on Your Network

(Image: ls_design - stock.adobe.com)
"Make no mistake: It is happening. Companies are hacking back," he explains, and much of their activity is arguably in violation of the CFAA. That said, he isn't aware of any prosecutions under CFAA against organizations engaged in what is often called "active defense activities." Legal trouble aside, getting into a back-and-forth with attackers is dangerous, Straight cautions. "Even if you're really, really good and know what you're doing, the best in the business … will tell you it's very hard to avoid causing collateral damage," he explains. Chances are good your adversaries will see your "hack back" and launch a more dangerous attack in response. The worst thing you can do is go after the wrong party, the wrong network, or the wrong machines, he continues. Most hackers aren't using their own equipment when they attack. "There are times when I have really wanted to strike back, but you can't and you don't," says Gene Fredriksen, chief information security strategy for PCSU.



According to Veracode’s State of Software Security (SOSS) report, 87.5 percent of Java applications, 92 percent of C++ applications, and 85.7 percent of .NET application contain at least one vulnerability. In addition, over 13 percent of applications contain at least one critical vulnerability. “Our annual SOSS data puts hard evidence on the table to explain why so many security professionals experience anxiety when they think about application security (AppSec),” the report stated. “There is no way to sugar coat it: the sheer volume of flaws and percentage of vulnerable apps remain staggeringly high.” Among the vulnerabilities, SQL injection flaws and cross-site scripting (XSS) remained most common, which is consistent with previous years. SQL injection flaws were found in about one in three applications, while XSS vulnerabilities were present in about half of the applications.


Best tools and methods for designing RESTful APIs


API visualization is one of the fundamental steps in design, because it frames a graphical view of the API for users and enables users to interact with services that use a type of generalized API GUI. Most interactive development environments have visualization tools available, but these tools only offer basic capabilities. Swagger UI is a popular API visualization example that makes the in/out data structure of an API visible; it also exhibits simulated responses to given API caller requests. An API catalog is the central element of any API design strategy. Catalogs hold API definitions and make them available to developers. In some cases, catalogs may also drive API management processes, like access control or load balancing. Most API management suites will include a catalog, and separate API catalog tools are available from companies like Swagger, Oracle and IBM, as well as in open source form, like ReDoc.


Three key trends that will change cybersecurity strategies in 2019

Traditional VM tools identify thousands of vulnerabilities at any given time for a large enterprise, making it near impossible for security teams to know which vulnerabilities to prioritize and address first. As Gartner pointed out, advanced risk-based VM tools take into consideration the impact to the business of each vulnerability if exploited, and produce a clear, prioritized list of actions for the security team to take. As devastating breaches at organizations large and small, public and private, continue to make headlines, companies will gravitate toward risk-based tools to more effectively and efficiently avoid getting breached. Cybercriminals are constantly evolving their attack methods, and in response, security teams must advance their approaches to protecting their data. This means rethinking antiquated processes and tools. 2019 is sure to bring new challenges, but companies will also be taking steps in the right direction to properly secure data and proactively prevent breaches.


As Bitcoin sinks, industry startups are forced to cut back

Man sweeping Bitcoins into dustpan
The latest victim is Bitmain, a provider of bitcoin mining hardware that very recently submitted its IPO prospectus to the Stock Exchange of Hong Kong. The company confirmed to CoinDesk this week that cutbacks would begin imminently: “There has been some adjustment to our staff this year as we continue to build a long-term, sustainable and scalable business,” a spokesperson for Bitmain told CoinDesk . “A part of that is having to really focus on things that are core to that mission and not things that are auxiliary.” Beijing-based Bitmain hasn’t clarified just how many of its employees will be impacted, though rumors — which Bitmain has since denied — on Maimai, a Chinese LinkedIn-like platform, suggest as many as 50 percent of the company’s headcount could be laid off. This news comes after the crypto mining giant confirmed it had shuttered its Israeli development center, Bitmaintech Israel, laying off 23 employees in the process. Bitmain employs at least 2,000 people, up from 250 in 2016, according to PitchBook, as the company’s growth has skyrocketed.


Tracking Analytics with Artificial Intelligence

robot
As we head into 2019, it’s hard to find an industry that has been untouched by the data revolution. Even segments known for the hands-on nature of its work, like construction, are being reimagined with 3D-printed buildings, augmented reality and robots. The three industries below stand somewhere between those most and least affected by digital transformation. ... The worst kind of usage for a car is when a vehicle is driven mostly in traffic while the best is when it’s mostly highway driving, with less stopping and starting. An automaker in that case can offer everyone the same warranty for the first year but then can offer a different warranty package in the next year based on usage. Though no automakers have done so yet, this type of warranty package (similar to how the car insurance industry uses in-car tracking devices) can save automakers a lot of money and reward drivers who are gentler on their cars. For example, one of our auto clients was able to reduce warranty costs by 35% using sensor data.


When quantum computing threats strike, we won't know it


If a country was able to develop and successfully implement quantum computing for the purpose of breaking RSA encryption, they're not going to tell anyone. At some point, academia or the private sector will make advances that might show that it's plausible. But I think we have to be realistic and understand that the largest investors in this area are doing so such that it is highly unlikely that we will actually be aware when they are successful. ... You can assume that none of the people with access to the data were insider threats, but can you be 99.99% sure? Could that actually be the way the data was leaked? Or could it be flaws in the implementation of existing algorithms? It's not just good enough to have strong algorithms, we need strong implementations of the algorithm. If data all of a sudden is leaked, was it because the algorithm was cracked or one of these other government agencies identified a vulnerability that they chose not to disclose?



Quote for the day:


"When you practice leadership,The evidence of quality of your leadership, is known from the type of leaders that emerge out of your leadership" -- Sujit Lalwani