Daily Tech Digest - July 16, 2019

Best tools for single sign-on (SSO)

login credential - user name, password - administrative controls - access control - single sign-on
Interestingly, most SSO products also cost about $8 per user per month but will require more IT manpower to implement. (Ping’s solution offers a lot of bang for the $3 per month price point, however.) Let’s talk a bit about using MFA, because it is an important motivation behind going the SSO route. The idea of using MFA used to be mostly for the ultra-paranoid. Now MFA is the minimum for enterprise security, especially considering the number and increasing sophistication of spear-phishing attacks. Sadly, the deployment of MFA is far from universal: a recent survey from Symantec (Adapting to the New Realities of Cloud Threats) found that two-thirds of the respondents still don’t deploy any MFA tools to protect their cloud infrastructures. Certainly, having SSO can help ease the pain and move toward broader MFA acceptance. Besides MFA, there is another reason to up your authentication game: the need for adaptive or risk-based authentication. This means changing your perspective from issuing your users an “all-day access pass” when they begin work by logging into their laptops.



Trump’s hostile view of Bitcoin and crypto could chill industry

bitcoin behind bars > cryptocurrency ban or restriction
Trump tweeted Facebook Libra's "virtual currency" will have little standing or dependability. "If Facebook and other companies want to become a bank, they must seek a new Banking Charter and become subject to all Banking Regulations, just like other Banks, both National," Trump wrote. Those comments came one day after he criticized both Facebook and Twitter for what he called bias against his supporters. Like other cryptocurrencies backed by fiat currency, Facebook's digital money would be purchased through a typical financial network and then stored in the Calibra digital wallet application for making purchases via ads on the social media platform. A user could also do the same thing through Facebook's most popular communication platforms: WhatsApp and Messenger. Facebook did not respond to questions by Computerworld about whether the president's comments would affect its plans to issue a cryptocurrency. Avivah Litan, a vice president of research at Gartner, said while it's "very difficult" to analyze Trump's intentions from his tweets, "it sounds to me like he is gearing up to clamp down on cryptocurrency adoption by Americans.


How to deal with cloud complexity

How to deal with cloud complexity
Many popular approaches that deal with architectural complexity tell you to practice architectural discipline so your systems won’t be complex in the first place. The assumption is that you build and migrate cloud systems in short, disconnected sprints with little regard for standard platforms such as storage, compute, security, and governance. Most migrations and net-new developments are done in silos without considering architectural commonality that would drive less complexity. More complexity becomes inevitable. Although many are surprised when they experience complexity, it’s not always bad. In most cases, we see excessive heterogeneity because those who pick different cloud services make best of breed a high priority. Complexity is the natural result. A good rule of thumb is to look at cloud operations or cloudops. If you’re staying on budget, and there are few or no outages and no breaches, then it’s likely that your complexity is under control. Revisit these metrics every quarter or so. If all continues to be well, you’re fine. You are one of the lucky few who deal with a less complex cloud implementation—for now.


Single Sign-Ons To Accelerate Growth Of Digital Identity: Study

Single sign-ons to accelerate growth of digital identity: Study - CIO&Leader
Wide varieties of countries have recently planned, or are planning, to bring digital identity to many citizens. It will have an effect on the kinds of digital identity security available to consumers, as many of these initiatives are intended to bring identity verification to those who have never had official identification before. That being the case, these schemes need to be accessible to those with low levels of digital access, and are likely to be SIM-based, rather than relying on an online presence as such. These initiatives will also be more likely to have a physical card than other forms of digital identity. This impacts a range of use cases and allows a more consistent application of identity verification than in the case of identities that do not connect to a physical asset. This is frequently because the core documentation on which the foundation of the identity is built contains a photograph as the core verification method. Other methods (such as fingerprint sensors) require additional infrastructure and do not eliminate the chance of presenting false data at the point of on-boarding.


How Suse is taking open source deeper into the enterprise


What a company like Suse is doing is to help enterprises such as banks, healthcare providers and retail companies match what they’re trying to do with what’s available in the open source world. We select the projects and make sure they can work together with enterprise IT infrastructure, and are stable, secure and supported over time. We’ve started doing that with Linux, OpenStack, Cloud Foundry and Kubernetes. Now, you mentioned Asia. The challenges I mentioned are common to everybody, but what we see in Asia, like in Europe, is that Asia is not a single, homogeneous market. Different countries are in different stages of adopting open source. I spend quite a lot of time in Japan, China, Hong Kong, Singapore, all of which are very different markets. Typically in Japan, enterprises are more conservative so we have a lot of customers like banks that are running Linux on mainframes. Singapore is more innovative, so we see OpenStack being used by the public sector and manufacturing companies.


Understanding the role of governance in data lakes and warehouses

Having data well organized and consistently aggregated allows for the creation of performance and operational metrics – reporting that drives business and allows leaders to make informed decisions. Inclusion of both historical and current information organized in a consistent manner within the data warehouse increases the quality of the viewed data, thus increasing decision-making quality. ... Although they are different, the key to successful data lakes and data warehouses with useful, quality data, is the same – governance. Data governance allows for the understanding of not only what is stored where and its source, but the relative quality of the data and being able to ascertain it consistently. Aside from clarity and structure, governance also allows control. With such control, the organization knows how the data is being used and whether or not it’s meeting its intended purpose. Say the data has been manipulated to meet a set of determined requirements, without data governance, someone else could come along and pull the data – not knowing it had been previously employed – thus resulting in an inaccurate data analysis.


Cybersecurity: Is your boss leaving your organisation vulnerable to hackers?


CEOs and other senior board-level executives are exposing their organisations to cyberattacks and hackers because of a lack of awareness around cybersecurity, a new study has warned. Research by cybersecurity company RedSeal surveyed hundreds of senior IT and security professionals and found that many of these personnel believe there's a disconnect between the CEO and the information security team, which could be putting organisations at risk. ... "CEOs have wide access to their organisation's network resources, the authority to look into most areas, and frequently see themselves as exempt from the inconvenient rules applied to others. This makes them ideal targets," he added. However, despite some having fears around security at the very top of the organisation, on the whole, businesses appear to be taking cybersecurity seriously. Two thirds of businesses say their cyber-incident response plan is well defined and well tested – either via real breaches, or simulation tests. Three quarters of firms also report they have cyber insurance, suggesting there's an awareness around preparing for the aftermath of an incident, should one occur.


To pay or not pay a hacker’s ransomware demand? It comes down to cyber hygiene

CSO  >  ransomware / security threat
According to the FBI and most cybersecurity experts, no one should ever pay ransomware attackers. Giving in to the attackers’ demands only rewards them for their malicious deeds and breeds more attacks, they say. “The FBI encourages victims to not pay a hacker’s extortion demands,” the FBI says in an email to CSO. “The payment of extortion demands encourages continued criminal activity, leads to other victimizations, and can be used to facilitate additional serious crimes.” Jim Trainor, who formerly led the Cyber Division at FBI Headquarters and is now a senior vice president in the Cyber Solutions Group at risk management and insurance brokerage firm Aon, agrees. Trainor, who spent a fair amount of time dealing with ransomware attacks while he was in the Bureau, said his position has not changed. “I would recommend that people not pay the ransom. It’s extremely problematic,” he tells CSO. He conceded that making the determination to pay or not pay the attackers is ultimately a business decision, one that almost always hinges on whether the victim has access to adequate backups.


Government must ‘stop choosing ignorance’ around data


“The National Data Strategy must go beyond public services. Government’s role is broader than the delivery of public services; it can help shape how data is used across the whole of society through interventions such as research funding, procurement rules, regulatory activities and legislation,” the letter stated. “The strategy must recognise this and describe how government will make data work for everyone in the UK,” it added. However, the strategy “must deliver transformative, rather than incremental, change”, the letter stated, adding that the national data plan must be a long-term endeavour for government, with a vision for at least the next decade along with practical steps to turn any future vision into reality. Such ambitions may be unfulfilled if there is a lack of sustained strategic leadership on data, the letter warned. This is an issue that had been previously outlined in a recent report by the National Audit Office (NAO). Echoing the NAO’s concerns, the organisations stated the government must “get leadership from the very top if it is to get a grip on data”.


How digital and marketing executives are taking charge of digital transformation


Brahin says the key to success has been the marketing team's hybrid approach to digital transformation at UBS. Content is at the heart of this approach, where a centralised marketing organisation is helping line-of-business functions to transform the online experiences of clients. "Everything that concerns content delivery into the website and marketing channels is through a single approach, while business units still have control of their products and services. We partner with them to deliver marketing content into their service areas," she says. "It's an approach that has allowed us to create a solid foundation with a powerful content-delivery hub, where we can pump content to individual areas from a single hub. That's worked pretty well for us." The firm has analysed website analytics and used this insight to help deliver "modern, mobile experiences". McBain says the focus recently has been around optimisation and extending its content across new channels, including a recently launched website for the main brand.



Quote for the day:


"Strategy is not really a solo sport, even if you're the CEO." -- Max McKeown


Daily Tech Digest - July 15, 2019

Most Common Security Fails 

Image title
The most common security failure is not having a process. In addition, there’s also a disconnect between the security and compliance regulations that executives focus on, one being HIPPA’s cybersecurity requirement below: 164.306(a)(1) ensure the confidentiality, integrity, and availability of all electronically protected health information the covered entity creates, receives, maintains, or transmits. The above is such a broad and generic statement and that’s just one statement out of a document that has almost a hundred statements, so it’s no longer meaningful. From the perspective of security failures —which ties into the state of security management — there’s an absolute disconnect between high-level frameworks like ISO, COBIT, and HIPAA and how you actually implement them. Many companies today feel they have a framework that they follow, but that’s not a security program; that’s a document that gives guidance, one that doesn’t even give you detail on how to implement said guidance. For example, you start out with a security framework like HIPAA and you use something like the CIS controls to implement the guidance within HIPAA, that’s the second phase.



Why is it so hard to see IoT devices on the network?

Most IoT devices are known for their low CPU, minuscule memory and unique operating system (that often needs to be studied from scratch). Many IoT devices are “protected” by factory-derived usernames and passwords that are rarely changed. Furthermore, these devices are designed to connect to the wireless network, and most won’t function at all without a connection. These challenges make discovering and managing the devices a significant challenge, especially if they aren’t being accounted for as part of IT inventory. To track their presence on the network, IT teams need dedicated visibility tools with a price point that outweighs the relative low cost of adopting the IoT devices themselves. As a result, many IoT devices are given free reign over the network and can’t be seen in regular endpoint or vulnerability scans. You may be thinking that the answer to this challenge lies with the device manufacturers. Indeed, this thinking is correct, but due to a lack of regulation on IoT security, manufacturers are only now starting to realize that a lack of security presents a barrier to implementation.


Leak Confirms Google Speakers Often Record Without Warning

Leak Confirms Google Speakers Often Record Without Warning
Responding to the VRT NWS report, Google says that building technology that can work well with the world's many different languages, accents and dialects is challenging, and notes that it devotes significant resources to refining this capability. "This enables products like the Google Assistant to understand your request, whether you're speaking English or Hindi," David Monsees, Google's product manager for search, says in a Thursday blog post. Google says it reviews about 0.2 percent of all audio snippets that it captures. The company declined to quantify how many audio snippets that represents on an annualized basis. "As part of our work to develop speech technology for more languages, we partner with language experts around the world who understand the nuances and accents of a specific language," Monsees says. "These language experts review and transcribe a small set of queries to help us better understand those languages. This is a critical part of the process of building speech technology, and is necessary to creating products like the Google Assistant."


Network visibility challenges in modern networks

Organizations should strive for an end-to-end view of the health and operational status of their networks for several reasons. For one, visibility can enhance your ability to troubleshoot problems as they arise. Everything from a downed network and interfaces to operational, yet degraded links can be more quickly identified when monitoring and baselining data flows as they pass through the local area network (LAN), wide area network (WAN) and even out to the internet edge. Another reason for network visibility is to validate performance-based configurations. Visibility can help network managers better understand how network issues affect data on a per-application basis. If specific applications are business-critical, a manager can use configuration techniques, such as quality of service and traffic policing and shaping, to optimize these important data flows. Visibility can then validate that the performance modifications are working or identify when further configuration adjustments are needed.


Billion-dollar privacy penalties put CEOs on notice


The unprecedented penalties imposed on Facebook, Marriott and British Airways should serve as a warning for company leaders, according to Tom Turner, CEO of cyber security ratings firm BitSight. “CEOs around the globe are on notice that they are accountable for cyber security performance management just the same way they are accountable for managing the business,” he said. Commenting on the FTC settlement, Nuala O’Connor, president and CEO of the Center for Democracy & Technology (CDT), said: “The record-breaking settlement highlights the importance of data stewardship in the digital age. “The FTC has put all companies on notice that they must safeguard personal information,” she said, adding that privacy regulation in the US is “broken”. While large after-the-fact fines matter, O’Connor said strong, clear rules to protect consumers are more important, and called on the US Congress to pass a comprehensive federal privacy law in 2019.


How Not to Get Eaten by Technology

Cue Jaws theme song
Effective and smart learning techniques and strategies are required. By effective learning techniques, I mean methods that will help you to identify hot markets, hot technologies, trends, learning how to focus on what matters, learning things quickly, and so on. Specialization became more valuable as the platforms are becoming more sophisticated, so being a “Jack of all trades” is no longer acceptable for many companies since mastering a particular track is a non-trivial time and effort investment. (Well, this is subject to debate!) The software engineering field is becoming a well-paid field, in particular for renowned experts since it is not easy to become a well-versed engineer. Soft skills such as negotiation skills, requirements engineering, time planning, and public speaking are timeless valuable skills that will boost career opportunities. Domain knowledge is always valuable; it's worth it to spend time understanding the business rules, domain language, and concepts on a specific business area you are working in, such as health, HR, banking, etc.


How organizations are bridging the cyber-risk management gap

How to bridge the cyber-risk management gap
OK, so there’s a cyber-risk management gap at most organizations. What are they going to do about it? The research indicates that: 34% will increase the frequency of cyber-risk communications between the CISO and executive management. Now, more communication is a good thing, but CISOs must make sure they have the right data and metrics, and this has always been a problem. I see a lot of innovation around some type of CISO cyber-risk management dashboard from vendors such as Kenna Security, RiskLens (supporting the Factor Analysis of Information Risk (FAIR) standard), and Tenable Networks. Over time, cyber-risk analytics will become a critical component of a security operations and analytics platform architecture, so look for vendors such as Exabeam, IBM, LogRhythm, MicroFocus, Splunk, and SumoLogic to make investments in this area. 32% will initiate a project for sensitive data discovery, classification, and security controls. Gaining greater control of sensitive data is always a good idea, yet many organizations never seem to get around to this.


Software Engineer Charged With Stealing Company Secrets

Xudong Yao, 57, has been indicted on nine federal counts of theft of trade secrets, according to the U.S. Attorney's Office for the Northern District of Illinois, which is overseeing the case along with the FBI. Yao, who also used the first name "William," is believed to be living in China, according to federal prosecutors. During his time with the company, Yao allegedly downloaded thousands of computer files and other documents that contained various company trade secrets and intellectual property, including data related to the system that operates the unnamed manufacturer's locomotives, according to the indictment. While Yao was taking his former employer's intellectual property, he was negotiating for a new job with a firm in China that provided automotive telematics service systems, the Justice Department alleges. Yao was born in China, but he's a naturalized U.S. citizen, according to the FBI. Theft of trade secrets is a federal crime that carriers a possible 10-year prison sentence for each count, according to the Justice Department.


IT-Based Attacks Increasingly Impacting OT Systems: Study

IT-based attacks increasingly impacting OT systems: Study - CIO&Leader
While IT systems have been standardized for many years on the TCP/IP protocol, OT systems use a wide array of protocols, many of which are specific to functions, industries, and geographies. The OPC Foundation was established in the 1990s as an attempt to move the industry toward protocol standardization. OPC’s new Unified Architecture (OPC UA) has the potential to unite protocols for all industrial systems, but that consolidation is many years away due to the prevalence of legacy protocols and the slow replacement cycle for OT systems. Cyber criminals have actively attempted to capitalize on this confusion by targeting the weak links in each protocol. These structural problems are exacerbated by the lack of standard protections and poor security hygiene practiced with many OT systems—a legacy of the years when they were air gapped. Figure 2 shows the number of unique threats targeting machines using specific ICS/SCADA protocols. Despite seasonal fluctuations and a wide variety of targets, the data is clear on one thing: IT-based attacks on OT systems are increasing.


4 steps to reaping the software benefits of continuous testing

Evolving your software development process to include continuous testing is a necessary investment so your team’s software is high quality and delivered quickly. Testing mitigates the risk of poor software quality by identifying defects before there is impact to your customers, your business operations and your revenue. Testing is good, but continuous testing is better, because you can find and fix more defects sooner to avoid the accumulation of technical debt. Software technical debt includes defects in your software not yet discovered plus the backlog of lower priority defects waiting to be fixed. Technical debt adds to the complexity and cost of maintaining your software over time. With CT, you can take action immediately to fix defects to avoid adding to your technical debt. CT decreases the time and cost to fix a defect. Research by Perfecto disclosed that a software defect fixed on the same day it was detected took only one hour, while it took eight hours to fix if detected at the end of a two-week sprint.



Quote for the day:


"Effective team leaders adjust their style to provide what the group can't provide for itself." -- Kenneth Blanchard


Daily Tech Digest - July 14, 2019

German banks are moving away from SMS one-time passcodes

sms-phone.jpg
The cyber-security industry has been warning against the insecurity of SMS OTP for years now, as well, but not because of SIM swapping attacks -- which are virtually social engineering attacks. The cyber-security industry has been warning against securing systems with SMS-based authentication because of inherent and unpatchable weaknesses in the SS7 protocol used in the backbone of all mobile telephony networks for years. Vulnerabilities in this protocol allow attackers to silently hijack a user phone number, even without a telco's knowledge, allowing threat actors to track users or authorize online payments or login requests. These vulnerabilities have not gone unnoticed in Germany. ... While two-step verification and two-factor authentication is recommended, security experts have been warning against relying on SMS as "the second factor." Instead, experts recommend using authenticator apps or hardware security tokens, two of the methods that German banks are now rolling out to secure their systems and replace SMS-based authentication.


Four Insurtech Startups Shaking Up The Insurance Industry

Insurtech can ease the insurance claim process
Insurtech companies tend to focus on increased personalization and greater speed and efficiency of services to meet changing customer needs, with many using AI to offer deeper data insights. And while some are set on displacing the industry incumbents, others are working with the leading insurance firms as they transition to the age of digital innovation. ... The inconsistent and often unpredictable nature of external data can cause insurers a real headache, leading to delays in processing new opportunities and managing existing books of business. Untangler is the insurtech that gets around that problem by using AI to recognize inbound customer or employee data in any format, transform it into readable data in seconds from which providers can create quotes without having to convert the data within cells. Launched in May this year the technology was developed by entrepreneurs Richard Stewart and Steve Carter, initially for their own startup Untangl, which provides employee benefits, including insurance products, to SMEs.


Beware of Geeks Bearing AI Gifts

Artificial? Definitely. Intelligent? Maybe.
Last March, McDonald’s Corp. acquired the startup Dynamic Yield for $300 million, in the hope of employing machine learning to personalize customer experience. In the age of artificial intelligence, this was a no-brainer for McDonald’s, since Dynamic Yield is widely recognized for its AI-powered technology and recently even landed a spot in a prestigious list of top AI startups. Neural McNetworks are upon us. Trouble is, Dynamic Yield’s platform has nothing to do with AI, according to an article posted on Medium last month by the company’s former head of content, Mike Mallazzo. It was a heartfelt takedown of phony AI, which was itself taken down by the author but remains engraved in the collective memory of the internet. Mr. Mallazzo made the case that marketers, investors, pundits, journalists and technologists are all in on an AI scam. The definition of AI, he writes, is so “jumbled that any application of the term becomes defensible.” Mr. Mallazzo’s critique, however, conflates two different issues. The first is the deliberately misleading marketing that is common to many hyped technologies, and is arguably epitomized by some blockchain companies.


Healthcare Organizations Too Confident in Cybersecurity

"There are some surprises in the results, particularly the higher than expected confidence that organizations have in regards to the security of their patient portal and telemedicine platforms given that only 65% deploy multi-factor authentication," said Erin Benson, director of market planning for LexisNexis Health Care. "Multi-factor authentication is considered a baseline recommendation by key cybersecurity guidelines. Every access point should have several layers of defense in case one of them doesn't catch an instance of fraud. At the same time, the security framework should have low-friction options up front to maintain ease of access by legitimate users." The report findings suggest that traditional authentication methods are insufficient, multi-factor authentication should be considered a baseline best practice and the balance between optimizing the user experience and protecting the data must be achieved in an effective cybersecurity strategy, the press release said.


Big Data Governance: 4 Steps to Scaling an Enterprise Data Governance Program

Picture of dozens of birds grouped on several lines (almost looks like an abacus), an illustration of alignment in nature.
No matter how big your data governance program becomes, it must retain its agility. If you can’t adapt quickly, then you’ll lose momentum and your initiative will start to deliver diminishing returns. A big challenge here is aligning the huge number of people involved in your initiative. We’ve already discussed the need for collaboration, but crowdsourcing solutions to big decisions can soon lead to analysis paralysis. The solution is to develop an efficient decision-making system that allows for everyone’s voice to be heard. A best-practice decision-making framework, such as a DACI approach (Driver, Approver, Contributor, and Informed), can help here. These frameworks establish a continuous cycle of listening and acting, where everyone has a chance to feed into the discussion, but a small group of clearly identified people retain control over decision-making. That way, everyone’s happy and you make steady progress.


Is Programming knowledge Required to Pursue Data Science?

The answer is No! Data Science is not just about having technical knowledge. Being a domain related to both the Computer science world as well as the Business world, the latter has a fair share of skill set that is very vital for becoming a data scientist. In fact, non-technical skills that are mentioned below arguably sum up to 60% of the work as a data scientist. These are skills that were not mentioned in the Venn diagram but are equally important in any ideal data-driven project. ... data science is a field for everyone. From an application developer to a businessman, everyone will have a base skill set that enables anyone to start a fresh career in Data Science. Even those who do not want to learn to programme can hone their strengths in their business or mathematical department and still be a part of this wonderful domain. At the end of the day, a sense of problem solving and commitment is all that one will need to excel in any given situation.


Brazil is at the forefront of a new type of router attack


According to Avast researchers David Jursa and Alexej Savčin, most Brazilian users are having their home routers hacked while visiting sports and movie streaming sites, or adult portals. On these sites, malicious ads (malvertising) run special code inside users' 
browsers to search and detect the IP address of a home router, the router's model. When they detect the router's IP and model, the malicious ads then use a list of default usernames and passwords to log into users' devices, without their knowledge. The attacks take a while, but most users won't notice anything because they're usually busy watching the video streams on the websites they've just accessed. If the attacks are successful, additional malicious code relayed through the malicious ads will modify the default DNS settings on the victims' routers, replacing the DNS server IP addresses routers receive from the upstream ISPs with the IP addresses of DNS servers managed by the hackers. The next time the users' smartphone or computer connects to the router, it will receive the malicious DNS server IP addresses, and this way, funnel all DNS requests through the attacker's servers, allowing them to hijack and redirect traffic to malicious clones.


Building Your Federal Data Strategy

The variety of federal agency missions, resourcing and data maturity levels vary greatly and thus each will have a unique perspective on the Federal Data Strategy and how they can leverage it. Regardless of an agency’s start point however, building an agency strategy and implementation plan based on the Federal Data Strategy’s three guiding data principles (i.e., Ethical governance; Conscious Design; Learning Culture) and three best practices (i.e., Building a culture that values data and promotes public use; Governing, managing and protecting data; Promoting efficient and appropriate data use) is imperative. Most of these principles and strategies will likely be driven initially by new policies, reporting requirements, budget planning, and associated activities. Given the necessary policy staffing and approval processes however, it will take some time to get these in place. A sequential, policy-only approach to the strategy is apt to result in extended timelines hindering rapid progress.


What Matters Most with Data Governance and Analytics in the Cloud?


As much as there are benefits, there understandably can be issues to address when it comes to moving data and redeploying processes from legacy platforms to Big Data platforms and the cloud to reduce costs and achieve higher processing performance. “There is a desire to modernize infrastructure into Big Data platforms or clouds,” commented Smith. Syncsort provides solutions for data infrastructure optimization, the cloud, data availability, security, data integration, and data quality. As businesses build data lakes to centralize data for advanced analytics, they need to also ingest mainframe data for Big Data platforms like Hadoop and Spark. According to Smith, the top analytics use cases that drive data lakes and enterprise data hubs are advanced/predictive analytics, real-time analytics, operational analytics, data discovery and visualization, and machine learning and AI. The top legacy data sources that fill the data lake are enterprise data warehouses, RDBMS, and mainframe/IBM I Systems.  “Legacy infrastructure is so critical,” Smith commented. “These systems are not going away.” Mainframes and IMB I still run the core transactional apps of most enterprises, according to the company.


Sensitive Data Governance Still a Difficult Challenge


Governing all of an organization’s mountains of sensitive data, or even knowing what sensitive data exists and where it’s located within the enterprise, isn’t easy. Data classification is hard to accomplish. Often it does not occur reliably given the volume of data that must be discovered and when the task is left to business users. A large health care provider stores 4.1 billion columns of data. A financial services company sucks in more than 10 million data sets per day. As the data pours in, only a small percentage of it — the so called Critical Data Elements (CDEs) — are tagged in a painfully slow and error-prone manual process that leaves most data miscategorized, lost or still waiting to be discovered, and impossible to track. Most companies have between 100 and 200 CDEs, but customers typically have thousands and sometimes even millions of data elements depending on their business, data organization and representation. This presents a risk because CCPA covers any data you know and possess about your customers.



Quote for the day:


"Leaders are the ones who keep faith with the past, keep step with the present, and keep the promise to posterity." -- Harold J. Seymour


Daily Tech Digest - July 13, 2019

Zoom vulnerability reveals privacy issues for users

On top of this, this vulnerability would have allowed any webpage to DOS (Denial of Service) a Mac by repeatedly joining a user to an invalid call," Leitschuh added. "Additionally, if you've ever installed the Zoom client and then uninstalled it, you still have a localhost web server on your machine that will happily re-install the Zoom client for you, without requiring any user interaction on your behalf besides visiting a webpage." According to Leitschuh, it took Zoom 10 days to confirm the vulnerability and in a meeting on June 11, he told Zoom there was a way to bypass the planned fix, but Zoom did not address these concerns when Zoom reported the vulnerability fixed close to two weeks later. The Zoom vulnerability resurfaced on July 7, Leitschuh disclosed on July 8 and Zoom patched the Mac client on July 9. Zoom also worked with Apple on a silent background update for Mac users, released July 10, which removed the Zoom localhost from systems. "Ultimately, Zoom failed at quickly confirming that the reported vulnerability actually existed and they failed at having a fix to the issue delivered to customers in a timely manner," Leitschuh wrote.


What is enterprise? What is architecture?

What exactly was ‘the enterprise’ in that context? If we think of ‘enterprise’ as organisation’, was it just NASA? Or did it include others in the consortium of US organisations that designed and built the launchers, the landers, all of the equipment for the missions? Did it include the broader international consortium of support-facilities such as ‘The Dish‘ at Parkes, New South Wales, through which the TV transmissions for the first moonwalk came? Did it include ‘competitors’ such as the Russian, Chinese and other national space-organisations? If we think of ‘enterprise’ as ‘a bold endeavour’, what was the respective endeavour? Just one moon-landing mission? – or all of them, from Apollo-11 to Apollo-17? Should we include all the other Apollo missions? All of the other US space-missions, before and after Apollo? To what extent would we include other moon-missions from other nations in the same enterprise? Space-explorations in general? Only crewed-missions, or robotic missions as well? Or could and should we extend this enterprise to include the overall story? – the dream of spaceflight and suchlike?


No, 5G isn't going to make your 4G LTE phone obsolete


"This is the first time so many aspects of [the old and new network] are shared," saidGordon Mansfield, AT&T vice president for converged access and device technology. "Some things we'll do for 5G are inherently backward compatible and will lift the capabilities of 4G." By 2025, 15% of mobile connections in the world will be on 5G, according to a 2019 report by GSMA Intelligence, the research arm of the mobile operator group that hosts Mobile World Congress. But LTE usage will be about 59% by the same year, up from 43% in 2018. (In North America, the split will be more even, with about 47% of 2025's connections on 5G and 44% on 4G). Even if 5G becomes an even bigger part of the market by 2025 than estimated today, "it will complement rather than replace LTE," GSMA said in a separate report from last year.  "For operators in many parts of the world, LTE is and will be the foundation for the next 10 years at least," the GSMA report said. "LTE speeds are improving, which makes 5G less compelling without new services such as AR/VR." 


IT strategy: The CIO’s guide to getting stuff done


Shaun Le Geyt, CIO at Parkinson's UK, is part of a senior digital leadership team who are driven by using technology to help people with the condition. Every hour, two people in the UK are told they have Parkinson's. As many as 145,000 people are diagnosed with the condition in the UK, which is around one in every 350 adults. Le Geyt says CIOs who want to meet their targets must focus on the people who will benefit, rather than focusing on the technology they're implementing. "Managing cultural change is as important as managing technological change," he says. Taking control of that change process is as true for internal staff in his organisation as it is for the individuals who benefit from the charity's work. Parkinson's UK has annual income of about £40m and employs 450-plus staff. The charity draws on a dynamic network of expert staff, health and social care professionals, volunteers, and researchers.  "Focus on the needs of the organisation – put people first," says Le Geyt. "There's times when technology comes up and you know it's the right thing to discuss.


DevOps for networking hits chokepoints, tangled communications


Challenges will arise with tighter integration of DevOps and networking. Network automation lags behind other areas of IT automation -- the networking team will argue that this is because automation decreases visibility into this vital component of the application infrastructure. When applications run in the public cloud -- as is often the case with DevOps deployments -- the admins cannot touch the physical network. Instead, the focus shifts to virtual private clouds, autoscaling and failover via setup policies. Cloud certification training can help developers take over these tasks, but developers do not have expertise in network management, just as network admins do not have experience running cloud operations. There's a cultural challenge set up by this change to NetOps. DevOps teams have to collaborate closely with both the in-house network operations team and their cloud service provider. Appoint in-house technical and strategic alliance managers, and request one from a cloud provider to build relationships and overcome these obstacles.


How to learn from inevitable AI failures

cloudai.jpg
Partly this is a problem of skills: To do well with AI or any area of big data, you need a mix of math, programming, and more. That kind of unicorn doesn't readily gallop by. However, it's also the case that finding someone who understands data science may be easier than finding someone who understands your business and the data that makes it hum. This calls to mind Gartner analyst Svetlana Sicular's advice from years ago about big data: "Organizations already have people who know their own data better than mystical data scientists." Therefore, look within your organization because "Learning Hadoop is easier than learning the company's business." Many AI projects fail precisely because the technology is considered in a vacuum. As noted by Greg Satell in Harvard Business Review, any AI project should have a clear business outcome identified, with the right data culled to serve that end. This, in turn, requires (you guessed it!) involving smart folks within the enterprise who understand the business intimately and know where to find the best data. AI, in other words, while ostensibly about replacing people, can't succeed without involving your company's best people.


Asia’s AI agenda: The ethics of AI

Asia's AI agenda: The ethics of AI
Asia’s AI ecosystem participants are aware of and concerned about the potential for embedded biases (race, gender, or socioeconomic status) within AI tools, and the harm this can cause through facilitating overpolicing of minority communities, or economic exclusion. Weaponization and malicious use of AI are also ethical concerns in Asia as applications are increasingly commoditized and industrialized. While Asian decision-makers are concerned about a potentially negative impact, particularly where jobs are concerned, optimism is the more dominant sentiment, which will propel the use of AI in Asia. Asian governments are building institutional capacity and frameworks to increase AI governance—but have yet to develop regulations. Overwhelmingly, more survey respondents believe Asia will lead the world in the development of ethics and governance than any other region: 45%, as compared with only a quarter who see North America as the ethics frontrunner.


The Market Of One

market of one
Smart, agile companies like Lifedata.ai are keeping up with such data developments by recognising that people expect a brand experience that matches their digital lifestyles. Today’s consumers demand improved ease of use, relevance and personalisation, delivered across channels in a frictionless and consistent way. Social and media interactions can be mediated by software and turn IoT sensor data into monetisable experiences. These can unlock context-aware personalisation, obtaining behavioural insights and real-time automation based on a timeline of contextual moments, situational context and relevant behavioural profiles. Lifedata.ai aims to capture how people go through their day and identify moments that matter in order to improve their lives. Omar Fogliadini, Managing Partner of Lifedata.ai, says, “In 2019, integrations with digital assistants are no longer a differentiator. Brands will need to showcase what people can actually do with these integrations. The most successful integrations will be those that make people’s lives easier and help them get things done. ...”


Man Vs. Machine: The 6 Greatest AI Challenges To Showcase The Power Of AI

Man Vs. Machine: The 6 Greatest AI Challenges To Showcase The Power Of Artificial Intelligence
Could artificial intelligence play Atari games better than humans?DeepMind Technologies took on this challenge, and in 2013 it applied its deep learning model to seven Atari 2600 games. This endeavor had to overcome the challenge of reinforcement learning to control agents directly from vision and speech inputs. The breakthroughs in computer vision and speech recognition allowed the innovators at DeepMind Technologies to develop a convolutional neural network for reinforcement learning to enable a machine to master several Atari games using only raw pixels as input and in a few games have better results than humans. Next up in our review of man versus machine is the achievements of AlphaGo, a machine that is able to learn for itself what knowledge is. The supercomputer was able to learn 3,000 years of human knowledge in a mere 40 days prompting some to claim it was “one of the greatest advances ever in artificial intelligence.” The system had already learned how to beat the world champion of Go, an ancient board game that was once thought to be impossible for a machine to decipher.


Is Enterprise Architecture Relevant To Agile?

Is Enterprise Architecture Relevant To Agile?
The first important insight is that EA is valuable to determine the future of pivotal Agile projects. It provides better vision to realize, identify the application and projects which are needed to support this vision. In EA, applications can be introduced as a black box. The Agile Project can open this black box. Agile projects can refine the high-level business requirements into the epics and the user stories in EA. Another important insight is that the focus of Agile will only be the teams, and not the enterprise. Dean Leffingwell designed the Scaled Agile Framework for small teams and it does not scale to the enterprise level. Enterprise Architects are also working under this framework. The responsibilities of the enterprise architect constitute of maintaining the goals, facilitating reuse of emerging solutions, knowledge and patterns. Finally, Agile and Scrum can be considered as enterprise architecture. They can be illustrated in the form of principles and models, core elements of architecture.



Quote for the day:


"The role of leadership is to transform the complex situation into small pieces and prioritize them." -- Carlos Ghosn


Daily Tech Digest - July 12, 2019

Reinforcement learning is an area of machine learning that has received lots of attention from researchers over the past decade. Benaich and Hogarth define it as being concerned with "software agents that learn goal-oriented behavior by trial and error in an environment that provides rewards or penalties in response to the agent's actions (called a "policy") towards achieving that goal." A good chunk of the progress made in RL has to do with training AI to play games, equaling or surpassing human performance. StarCraft II, Quake III Arena and Montezuma's revenge are just some of those games. More important than the sensationalist aspect of "AI beats humans", however, are the methods through which RL may reach such outcomes: Play driven learning, simulation and real-world combination, and curiosity-driven exploration. Can we train AI by playing games? As children, we acquire complex skills and behaviors by learning and practicing diverse strategies and behaviors in a low-risk fashion, i.e., play time. 


APT Groups Make Quadruple What They Spend on Attack Tools

"The potential benefit from an attack far exceeds the cost of a starter kit, says Leigh-Anne Galloway, cybersecurity resilience lead at Positive Technologies. For groups like Silence, the profit from one attack is typically more than quadruple the cost of the attack toolset, she says. The ROI for some APT groups can be many magnitudes higher. Positive Technologies, for instance, estimated that APT38, a profit-driven threat group with suspected backing from the North Korean government, spends more than $500,000 for carrying out attacks on financial institutions but gets over $41 million in return on average. A lot of the money that APT38 spends is on tools similar to those used by groups engaged in cyber espionage campaigns. Building an effective system of protection against APTs can be expensive, Galloway says. For most organizations that have experienced an APT attack, the cost of restoring infrastructure in many cases is the main item of expenditure. "It can be much more than direct financial damage from an attack," she says.


Smarter IoT concepts reveal creaking networks

Industry 4.0 - industrial IoT internet of things
"The internet, as we know it, is based on network architectures of the 70s and 80s, when it was designed for completely different applications,” the researchers say in their media release. The internet has centralized security, which causes choke points, and and an inherent lack of dynamic controls, which translates to inflexibility in access rights — all of which make it difficult to adapt the IoT to it. Device, data, and process management must be integrated into IoT systems, say the group behind the project, called DoRIoT (Dynamische Laufzeitumgebung für Organisch (dis-)Aggregierende IoT-Prozesse), translated as Dynamic Runtime Environment for Organic dis-Aggregating IoT Processes. “In order to close this gap, concepts [will be] developed in the project that transparently realize the access to the data,” says Professor Sebastian Zug of the University of Freiberg, a partner in DoRIoT. “For the application, it should make no difference whether the specific information requirement is answered by a server or an IoT node.”


Managing Third-Party Risks: CISOs' Success Strategies

Managing Third-Party Risks: CISOs' Success Strategies
As more organizations rely on third parties for various services, managing the security risks involved is becoming a bigger challenge. Those risks, indeed, can be significant. For example, earlier this year, Indian IT outsourcing giant Wipro was targeted by hackers who in turn launched phishing attacks against its customers. Among the toughest third-party risk management challenges are: Keeping track of the long list of outsourcers an organization uses and making sure they're assessed for security; Taking steps to minimize the amount of sensitive data that's shared with vendors - and making sure that data is adequately protected; and Holding vendors to a uniform standard for security. "For most organizations, there is still a long way to go in strengthening governance when it comes to vendor management," says Jagdeep Singh, CISO at InstaRem, a Singapore-based fintech company. "We need to look at the broader risk posture that vendors bring in ... which will determine the sort of due diligence you want to carry out."


To encourage an Agile enterprise architecture, software teams must devise a method to get bottom-up input and enforce consistency. Apply tenets of continuous integration and continuous delivery all the way to planning and architecture. With a dynamic roadmap, an organization can change its planning from an annual endeavor to a practically nonstop effort. Lufthansa Systems, a software and IT service provider for the airline industry under parent company Lufthansa, devised a layered approach to push customer demand into product architecture planning. Now, the company can continuously update and improve products, said George Lewe, who manages the company's roster of Atlassian tools that underpin the multi-team collaboration. "We get much more input from the customers -- really cool ideas," Lewe said. "Some requests might not fit into our product strategy or, for technical reasons, it's not possible, but we can look at all of them." Lufthansa Systems moved its support agents, product managers and software developers onto Atlassian Jira, a project tracking tool, with a tiered concept. 


What does the death of Hadoop mean for big data?

The Hadoop software framework, which facilitated distributed storage and processing of big data using the MapReduce programming model, served these data ambitions sufficiently. The modules in Hadoop were developed for computer clusters built from commodity hardware and eventually also found use on clusters of higher-end hardware. But the broader adoption of the open-source distributed storage technology that was invented by Google, however, did not come to be, as enterprises began opting to move to the cloud and explore AI, which included machine learning and deep learning as part of their big data initiative. Worse, several big Hadoop-based solution providers that had been unprofitable for years were forced to merge to minimize losses, and one may be forced shut down altogether. However, the questions remain if the fate of these vendors is only indicative of the demise of Hadoop powered solutions and other open source data platforms, or the death of big data as a whole? Was big data merely a fad or a passing interest of industries?


From Machine Learning to Machine Cognition  

Image 4 for From Machine Learning to Machine Cognition
Keeping logic/decisions outside network is what has been done by now. For decisions, we are using automated systems bases on software running on CPUs instead of artificial cognitive networks. While these work very well and will still be present for a long time, they are limited. Basically, these programs perform simple iterative tasks or move controls and numbers on monitor windows with millions of lines of code. This approach may be good while dealing with games and simple narrow tasks but not great when dealing with general concepts. They will not ensure enough internal connections. These will hardly evolve to intelligence. The complexity required is just too high to emulate imagination, intuition etc. Image recognition has been developed with neural networks because it was impossible to generate an iterative algorithm for it. The same should be done with cognition; decisions should use neurons, cognition should be kept inside network together with concepts and learning, as they have common neurons.


Open-Source Tool Lets Anyone Experiment With Cryptocurrency Blockchains

In researching blockchains, Shudo and his colleagues searched for a simulator that would help them experiment with and improve the technology. But existing simulators were too hard to use and lacked the features the team wanted. Moreover, these simulators had apparently been created for specific research and were abandoned soon after that work was completed, because many of the tools the group found were no longer being updated.  "The most recent simulator we looked at was developed in October 2016," says Shudo. "And it was no longer being maintained." So, the group developed its own simulator. Dubbed SimBlock, it runs on any personal computer supporting Java and enables users to easily change the behavior of blockchain nodes. Consequently, investigating the effects of changed node-behavior has now become a straightforward matter, says Shudo. "All the parameters of the nodes in SimBlock are written in Java," he explains. "These source files are separated from the main SimBlock Java source code, so the user simply edits [the nodes’] source code to change their behavior."


Visual Studio Code: Stepping on Visual Studio’s toes?
Microsoft describes Visual Studio as a full-featured development environment that accommodates complex workflows. Visual Studio integrates all kinds of tools in one environment, from designers, code analyzers, and debuggers to testing and deployment tools. Developers can use Visual Studio to build cloud, mobile, and desktop apps for Windows and MacOS.  Microsoft describes Visual Studio Code, on the other hand, as a streamlined code editor, with just the tools needed for a quick code-build-debug cycle. The cross-platform editor complements a developer’s existing tool chain, and is leveraged for web and cloud applications. But while Microsoft views the two tools as complementary, developers have been raising questions about redundancy for years. Responses to a query in Stack Overflow, made four years ago, sum up the differences this way: Visual Studio Code is “cross-platform,” “file oriented,” “extensible,” and “fast,” whereas Visual Studio is “full-featured,” “project and solution oriented,” “convenient,” and “not fast.”


Attacks against AI systems are a growing concern


The continuing game of “cat and mouse” between attackers and defenders will reach a whole new level when both sides are using AI, said Hypponen, and defenders will have to adapt quickly as soon as they see the first AI-enabled attacks emerging. But despite the claims of some security suppliers, Hypponen told Computer Weekly in a recent interview that no criminal groups appear to be using AI to conduct cyber attacks. The Sherpa study therefore focuses on how malicious actors can abuse AI, machine learning and smart information systems. The researchers identify a variety of potentially malicious uses for AI that are already within attackers’ reach, including the creation of sophisticated disinformation and social engineering campaigns. Although the research found no definitive proof that malicious actors are currently using AI to power cyber attacks, as indicated by Hypponen, the researchers highlighted that adversaries are already attacking and manipulating existing AI systems used by search engines, social media companies, recommendation websites, and more.



Quote for the day:


"Leadership is a matter of having people look at you and gain confidence, seeing how you react. If you're in control, they're in control." -- Tom Landry


Daily Tech Digest - July 11, 2019

How IoT is reshaping network design

IoT
In a world of always-on ubiquitous connectivity, latency and reliability loom over everything, whether you’re talking about self-driving cars or Industry 4.0. These two challenges are driving much of the change that we’ll see in network design over the next few years. If the industry is to realize the promised benefits of IoT, we must increase the ability to support more machine-to-machine communications in near-real time. In applications like autonomous vehicles, latency requirements are on the order of a couple of milliseconds. GSMA, the international association for mobile technologuy, has specified that 5G's latency should be 1 millisecond, which is 50 times better than 4G's current 50 milliseconds. Satisfying these requirements involves a radical rethink about how and where we deploy assets throughout the network. For example, routing and backing up data using a traditional star-type network design will become increasingly unfeasible. The vast amount of traffic and the latency demands would easily overwhelm a north-south data flow.



Cyber security will always be an issue, “until we get rid of passwords” — Frank Abagnale Jr

The password is insecure: a hacker could log into an individual’s bank account and they wouldn’t even know. This is first issue; passwords are easily lost and even more easily stolen, via phishing or malware attacks. Once a cybercriminal has access to the password, they can replay it over and over gain. “Unfortunately, because passwords are free and easy, no one gave design much thinking,” said Mr Eisen. “But, now the cost of passwords is obvious” — they’re the great security vulnerability and largely responsible for the data breaches that pepper news headlines. Historically, security and user experience have been at odds with each other, because everyone believed that making systems less user friendly (longer, more complex passwords, for example) made them more secure — this is a fallacy and hinders adoption rates, making systems, ironically, less secure. “This is not a computer-to-computer interaction with longer keys. These are humans we’re talking about,” continued Mr Eisen.


Logitech wireless USB dongles vulnerable to new hijacking flaws

Logitech USB dongle
The vulnerabilities allow attackers to sniff on keyboard traffic, but also inject keystrokes (even into dongles not connected to a wireless keyboard) and take over the computer to which a dongle has been connected. When encryption is used to protect the connection between the dongle and its paired device, the vulnerabilities also allow attackers to recover the encryption key. Furthermore, if the USB dongle uses a "key blacklist" to prevent the paired device from injecting keystrokes, the vulnerabilities allow the bypassing of this security protection system. Marcus Mengs, the researcher who discovered these vulnerabilities, said he notified Logitech about his findings, and the vendor plans to patch some of the reported issues, but not all. According to Mengs, the vulnerabilities impact all Logitech USB dongles that use the company's proprietary "Unifying" 2.4 GHz radio technology to communicate with wireless devices.



Financial Firms Face Threats from Employee Mobile Devices

Instead of malware, criminals are using phishing attacks to gain access to financial services networks, but not just any attacks. "We're seeing more targeted attacks within financial services instead of kind of the scattershot approach where you send out a phishing attack to everybody in the organization," he explains. The success of phishing attacks on mobile devices in financial services may be part of a larger pattern of risky mobile behavior by those in the industry. According to the report, 42% of the organizations represented had devices with "side-loaded" apps — apps downloaded and installed from sites other than the app stores approved for the device. Covington says, "You start to see the implications of letting employees manage their own device." And those employees are managing their devices in tremendous numbers, he says. Employee-owned devices, used to conduct company business, are targets because of the sensitive data they contain. "There's no doubt in my mind that the criminal side of the equation is after rich data," he says.


Digital skills — key to driving UK prosperity post-Brexit, according to Salesforce

Digital skills — key to driving UK prosperity, according to Salesforce image
The data from the Salesforce report highlights concerns of a potential shortage of tech skills post-Brexit, with over half of business leaders believing the UK is at risk of a tech brain drain. To address this, businesses are now recognising the pivotal role they must play in nurturing tech talent and digital skills in the country. One in four business leaders feel responsibility for doing so lies mainly with private enterprise; Over half (55%) plan to invest more in developing their own tech talent, with the same number pledging to address the skills gap by re-skilling older generations; and And, 51% intending to do more to re-skill people from disadvantaged backgrounds. There are issues that business needs to lead on regardless of what’s happening in the world of politics,” said Paul Smith, EVP and GM, Salesforce UK. “The economy is changing as new technologies emerge.


The Bank of Amazon: How big tech is disrupting banking


Big tech companies have already begun to embark on financial ventures, with payment platforms such as Google’s Google Wallet and Google payments, Amazon lending to SME marketplace sellers, Facebook’s partnership with Clear Bank on a product called Charged, a programme that allows financing for advertising, and Apple’s credit card, launched last year with Goldman Sachs and Marcus in the US. ... Big tech is in the position of having a significant “data advantage” over banks or fintechs, with the ability to glean more information about their users than others could hope to achieve. With the tech resources to offer an improved user experience and services that are integrated into their existing platforms, a grasp of artificial intelligence that traditional banks are only just beginning to deploy, sophisticated cloud computing, and an already loyal user base, up to 40% of the revenue currently generated by the US financial industry could move over to Big Tech, according to McKinsey.


Restoring Vision With Bionic Eyes: No Longer Science Fiction


"Brain-computer interfaces" can be used both for treating neurological and mental disorders as well as for understanding brain function, and now engineers have developed ways to manipulate these neural circuits with electrical currents, light, ultrasound, and magnetic fields. Remarkably, we can make a finger, arm, or even a leg move just by activating the right neurons in the motor cortex. Similarly, we can activate neurons in the visual cortex to make people see flashes of light. The former allows us to treat neurological conditions such as Parkinson's disease and epilepsy, whereas the latter should eventually allow us to restore vision to the blind. ... We have a real opportunity here to tap into the existing neural circuitry of the blind and augment their visual senses much like Google Glass or the Microsoft HoloLens. For example, make things appear brighter the closer they get, use computer vision to mark safe paths and combine it with GPS to give visual directions, warn users of impending dangers in their immediate surroundings, or even extend the range of "visible" light with the use of an infrared sensor. 


The Potential of AI for Utilities

Utility officials analyzing data
One of the biggest confusion factors is all the different terms that are used as synonyms for AI such as machine learning, deep learning, cognitive computing, etc. The list grows daily. Keep in mind, these terms are not interchangeable, but they are often used that way. That doesn’t help anyone trying to figure out AI or how to use it. First of all, AI is a division of computer science using complex instruction sets to perform what appears to be human-like intelligence. These programs are powered by algorithms, and that is the ingredient causing the mystique. Without going into a lot of detail, an algorithm is a set of step-by-step computer instructions that can use data to build models that make predictions based on the data. Remember, we are a long way off from the thinking, talking robots seen in movies and on television. Algorithms are how AI demonstrates being smart, but be aware it’s not intelligent, which is the critical distinction. This type of AI is referred to as Narrow AI or Applied AI. It is said to simulate human thought, but each application can only carry out one specific task with a limited range of functions.


RiskIQ uncovers new Magecart campaign


This attack introduces yet another method by Magecart that RiskIQ researchers call a “spray and pray” approach. Because skimmers work only when placed on payment or checkout pages, most Magecart attacks target specific e-commerce sites and attempt to drop a skimmer only on pages with payment forms. However, the ease of compromise that comes from finding S3 buckets misconfigured to allow public access means that even if only a fraction of their skimmer injections return payment data, it will yield a substantial return on investment, the researchers said. “This is a brand new twist on Magecart,” said Yonathan Klijnsma, head threat researcher at RiskIQ. “Although this group chose reach over targeting, they likely ended up getting their skimmer on enough payment pages to make their attack lucrative. They have done their cost-benefit analysis.” The scale of this latest attack illustrates how easy it is for threat actors of any kind to compromise a vast quantity of websites at once with scripts stored in misconfigured S3 buckets.


Stream Processing Anomaly Detection Using Yurita Framework

Working at PayPal on a next generation stream processing platform, we started to notice that many of our users wanted to use stream processing to apply anomaly detection models in real time. After we explored different architectures to create a flexible production grade framework that can scale to real world workloads, eventually we decided to go with a pipeline-based API, inspired by other open source projects like scikit-learn and Spark MLlib. This work has led to the development of Yurita - an open source anomaly detection framework for stream processing. Yurita is based on the new Spark structured streaming framework, and utilizes its processing engine capabilities to reach high scale and performant execution. The name Yurita comes from a traditional Japanese gold panning tool. ... Without knowing what the normal behavior of a metric is, we would be able to use only simple anomaly detection techniques, like rule-based decisions which also require a deep understanding of each specific dataset, and therefore are not scalable from a productivity point of view.



Quote for the day:


"All organizations are perfectly designed to get the results they are now getting. If we want different results, we must change the way we do things." -- Tom Northup