Showing posts with label dark data. Show all posts
Showing posts with label dark data. Show all posts

Daily Tech Digest - April 28, 2025


Quote for the day:

"If a window of opportunity appears, don't pull down the shade." -- Tom Peters



Researchers Revolutionize Fraud Detection with Machine Learning

Machine learning plays a critical role in fraud detection by identifying patterns and anomalies in real-time. It analyzes large datasets to spot normal behavior and flag significant deviations, such as unusual transactions or account access. However, fraud detection is challenging because fraud cases are much rarer than normal ones, and the data is often messy or unlabeled. ... “The use of machine learning in fraud detection brings many advantages,” said Taghi Khoshgoftaar, Ph.D., senior author and Motorola Professor in the FAU Department of Electrical Engineering and Computer Science. “Machine learning algorithms can label data much faster than human annotation, significantly improving efficiency. Our method represents a major advancement in fraud detection, especially in highly imbalanced datasets. It reduces the workload by minimizing cases that require further inspection, which is crucial in sectors like Medicare and credit card fraud, where fast data processing is vital to prevent financial losses and enhance operational efficiency.” ... The method combines two strategies: an ensemble of three unsupervised learning techniques using the SciKit-learn library and a percentile-gradient approach. The goal is to minimize false positives by focusing on the most confidently identified fraud cases. 


Cybersecurity is Not Working: Time to Try Something Else

Many CISOs, changing jobs every 2 years or so, have not learnt to get things done in large firms; they have not developed the political acumen and the management experience they would need. Many have simply remained technologists and firefighters, trapped in an increasingly obsolete mindset, pushing bottom-up a tools-based, risk-based, tech-driven narrative, disconnected from what the board wants to hear which has now shifted towards resilience and execution. This is why we may have to come to the point where we have to accept that the construction around the role of the CISO, as it was initiated in the late 90s, has served its purpose and needs to evolve. The first step in this evolution, in my opinion, is for the board to own cybersecurity as a business problem, not as a technology problem. It needs to be owned at board level in business terms, in line with the way other topics are owned at board level. This is about thinking the protection of the business in business terms, not in technology terms. Cybersecurity is not a purely technological matter; it has never been and cannot be. ... There may be a need to amalgamate it with other matters such as corporate resilience, business continuity or data privacy to build up a suitable board-level portfolio, but for me this is the way forward in reversing the long-term dynamics, away from the failed historical bottom-up constructions, towards a progressive top-down approach.


How the financial services C-suite are going beyond ‘keeping the lights on’ in 2025

C-suites need to tackle three core areas: Ensure they are getting high value support for their mission-critical systems; Know how to optimise their investments; Transform their organisation without disrupting day-to-day operations. It is certainly possible if they have the time, capabilities and skill sets in-house. Yet even the most well-resourced enterprises can struggle to acquire the knowledge base and market expertise required to negotiate with multiple vendors, unlock investments or run complex change programmes single-handedly. The reality is that managing the balance needed to save costs while accelerating innovation is challenging. ... The survey demonstrates a growing necessity for CIOs and CFOs to speak each other’s language, marking a shift in organisational strategy, moving IT beyond the traditional ‘keeping the lights on’ approach, and driving a pivotal transformation in the relationship between CIOs and CFOs. As they find better ways to collaborate and innovate, businesses in the financial services space will reap the rewards of emerging technology, while falling in line with budgetary needs. Emerging technologies are being introduced thick and fast, and as a result, hard metrics aren’t always available. Instead of feeling frustrated with a lack of data, CFOs should lean in as active participants, understanding how emerging technologies like AI and cybersecurity can drive strategic value, optimise operations and create new revenue streams. 


Threat actors are scanning your environment, even if you’re not

Martin Jartelius, CISO and Product Owner at Outpost24, says that the most common blind spots that the solution uncovers are exposed management interfaces of devices, exposed databases in the cloud, misconfigured S3 storage, or just a range of older infrastructure no longer in use but still connected to the organization’s domain. All of these can provide an entry point into internal networks, and some can be used to impersonate organizations in targeted phishing attacks. But these blind spots are not indicative of poor leadership or IT security performance: “Most who see a comprehensive report of their attack surface for the first time are surprised that it is often substantially larger than they understood. Some react with discomfort and perceive their prior lack of insight as a failure, but that is not the case. ... Attack surface management is still a maturing technology field, but having a solution bringing the information together in a platform gives a more refined and in-depth insight over time. External attack surface management starts with a continuous detection of exposed assets – in Sweepatic’s case, that also includes advanced port scanning to detect all (and not just the most common) ports at risk of exploitation – then moves on to automated security analysis and then risk-based reporting.


How to Run a Generative AI Developer Tooling Experiment

The last metric, rework, is a significant consideration with generative AI, as 67% of developers find that they are spending more time debugging AI-generated code. Devexperts experienced a 200% increase in rework and a 30% increase in maintenance. On the other hand, while the majority of organizations are seeing an increase in complexity and lines of code with code generators, these five engineers saw a surprising 15% decrease in lines of code created. “We can conclude that, for the live experiment, GitHub Copilot didn’t deliver the results one could expect after reading their articles,” summarized German Tebiev, the software engineering process architect who ran the experiment. He did think the results were persuasive enough to believe speed will be enabled if the right processes are put in place: “The fact that the PR throughput shows significant growth tells us that the desired speed increase can be achieved if the tasks’ flow is handled effectively.” ... Just 17% of developers responded that they think Copilot helped them save at least an hour a week, versus a whopping 40% saw no time savings by using the code generator, which is well below the industry average. Developers were also able to share their own anecdotal experience, which is very situation-dependent. Copilot seemed to be a better choice for completing more basic lines of code for new features, less so when there’s complexity of working with an existing codebase.


Dark Data: Surprising Places for Business Insights

Previously, the biggest problem when dealing with dark data was its messy nature. Even though AI has been able to analyze structured data for years, unstructured or semi-structured data proved to be a hard nut to crack. Unfortunately, unstructured data constitutes the majority of dark data. However, recent advances in natural language programming (NLP), natural language understanding (NLU), speech recognition, and ML have enabled AI to deal with unstructured dark data more effectively. Today, AI can easily analyze raw inputs like customer reviews, social media comments to identify trends and sentiment. Advanced sentiment analysis algorithms can come to accurate conclusions when concerning tone, context, emotional nuances, sarcasm, and urgency, providing businesses with deeper audience insights. For instance, Amazon uses this approach to flag fake reviews. In finance and banking, AI-powered data analysis tools are used to process transaction logs and unstructured customer communications to identify fraud risks and enhance service and customer satisfaction. Another industry where dark data mining might have potentially huge social benefits is healthcare. Currently, this industry generates around 30% of all the data in the world. 


Is your AI product actually working? How to develop the right metric system

Not tracking whether your product is working well is like landing a plane without any instructions from air traffic control. There is absolutely no way that you can make informed decisions for your customer without knowing what is going right or wrong. Additionally, if you do not actively define the metrics, your team will identify their own back-up metrics. The risk of having multiple flavors of an ‘accuracy’ or ‘quality’ metric is that everyone will develop their own version, leading to a scenario where you might not all be working toward the same outcome. ... the complexity of operating an ML product with multiple customers translates to defining metrics for the model, too. What do I use to measure whether a model is working well? Measuring the outcome of internal teams to prioritize launches based on our models would not be quick enough; measuring whether the customer adopted solutions recommended by our model could risk us drawing conclusions from a very broad adoption metric ... Most metrics are gathered at-scale by new instrumentation via data engineering. However, in some instances (like question 3 above) especially for ML based products, you have the option of manual or automated evaluations that assess the model outputs. 


Security needs to be planned and discussed early, right at the product ideation stage

On the open-source side, where a lot of supply chain risks emerge, we leverage a state-of-the-art development pipeline. Developers follow strict security guidelines and use frameworks embedded with security tools that detect risks associated with third-party libraries—both during development and at runtime. We also have robust monitoring systems in place to detect vulnerabilities and active exploits. ... Monitoring technological events is one part, but from a core product perspective, we also need to monitor risk-based activities, like transactions that could potentially lead to fraud. For that, we have strong AI/ML developments already deployed, with a dedicated AI and data science team constantly building new algorithms to detect fraudulent actions. From a product standpoint, the system is quite mature. On the technology side, monitoring has some automation powered by AI, and we’ve also integrated tools like GitHub Copilot. Our analysts, developers, and security engineers use these technologies to quickly identify potential issues, reducing manual effort significantly. ... Security needs to be planned and discussed early—right at the product ideation stage with product managers—so that it doesn’t become a blocker at the end. Early involvement makes it much easier and avoids last-minute disruptions.


14 tiny tricks for big cloud savings

Good algorithms can boost the size of your machine when demand peaks. But clouds don’t always make it easy to shrink all the resources on disk. If your disks grow, they can be hard to shrink. By monitoring these machines closely, you can ensure that your cloud instances consume only as much as they need and no more. ... Cloud providers can offer significant discounts for organizations that make a long-term commitment to using hardware. These are sometimes called reserved instances, or usage-based discounts. They can be ideal when you know just how much you’ll need for the next few years. The downside is that the commitment locks in both sides of the deal. You can’t just shut down machines in slack times or when a project is canceled. ... Programmers like to keep data around in case they might ever need it again. That’s a good habit until your app starts scaling and it’s repeated a bazillion times. If you don’t call the user, do you really need to store their telephone number? Tossing personal data aside not only saves storage fees but limits the danger of releasing personally identifiable information. Stop keeping extra log files or backups of data that you’ll never use again. ... Cutting back on some services will save money, but the best way to save cash is to go cold turkey. There’s nothing stopping you from dumping your data into a hard disk on your desk or down the hall in a local data center. 


Two-thirds of jobs will be impacted by AI

“Most jobs will change dramatically in the next three to four years, at least as much as the internet has changed jobs over the last 30,” Calhoon said. “Every job posted on Indeed today, from truck driver to physician to software engineer, will face some level of exposure to genAI-driven change.” ... What will emerge is a “symbiotic” relationship with an increasingly “proactive” technology that will require employees to constantly learn new skills and adapt. “AI can manage repetitive tasks, or even difficult tasks that are specific in nature, while humans can focus on innovative and strategic initiatives that drive revenue growth and improve overall business performance,” Hoffman said in an interview earlier this year. “AI is also much quicker than humans could possibly be, is available 24/7, and can be scaled to handle increasing workloads.” As AI takes over repetitive tasks, workers will shift toward roles that involve overseeing AI, solving unique problems, and applying creativity and strategy. Teams will increasingly collaborate with AI—like marketers personalizing content or developers using AI copilots. Rather than replacing humans, AI will enhance human strengths such as decision-making and emotional intelligence. Adapting to this change will require ongoing learning and a fresh approach to how work is done.

Daily Tech Digest - January 18, 2025

Beyond RAG: How cache-augmented generation reduces latency, complexity for smaller workloads

RAG is an effective method for handling open-domain questions and specialized tasks. It uses retrieval algorithms to gather documents that are relevant to the request and adds context to enable the LLM to craft more accurate responses. ... First, advanced caching techniques are making it faster and cheaper to process prompt templates. The premise of CAG is that the knowledge documents will be included in every prompt sent to the model. Therefore, you can compute the attention values of their tokens in advance instead of doing so when receiving requests. This upfront computation reduces the time it takes to process user requests. Leading LLM providers such as OpenAI, Anthropic and Google provide prompt caching features for the repetitive parts of your prompt, which can include the knowledge documents and instructions that you insert at the beginning of your prompt. ... And finally, advanced training methods are enabling models to do better retrieval, reasoning and question-answering on very long sequences. In the past year, researchers have developed several LLM benchmarks for long-sequence tasks, including BABILong, LongICLBench, and RULER. These benchmarks test LLMs on hard problems such as multiple retrieval and multi-hop question-answering. 


Turning Curiosity into a Career: The Power of OSINT

The beauty of OSINT is that you can start learning and practicing right now, even without a formal background in cybersecurity. Begin by familiarizing yourself with publicly available tools and resources. Social media platforms, search engines and public record databases are great starting points. From there, you can explore specialized tools like Google Dorking for advanced searches, reverse image search for photo analysis, and platforms like Maltego or SpiderFoot for more in-depth investigations. The OSINT Framework provides an extensive list of tools. If you're interested in pursuing OSINT as a career, consider taking advantage of free and paid online courses. Certifications such a GIAC Open Source Intelligence (GOSI) or Certified Ethical Hacker (CEH) can help build your credibility in the field. Participating in OSINT challenges or contributing to community projects is also a great way to hone your skills and showcase your abilities to potential employers. The demand for OSINT skills is growing as technology evolves and data becomes more accessible. Artificial intelligence and machine learning are enhancing OSINT capabilities, making it easier to analyze massive datasets and detect patterns. 


Five Trends That Will Drive Software Development in 2025

While organizations worldwide have quickly adopted AI for software development, many still struggle to measure its impact across diverse teams and business functions. Next year, organizations will become more sophisticated about measuring the return on their AI investments and better understand the value this technology can provide. This starts with looking more closely at specific outcomes. Instead of asking a broad question like, ‘How is AI helping my organization?’ leaders should study the impact of AI on tasks, such as test generation, documentation or language translation, and measure the gains in efficiency and productivity for these activities. ... While developers already work at breakneck speed today, technical debt is a persistent issue. The most worrying consequence of this debt is vulnerabilities that can creep into code and go unnoticed or unfixed. Next year, developers will expand their use of AI in software development to significantly reduce technical debt and increase the security of their code. Technical debt often occurs when developers choose an easy or quick solution instead of a better approach that takes longer. Vulnerabilities result when the code is poorly structured, not sufficiently reviewed or when testing is rushed or incomplete.


A Cloud Architect’s Guide to E-Commerce Data Storage

Latency, measured in microseconds, is the enemy of e-commerce storage systems, as slow-performing systems can mean hundreds of thousands of dollars in lost transactions and abandoned shopping carts. Your data platform must be reliable and highly performant even during fluctuating demand; events like Black Friday or unexpected social media trends can put a heavy load on your systems. Infrastructure that supports real-time data processing can be the deciding factor in staying competitive. These challenges necessitate a modern approach to storage — one that is software-defined, scalable and cloud-ready. ... Foundational elements of a modern e-commerce infrastructure consist of software-defined storage often combined with open-source environments like OpenStack, OpenShift, KVM and Kubernetes. The challenge for platform architects, whether building their e-commerce storage platform on premises or in the cloud, is to achieve scale and flexibility without compromising application and site performance. Many legacy storage systems, especially those architected for spinning disks, have performance limitations, resulting in data silos and expensive and time-consuming scaling strategies.


Demand and Supply Issues May Impact AI in 2025

Executives are asking for ROI numbers on analytics, data governance, and data quality programs, and they are demanding dollar values as opposed to “improving customer experience” or “increasing operational efficiency. ... Organizations have expected quick returns but not realized them because the initial expectations were unrealistic. Later comes the realization that the proper foundation has not been put in place. “Folks are saying they expect ROI in at least three years and more than 30% or so are saying that it would take three to five years when we’ve got two years of generative AI. [H]ow can you expect it to perform so quickly when you think it will take at least three years to realize the ROI? Some companies, some leadership, might be freaking out at this moment,” says Chaurasia. “I think the majority of them have spent half a million on generative AI in the last two years and haven’t gotten anything in return. That's where the panic is setting in.” Explaining ROI in terms of dollars is difficult, because it’s not as easy as multiplying time savings by individual salaries. Some companies are working to develop frameworks, however. ... If enterprises are reducing AI investments because the anticipated benefits aren’t being realized, vendors will pull back. 


4 Strategies To Thrive In A Manager-Less Workplace

One of the most important skills you can build is emotional regulation. Work can be intense, often frustrating. It’s easy to get caught up in your own emotions and—since emotions are catching—other people’s as well. Staying even-keeled pays off in maintaining good relationships with peers and also keeping yourself clear-headed so you can problem-solve when things go wrong. You can work on your emotional self-control by learning the tools of journaling and mindfulness. ... When you communicate powerfully, you navigate more easily. You get what you need more efficiently, you sell your ideas, and you build better relationships. All of these outcomes are useful when you’re on your own to build a case for getting promoted. The best way to build these skills is to practice. Volunteer to give large presentations and ask for feedback. Craft your emails and slack messages with an understanding of the receiver and ask them if they have suggestions for you. ... Your network inside your company can also provide the emotional support you would have gotten from your manager. And, when it comes time for you to be promoted, in most companies you need your colleagues to support you. Look around at your coworkers to see who are the most interesting, plugged-in, or effective. 


Dark Data: Recovering the Lost Opportunities

Dark data is the data collected and stored by an organization but is not analyzed or used for any essential purpose. It is frequently referred to as "data that lies in the shadows" because it is not actively used or essential in decision-making processes. ... Dark data can be highly beneficial to businesses as it offers insights and business intelligence that wouldn't be available otherwise. Companies that analyze dark data can better understand their customers, operations, and market trends. This enables them to make the best decisions and improve overall performance. Dark data can help organizations recoup lost opportunities by uncovering previously unknown patterns and trends. ... Once the dark data has been collected, it must be cleansed before further analysis. This may include deleting duplicate data, correcting errors, and formatting information to make it easier to work with. After the data has been cleansed and categorized, it can be examined to reveal patterns and insights that will aid decision-making. ... Collaborating with cross-functional teams, such as IT, data science, and business divisions, can assist in guaranteeing that dark data is studied in light of the organization's broader goals and objectives. 
The difference between “data deletion” and “data destruction” is critical to understand. “Data deletion” simply means removing a file from a system, making it appear inaccessible, while “data destruction” is a more thorough process that permanently erases data from a storage device, making it completely irretrievable. Deleting data isn’t enough. Without proper destruction protocols, “deleted” data remains vulnerable to breaches, regulatory compliance, and data recovery tools. ... A well-defined data destruction policy is your organization’s first line of defense. It outlines when, how, and under what circumstances data should be destroyed. Without a formal policy, data is often overlooked, forgotten, or destroyed haphazardly, creating compliance and security risks. To implement this, start by identifying the types of data your organization collects and classifies, such as PII or proprietary records. Define clear retention periods based on regulatory requirements like GDPR or CCPA and document the necessary steps, tools, and roles for secure destruction. Assign accountability to ensure oversight and follow-through. A formal policy isn’t just a “nice-to-have.” It’s a compliance requirement for many regulations, including GDPR and CCPA. 


Can GenAI Restore the ‘Humanity’ in Banking that Digital Has Removed?

Abbott is not arguing for turning customers directly over to GenAI — not yet. Even the most-advanced pioneers his firm works with aren’t risking that. ... Abbott believes GenAI, as it becomes a standard part of banking, will play out in a similar way. Employees will adapt, often more slowly than anticipated, but they will change. This will lead to shifts in the role of management vis-à-vis employees empowered by GenAI. Abbott says this will likely take a similar path to that seen as banks adopted agile development. Young people came into the bank using the tools, just as many are already experimenting with GenAI. Banking leaders liked the idea of their organizations "doing agile." But what Abbott calls "the frozen middle" management tier had to grin and plunge into unfamiliar turf. "That frozen middle will have to thaw out and find a new way of working," says Abbott. Bank leadership must help by providing tools and opportunities for trying it out. One of the biggest early challenges will be tempering the GenAI tech to the task. Abbott explains that GenAI can be tuned to be "low temperature" or "high temperature," or somewhere in between. The former refers to GenAI working with tight guardrails, such as in sensitive areas like dispute management. 


Federated learning: The killer use case for generative AI

Federated learning is emerging as a game-changing approach for enterprises looking to leverage the power of LLMs while maintaining data privacy and security. Rather than moving sensitive data to LLM providers or building isolated small language models (SLMs), federated learning enables organizations to train LLMs using their private data where it resides. Everyone who worries about moving private enterprise data to a public space, such as uploading it to an LLM, can continue to have “private data.” Private data may exist on a public cloud provider or in your data center. The real power of federation comes from the tight integration between private enterprise data and sophisticated LLM capabilities. This integration allows companies to leverage their proprietary information and broader knowledge in models like GPT-4 or Google Gemini without compromising security. ... As enterprises struggle to balance AI capabilities against data privacy concerns, federated learning provides the best of both worlds. Also, it allows for a choice of LLMs. You can leverage LLMs that are not a current part of your ecosystem but may be a better fit for your specific application. For instance, LLMs that focus on specific verticals are becoming more popular. 



Quote for the day:

"Too many of us are not living our dreams because we are living our fears." -- Les Brown

Daily Tech Digest - November 24, 2020

Why securing the DNS layer is crucial to fight cyber crime

When left insecure, DNS servers can result in devastating consequences for businesses that fall victim to attack. Terry Bishop, solutions architect at RiskIQ, says: “Malicious actors are constantly looking to exploit weak links in target organisations. A vulnerable DNS server would certainly be considered a high-value target, given the variety of directions that could be taken once compromised. “At RiskIQ, we find most organisations are unaware of about 30% of their external-facing assets. That can be websites, mail servers, remote gateways, and so on. If any of these systems are left unpatched, unmonitored or unmanaged, it presents an opportunity for compromise and further potential exploit, whether that is towards company assets, or other more valuable infrastructure such as DNS servers are dependent on the motives of the attacker and the specifics of the breached environment.” Kevin Curran, senior member at the Institute of Electrical and Electronics Engineers (IEEE) and professor of cyber security at Ulster University, agrees that DNS attacks can be highly disruptive. In fact, an improperly working DNS layer would effectively break the internet, he says.


The Dark Side of AI: Previewing Criminal Uses

Criminals' Top Goal: Profit, If that's the high level, the applied level is that criminals have never shied away from finding innovative ways to earn an illicit profit, be it through social engineering refinements, new business models or adopting new types of technology. And AI is no exception. "Criminals are likely to make use of AI to facilitate and improve their attacks by maximizing opportunities for profit within a shorter period, exploiting more victims and creating new, innovative criminal business models - all the while reducing their chances of being caught," according to the report. Thankfully, all is not doom and gloom. "AI promises the world greater efficiency, automation and autonomy," says Edvardas Å ileris, who heads Europol's European Cybercrime Center, aka EC3. "At a time where the public is getting increasingly concerned about the possible misuse of AI, we have to be transparent about the threats, but also look into the potential benefits from AI technology." ... Even criminal uptake of deepfakes has been scant. "The main use of deepfakes still overwhelmingly appears to be for non-consensual pornographic purposes," according to the report. It cites research from last year by the Amsterdam-based AI firm Deeptrace , which "found 15,000 deepfake videos online..."


Flash storage debate heats up over QLC SSDs vs. HDDs

Rosemarin said some vendors front end QLC with TLC flash, storage class memory or DRAM to address caching and performance issues, but they run the risk of scaling problems and destroying the cost advantage that the denser flash technology can bring. "We had to launch a whole new architecture with FlashArray//C to optimize and run QLC," Rosemarin said. "Otherwise, you're very quickly going to get in a position where you're going to tell clients it doesn't make sense to use QLC because [the] architecture can't do it cost-efficiently." Vast Data's Universal Storage uses Intel Optane SSDs, built on faster, more costly 3D XPoint technology, to buffer writes, store metadata and improve latency and endurance. But Jeff Denworth, co-founder and chief marketing officer at the startup, said the system brings cost savings over alternatives through better longevity and data-reduction code, for starters. "We ask customers all the time, 'If you had the choice, would you buy a hard drive-based system, if cost wasn't the only issue?' And not a single customer has ever said, 'Yeah, give me spinning rust,'" Denworth said. Denser NAND flash chip technology isn't the only innovation that could help to drive down costs of QLC flash. Roger Peene, a vice president in Micron's storage business unit, spotlighted the company's latest 176-layer 3D NAND that can also boost density and lower costs.


Instrumenting the Network for Successful AIOps

The highest quality network data is obtained by deploying devices such as network TAPs that mirror the raw network traffic. Many vendors offer physical and virtual versions of these to gather packet data from the data center as well as virtualized segments of the network. AWS and Google Cloud have both launched Virtual Private Cloud (VPC)traffic/packet mirroring features in the last year that allow users to duplicate traffic to and from their applications and forward it to cloud-native performance and security monitoring tools, so there are solid options for gathering packet data from cloud-hosted applications too.  The network taps let network monitoring tools view the raw data without impacting the actual data-plane. When dealing with high sensitivity applications such as ultra-low-latency trading, high quality network monitoring tools use timestamping with nanosecond accuracy to identify bursts with millisecond resolution which might cause packet drops that normal SNMP type counters can’t explain. This fidelity of data is relevant in other high quality applications such as real-time video decoding, gaming multicast servers, HPC and other critical IOT control systems. 


How to create an effective software architecture roadmap

The iteration model demonstrates how the architecture and related software systems will change and evolve on the way to a final goal. Each large iteration segment represents one milestone goal of an overall initiative, such as updating a particular application database or modernizing a set of legacy services. Then, each one of those segments contains a list of every project involved in meeting that milestone. For instance, a legacy service modernization iteration requires a review of the code, refactoring efforts, testing phases and deployment preparations. While architects may feel pressured to create a realistic schedule from the start of the iteration modeling phase, Richards said that it's not harmful to be aspirational, imaginative or even mildly unrealistic at this stage. Since this is still an unrefined plan, try to ignore limitations like cost and staffing, and focus on goals. ... Once an architect has an iteration model in place, the portfolio model injects reality into the roadmap. In this stage, the architect or software-side project lead analyzes the feasibility of the overall goal. They examine the initiative, the requirements for each planned iteration and the resources available for the individual projects within those iterations. 


How new-age data analytics is revolutionising the recruitment and hiring segment

There are innumerable advantages attached to opting for AI over an ordinary recruitment team. With the introduction of AI, companies can easily lower the costs involved in maintaining a recruitment team. The highly automated screening procedures select quality candidates that in turn will help the organization grow and retain better personnel – a factor that is otherwise overlooked in the conventional recruitment process. Employing AI and ML automates the whole recruitment process and helps eliminate the probability of human errors. Automation increases efficiency and improves the performance of other departments of the company. The traditional recruitment process tends to be very costly. Several teams are often needed for the purpose of hiring people in a company. But with the help of AI and ML, the unnecessary costs can be done away with and the various stages of hiring can all be conducted on a single dedicated platform. Additionally, if the company engages in a lot of contract work, then AI can be used for analysing the project plan and predicting the kinds, numbers, ratio and skills of workers that may be required for the purpose. The scope of AI and ML cannot be undermined by the capabilities of current systems.


6 experts share quantum computing predictions for 2021

"Next year is going to be when we start seeing what algorithms are going to show the most promise in this near term era. We have enough qubits, we have really high fidelities, and some capabilities to allow brilliant people to have a set of tools that they just haven't had access to," Uttley said. "Next year what we will see is the advancement into some areas that really start to show promise. Now you can double down instead of doing a scattershot approach. You can say, 'This is showing really high energy, let's put more resources and computational time against it.' Widespread use, where it's more integrated into the typical business process, that is probably a decade away. But it won't be that long before we find applications for which we're using quantum computers in the real world. That is in more the 18-24 month range." Uttley noted that the companies already using Honeywell's quantum computer are increasingly interested in spending more and more time with it. Companies working with chemicals and the material sciences have shown the most interest he said, adding that there are also healthcare applications that would show promise.


How Industrial IoT Security Can Catch Up With OT/IT Convergence

The bigger challenge, he says, is not in the silicon of servers and networking appliances but in the brains of security professionals. "The harder problem, I think, is the skills problem, which is that we have very different expertise existing within companies and in the wider security community, between people who are IT security experts and people who are OT security experts," Tsonchev says. "And it's very rare to find one individual where those skills converge." It's critical that companies looking to solve the converged security problem, whether in technology or technologists, to figure out what the technology and skills need to look like in order to support their business goals. And they need to recognize that the skills to protect both sides of the organization may not reside in a single person, Tsonchev says. "There's obviously a very deep cultural difference that comes from the nature of the environments characterized by the standard truism that confidentiality is the priority in IT and availability is the priority in OT," he explains. And that difference in mindset is natural – and to some extent essential – based on the requirements of the job. Where the two can begin to come together, Tsonchev says, is in the evolution away from a protection-based mindset to a way of looking at security based on risk and risk tolerance.


Dark Data: Goldmine or Minefield?

The issue here is that companies are still thinking in terms of sandboxes even when they are face-to-face with the entire beach. A system that considers analytics and governance flip sides of the same coin and incorporates them synergistically across all enterprise data is called for. Data that has been managed has the potential to capture the corpus of human knowledge within the organization, reflecting the human intent of a business. can offer substantial insight into employee work patterns, communication networks, subject matter expertise, and even organizational influencers and business processes. It also holds the potential for eliminating duplicative human effort, which can be an excellent tool to increase productivity and output. The results of this alone are a sure-fire way to boost productivity, spot common pain points that may not be effective to the workstream and can share insights to organizations where untapped potential may lay. Companies that have successfully bridged information management with analytics are answering fundamental business questions that have massive impact on revenue: Who are the key employees? ... With the increase in sophistication of analytics and its convergence with information governance, we will likely see a renaissance for this dark data that is presently largely a liability.


NCSC issues retail security alert ahead of Black Friday sales

“We want online shoppers to feel confident that they’re making the right choices, and following our tips will reduce the risk of giving an early gift to cyber criminals. If you spot a suspicious email, report it to us, or if you think you’ve fallen victim to a scam, report the details to Action Fraud and contact your bank as soon as you can.” Helen Dickinson, chief executive of the British Retail Consortium (BRC), added: “With more and more of us browsing and shopping online, retailers have invested in cutting-edge systems and expertise to protect their customers from cyber threats, and the BRC recently published a Cyber Resilience Toolkit for extra support to help to make the industry more secure. “However, we as customers also have a part to play and should follow the NCSC’s helpful tips for staying safe online.” The NCSC’s advice, which can be accessed online at its website, includes a number of tips, including being selective about where you shop, only providing necessary information, using secure and protected payments, securing online accounts, identifying potential phishing attempts, and how to deal with any problems. Carl Wearn, head of e-crime at Mimecast, commented: “Some of the main things to look out for include phishing emails and brand spoofing, as we are likely to see an increase in both.



Quote for the day:

“Focus on the journey, not the destination. Joy is found not in finishing an activity but in doing it.” -- Greg Anderson

Daily Tech Digest - October 04, 2020

What Is Dark Data Within An Organisation?

In the universe of information assets, data may be deemed dark for a number of various reasons either because it’s unstructured or because it’s behind a firewall. Or it may be dark due to the speed or volume or because people simply have not made the connections between the different data sets. This could also be because they do not lie in a relational database or because until recently, the techniques required to leverage the data effectively did not exist. Dark data is often text-based and stays within company firewalls but remains very much untapped.  For instance, supply chain complexity is a significant challenge for organisations. The supply chain is a data-driven industry traversing across a network of global suppliers distribution channels and customer base. This industry churns out data in huge numbers given that an estimated that only 5% of data is being used. So while 95% of such data is not being utilised for analytics, it presents an opportunity for big data technologies to bring this dark data to light.  To date, organisations have explored only a small fraction of the digital universe for data analytic value. Dark analytics is about turning dark data into intelligence and insight that a company can use.


Quantum computing meets cloud computing

As part of Leap, developers can also use a feature called the hybrid solver service (HSS), which combines both quantum and classical resources to solve computational problems. This "best-of-both-worlds" approach, according to D-Wave, enables users to submit problems of ever-larger sizes and complexities. Advantage comes with an improved HSS, which can run applications with up to one million variables – a jump from the previous generation of the technology, in which developers could only work with 10,000 variables. "When we launched Leap last February, we thought that we were at the beginning of being able to support production-scale applications," Alan Baratz, the CEO of D-Wave, told ZDNet. "For some applications, that was the case, but it was still at the small end of production-scale applications." "With the million variables on the new hybrid solver, we really are at the point where we are able to support a broader array of applications," he continued. A number of firms, in fact, have already come to D-Wave with a business problem, and a quantum-enabled solution in mind. According to Baratz, in many cases customers are already managing the small-scale deployment of quantum services, and are now on the path to full-scale implementation.


H&M Hit With Record-Breaking GDPR Fine Over Illegal Employee Surveillance

Swedish multinational retail company H&M has been hit with a monumental €35 million ($41.3 million) GDPR fine for illegally surveilling employees in Germany. The Data Protection Authority of Hamburg (HmbBfDI) announced the fine on Thursday after the company was found to have excessively monitored several hundred employees in a Nuremberg service centre. The watchdog said that since at least 2014, parts of the workforce had been subject to "extensive recording of details about their private lives".  "After absences such as vacations and sick leave the supervising team leaders conducted so-called Welcome Back Talks with their employees. After these talks, in many cases not only the employees' concrete vacation experiences were recorded, but also symptoms of illness and diagnoses,” HmbBfDI said. “In addition, some supervisors acquired a broad knowledge of their employees' private lives through personal and floor talks, ranging from rather harmless details to family issues and religious beliefs.” The extensive data collection was exposed in October 2019 when such data became accessible company-wide for several hours due to a configuration error.


How CIOs can convert Data Lakes into profit centres

Most CIOs today are comfortable with traditional concepts of BI and Data Warehousing. These mature technologies have worked well to help the organization gain insights into what happened in the past - but are no longer sufficient by themselves. ML and AI are required technologies today for generating the next set of competitive advantages - predicting the future, gaining deep insights from unstructured data and creating data-driven products. Relational Databases are often incapable of handling rapidly evolving data formats and unstructured data, like natural language text and multimedia, which are the fuel for this ML and AI-driven revolution. ... Exploding data sizes, increasing Data Democratization and increasingly rich and complex data processing workloads mean the traditional on-premise hardware has a hard time keeping up. Processing power of modern processors for AI and ML (GPUs/TPUs) are doubling every few months - leaving Moore’s law in the dust. Capital sunk in on-premise hardware becomes obsolete faster than ever. Rapidly innovating hardware in the cloud enables new classes of applications or breaks performance barriers for old ones.


Data Governance & Privacy Best Practices to Lower Risk and Drive Value

An enterprise-wide data governance program is your key to accelerating digital transformation programs such as cloud migration, improving customer experience with trust assurance, and lowering operating expenses when data use is optimized, in line with your corporate policies. In today’s world with more data being available from more sources, it’s no surprise that we look for an automated and scalable methodology to manage all this information. Data governance is a discipline that encompasses the rules, policies, roles, responsibilities, and tools we put in place to ensure our data is accurate, consistent, complete, available, and secure to enable trust in the outcomes we plan to achieve.  From my experience, these are three best practices around governing data to maximize the success of business transformation agendas, reduce uncertainty, and ensure safe and appropriate data use. ... Leading global organizations are leveraging Informatica’s integrated and intelligent Data Governance and Privacy solution portfolio to proactively add value to their bottom line today. It is about getting the right information to the right people at the right time, enabling the entire organization to be proactive, in order to identify and act on new opportunities and plan for the best results, instead of reacting to unanticipated surprises.


Cyber-attack victim CMA CGM struggling to restore bookings, say customers

As CMA CGM’s IT engineers continue, for the fifth day, to try to restore its systems following a cyber-attack at the weekend, the French carrier has come under mounting criticism from customers that its back-up booking process is inadequate. Yesterday, the carrier said its “back-offices [shared services centres] are gradually being reconnected to the network, thus improving bookings and documentation processing times”. And it reiterated that bookings could still be made through the INTTRA portal, as well as manually via an Excel form attached to an email. However, Australian forwarder and shipper representatives, the Freight & Trade Alliance (FTA) and Australian Peak Shippers Association (APSA), described the measures as “failing to adequately provide contingency services”. John Park, head of business operations at FTA/APSA, said its members ought to be due compensation from the carrier and its subsidiary, Australia National Line, which operates some 14 services to Australia, according to the eeSea liner database. “FTA/APSA has reached out again to senior CMA CGM management to seek advice as to when we can expect full service to be re-instated, implementation of workable contingency arrangements and acceptance that extra costs incurred ...


Researchers create a graphene circuit that makes limitless power

The breakthrough is an offshoot of research conducted three years ago at the University of Arkansas that discovered that freestanding graphene, which is a single layer of carbon atoms, ripples, and buckles in a way that holds potential energy harvesting capability. The idea was controversial because it does refute a well-known assertation from physicist Richard Feynman about the thermal motion of atoms, known as Brownian motion, cannot do work. However, the University researchers found at room temperature thermal motion of graphene does induce an alternating current in a circuit. The achievement was previously thought to be impossible. Researchers also discovered their design increased the amount of power delivered. Researchers say they found the on-off, switch-like behavior of the diodes amplifies the power delivered rather than reducing it as previously believed. Scientists on the project were able to use a relatively new field of physics to prove diodes increase the circuit’s power. That emerging field is called stochastic thermodynamics. Researchers say that the graphene and the circuit share a symbiotic relationship.


Frameworks for Data Privacy Compliance

As new privacy regulations are introduced, organizations that conduct business and have employees in different states and countries are subject to an increasing number of privacy laws, making the task of maintaining compliance more complex. While these laws require organizations to administer reasonable security implementations, they do not outline what specific actions should be taken to satisfy this requirement. As a result, many risk managers are turning to proven security frameworks that specifically address privacy. Doing so can help organizations build privacy and security programs that make compliance more manageable, even when beholden to multiple regulations. While no two frameworks are the same, each is designed to help organizations identify and address potential security gaps that could negatively impact data privacy. Such frameworks include the Center for Internet Security (CIS) Top 20, Health Information Trust Alliance Common Security Framework (HITRUST CSF), and the National Institute of Standards and Technology (NIST) Framework. ... Originally designed for health care organizations and third-party vendors that serve health care clients, HITRUST CSF leads organizations beyond baseline security practices to establish a strong, mature security program.


Selecting Security and Privacy Controls: Choosing the Right Approach

The baseline control selection approach uses control baselines, which are pre-defined sets of controls assembled to address the protection needs of a group, organization, or community of interest. Security and privacy control baselines serve as a starting point for the protection of information, information systems, and individuals’ privacy. Federal security and privacy control baselines are defined in draft NIST Special Publication 800-53B. The three security control baselines contain sets of security controls and control enhancements that offer protection for information and information systems that have been categorized as low-impact, moderate-impact, or high-impact—that is, the potential adverse consequences on the organization’s missions or business operations or a loss of assets if there is a breach or compromise to the system. The system security categorization, risk assessment, and security requirements derived from stakeholder protection needs, laws, executive orders, regulations, policies, directives, and standards can help guide and inform the selection of security control baselines from draft Special Publication 800-53B.


Emerging challenges and solutions for the boards of financial-services companies

Actions by boards reflect the increased attention all financial firms are now devoting to cyberrisk. Ninety-five percent of board committees, for example, discuss cyberrisks and tech risks four times or more a year (Exhibit 1). One such firm holds optional deep-dive sessions the week before each quarter’s board meeting. These sessions cover relevant topics, such as updates on the current intelligence on threats, case studies of recent breaches that could affect the company or others in the industry, and the impact of regulatory changes. ... There has been a remarkable shift in board awareness of cybersecurity in the past few years: for example, earlier McKinsey research, from 2017, suggested that only 25 percent of all companies gave their boards information-technology and security updates more than once a year. More frequent and consistent communication between board members and senior management on this topic now enables boards to understand the financial, operational, and technological implications of emerging cybersecurity threats for the business and to guide its direction accordingly. Firms increasingly recruit experts for these committees.



Quote for the day:

"Superlative leaders are fully equipped to deliver in destiny; they locate eternally assigned destines." -- Anyaele Sam Chiyson

Daily Tech Digest - April 24, 2020

Data: The Fabric of Developers’ Lives

Data fabric_developers
Storage-as-a-Service—we hardly knew about it. Thanks in large part to containers, which offer exceptional scalability, simplicity and high availability, the speed of application development has increased dramatically. Developers need to be able to quickly provision their own data, in just the right amounts, to match that velocity. And, like containers, that data needs to be portable. Provisioning quickly means no more going through storage administrators to get the services they need, which can be a cumbersome and time-consuming process. Solutions like Kubernetes’ on-demand clusters enable developers to procure the data they need when they need it. The abstraction layer provided by a data fabric can empower developers even further. They can write their own APIs, provision data services as needed and move that data between clouds with ease. This is particularly important when dealing with cloud providers that offer different services. Sometimes a developer may need a service that exists in one cloud but not another. It’s critical to have an underlying storage infrastructure that enables applications and their data to be transferred as needs require.


Remember when open source was fun?

When Daniel Stenberg set out to make currency exchange rates available to IRC users, he wasn’t trying to “do open source.” It was 1996 and the term “open source” hadn’t even been coined yet (that came in February 1998). No, he just wanted to build a little utility (“how hard can it be?”), so he started from an existing tool (httpget), made some adjustments, and released what would eventually become known as cURL, a way to transfer data using a variety of protocols. It wasn’t Stenberg’s full-time job, or even his part-time job. “It was completely a side thing,” he says in an interview. “I did it for fun.” Stenberg’s side project has lasted for over 20 years, attracted hundreds of contributors, and has a billion users. Yes, billion with a B. Some of those users contact him with urgent requests to fix this or that bug. Their bosses are angry and they need help RIGHT NOW. “They are getting paid to use my stuff that I do at home without getting paid,” Stenberg notes. Is he annoyed? No. “I do it because it’s fun, right? So I’ve always enjoyed it. And that’s why I still do it.”


New research by the data protection and management software supplier has found 5.8 million tonnes of carbon dioxide will be pumped into the atmosphere this year resulting from the use of storage systems to house and process dark data. Veritas derived the figure by mapping industry data on power consumption from data storage, industry data on emissions from datacentres and its own research. On average, 52% of all data stored by organisations worldwide is likely to be dark data, according to Veritas. With the amount of data growing from 33 zettabytes in 2018 to 175 zettabytes by 2025, there will be 91 zettabytes of dark data in five years’ time – over four times the volume of dark data today. Ravi Rajendran, vice-president and managing director for the Asia South region at Veritas Technologies, said that although companies are trying to reduce their carbon footprint, dark data is often neglected. And with dark data producing more carbon dioxide than 80 countries do individually, Rajendran called for organisations to start taking it seriously. 


How different generations approach remote work

Maybe it's more millennials that are really pushing the work from home, but if you would think it would be more of your generation. I say that I'm Gen X. Veronica and I both are, of course. But, you would think that it'd be the younger ones that would be all for working from home, to have that freedom. ... When I'm in an office, as you both know, I tend to be a bit of a chatterbox, so it's good for me to have that alone time to really lock things down. But it's different for people. But, Veronica, you and I would be able to speak on this for Gen X, at least, in the research that I saw, NRG found that most Gen X-ers enjoyed working from home because they were really comfortable, and they liked that independence. And they also liked being around their families, and having that quality time, and felt a little more relaxed. Would you say that's accurate? ... You can get up and take a break whenever, and reset your brain to shift tasks, or to find inspiration if you're stuck on something. I think if you can close the door or close your family off, it's OK. My kids are older now, but if they were little, it would be so hard to work from home now. I have an 11-year-old and a 15-year-old, so they can make their own lunch, and walk the dog, and be self-sufficient while I'm down here.



Netgear is ahead of the game with its WiFi 6 router portfolio and it is paying off as the company is seeing a surge in home network upgrades. The catch for Netgear is that its supply chain, sales channels and markets have all been upended by the COVID-19 pandemic. CEO Patrick Lo outlined the moving parts of Netgear's first quarter. We saw two distinct phenomena during the Covid-19 pandemic. Whenever a shelter in place lockdown was declared, business activities fell and demand for our SMB products dropped significantly. At the same time, consumers are quickly finding out that high performance WiFi at home is a necessity and are rushing to upgrade their home WiFi, driving upticks in our consumer WiFi and mobile hotspot sales. We also saw significant channel shift from physical retail channel purchases to online purchases which put strain on the logistics of some of our online sales partners. On an earnings conference call, it became clear that Netgear had a lot to navigate as it pulled its guidance due to COVID-19. The company reported a first quarter net loss of $4.17 million on revenue of $229.96 million, down from $249 million a year ago. On a non-GAAP basis, Netgear's earnings of 21 cents a share were a nickel better than estimates.


Researchers say deep learning will power 5G and 6G ‘cognitive radios’


For decades, amateur two-way radio operators have communicated across entire continents by choosing the right radio frequency at the right time of day, a luxury made possible by having relatively few users and devices sharing the airwaves. But as cellular radios multiply in both phones and Internet of Things devices, finding interference-free frequencies is becoming more difficult, so researchers are planning to use deep learning to create cognitive radios that instantly adjust their radio frequencies to achieve optimal performance. As explained by researchers with Northeastern University’s Institute for the Wireless Internet of Things, the increasing varieties and densities of cellular IoT devices are creating new challenges for wireless network optimization; a given swath of radio frequencies may be shared by a hundred small radios designed to operate in the same general area, each with individual signaling characteristics and variations in adjusting to changed conditions. The sheer number of devices reduces the efficacy of fixed mathematical models when predicting what spectrum fragments may be free at a given split second.


Outsourced DevOps brings benefits, and risks, to IT shops


When IT teams outsource DevOps planning to a third-party service provider, it only exacerbates existing planning issues. Another option is to hire a contract Scrum Master or product manager with DevOps experience to work with the in-house teams. Either way, proceed with an end game of knowledge transfer to build in-house planning expertise. Depending on the organization's attitude toward contractors, the addition of an outside contractor to work on planning can bring some cultural challenges. Some organizations treat contractors as valued members of the team, while others treat them as outsiders -- which makes it challenging to have a contractor in any subject matter expert position. Planning tools, however, are ripe for outsourcing. For example, if an organization lacks the in-house expertise to implement and maintain Atlassian Jira or another planning tool, it can outsource that platform and use a managed version. While it's more common to outsource the build phase of DevOps than it is the planning phase, it still has risks.


Tech Leaders Map Out Post-Pandemic Return to Workplace

Businesses will be turning to enterprise technology to smooth out the process of getting employees back to the workplace in the wake of the coronavirus pandemic, according to a report by Forrester Research. Technology leaders say safety will be a top priority. The information-technology research firm’s report lays out an early-stage road map for IT executives preparing to reopen corporate offices—a process that will vary by industry, but for most businesses will involve multiple stages. Chief information officers and their teams will likely be in the first wave of employees returning to the job site, said Andrew Hewitt, a Forrester analyst serving infrastructure and operations professionals. He said their initial task will be to develop a strategy for keeping employee tech tools—including PCs, mobile devices, monitors, keyboards and mice—germ-free without damaging them. “IT teams will need to have a staging area that’s outside of the front door of the office where employees can bring their home technology in and sanitize it,” Mr. Hewitt said.


Five Attributes of a Great DevOps Platform

DevOps Platform
Culture plays a significant role in establishing the guidelines while embracing DevOps in any organization. Through DevOps culture, companies seek to bring dev and ops teams into harmony to promote collaboration, automation, process improvements, continuous iterative development and deployment methodologies. But above everything else, a sound DevOps culture fundamentally solves one of IT’s biggest people problems: bridging the gap between dev and ops teams to get them to stop working in silos and have common goals. According to Gartner estimation, DevOps efforts fail 90% of the time when infrastructure and operations teams try to drive a DevOps initiative without nurturing a cultural shift in the first place. It is not just about the efficient tools or experts working; it is about the behavioral modifications and mentality necessary to effect cultural change. Hence, it is important for the firms to consider the culture of the company before selecting its tool as a potential DevOps tool for their development.


Use tokens for microservices authentication and authorization


STS enables clients to obtain the credentials they need to access multiple services that live across distributed environments. It issues digital security tokens that stay with users from the beginning of their session and continuously validate their permission for each service they call. An STS can also reissue, exchange and cancel security tokens as needed. The STS must connect with an enterprise user directory that contains all the details about user roles and responsibilities. This directory, and any connection made to it, should be properly secured as well, otherwise users could elevate their permissions just by editing policies on their own. Consider segmenting user access policies based on roles and activities. For instance, identify the individuals who have administrative capabilities. Or, you might limit a developer's access permissions to only include the services they are supposed to work on. ... Not all microservices permission and security checks are based around a human user.



Quote for the day:


"I'm not crazy about reality, but it's still the only place to get a decent meal." -- Groucho Marx


Daily Tech Digest - November 25, 2018

Artificial intelligence: Germans see no reason to fear robot coworkers

One example of how AI can benefit people is automated driving. Bosch is striving to make road transportation emissions-free, accident-free, and stress-free. With nine out of ten accidents are currently attributable to human error, smart technology could use AI to prevent many of these from happening in the first place. Connected manufacturing is another banner field for AI. In a smart factory, people and machines will work together as an intelligent team. Robots will relieve people of strenuous and dangerous tasks and learn from experience. This will reduce people’s burden. The Bosch survey found that many Germans could imagine being able to accept this situation. Two-thirds of respondents – 67 percent – believe that manufacturing and mobility are going to benefit greatly from artificial intelligence. They are also open to working with a robot if it takes over routine chores. Half of all respondents could well imagine such a situation, and would above all devote the free time gained to social or creative activities.


Women in Blockchain: CryptoWendyO talks about her motivation

There’s so much negative energy directed at crypto from mainstream financial institutions because the public “doesn’t like change.” “Because crypto is intangible, it’s hard for the masses to understand. “We saw this with the internet and credit cards. If you notice, the group of folks present when credit cards became mainstream still write cheques – as time progresses, so will the masses.” The recent falls after the hard fork mean the market – which is basic supply and demand – needs a “catalyst to bring in new money.” WendyO says: “There’s nothing we can do individually to stop negative price action. What we can do is support one another and continue to support the entrepreneurs building in the space. They are the key to mass adoption. “Once Blockchain projects are seamless and make life easier for the masses, they will come.” Asked by me why people are panicking so much, she believes: “Price impacts the human psyche so much. People are entering into positions without proper risk management and education.



We all know how the media and the film industry are overhyping AI with androids and over-intelligent systems. Some computer pioneers, Alan Turing (you may want to watch The Imitation Game to appreciate the legend he is) at the forefront, did set off on projects with a view to making machines that think. Turing, however, did realise that this would be abysmally difficult, and in 1950 proposed: Instead of trying to produce a programme to simulate the adult mind, why not rather try to produce one which simulates the child’s? If this were then subjected to an appropriate course of education, one would obtain the adult brain. This idea grew on to become Deep Learning. Fast forward to 2018: we have, and are still gathering, massive amounts of data. We have and are still developing more and more advanced algorithms. But do we have the hardware to crunch all those calculations within reasonable time? And if we do, can it be done without having all those GPUs cause another global warming on their own by literally heating up from all the processing?


Forget Robots, Blockchain Technology May Be the Real Threat to Your Job

Blockchain isn't just the technology behind the Bitcoin craze. It could also mean the end of the middle manager.
Traditionalists say this is a necessary component of an organization, freeing senior management to think strategically and move away from the day-to-day, while building a talent bench of the next generation of senior managers. Detractors ask what a middle manager actually adds to the bottom line, pointing to an unclear or difficult to define return on investment. The truth, as is often the case, lies somewhere in the middle. But it may not matter. Many organizations have clear, tangible, quantifiable key performance indicators for day-to-day functions, like sales closed or widgets shipped. With the advent of smart contracts on blockchain, it’s clear: robots aren’t the only ones gunning for your job. Blockchain technology is too. A smart contract is code designed to facilitate, verify or enforce performance of set terms. ... Notably, this is not a far-off concept—it’s something that, in many situations, could be implemented tomorrow.


Rebooting analytics leadership: Time to move beyond the math
CAOs often find themselves doing this heavy lifting with a limited sphere of influence. They typically do not have the profit-and-loss or revenue accountability that would grant them due power in the organization. Moreover, like chief marketing officers a decade ago, CAOs need—but typically lack—a true seat at the C-suite table, placing them at a disadvantage when trying to obtain adequate funding or resources to power the analytics agenda. ... Arguably, none of the previous CAO personas could succeed in today’s landscape. We’ve entered an era that requires a new CAO persona—the Catalyst—who embraces a style of leadership geared toward addressing the current demands, roadblocks, and scrutiny most companies face today when it comes to deploying AI and advanced analytics at scale. Catalysts approach their role very differently than did past CAO personas, in ways that those with more scientific and technical career backgrounds might not have ever done before.



How voice biometrics catches fraudsters


According to Costain, it is relatively easy for the system to identify a new voice. Often, a fraudster will phone in to check whether stolen credentials are valid, but in certain cases, the fraudster may scam the customer to obtain these credentials. “It’s a bit like epidemiology with Patient Zero,” he said. The same voice may try to access multiple accounts, which would signal an attempted fraud. RBS has also been compiling a database of evidence, which Costain said has led to a few police arrests of people who have made fraudulent calls. Over the next six months, the bank will have technology to enable customers to determine whether a call they receive from the bank is genuine, he said. Experian’s Global fraud report 2018 found that customers want to be recognised, while businesses want to address the growing fraud they are experiencing.


AI and Neuroscience: A virtuous circle


Another key challenge in contemporary AI research is known as transfer learning. To be able to deal effectively with novel situations, artificial agents need the ability to build on existing knowledge to make sensible decisions. Humans are already good at this - an individual who can drive a car, use a laptop or chair a meeting are usually able to cope even when confronted by an unfamiliar vehicle, operating system or social situation. Researchers are now starting to take the first steps towards understanding how this might be possible in artificial systems. For example, a new class of network architecture known as a “progressive network” can use knowledge learned in one video game to learn another. The same architecture has also been shown to transfer knowledge from a simulated robotic arm to a real-world arm, massively reducing the training time. Intriguingly, these networks bear some similarities to models of sequential task learning in humans. These tantalising links suggest that there are great opportunities for future AI research to learn from work in neuroscience.


6 ways to include dark data in analytic strategies

istock-876586498archive-files.jpg
The goal for CIOs is simple: Find out what data is under company management, but that it possibly didn't know that it had. Then, develop a strategic data plan with executives that addresses what do with this data so that it delivers its highest value to the company. ... As soon as it is determined that certain areas of data are useful, begin to digitalize and exploit it for value so you can get it working for you. ... Outside data sources can enhance the value of data you already have under management. A prime example is the monitoring of Greenland's ice pack. If you monitor climate change and are concerned about the pace of global warming, you can study historical photos of Greenland's land mass from decades ago. Comparison of Greenland against how it was decades ago to how it is today can demonstrate both the impact and progression of global warming. ... As paper-based forms of unstructured data are digitalized, it is essential for data to undergo quality assurance checks for data integrity and quality.


Generative Adversarial Networks (GANs) – The Basics You Need To Know

Horse.gif
So as name suggest it is called as Adversarial Networks because this is made up of two neural networks. Both neural networks are assigned different job role i.e. contesting with each other. Neural Network one is called as Generator, because it generate new data instances. Other neural net is called as Discriminator, evaluates work for first neural net for authenticity. The cycle continue to obtain accuracy or near perfection results. ... To understand “Generative Adversarial Networks”, its very important to differentiate between supervised learning and unsupervised learning. ... GAN’s are fairly new architecture in the deep learning domain. They fall under unsupervised neural network category. The performance measure is far better then traditional neural nets. When we use google search engine we use GANs at time of typing, what we want to search. 


Distributed Machine Learning Is The Answer To Scalability And Computation Requirements


It was this challenge to handle large-scale data due to scalability and efficiency of learning algorithms with respect to computational and memory resources that gave rise to distributed ML. For example, if the computational complexity of the algorithm outpaces the main memory then the algorithm will not scale well and will not be able to process the training data set or will not run due to memory restrictions. Distributed ML algorithms rose to handle very large data sets and develop efficient and scalable algorithms with regard to accuracy and to requirements of computation. Distributed ML algorithms are part of large-scale learning which has received considerable attention over the last few years, thanks to its ability to allocate learning process onto several workstations — distributed computing to scale up learning algorithms. It is these advances which make ML tasks on big data scalable, flexible and efficient.



Quote for the day:


"You can't just wish change; you have to live the change in order for it to become a reality." -- Steve Maraboli