Showing posts with label spoofing. Show all posts
Showing posts with label spoofing. Show all posts

Daily Tech Digest - November 14, 2025


Quote for the day:

"The only way to achieve the impossible is to believe it is possible." -- Charles Kingsleigh



When will browser agents do real work?

Vision-based agents treat the browser as a visual canvas. They look at screenshots, interpret them using multimodal models, and output low-level actions like “click (210,260)” or “type “Peter Pan”.” This mimics how a human would use a computer—reading visible text, locating buttons visually, and clicking where needed. ... DOM-based agents, by contrast, operate directly on the Document Object Model (DOM), the structured tree that defines every webpage. Instead of interpreting pixels, they reason over textual representations of the page: element tags, attributes, ARIA roles, and labels. ... Running a browser agent once successfully doesn’t mean it can repeat the task reliably. The next frontier is learning from exploration: transforming first-time behaviors into reusable automations. A promising strategy starting to be deployed more and more is to let agents explore workflows visually, then encode those paths into structured representations like DOM selectors or code. ... With new large language models excelling at writing and editing code, these agents can self-generate and improve their own scripts, creating a cycle of self-optimization. Over time, the system becomes similar to a skilled worker: slower on the first task, but exponentially faster on repeat executions. This hybrid, self-improving approach—combining vision, structure, and code synthesis—is what makes browser automation increasingly robust. 


Security Degradation in AI-Generated Code: A Threat Vector CISOs Can’t Ignore

LLMs have been a boon for developers since OpenAI’s ChatGPT was publicly released in November 2022, followed by other AI models. Developers were quick to utilize the tools, which significantly increased productivity for overtaxed development teams. However, that productivity boost came with security concerns, such as AI models trained on flawed code from internal or publicly available repositories. Those models introduced vulnerabilities that sometimes spread throughout the entire software ecosystem. One way to address the problem was by using LLMs to make iterative improvements to code-level security during the development process, under the assumption that LLMs, given the job of correcting mistakes, would amend them. The study, however, turns that assumption on its head. Although previous studies (and extensive real-world experience, including our own data) have demonstrated that an LLM can introduce vulnerabilities in the code it generates, this study went a step further, finding that iterative refinement of code can introduce new errors. ... The security degradation introduced in the feedback loop raises troubling questions for developers, tool designers and AI safety researchers. The answer to those questions, the authors write, involves human intervention. Developers, for instance, must maintain control of the development process, viewing AI as a collaborative assistant rather than an autonomous tool.


Are We in the Quantum Decade?

It would be prohibitively expensive even for a Fortune 100 company to own, operate and maintain its own quantum computer. It would require a quantum ecosystem that includes government, academia and industry entities to make it accessible to an enterprise. In most cases, the push and funding could come from the government or through cooperation among nations. Historically, new computing technology was rented and used as a service. Compute resources financed by government were booked in advance. Processing occurred in batches using resource-sharing techniques such as time slicing. Equivalent models are expected for quantum processing. ... The era of quantum computing looms large, but enterprises and IT teams should be thinking about it today. Infrastructure needs to be deployed and algorithms need to be written for executing business use cases. "For several years to come, CIOs may not have much to do with quantum computing. But they need to know what it is, what it can do and how much it costs," said Lawrence Gasman, president of Communications Industry Researchers. "Quantum networks and cybersecurity will become necessary for secure communications by 2030 or even earlier." Quantum computing will not replace classical computing, but data center providers need to be thinking about how they will integrate the two architectures using interconnects like co-packaged optics.


When Data Gravity Meets Disaster Recovery

Data starts to pull everything else toward it: apps, analytics, integrations, even people and processes, the more it aggregates in one place. That environment becomes a tightly woven web of dependencies, over time. While it may be fine for day-to-day operations, it becomes a nightmare when something breaks. At that point, DR turns into a delicate task of relocating an entire ecosystem, not just a matter of simply copying files. You have to think about relationships, which systems rely on which datasets, how permissions are mapped, and how applications expect to find what they need. Of course, the bigger that web gets, the heavier the “gravitational field.” Moving petabytes of interconnected data across regions or clouds isn’t fast or easy. It takes time, bandwidth, and planning, and every extra gigabyte adds friction – in other words, the more gravity your data has, the harder it is to recover from disaster quickly. ... To push back against gravity, organizations are rethinking their architectures. Instead of forcing all data into one environment, they’re distributing it intelligently, keeping mission-critical workloads close to where they’re created, while replicating copies to nearby or complementary environments for protection. Hybrid and multi-cloud DR strategies have become the go-to solution for this. They blend the best of both worlds: the low-latency performance of local infrastructure with the flexibility and geographic reach of cloud storage. 


What’s Driving the EU’s AI Act Shake-Up?

The move to revise the AI Act follows sustained lobbying from US tech giants. In October, the Computer and Communications Industry Association (CCIA), whose members include Apple, Meta, and Amazon, launched a campaign pushing for simplification not only of the AI Act but of the EU’s entire digital rulebook. Meanwhile, EU officials have reportedly engaged with the Trump administration on these issues. ... The potential delay reflects pressure from national authorities. Denmark and Germany have both pushed for a one-year extension. A spokesperson from Germany’s Federal Ministry for Digital Transformation and Government Modernization said that a delay “would allow sufficient time for the practical application of European standards by AI providers, with standards still currently being elaborated.” ... Another major reform under consideration is expanding and centralizing oversight powers within the Commission’s AI Office. Currently responsible for general-purpose AI models (GPAI), the office would gain new authority to oversee all AI systems based on GPAI and conduct conformity assessments for certain high-risk systems. The Commission would also gain new authority to perform conformity assessments for certain high-risk systems and supervise online services deemed to pose “systemic risk” under the Digital Services Act. This would shift more power to Brussels and expand the mandate of the Commission’s AI Office beyond its current role supervising GPAI.


BITS & BYTES : The Foundational Lens for Enterprise Transformation

BITS serves as high-level strategic governance—ensuring balanced maturity assessments across business alignment, information-centric decision-making, technology enablement, and security resilience—while leveraging BDAT’s detailed sub-domains (layers and components) for tactical implementation and operational oversight. This allows organizations to maintain BDAT’s precision in decomposing complex IT landscapes (e.g., mapping specific data architectures or application portfolios) within BITS’s overarching pillars, fostering adaptive governance that scales from atomic “bits” of change to enterprise-wide transformations ... If BITS defines what must be managed, BYTES (Balanced Yearly Transformation to Enhance Services) define how change must be processed. BYTES is more than a set of principles; it is a derivative of the core architectural lifecycle: Plan (Balanced Yearly), Design& Build (Transformation Enhancing) , and Run (Services). Each component of BYTES directly maps to the mandatory stages of a continuous transformation framework, enabling architects to manage change at its source. ... The BITS & BYTES framework is not intended to replace existing architecture frameworks (e.g., TOGAF, Zachman, DAMA, IT4IT, SAFe). Instead, it acts as a meta-framework—a simplified, high-level matrix that accommodates and contextualizes the applicability of all existing models. 


Unlocking GenAI and Cloud Effectiveness With Intelligent Archiving

Unlike tiering, which functions like a permanent librarian selectively fetching individual files from deep storage, true archiving is a one-time event that moves files based on defined policies, such as last access or modification date. Once archived, files are stored on a long-term platform and remain accessible without reliance on any intermediary system or application. In this context, one of the main challenges is that most enterprise data is unstructured, including everything from images and videos to emails and social media content. Collectively, these vast and diverse data lakes present a formidable management challenge, and without rigorous control, organizations risk falling victim to the classic “garbage in, garbage out” problem. ... Modern archiving technologies that connect directly to both primary and archive storage platforms eliminate the need for a middleman, drastically improving migration speed, accuracy, and long-term data accessibility. This means organizations can migrate only what’s necessary, ensuring high-value data is cloud-ready while offloading cold data to cost-efficient archival platforms. This not only reduces cloud storage costs but also supports the adoption of cloud-native formats, enabling greater scalability and performance for active workloads. ... For modern enterprises, where more than 60% of enterprise data is typically inactive and often goes untouched for years, organizations are still consuming high-performance (and high-cost) storage.


Why 60% of BI Initiatives Fail (and How Enterprises Can Avoid It)

Many BI projects fail because goals and outcomes aren’t clearly defined. While enterprises may be confident that they understand BI gaps, often their goals are vague, lacking proper detailing and no internal consensus. ... Poor project management practices, vague processes, and changing responsibilities create even more confusion. In many failed BI projects, BI is viewed as “just another IT initiative,” whereas it should be treated as part of a business transformation program. Without active sponsorship and accountability, the technology may be delivered, but its adoption and impact suffer. ... Agile and iterative methods are often preferred since they are effective for BI. Whereas, the waterfall method is not recommended for BI projects since it lacks the necessary agility to adapt to changing requirements, iterative data exploration, and continuous business feedback. Under the waterfall approach, the users are engaged only in the beginning of the project and during the end, which leaves gaps for development or data analysis incase of issues. ... A system is only as good as the users who use it; research has shown that 55% of users lack confidence in BI tools due to insufficient training. Enterprises often expend considerable resources on deployment, but neglect enablement. If employees can’t find how to navigate dashboards, understand the data quality, data visualizations, or use insights to make daily decisions, the adoption rates suffer.


Authentication in the age of AI spoofing

Unlike traditional malware, which may find its way into networks through a compromised software update or downloads, AI-powered threats utilize machine learning to analyze how employees authenticate themselves to access networks, including when they log in, from which devices, typing patterns and even mouse movements. The AI learns to mimic legitimate behavior while collecting login credentials and is ultimately deployed to evade basic detection. ... Beyond the statistics, AI’s effectiveness is driven by its exponentially improving abilities to social engineer humans — replicating writing style, voice cadence, facial expressions or speech with subtle nuance and adding realistic context by scanning social media and other publicly available references. The data is striking and reflects the crucial need for a multi-layer approach to help sidestep the exponentially escalating ability for AI to trick humans. ... Cryptographic protection complements biometric authentication, which verifies “Is this the right person?” at the device level, while passkeys are used to verify “Is this the right website or service?” at the network level. Multi-modal biometrics, such as facial recognition plus fingerprint scanning or biometrics plus behavioral patterns, further strengthen this approach. As AI-powered attacks make credential theft and impersonation attacks more sophisticated, the only sustainable line of defense is a form of authentication that cannot be tricked or must be cryptographically verified. 


Why your security strategy is failing before it even starts

The biggest mistake I see among organizations is initiating cybersecurity efforts with technology rather than prioritizing risk and business alignment. Cybersecurity is often mischaracterized as a technical issue, when in reality it’s a business risk management function. Failure to establish this connection early often results in fragmented decision-making and limited executive engagement. Effective cybersecurity strategies should be embedded into business objectives from the outset. This requires identifying the business’s critical assets, assessing potential threats and motivations, and evaluating the impact of assets becoming compromised. Too often, CISOs jump straight into acquiring cybersecurity tools without addressing these questions. ... First, the threat landscape shifted dramatically. Cybersecurity attacks today target OT and ICS. In food manufacturing, those systems run production lines, refrigeration, and safety processes. A cyber incident in these areas extends beyond data loss, it can disrupt production and even compromise food safety, introducing a far more complex level of risk. Second, it became evident to me that cybersecurity cannot operate in isolation. It must support and enable business operations and growth. Today, my approach is risk-based and aligned with our business prioritizes, while still built on zero trust principles. We focus on resilience, not just compliance, and OT security is a core pillar of that strategy. 

Daily Tech Digest - September 09, 2021

How a National Digital Twin could help catapult sustainability in the UK

Digital twins continue to remain an area that is underfunded and underdeveloped in the UK. This is largely due to an awareness issue. Until recently, digital twins have largely sat in the remit of academia and therefore much of the theory hasn’t turned into action. Any innovation that has been brought to the table has mainly remained siloed between organisations and sectors. To counter this requires strong, central guidance on what can be achieved through digital twins. The Government is primed to take on this leading role, particularly the Department for Business, Energy & Industrial Strategy (BEIS). In an ideal scenario, we’d see it set up small scrum teams of digital twin experts to support, educate and consult organisations across the private and public sectors to first, develop business cases and proof of value, and second get them to a place where they can develop their own information management strategy to support the digital twin. This cohesive education will help to underpin a National Digital Twin strategy. Hand-in-hand with the awareness issue, is a lack of digital maturity and understanding on how to get to that point. 


Technical Debt Isn't Technical: What Companies Can Do to Reduce Technical Debt

The biggest problem is that unlike a dirty kitchen, technical debt is mostly invisible to our non-technical stakeholders. They can only see the slowing down effect it has, but when they do, it’s often already too late. It’s all about new features, constantly adding new code on already fragile foundations. Another problem is that too much tech debt causes engineering teams to be in fire-fighting mode. Tech debt impacts the whole company, but for engineers, more tech debt means more bugs, more performance issues, more downtime, slow delivery, lack of predictability in sprints, and therefore less time spent building cool stuff. ... Controlling technical debt is a prerequisite to delivering value regularly, just like an organized and clean kitchen is a prerequisite to delivering delicious food regularly. That doesn’t mean you shouldn’t have technical debt. You will always have some mess and that’s healthy too. The goal isn’t to have zero mess; the goal is to get rid of the mess that slows you down and prevents you from running a great kitchen.


When a scammer calls: 3 strategies to protect customers from call spoofing

Humans are invariably going to be the weakest link in the chain; not even the most robust technology can prevent a victim from unwittingly handing over their private credentials. That said, while many financial institutions are investing in educational programs to teach their customers basic principles around protecting their accounts, they need to make it a continuous and ongoing initiative. Likewise, these efforts should extend to the customer-facing workers and especially contact center employees who are ultimately responsible for authenticating a customer’s identity. ... Phone-based scams almost always culminate with the victim transmitting funds, buying untraceable gift cards, or sharing critical data that can be used to create synthetic identities to open new accounts. For financial institutions this means that they need to be able to establish a behavioral baseline of their customers to understand normal interactions from anomalous activities that could be earmarks for potential fraud threats.


Agile Enterprise Architecture Framework: Enabler for Enterprise Agility

The Agile EA Framework (AEAF) helps in breaking barriers between IT and business, ideally with increasing levels of co-location by unit and with fast forming teams that coalesce for new projects. The initial goal of the architect is to bring out a Minimum Viable Product (MVP), improve upon it, and evolve with each iteration. It would also consider the real time customer feedback while adding more features through the iterations. The overall idea is to adopt just enough architecture that would be sufficiently good to deliver the MVP and thus avoiding any big upfront designs. The AEAF helps in defining an architecture using an iterative life cycle, allowing the architectural design to evolve gradually as the problem and the constraints better understood. The architecture and the gradual building of the system must go hand in hand and the subsequent iterations address the architecture issues and address architecture decisions to arrive a flexible architecture. The following diagram depicts the AEAF framework and constituent steps associated with it.


6 Hobbies You Should Have if You’re Interested in Cybersecurity

Ethical hacking (or "white-hat hacking") occurs when people get permission to try and break into a company’s systems. They then report their methods and how quickly they accomplished the task. Ethical hackers would ideally find problems before malicious parties do, giving companies time to act. Some people specializing in ethical hacking recommend having a wide but shallow knowledge pool. This equips them to find issues in cloud software, and so identify vulnerabilities that help malware flourish. ... Hack the Box is a platform for cybersecurity enthusiasts that combines hacking with gamification. The online modules cater to individuals, universities, and companies, providing content to help people hone their penetration testing skills. Think of Hack the Box as a springboard for people interested in hacking who aren’t sure where to start. Besides offering an educational component, there’s a community aspect. For example, people can discuss their methods and get recommendations for different techniques to apply in the future.


SEC Warns of Fraudulent Cryptocurrency Schemes

Several security and blockchain experts draw a direct line between this fraudulent activity and increasingly sophisticated social engineering attempts, or blatantly false advertising that may lead to poor or unsafe crypto investments. James McQuiggan, education director for the Florida Cyber Alliance and security awareness advocate for the firm KnowBe4, says, "Cybercriminals will always find emotional lures to exploit users through social engineering. Asking yourself the question, 'Is this too good to be true?' is the first step to determine if the organization is worthwhile." Further, Julio Barragan, director of cryptocurrency intelligence at the firm CipherTrace, warns against ongoing scams in which victims are lured by a convincing fraudster sending them direct messages on social media or through a friend's hacked account, promoting massive gains. Neil Jones, cybersecurity evangelist for the firm Egnyte says: "Significant change [in the space] will only occur when cryptocurrency platforms become subject to the same standardized IT requirements as traditional investment platforms ..."


Are you stuck in a “logic box”?

The point of the logic box is to help develop self-awareness, an essential skill of leadership that is becoming more important as we negotiate our VUCA—volatile, uncertain, complex, and ambiguous—world. Leaders and their subordinates must always examine the basic premises of a key decision and interrogate its surface validity. This came up in a recent conversation I had with Dambisa Moyo, a widely published economist who is a board member at Chevron and 3M. One of the most important qualities she looks for when assessing leaders is their ability to use different mental models for analyzing choices, an idea that she attributed to Buffett’s partner at Berkshire Hathaway, Charlie Munger. “It’s this idea of road-testing their thinking using different paradigms,” she said. “So, if, say, an investment looks quite attractive from a financial perspective, it might look less attractive through a geopolitical or environmental lens. Given the world that we live in now, people who think about complex problems in a more versatile way have an advantage.”


Protecting your company from fourth-party risk

Since fourth parties are not generally obligated to share information with partners of their clients, organizations are now adapting their TPRM programs to address fourth-party concerns. Fortunately, there are steps companies can take to give them greater visibility into – and protection from – downstream risk. Despite growing awareness of the threat of fourth-party risk, clear guidelines, and uniform processes for fourth parties have not been established, resulting in disjointed, ad-hoc processes. Most of these processes are manual, requiring significant investment in time and labor, and opening the possibility of error and oversight. ... The first step is for companies to understand how their third parties are monitoring their vendors. This includes direct monitoring (i.e., what are they doing to monitor their third parties) and general vendor management (i.e., do they have their own vendor management program and how effective is it). Companies can ask these questions through periodic performance reviews as well as through their annual risk and due diligence reassessments. 


Putting people at the heart of digital marketing

A strong marketing team is made up of people with a diverse range of skills – from strategists and data analysts to identify strengths and map trends and focus plans, to creatives and ‘doers’ to design and deliver beautifully tailored campaigns. A good marketer needs to understand how technology can help to enhance, personalise and deliver these campaigns through the appropriate channels – but also to be able to think beyond the barriers of what technology can provide. Technology makes it easy to execute, analyse and measure a marketing strategy with the push of a button and while this is helpful – especially at scale – where we see the most effective personalised marketing is in teams with marketers who are not afraid to ask questions. They need to be able to query the ‘why’, ‘how’ and ‘who’ behind every marketing decision – whether technology or human driven – to ensure it is relevant, beneficial and being delivered to the right people in the best possible way. Good marketers know this and understand that if we want customers to continue to agree to share their data, we need to earn their trust.


How to Enable Team Learning and Boost Performance

Very often, a team with a performance problem lacks the knowledge of strategy. They do not feel like doing meaningful work. As a leader, you should have defined a framework within which you regularly communicate goals and connect them with strategy. You also need to be open to collect feedback from your team if they feel the goals are achievable or not. It might be that you have clear goals, but you communicate them once per year. Unfortunately, that might be too rare. Based on your context, you need to define the best cadence to remind the team and yourself about the goals. For teams that are working in compex fast changing environment you need to review the goals at least once per 3 months, maybe even more often. For example, you can schedule release planning or delivery planning sessions with your team. Once per 3 months, review with your team the delivery roadmap, release plans. Compare it with your team's current velocity and capacity. Discuss the expectations, collect feedback from your team. Afterwards use sprint review sessions and sprint planning sessions to track the progress towards the goal. 



Quote for the day:

"A positive attitude will not solve all your problems. But it will annoy enough people to make it worth the effort" -- Herm Albright

January 03, 2016

Enterprise Architecture - Guiding Principles

The usefulness of principles is in their general orientation and perspective; they do not prescribe specific actions. A given principle applies in some contexts but not all contexts. Different principles may conflict with each other, such as the principle of accessibility and the principle of security. Therefore, applying principles in the development of EA requires deliberation and often tradeoffs. The selection of principles to apply to a given EA is based on a combination of the general environment of the enterprise and the specifics of the goals and purpose of the EA. The application of appropriate principles facilitates grounding, balance, and positioning of an EA. Deviating from the principles may result in unnecessary and avoidable long-term costs and risks.


How to Flush DNS

There are wide arrays of DNS issues that can arise at the network administrator or power user level. For the end-user; however, the majority of DNS problems arise from either bad configuration entries or the local computer’s DNS storage requiring flushing. Independent of the type of operating system, many home computer users will input the DNS Server for their respective Internet Service Provider (ISP) incorrectly resulting in a failed Internet connection. Each ISP will have a slightly different configuration process; however, the IP address of the DNS server for your home network to use will be provided on registration for service. Many times the ISP will use the address for their actual DNS server, where others it will be the same as the Gateway IP for the service


The Disciplined Agile Framework

IT departments are complex adaptive organizations. What we mean by that is that the actions of one team will affect the actions of another team, and so on and so on. For example, the way that your agile delivery team works will have an effect on, and be affected by, any other team that you interact with. If you’re working with your operations teams, perhaps as part of your overall DevOps strategy, then each of those teams will need to adapt the way they work to collaborate effectively with one another. Each team will hopefully learn from the other and improve the way that they work. These improvements with ripple out to other teams. The challenge is that every area within IT has one or more bodies of knowledge, and in some cases published “books of knowledge”, that provide guidance for people working in those areas.


Designing the Business of IT

One of the core benefits that organisations can expect is a more cost-efficient IT environment. Senior IT leaders from MunichRe, Shell and Achmea, as well as research from Gartner, predicts that IT4IT will help organisations manage an increasingly complex IT estate in a more cost-effective fashion. It will also free up time and budget for innovation and new products. They feel the Reference Architecture provides a strong framework for managing multi-sourcing approaches, which are becoming more prominent in organisations around the world. Another key benefit of IT4IT is that it is not being introduced as an alternative to methodologies or frameworks such as TOGAF and ITIL.


Google's 'Lego' Smartphone, Smarter TVs: What We're Excited About In 2016

The Internet of Things should continue to provide the foundation for the technology industry's ambitions next year, framed by machine learning, analytics, networking, and ever-smaller devices. Connected sensors will proliferate. Intelligent software agents will learn new tricks that automate discrete tasks in a way that's similar to Gmail's Smart Reply service. Robots will emerge from private businesses to begin grocery deliveries on public sidewalks. If regulatory approval can be secured, drones will begin lawful package deliveries, following in the footsteps of flying contraband couriers.


TLS Client Authentication

Why TLS client authentication? Because that’s the most standard way to authenticate a user who owns a certificate. Of course, smartcard certificates are not the only application – organizations may issue internal certificates to users that they store on their machines. The point is to have an authentication mechanism that is more secure than a simple username/password pair. It is a usability problem, especially with smartcards, but that’s beyond the scope of this post. So, with TLS clientAuth, in addition to the server identity being verified by the client, the client identity is also verified by the server. This means the client has a certificate that is issued by an authority, which the server explicitly trusts.


Market Police Deploy New Algorithm Weapons Against Spoofers

“We have to capture every trade now,” O’Brien said. “In today’s markets it’s all about analyzing patterns and contexts.” Yet given how rapidly fraudsters can change their methods to hoodwink human beings, outwitting surveillance software could be even easier. Algorithms are sophisticated but they’re incapable of determining whether a flurry of buy and sell orders are legitimate or unlawful. “The surveillance tools are merely the first line of defense,” said Haim Bodek, founder of Decimus Capital Markets, a New York-based algorithmic investing firm. “These tools can help bring suspicious activity to the attention of regulators, trading venues and brokers, but they’re a poor substitute for a compliance program that monitors activity across affiliated accounts and groups of traders.”


2025: the five key attributes for your business surviving the next ten years in tech

The two make-or-break traits that rose to the top for these leaders were being able to spot new opportunities predictively and being able to innovate in an agile way. The survey also asked these leaders how prepared they believe their organisations are in each of these two dimensions. The gaps were quite remarkable. While 62% of those surveyed identified predictively spotting opportunities as being very important for their businesses, only 12% thought that their businesses had this capability. And only nine percent believed their organisations were capable of innovating extremely well in an agile way.


Podcast: Portfolio Management & The Agile Extension

In agile, we need to be prepared to constantly adapt our plans. That approach works extremely well at the project or initiative level, but at an organizational level, budgets and plans tend to be longer term and less adaptable. The current rate of change often means that those plans are negated and organizations find it difficult to adapt quickly to changing market conditions. We need to take the concept of backlog management and apply it at a higher level to programs and portfolios so that we are able to adaptively respond to changes in the world around us. The traditional definition of project success has been on time, on scope, and on budget. Those constraints still exist, but they are not the driving factors today.


Cybersecurity in 2016: will it come down to luck or leadership?

Unfortunately in most respects, 2016 won’t change much: users will still unknowingly click on malicious links; IT departments will still be bad at staying up to date with patching; the bad guys will continue to attack; and the tide of misery from breaches will persist. What matters most is whether your organisation will be a victim or not. Of course you could do nothing, and be lucky. But the only way to control your fate is to lead your organisation to the high ground based on a well-considered, security-first strategy. It is important to remember that, despite their claims, most security vendors cannot help you. Within the market we see too many 'me too' vendors, who’s main focus in on the staple of detection.



Quote for the day:

"It is literally true that you can succeed best and quickest by helping others to succeed." -- Napoleon Hill

December 25, 2015

Using Advanced Analytics to Sniff Out Spoofing

So far, the fight has yielded little. More than five years after the Dodd-Frank Act made spoofing a crime, the Commodity Futures Trading Commission sued just three traders for spoofing in 2015. And while the number of enforcement cases on CME Group Inc., which owns futures markets including the Chicago Board of Trade, doubled this year from 2014, there were only 16. But tips keep arriving, and officials expect more cases next year. That’s ratcheted up expectations from market participants, who can see that spoofing cops finally have the will to take on wrongdoers and, with emerging technology, will have better tools to detect them. “It’s taken the regulators and market participants who are blowing the whistle several years to know what to look for,” said Kevin McPartland, head of research for market structure and technology at Greenwich Associates in Stamford, Connecticut.


Google joins Mozilla, Microsoft in pushing for early SHA-1 crypto cutoff

"In line with Microsoft Edge and Mozilla Firefox, the target date for this step is January 1, 2017, but we are considering moving it earlier to July 1, 2016 in light of ongoing research," Google Chrome team members Lucas Garron and David Benjamin said Friday in a blog post. "We therefore urge sites to replace any remaining SHA-1 certificates as soon as possible." Until then, starting with Chrome version 48, which is expected to land early next year, the browser will display errors if the certificates served by websites have SHA-1 signatures and were issued after Jan. 1, 2016. That's because public certificate authorities (CAs) are not supposed to issue new SHA-1-signed certificate after that date.


5 favorite open source Django packages

Django is built around the concept of reusable apps: self-contained packages that provide re-usable features. You can build your site by composing these reusable apps, together with your own site-specific code. There's a rich and varied ecosystem of reusable apps available for your use—PyPI lists more than 8,000 Django apps—but how do you know which ones are best? ... We also recommend you check out Django Packages, a directory of reusable Django apps. Django Packages organizes Django apps into "grids" that allow you to compare similar packages and chose between them. You can see which features are offered by each package, as well as usage statistics.


How Do I Become a Data Scientist? / Data Science Aspects

What is out of the question is, that the theoretical language of data analysis is mathematics and the practical language is, at least for another few decades, programming in whatever dialect (programming language) and the execution platform is computers and the subject of analysis is data in whatever form and wherever and however it might be produced or coming from. This is independent of whether you might assign the processing of a matrix to mathematics, programming or statistics and, with this, we should be able to avoid the problems of the Venn diagram approach. It depends on your point of view. The theoretical or abstract description of problems and their solution is mathematics, the practical side


Banks crunch big data for better service

The sources of information that banks use look set to rise. They are keen on acquiring travel data from bus and phone companies to ascertain travel patterns and where peak traffic occurs, said Mr Bill Padfield, chief executive of Dimension Data Asia Pacific, an IT solutions and services company. "Most countries link the bus pass to their identity card, so (bus companies) actually understand who's getting on and off the buses, when they're getting on and off the buses, what age they are and what sex they are." Such information can help banks assess where they should set up branches and when they will need the most staff, added Mr Padfield. Getting the data is one thing, analysing it is another, given the technical challenges.


The three-point big data analytics action plan

Data modeling gives us the ability to decide how various elements of data relate to each other and how the heart of the data will behave inside our chosen database. Put simply, data modeling is the classification, documentation and formalization of procedures and events involved within the software in hand. Potentially hugely complex as an overall task, data modeling tools help capture and translate multifaceted system designs into representations of data that are more easily comprehended. For a retail firm approaching its first major implementation of big data analytics, the ability to model the data form allows the business function to specify and describe all the myriad components of the business into data.



Why So Many Organizations Struggle With Data Management

Organizations are generally pretty good at profiling and to some degree diagnostics of why something happened. What drove an operational cost up or down, or what happened in terms of customer acquisition over the last five years. Where there is a lot of room for improvement still is diagnosing and really understanding the underlying drivers. If costs are up, why? Understanding the five variables that are most correlated and then saying, we’ve actually built models to prove a cause and effect relationship. Once you have that then you can say, alright I know there is cause and effect, I can deploy a team differently in this way to reduce cost or I can cut headcount here to reduce cost. I think there is room for a lot of improvement there.


Weired Science: 10 Strange Tech Stories From 2015

These are the light and fizzy reports of dumb criminals and animal hijinks that regularly pop up on news sites, broadcasts, and social media. Interestingly, many straight-up articles about science and technology research end up in the weird news section. Because the world of high tech moves so fast, these items surface for a couple of days, raise a few eyebrows, then recede under the relentless waves of information overload. In fact, these technology stories flash and fade so quickly that we often don't appreciate how genuinely bananas they are, nor do we ponder their larger implications. Here we take a look at 10 of the weirder tech stories of 2015, including updates on self-replicating machines, bacteria-powered sportswear, and time-traveling computers.


3 self-improvement techniques to carry through the New Year

The major problem with New Year's resolutions is that they're often unrealistic objectives, without the supporting changes in habit and support structures. Rather than adding lose some weight or work out more to your list of resolutions, here are some suggestions for new habits that will help make you a more effective person, in and outside the office. One aspect to long-term change that I've found interesting is the effectiveness of tricks to instill a habit. Put your running shoes and workout clothes next to the bed in the evening and you're more likely to put them on in the morning. If you struggle to respond to your alarm clock, put one far away from the bed. Rather than relying on willpower, rely on the nuances of your own nature to help instill some of the techniques below. There's no problem with cheating when the game is self-improvement.


Spoofing Went Mainstream in 2015

Nobody knows how widespread spoofing is. The CFTC in Washington receives complaints every week, Aitan Goelman, the agency’s head of enforcement, said earlier this year.  The frequency ebbs and flows, according to several industry executives who would only discuss the matter anonymously. It increased earlier this year before calming down, they said. ... Spoofing is difficult to prove. When prosecutors sent traders to prison and punished financial firms for colluding in the international currency market and manipulating one of the world’s benchmark interest rates, they combed through phone records, e-mail and Internet chats. When it comes to spoofing, investigators must also sift through reams of trading data.



Quote for the day:


"The time is always right to do what is right." -- Martin Luther King Jr.