Who wants to go threat hunting?
To become a threat hunter, one must first work as a security analyst and likely graduate into IR and cyber threat intelligence fields. Combined with a bit of knowledge of attacker methodology and tactics, threat hunting becomes a very coveted skill. Threat hunting is one of the most advanced skillsets one could obtain in information security today. The core skills of a threat hunter include security operations and analytics, IR and remediation, attacker methodology, and cyber threat intelligence capabilities. Combined, a hunter is the special operations team of an organization’s defensive and detection capabilities. A threat hunter is taking the traditional indicators of compromise (IoC) and instead of passively waiting to detect them, is aggressively going out looking for them. Traditional intrusion detection doesn’t do a great job on the crafty adversary. They will avoid tripping the normal intrusion detection defenses. It takes a threat hunter to find them. Not every company can have one. It takes a certain size and sophistication. ... Threat hunting teams need threat intelligence plus a network person, an endpoint person, a malware analyzer, and a scalable bunch of tools. A threat hunting team is like special operation forces.
Quantum computers and Bitcoin mining – Explained
With the application of the science behind electrons, energy required to mine bitcoins will be drastically reduced causing a direct impact on the protection of the environment. It is immaterial where the computers are located because one of the properties that this technology employs is having objects in any place at the same time. Because of the energy challenge, most mining companies have opted to set up their data centers in regions that have cold weather conditions most time of the year as is the case of using the classical computers. Quantum computers adequately sort out that issue. The economics of mining will definitely improve with miners not having to get concerned about the exorbitant electricity bills they have to contend with. A large number of Mining companies have had to migrate to China where they can capitalize on the relatively cheaper electricity offered. This need not be case anymore as particles can exist in multiple locations at once with quantum computing. The technology of quantum computing is in itself an incentive or motivating factor in causing more people to desire to engage in conducting the mining activity.
Should you let IT teams choose their own collaboration tools?
While CIOs should consider stepping back from dictating what IT can use, they are still responsible for vetting, integrating and maintaining the solutions IT chooses, Palm says. “You’re not just a strategic advisor to the business as far as how these tools can enable efficiency and innovation, but you’re helping your teams choose the best tools to help them do the best job they can,” he says. Red Hat’s Kelly notes a fundamental issue CIOs encounter when shifting to a choose-your-own approach: “What you don’t want to do is completely let go, and then all of a sudden you have fifty different ways people are communicating with each other — that’s a mess. You as a CIO have to walk a line between standing there and saying, ‘You are required to use this and only this,’ and making it a free-for-all.” One chief concern is the possibility of constraints that regulatory compliance and data governance may have on these decisions, depending on your industry, Kelly says.
Financial sector cyber-related laws are a bellwether, says Deloitte
“While it is generally not possible to control when you have a crisis, quite often the cause of these crises is a cyber security incident, so it is worth information security teams in organisations engaging with the privacy teams to help understand where the organisation’s core risks lie, so they can prepare for these crises. A good response makes a huge difference.” Another thing that “absolutely attracts regulator attention”, said Bonner, is “pockets of complaints”, because even if the regulator does not have the resources to follow up on every single isolated complaint, if there are several customer complaints about a single organisation, the regulator will pay attention. “The lack of resources means that regulators will draw conclusions based on the nature and volume of the complaints,” he said. “So it could be by chance that a couple of entirely separate parts of your organisation have an issue that gets escalated to the regulator, but the conclusion will be that the organisation has a systemic problem.
Making sense of Handwritten Sections in Scanned Documents
It is challenging to achieve acceptable extraction accuracy when applying traditional search and knowledge extraction methods to these documents. Chief among these challenges are poor document image quality and handwritten annotations. The poor image quality stems from the fact that these documents are frequently scanned copies of signed agreements, stored as PDFs, often one or two generations removed from the original. This causes many optical character recognition (OCR) errors that introduce nonsense words. Also, most of these contracts include handwritten annotations which amend or define critical terms of the agreement. ... In recent years, computer vision object detection models using deep neural networks have proven to be effective at a wide variety of object recognition tasks, but require a vast amount of expertly labeled training data. Fortunately, models pre-trained on standard datasets such as COCO, containing millions of labeled images, can be used to create powerful custom detectors with limited data via transfer learning – a method of fine-tuning an existing model to accomplish a different but related task. Transfer learning has been demonstrated to dramatically reduce the amount of training data required to achieve state-of-the-art accuracy for a wide range of applications.
Smarter cities: why data-driven cities are the only option
New and exciting mobile and static infrastructure technologies are enabling safer communities, new low-cost utility services, more efficient city operations, and intelligent low-emissions transportation systems. One example of a UK smart city is Milton Keynes which showcases driverless pods that ferry citizens along fixed routes across the city. ... Smart city programmes involve mass deployments of sensors which are necessary to gather data to justify and manage change. While sensors can be very cheap, the deployment of these sensors can often be very expensive, especially at a city-wide scale. But rather than deploying an entirely new network of sensors, cities and local authorities can improve efficiency by leveraging existing sensor networks to avoid expensive infrastructure investments. Telematics companies may seem unlikely partners in this endeavour, but with some of the world’s largest organically grown vehicle datasets, telematics providers can grant access to aggregated data already blanketing cities across the globe – without the expense of building a sensor network.
11 Industries That Will Soon Be Disrupted By Blockchain
Many people are resistant to technological changes in both their personal lives and at the office. However, what they often lack is the vision to see how the new technology they are resisting will improve their lives in the future. Emerging technologies are exciting and bring innovation and new opportunities across the globe. They change our life by altering the way we think and operate on a daily basis. Technological innovation can impact a lot more than our daily lives. In fact, it can disrupt entire industries and change the way we do business. As new technologies are developed, affected industries are forced to adapt or be replaced. The newest technology that is quickly becoming the next major disruption is blockchain technology. Blockchain is a digital ledger system used to securely record transactions. It is poised to impact the way business is done across the globe. Here are nine prominent industries that are slated to be overhauled by blockchain technology in the near future.
Microsoft's Project Brainwave brings fast-chip smarts to AI at Build conference
Project Brainwave brings two important differences to conventional AI. First, it uses a fast and flexible but unusual processor type called an FPGA, short for field programmable gate array. It can be updated often to accelerate AI chores with the latest algorithms, and it handles AI tasks rapidly enough to be used for real-time jobs where response time is crucial. Second, customers eventually will be able to run the AI jobs with Microsoft hardware at their own sites, and not just by tapping into Microsoft's data centers, which speeds up operations another notch. "This is a unique offering," said Forrester analyst Mike Gualtieri. The project is a microcosm of the AI revolution sweeping the tech industry. On the one hand, it's maturing fast enough to become useful for countless tasks -- digesting legal contracts, finding empty parking spaces, looking for hiring biases and generating 3D models of people's bodies, limbs and heads from a video.
More time equals more opportunity for cyber attackers
Given enough time, a criminal siphoning data can slow the attack down to a level where it may look like normal network traffic noise, rather than attempt to send out gigabytes of data from a database, for example. New data can also be gained over time, such as new oil well exploration or pharmaceutical research. If this arrives in an already compromised database, the attacker is positioned, ready and waiting, and only needs to exfiltrate it. Third, a rushed attack can often be rolled back to a previous backup without too much trouble or data loss. If exploitation of a database occurs today and is discovered, restoring the database leaves only a short batch of transactions that may need to be updated, once the route in has been strengthened. As a result, the business impact is low. Conversely, an attack that takes place over many months may mean long periods of compromised backups, requiring extensive manual work to rebuild from the last known successful backup. In extreme cases, reliance on these backups may not be possible as tapes deteriorate or are reused/recycled.
What is edge computing?
As centralized as this all sounds, the truly amazing thing about cloud computing is that a seriously large percentage of all companies in the world now rely on the infrastructure, hosting, machine learning, and compute power of a very select few cloud providers: Amazon, Microsoft, Google, and IBM. ... The advent of edge computing as a buzzword you should perhaps pay attention to is the realization by these companies that there isn’t much growth left in the cloud space. Almost everything that can be centralized has been centralized. Most of the new opportunities for the “cloud” lie at the “edge.” So, what is edge? The word edge in this context means literal geographic distribution. Edge computing is computing that’s done at or near the source of the data, instead of relying on the cloud at one of a dozen data centers to do all the work. It doesn’t mean the cloud will disappear. It means the cloud is coming to you.
Quote for the day:
"Leadership is working with goals and vision; management is working with objectives." -- Russel Honore
No comments:
Post a Comment