Commenting on the issue, Phil Bindley, managing director at The Bunker, said: “Prioritising compliance gives early-stage Fintechs a significant head-start in getting to market faster. To comply with the financial services sector’s strict regulations, Fintechs must use data centres that not only guarantee UK data sovereignty, but conform to the most demanding industry standards. Navigating this landscape can be particularly challenging as many Fintech businesses while heavy in technology innovation can benefit massively from service providers that are experienced in delivering technology and cyber security services in the financial services sector. That’s why it is crucial that they seek out partners with the relevant experience and expertise who can help them overcome these potential obstacles.”
The discovery information from configuration management tools can also uncover rogue equipment on the platform. Discoveries should show what assets appeared in the IT estate through shadow IT, so that operations admins can bring them under proper control. It can also flag things such as unauthorized Wi-Fi access points and other equipment that could grant malicious network access. Good configuration management processes also catalog user devices: tablets, smartphones, laptops and other computers on the network. Check the configuration of these devices as they touch the network, and grant access only if they meet a set of basic policies. For example, the device must have antivirus software installed or connect via a virtual private network.
Like many a dysfunctional family, the key to greater harmony is communication. The UK’s Financial Conduct Authority (FCA) has launched an industry sandbox for exactly that purpose, offering a forum for continuous feedback between fintechs, incumbents, regulators – and RegTech. RegTech, or regulation technology, translates complex regulation into API code. It streamlines burdensome compliance processes to keep both risk and human resources low. And there’s an urgent need for it: startup fintech providers simply don’t have the means to hire an army of compliance officers. With new regulatory technology, they don’t have to. Innovations including machine learning, biometrics and distributed ledgers help ensure compliance with fewer resources, and the benefits are significant.
Truth-be-told, many enterprise IT shops were so happy to get out of the management of physical servers within a data center that many limitations of the existing public IaaS clouds were forgiven. However, now that we’ve lived a few years with public IaaS clouds, developers and CloudOps pros are giving a huge thumbs down to the constant monitoring of servers, provisioned or not, that’s required to support the workloads. Here are two things that are happening with traditional IaaS that contributes to the problem. First, they over provision the servers needed, and go for a “You can’t have too many resources” model. Or, second, they do not provision enough resources, and instead go for a “Make them ask for more” model. Both are the wrong approaches. While estimates vary, the provisioning of pubic IaaS cloud resources over what’s actually needed is at almost 40 percent.
While other tech-related chief titles have a clearer path to the role, chief digital officers can come from many different backgrounds, he says. They may have technology backgrounds, data science backgrounds, marketing backgrounds, or they may come from consulting or research firms. “Sometimes it’s a good strategy person,” he says. “It depends what the organization needs.” “Often, it has to do with someone’s ability to influence others,” adds Mike Doonan, partner at executive search firm SPMB. “They’re usually coming into an old-line company that’s used to doing things one way. This is the one intangible I advise my clients to look for — you want someone who’s a visionary but also someone who understands people can’t absorb that vision all at once. ”
With the demand for processing large amounts of data, Apache Kafka is a standard message queue in the big data world. Apache Kafka is publish-subscribe-messaging rethought as a distributed, partitioned, replicated, commit log service, and it has a lot of convenient APIs for many languages. ... Integrating Spark Streaming and Kafka is incredibly easy. Your middleware, backend (proxy-like), or IoT devices can send millions of records per second to Kafka while it effectively handling them. Spark Streaming provides simple parallelism, 1:1 correspondence between Kafka partitions and Spark partitions, and access to offsets and metadata. Primarily, we need to set up Kafka's parameters to Spark — like a host, port, offset committing strategy, etc.
The impact of the breach was increased based on investigations by cyber security firm Mandiant, but Equifax said forensic investigators has not found any evidence of new or additional hacking activity or unauthorised access to new databases or tables. Equifax previously disclosed that about 400,000 consumers in the UK and 100,000 in Canada may also have been affected by the breach, but now it says it believes only 8,000 Canadians were affected. The company said the forensic investigation related to UK consumers has been completed and the resulting information is now being analysed in the UK. “Equifax is continuing discussions with regulators in the UK regarding the scope of the company’s consumer notifications as the analysis of the completed forensic investigation is completed,” it said.
We're thinking and driving a level of automation of the work we do beyond anything we've done before. So, for infrastructure professionals, I'm asking them to drive what we do to the cloud and toward automation. I'm asking them to dramatically change how we work. It's a structure where professionals need to have skills that look more like application development professionals have -- they have to write code and treat code like an asset and watch it evolve over time. That's a different skill that we asked infrastructure people to have than in the past. It changes how people do the work and the work we ask them to do. It really requires a nimbleness and constant curiosity and willingness to continue to evolve skills. It's a different mindset than what IT demanded previously, when the skills you nurtured lasted for a long period of time.
Fog colocates computing to where the data is and pushes intelligence and processing capabilities closer to where the data originates. Fog differs from Edge Computing in that fog has an association with cloud services. Data is processed and stored at a fog node and pertinent data is transmitted back. There could be multiple fog nodes between the actual sensor device and the cloud data center itself. Fog devices perform all the actions of an Edge Computing device, but are flexible in partitioning workloads between the fog nodes and cloud data centers. Fog Computing also offers the benefits of well-defined software frameworks, making the fog and cloud transparent to the user and developer.
The feature set may not be revealed until mid-2018, when Microsoft releases a preview of the suite. For his part, Spataro hinted at some of what will make it into Office 2019, calling out such features as Ink replay in Word and Morph in PowerPoint, which have been available to Office 365 subscribers for one and two years, respectively. ... There's little to no chance that Office 2019 will include any groundbreaking new features. Why? Because the perpetually-licensed version of the suite is built by taking the accumulated changes since the predecessor appeared — the changes issued to Office 365 subscribers over the past several years. Microsoft will take the version of Office 2016 that Office 365 ProPlus users have in, say, the spring of 2018 — and that version of Office 2016 is different than the 2015 version of Office 2016 sold as a one-time purchase — freeze the code, and call it Office 2019.
Quote for the day:
"Always do right. This will gratify some people and astonish the rest." -- Mark Twain