Blockchain has been wildly mis-sold, but underneath it is a database with performance and scalability issues and a lot of baggage. Any claim made for blockchain could be made for databases, or simply publishing contractual or transactional data gathered in another form. Its adoption by non-technical advocates is faith-based, with vendors' and consultants' claims being taken at face value, as Eddie Hughes MP (Con, Walsall North) cheerfully confessed to the FT recently. "I'm just a Brummie bloke who kept hearing about blockchain, read a bit about it, and thought: this is interesting stuff. So I came up with this idea: blockchain for Bloxwich," said Hughes. As with every bubble, whether it's Tulip Mania or the Californian Gold Rush, most investors lose their shirts while a fortune is being made by associated services – the advisors and marketeers can bank their cash, even if there's no gold in the river.
Building Resilient Data Multiclouds
Resilience is risk mitigation that is engineered into all your IT assets. It’s the confidence that your infrastructure won’t fail you ever, especially in times of crisis. Resilience that’s baked into the True Private Cloud ensures that businesses can weather those once-in-a-lifetime “black swan,” “perfect storm,” and other disruption scenarios that can put them and their stakeholders out of business permanently. To evolve toward this resilience architecture, enterprises must take steps to ensure that their migration of data, analytics, and other IT infrastructures to the cloud environments are comprehensively resilient. The path to the True Private Cloud requires a keen focus on building unshakeable resilience into distributed data assets. Management information systems and enterprise data warehouse systems have historically been viewed as “second-class citizens” among IT infrastructure platforms. Occasional failures of these systems were viewed as tolerated events.
Global Financial Services Bullish On AI, The 'Disruptive Tech' Frontrunner
AI and its application through machine learning is being increasingly used to automate processes such as credit decision-making and customer interaction as well as help detect fraud, money laundering and even terrorist activity. Capital markets-focused organizations such as investment banks are the furthest down the road in the financial services industry in adopting new disruptive technologies, with a little over half (51%) saying that AI, ... and just 17% among those in the private wealth industry. Stephanie Miller, Chief Executive Officer of Intertrust, commenting in the wake of the findings said: “With the hype surrounding disruptive technology in the financial sector it is easy to lose sight of reality. The findings from this study suggest that while the industry is positive towards new technology such as AI, blockchain and robotics, only a minority of firms are currently putting it to use and the speed of travel remains cautious.”
Geospatial Data Brings Value Across Industries
Though it may seem like a highly technical concept, most people use some type of geospatial data system every day because such programs are used to route Uber drivers, assess credit risk and lending rates based on zip code, and determine insurance rates by identifying homes at risk of flooding, earthquakes, and other natural disasters. Even kids use geospatial data to play games like Pokemon Go. Geospatial information is everywhere and, in a world where everyone is attached to a smartphone, we’re constantly connected to it. Put simply, geospatial data just means that the information set is tied to zip codes, addresses, or coordinates, among other possibilities. It’s a map or an address book, reinterpreted for a digital ecosystem. Though there are plenty of groups building geospatial data sets, one of the factors that has most contributed to this new digital world is the availability of open data sets.
Built for realtime: Big data messaging with Apache Kafka
Apache Kafka's architecture is very simple, which can result in better performance and throughput in some systems. Every topic in Kafka is like a simple log file. When a producer publishes a message, the Kafka server appends it to the end of the log file for its given topic. The server also assigns an offset, which is a number used to permanently identify each message. As the number of messages grows, the value of each offset increases; for example if the producer publishes three messages the first one might get an offset of 1, the second an offset of 2, and the third an offset of 3. When the Kafka consumer first starts, it will send a pull request to the server, asking to retrieve any messages for a particular topic with an offset value higher than 0. The server will check the log file for that topic and return the three new messages. The consumer will process the messages, then send a request for messages with an offset higher than 3, and so on.
Expert Excuses for Not Writing Unit Tests
Many studies do show a correlation between LoC and the overall cost and length of development, and between LoC and number of defects. So while it may not be a precise indication of progress, it is not a completely useless metric. The lower your LoC measurement is, the better off you are in terms of defect counts. For a tool to calculate this for you try https://github.com/boyter/scc/ which will also give you a COCOMO estimation. Be sure to run it over projects that have tests and see how much additional cost the tests add. Do this internally if you can with projects that have tests and point out that the tests add some percentage of cost. If you can cherry pick projects to make this look worse the better off you will be. If someone challenges that the project with tests was more successful point out using the same model that the project cost more. More money spent means more quality to most people. If you mix metaphors and ideas here you can also impress and confuse people to the point they will be afraid to challenge you further. Be sure to point out that adding tests means writing more code which takes longer, which also impacts cost. Also be sure to point out that while tests are being written that nobody will be fixing bugs. This is usually enough of an argument to stop everything dead in its tracks.
Confused by AI Hype and Fear? You’re Not Alone
Although AI leaves the door open for other paths to machine intelligence, most advances towards this goal so far have been made using machine-learning algorithms. These have some key characteristics that separate them from other algorithms, and that will define the field if another route to AI is discovered in the near future. Machine learning is primarily concerned with algorithms that can make connections between various annotated data and their output. Crucially, they are also able to learn independently from new, varied output, thereby improving their models without the need for human intervention. This approach lends itself to many of AI’s defining use cases, such as computer vision and machine translation. It’s debatable whether any AI applications to date haven’t derived from machine learning in some way. Almost all current chatbots have been built by machine learning, but there is another approach that some data scientists are considering. Rule-based models are founded on linguistic systems that are developed by experts to imitate the ways humans structure their speech.
Man-in-the-disk attacks: A cheat sheet
Cue a recent discovery by researchers at the software research firm Check Point: An attack they dubbed "man-in-the-disk" (MITD) attacks, which exploit a weakness in Android's handling of external storage to inject malicious code. The exploit allowing MITD attacks has serious repercussions for Android users because it exists at a level that's integral to Android's design. If man-in-the-disk sounds similar to man-in-the-middle (MITM) attacks, it's because there are many ways in which the attacks are similar. Both involve intercepting and often modifying data for nefarious purposes--it's simply the scale that distinguishes between the two attacks. Check Point's researchers found a number of apps--including some from major distributors such as Google--that were vulnerable to MITD attacks. Researchers also managed to build their own apps that took advantage of the exploit.
Want A Bigger Bang From AI? Embed It Into Your Apps
A key element of application-centric AI: Context. Say a sales executive wants to call on important customers in several cities. AI can review the accounts and predict which customers might increase business after a sales call, based on past history, and suggest an itinerary that would maximize ROI from the trip. One common factor in all those buckets is that integrating AI and machine learning into applications lets the app take some type of action automatically. Automation allows many tasks to be performed without human intervention—and without human error, says Swan. AI systems can execute relatively straightforward actions, such as booking a rental car for that sales trip. They can also tackle harder tasks that normally require not only time, but also some level of expertise, such as optimizing business workflows, reviewing financials for anomalies, or finding expense report violations. Often there’s still a human review, but that review can often be done faster, and more accurately, with the AI’s assistance in laying all the groundwork, presenting recommendations, and providing the background, documentation, and reasoning behind those recommendations.
Why open standards are the key to truly smart cities
In collaboration with several partners, including The Open Group, academic institutions and industry players, bIoTope is running a series of cross-domain smart city pilot projects which will provide proofs-of-concept for a wide range of applications, including smart metering, smart lighting, weather monitoring, and the management of shared electric vehicles. These projects will reveal the benefits that can be realised through the use of IoT technology, such as greater interoperability between smart city systems. They will also deliver a much-needed framework for security, privacy and trust to facilitate responsible access to, and ownership of, data on the IoT. Ultimately, bIoTope will deploy smart city pilots in Brussels, Lyon, Helsinki, Melbourne and Saint Petersburg. It is hoped that these pilot schemes will showcase the sustainable business ecosystems that will generate value to end users, solution providers, municipalities and other stakeholders.
Quote for the day:
"Risk more than others think is safe. Dream more than others think is practical." -- Howard Schultz
No comments:
Post a Comment