Gamification, big data and social business – The way forward for cutting edge enterprises

Gamification, big data and social business have been trending topics in how to build a typical millennial enterprise. But as of now, many organizations have operated them in silos and it remains to be seen how exactly we can truly integrate them. This article explains how big data and social business can be used together with gamification, to drive more value to the enterprise and drive competitive advantage for it’s products and services. Continue reading


Searching for that elusive vertical-specific CMS

Recently, Gartner published its 2014 review of Content Management Systems. Despite the excitement, there were no real surprises. Adobe and Sitecore sit atop their perspective technologies (Java and .NET) while the rest of the field stacks up behind them. Having been a solution architect for so many years I have had plenty of experience with both and I am a big fan.

Adobe has continued to invest into Apache technologies, which renders its architecture both robust and powerful. You will be hard pressed to find another CMS with as many OOTB features, engines, capabilities and controls. And with its newest version, 6.0, Adobe Experience Manager now has a wide set of new features that should make it a very attractive choice for a customer. These features are highlighted by the new mobile app integration with PhoneGap and the new repository architecture with Apache Jackrabbit. Continue reading


Next best action in Insurance: Using analytics to unleash your intuition

The world witnessed something very novel in 1996 as Garry Kasparov began to play chess in the Pennsylvania Convention Center. There was nothing unusual about Kasparov’s game of chess, but what was unusual was his unknown opponent, the ‘Deep Blue’, playing from the IBM center in New York. The media even hyped this event as “the future of humanity is on the line”. For every move by Kasparov, the Deep Blue, powered by 256 processors in parallel, could evaluate more than 100 million positions per second and make its move. Kasparov won the match 4-2, losing 1 game to Deep Blue, winning 3 games and 2 ending in a draw. But Deep Blue returned one year later with more enhanced logic and defeated Kasparov in 3½ - 2½. This historic match paved the way to show that computers can mimic or even beat the best experts, when programed with combinatorial analysis allowing them to evaluate multiple parameters simultaneously. Continue reading


Client onboarding – The importance of business process assessment and optimization

Business process improvement and the implementation of business process management (“BPM”) technology can truly become a “cart before the horse” dilemma for many organizations. Software vendors and implementers are often called upon to deliver the nirvana state that they sold their customer on, by simply automating a process without considering the fact that the process they are automating is inefficient and frankly, broken. On the other hand, the client may want to implement a point solution that resolves an immediate issue, without full consideration of how that software will impact the existing operating model and current business processes. Continue reading


Mission and make up of a data sciences group

The market is abuzz with data trends like big data and data science, and there are more to come. Numerous organizations across the world are trying to find the best possible way to establish an effective data group. However, organizations face various challenges in setting up a best-in-class data group, and those who have somehow managed to set one up face issues sustaining it. Hence, it has become very critical to analyze and understand the factors responsible for making this task challenging.

One of the key reasons for the nascent death of such a data group is the incapability of data organizations to continuously showcase their real potential to a business. These groups usually have technologists and evangelists who build on existing success and take on more volume and velocity of data, variety of data types—including structured, unstructured, and semi-structured—and provide real-time data processing and analytics capabilities. Continue reading


7 trends to watch to stay ahead of the digital era curve

This is the time of year when you start seeing a mad dash of articles looking back on the trends of 2014 and what to expect in 2015. While this is par for the course, something much more significant occurred in 2014 — this was the year it became clear that digital disruption is here to stay. According to a recent Zinnov study, almost 50 percent of the companies on the Forbes 2000 list will drop from the list because of disruption and the impact of the digital era. The study notes that enterprises will need to spend $70 billion in 2015 to compete with emerging digital organizations.

As we look back, 2014 will be the year wearable technology became fashionable, “things” of all types got connected to the Internet, advanced analytics made everything “smart,” 3-D printing turned the time to manufacture on its side, immersive technologies moved beyond gaming and payments finally became consumer friendly. These technologies will continue to mature in 2015, but it’s clear many of these have reached a tipping point and are now affecting how enterprises compete and survive in this new digital era. Continue reading


Six ways companies can succeed by exploring social media networks using DWBI

Social media platforms are rapidly becoming the new force behind the global 2000 companies, allowing companies to reach out and understand their valuable customers like never before. Social media networks analysis is the study of patterns of social relations. Through network analysis the behavior of social relations structure can be analyzed. Social media networks and peer groups are creating large data sets that are now enabling organizations to gain a competitive advantage and improve performance. These data sets provide vital insights into customer behavior, brand reputation and the overall customer experience. Intelligent early adopter are beginning to monitor and collect this data from the proprietary and open social media networks. Continue reading


End user computing: Grabbing the tiger by the tail

Finance organizations find themselves grappling with a dilemma around the use of Excel. On one hand Excel with its simplicity and advanced functionality has allowed even the most unsophisticated end user to build complex and powerful financial models. On the other hand Excel has proliferated so widely that organizations are unable to track where it is used and are often at the mercy of relying on figures for decision making or reporting, which are derived in ungoverned and unaudited Excel models. Some organizations have spent years trying to come up with a solution to address the use of Excel with very little success.

In 1846 Augustus De Morgan, the first Chair of Mathematics at the University College of London, created an array of rows and columns to tabulate mathematical computation. This framework became the foundation upon which modern spreadsheets are based. Continue reading


5 things you may not know about FATCA

Foreign Account Tax Compliance Act (FATCA), a US anti-tax evasion law with global implications has been at the center-stage of the regulatory space for the past two years. It aims to curb cross-border tax evasion by implementing a reporting framework that requires information from worldwide institutions on a US customer’s financials.

The IRS has placed a special focus on Financial Institutions (FI) since they generally handle substantial amounts of global payment transactions and thus are more likely to be exploited. As a consequence, FIs face stringent reporting and due-diligence requirements that cover extensive data-points. This is the opposite of the requirements on Non-Financial Entities (NFE) which have minor coverage, where they need to disclose only the information related to their ownership structure and income. Continue reading


SDN: A Leveler for CSPs

According to data growth statistics, over the coming years we will experience a tremendous jump in data generation and consumption. Cisco forecasts that global cloud traffic is expected to grow 4.5-fold. That amounts to a 35% CAGR — from 1.2 zettabytes of annual traffic in 2012, to 5.3 zettabytes by 2017. Overall, Cisco expects global data center traffic will grow threefold and reach a total of 7.7 zettabytes annually by 2017.

If those numbers seem impressive, consider that IDC estimates that the Internet of Everything will amount to an $8.9 trillion market in 2020. To co-ordinate IDC’s estimated 212-billion things, data centers will need to shoulder the burden. This clearly puts the onus on networks to support the enormous volume of data that is being transmitted across channels. As customers’ demand for bandwidth increases, there is a need for a strong network infrastructure at the back-end and intelligent devices to minimize the service latency and support this data growth. Continue reading