I developed the data services strategy for New York City. After working closely with Agency and internal stakeholders on data sharing needs and validating solutions with POCs I defined the overarching Architecture Vision for data services in New York City. This was elaborated in further detailed Architecture Vision documents for specific areas like, Data Lake, Data Science Platform, Hybrid Integration Platform, Logical Data Warehouse most of which were reviewed by multiple internal peers and externally by Gartner. This vision forms the basis of point 9 in the new Commissioner Samir Saini’s ten point strategic plan for DoITT. This prompted the formation of a 20+ person unit to implement the vision. The process showed how DoITT as a central service provider can add value to agencies through dialog around needs and thought leadership on technologies

I researched and defined the new data integration solution for New York City. New York City has an ageing legacy system that is both complex and costly. Furthermore development of any integration will take months. My responsibility was to modernise the integration solution to do what it does today, supporting mission critical systems, but also offer self service and reduce cost and complexity. Through engagement with stakeholders and developers we did POCs and validated the architecture of a solution that would not only do the same as today, but do so in a more resilient fashion while offering self-service capabilities for city users. On top of this we were able to show cost reductions of the new solution for close to 80% saving New Yorkers for millions of dollars in tax payer money in years to come.

I led a multi agency IoT work group in New York City. The purpose of the work group was to make a combined effort to develop cross agency guidelines and policies around IoT. Traditionally the efforts have been fragmented and no real governance has been established. We managed to updated and write a handful of policies around security, retention, privacy and architecture.

I started and led the open source project alvelor. The collaboration was unique in that the City of New York, which I represented, collaborated with city residents on an open  source project to use Machine Learning to count vehicles in the streets. See more about the project here

I developed the developed the idea of the 7 dimensions of data. In a post on this site I developed the idea about switching the focus on data from quality to one of value. In support of this I created the data value scorecard which is a template for describing and defining your data needs.

I defined the architecture for implementing Customer Masterdata on a Northern European GSIB Banks Hadoop platform: The bank had to implement highly sensitive personal data on a new platform, the Hadoop platform supplied by Cloudera. My responsibility was to design and get a solution approved.

I defined the architecture roadmap and deliverables for a Northern European GSIB bank’s anti financial crime initiative: The Retail Anti Financial Crime program in a major Northern European Bank needed to deliver a number of IT solutions. It was the highest prioritised program in the bank at the time. The purpose of the program was to implement the necessary system support to comply with FSAs requirements in order to avoid multimillion dollar fines and potentially losing the bank license. It involved significant usage of big data components employed in a heterogenous architecture characterised by legacy systems and multiple parallel ad hoc solutions. I Created a plan for the architectural deliverables for 12 projects, 200 plus individuals and three different organisational silos. Everything was done according to company guidelines in a very challenging fragmented stakeholder environment. Part of this work even won us the 2016 Cloudera Data Impact Award . For a presentation of what we did see this powerpoint: Big data in a regulatory context

Created the product sensor six from nothing – I researched, designed and architected all aspects of the solution for the SaaS product management tool Sensor Six ( I also financed everything with my own funds and setup an off-shore development team. Sensor Six was featured favourably as one among 11 of the most attractive Product Management solutions on the market by the independent Analysts Sirius Decisions in the Sirius Decisions Product Management Field Guide.  One example of a customer review reads:

Probably, the biggest reason to opt for the solution is its approach to decision-making. I liked that it broke down complex decision-making to specific, quantifiable decision points and establishes a process to make decisions. That you can do this using a single and simple solution is even more of an advantage because it integrates functionality and features from multiple solutions into a single comprehensive solution. I also liked that the solution combines qualitative and quantitative aspects of decision-making. Multiple roles and security permission only add to the solution’s attractiveness. This is because it enables you to reinvent and categorize inputs from multiple stakeholders for a decision.

Created a Decommissioning Strategy and plan for a 40 year old Finance System for a Major Danish Retailer: I created the strategy for decommissioning a number of legacy Mainframe IT systems as SAP Finance was rolled out. As part of this I did in depth analysis of the sparsely documented systems and constructed a detailed plan for decommissioning activities. This entails retiring approximately 2000 programs, 1000 database tables, 400 mainframe jobs while making sure that the remaining legacy systems and hundreds of Access databases remain functional. The plan spans a 6 year period with more than 50 timed actions that was recorded in the Service Management tool for future implementation.