ResponsibilitiesDevelop, test and support future-ready data solutions for customers across industry verticalsDevelop, test and support end-to-end batch and near real-time data flows/pipelinesDemonstrate understanding in data architectures, modern data platforms, big data, ML/AI, analytics, cloud platforms, data governance and information management and associated technologiesDesign and implement data models according to business requirements for specific use cases and/or client’s business domainsDemonstrate understanding in data modelling schema design approaches and techniques such as Third Normal Form (3NF), Dimensional Modelling, Entity-Relationship (ER) Models and Data VaultDemonstrate understanding in Data Asset concept or Data Mesh Architecture and its Domain Data ProductsDevelop and demonstrate Proof of Concepts and Working DemosLead or collaborate with other internal/external consultants in consulting, workshops and delivery engagementsMentor junior IBM consultants in the practice and delivery engagementsRequired Professional and Technical ExpertiseMinimum of 10+ years of total work experience as a solution designer, developer, tester or support role in IT Consulting or other technology business units across the industryMinimum of 3+ years of hands-on development, testing and administration experience with Big Data/Data Lake services in cloud and on-premiseMinimum of 3+ years of designing and implementing data models in Big Data/Data Lake or Data Warehouse environment using modelling approaches and techniques such as Dimensional Modelling on top of Star or Snowflake SchemasExperience in implementing data model products (IFW/BDW/FSDM) from vendors such as IBM and TeradataHands-on experience in data model design tools such as Erwin and Sparx Systems Enterprise ArchitectExperience in applying industry best practices, design patterns and first-of-a-kind technical solutions as developer or administrator on data and non-data specific platforms and applicationsExperience in implementing near real-time data flows/pipelines and distributed streaming applicationsExperience in implementing traditional ETL, data warehousing and BI solutionsHands-on development experience and working skill level in Python, Scalar, SQL, shell scripting and other programming/scripting languagesHands-on implementation of services sensitive to performance SLAs with high availability, fault-tolerance, automatic fail-over and geographical redundancyWorking knowledge and hands-on experience with data services (Azure Synapse, Cosmos DB etc) in Cloud Platform such as Azure, AWS, GCP and IBM and other modern data platformsSolid understanding of containerisation, virtualisation, infrastructure and networkingDiverse experience in software development methodologies and project delivery frameworks such as Agile Sprints, Kanban, Waterfall etcExperience in presenting to and influencing stakeholders and senior managersTeam leadership and people manager experienceDegree in Computer Science, Information Technology or related Engineering coursesDesired Attributes and SkillsOne or more industry-recognised technology certifications or badges
#J-18808-Ljbffr