Open Data Science job portal

Azure Data Engineer 317 views

The Azure Technical Architect Delivery is responsible for delivering Data On Cloud projects for Azure-based deals. The ideal candidate would also be responsible for developing and delivering Azure cloud solutions to meet today’s high demand in areas such as AIML, IoT, advanced analytics, open-source, enterprise collaboration, micro-services, server-less, etc. The Azure Data Engineer is a high performing engineer responsible for delivering Cloud-based Big Data and Analytical Solutions at their clients. Responsibilities include evangelizing data on cloud solutions with customers, leading Business and IT stakeholders through designing a robust, secure and optimized Azure architectures and ability to be hands-on delivering the target solution. This role will work with customers and leading internal engineering teams in delivering big data solutions on the cloud. Using Azure public cloud technologies, their Data Engineer professionals implement state of the art, scalable, high-performance Data On Cloud solutions that meet the need of today’s corporate and emerging digital applications

Role & Responsibilities:

  • Provide subject matter expertise and hands-on delivery of data capture, curation and consumption pipelines on Azure and Hadoop
  • Ability to build cloud data solutions and provide domain perspective on storage, big data platform services, serverless architectures, Hadoop ecosystem, vendor products, RDBMS, DW/DM, NoSQL databases and security.
  • Participate in deep architectural discussions to build confidence and ensure customer success when building new solutions and migrating existing data applications on the Azure platform.
  • Conduct full technical discovery, identifying pain points, business, and technical requirements, “as is” and “to be” scenarios.
  • Build a full technology stack of services required including PaaS (Platform-as-a-service), IaaS (Infrastructure-as-a-service), SaaS (software-as-a-service), operations, management, and automation.
  • Apply Accenture methodology, Accenture reusable assets, and previous work experience to deliver consistently high-quality work.
  • Stay educated on new and emerging technologies/patterns/methodologies and market offerings that may be of interest to our clients.
  • Adapt to existing methods and procedures to create possible alternative solutions to moderately complex problems.
  • Understand the strategic direction set by senior management as it relates to team goals.
  • Use considerable judgment to define a solution and seeks guidance on complex problems.
  • Primary upward interaction is with direct supervisor. May interact with peers and/or management levels at a client and/or within Accenture. Establish methods and procedures on new assignments with guidance.
  • Manage small teams of delivering engineers successfully delivering work efforts

Basic Qualifications

  • At least 5 years of consulting or client service delivery experience on Azure
  • At least 5 years of experience in developing data ingestion, data processing and analytical pipelines for big data, relational databases, NoSQL and data warehouse solutions
  • Extensive experience providing practical direction within the Azure Native and Hadoop – Minimum of 5 years of hands-on experience in Azure and Big Data technologies such as Powershell, C#, Java, Node.js, Python, SQL, ADLS/Blob, Spark/SparkSQL, Hive/MR, Pig, Oozie and streaming technologies such as Kafka, EventHub, NiFI, etc.
  • Extensive hands-on experience implementing data migration and data processing using Azure services: Networking, Windows/Linux virtual machines, Container, Storage, ELB, AutoScaling, Azure Functions, Serverless Architecture, ARM Templates, Azure SQL DB/DW, Data Factory, Azure Stream Analytics, Azure Analysis Service, HDInsight, Databricks Azure Data Catalog, Cosmo Db, ML Studio, AI/ML, etc.
  • Cloud migration methodologies and processes including tools like Azure Data Factory, Event Hub, etc.
  • 5+ years of hands-on experience in programming languages such as Java, c#, node.js, python, pyspark, Spark, SQL, Unix shell/Perl scripting, etc.
  • Minimum of 5 years of RDBMS experience
  • Experience in using Hadoop File Formats and compression techniques- Experience working with Developer tools such as Visual Studio, GitLab, Jenkins, etc.
  • Experience with private and public cloud architectures, pros/cons, and migration considerations.
  • Bachelor’s or higher degree in Computer Science or a related discipline.
  • Able to travel up to 100% (M-TH)

Candidate Must Have Completed The Following Certifications

  • MCSA Cloud Platform (Azure) Training & Certification
  • MCSE Cloud Platform & Infrastructure Training & Certification
  • MCSD Azure Solutions Architect Training & Certification

Nice-to-Have Skills/Qualifications:

  • DevOps on an Azure platform
  • Experience developing and deploying ETL solutions on Azure
  • IoT, event-driven, microservices, containers/Kubernetes in the cloud
  • Familiarity with the technology stack available in the industry for metadata management: Data Governance, Data Quality, MDM, Lineage, Data Catalog, etc.
  • Familiarity with the Technology stack available in the industry for data management, data ingestion, capture, processing, and curation: Kafka, StreamSets, Attunity, GoldenGate, Map Reduce, Hadoop, Hive, Hbase, Cassandra, Spark, Flume, Hive, Impala, etc.
  • Multi-cloud experience a plus – Azure, AWS, Google

Professional Skill Requirements

  • Proven ability to build, manage and foster a team-oriented environment
  • Proven ability to work creatively and analytically in a problem-solving environment
  • Desire to work in an information systems environment
  • Excellent communication (written and oral) and interpersonal skills
  • Excellent leadership and management skills
  • Excellent organizational, multi-tasking, and time-management skills
  • Proven ability to work independently

All of their professionals receive comprehensive training covering business acumen, technical and professional skills development. You’ll also have opportunities to hone your functional skills and expertise in an area of specialization. They offer a variety of formal and informal training programs at every level to help you acquire and build specialized skills faster. Learning takes place both on the job and through formal training conducted online, in the classroom, or in collaboration with teammates. The sheer variety of work they do, and the experience it offers, provide an unbeatable platform from which to build a career.

More Information

Share this job
Company Information
  • Total Jobs 16 Jobs
  • Location Boston
Connect with us
Contact Us
https://jobs.opendatascience.com/wp-content/themes/noo-jobmonster/framework/functions/noo-captcha.php?code=f5e06

Here at the Open Data Science Conference we gather the attendees, presenters, and companies that are working on shaping the present and future of AI and data science. ODSC hosts one of the largest gatherings of professional data scientists with major conferences in the USA, Europe, and Asia.