| The sender of this email is registered with Naukri.com as GCC Services India Private Limited . To respond back directly to the Employer, please click on Reply button, or send an email to kumar.v@ab-inbev.com
Do Not forward this email, it contains links which allow direct login to your Naukri account. | |
| Dear Job Seeker,
You are receiving this mail because the recruiter considers your profile to be suitable for the following job opportunity posted on Naukri.com, and would like you to apply for the job.
Following are the job details: | |
| Apply Now | |
| Job Synopsis | |
|
Ab-inbev is Hiring - Data Architect - Bangalore Location |
Company: | | Anheuser-Busch InBev |
Experience: | | 10 to 16 yrs |
Salary: | | INR 12,00,000 - 20,00,000 P.A |
Location: | | Bengaluru/Bangalore |
| |
|
| Job Description | |
|
Ab-inbev is Hiring Data Architect - Bangalore location
Responsibilities:
Create and maintain efficient and scalable data pipeline architectures
Need to be very good in relational and dimensional data modelling using ERWIN or any other data design tools, understand data profiling, data quality management, data cataloging
Governing the usage, design and management of data in the enterprise data lake
Act as subject matter expert for Big Data Platforms (preferably on Azure Cloud) and provide technical leadership around the delivery of projects on enterprise data lake
Design, develop, and deliver large-scale projects to support Big Data ingestion, curation and delivery using an Agile approach resulting in software delivered on time and in budget
Establish and enforce guidelines to ensure consistency, quality and completeness of data assets
Work with all stakeholders including executive, product, and engineering teams to assist with data-related issues and need
Define, implement, and evolve the necessary data models to support our big data lifecycle needs
Help establish analytics tools that enable data scientists and business analysts in building and optimizing models that support our business goals
Refine the existing processes, guidelines, and tools for data management solutions covering data movement, data security, data privacy, and metadata management.
Hands-on experience with Talend or any ETL tools, SQL RDBMS, Hadoop (preferably Hortonworks) and Hive databases (on Cloud esp. Azure, preferred)
Experience in Azure HDInsight, ADLS, Azure SQL DW & Snowflake is a plus
Experience/Knowledge in Spark & Kafka is a plus
Experience designing, implementing, and validating integrations with 3rd party applications
End to end solutions experience including strategy development, design patterns, product selection, technical implementation, and configuration of BigData analytics solutions
Good understanding of file formats including JSON, ORC, HDFS
Resolve Technical issues faced by Team
Experience in business, architecture principles/patterns, processes and frameworks
Good resource management and communication skills, ability to guide, train and negotiate technical decisions
Have an affinity for new challenges, a self-starting attitude, and a willingness to work with a global team
Be able to take ownership of complex technical problems and drive them to resolution individually as well as work effectively through escalations with the engineering team
Self-motivated, enthusiastic, approachable and people orientated
Must demonstrate a positive, team-focused attitude
Ability to work effectively with technical and business staff across geographic and multi-cultural environment
Is comfortable in a fast-paced, dynamic work environment
Can manage multiple tasks simultaneously; is adept at managing competing priorities
Strong analytical problem solving and decision-making skills
Excellent written and verbal communication skills with ability to communicate effectively
FMCG or Retail Domain knowledge is a plus
Exposure to industry-leading metadata tools such as Talend Data Catalog is a plus. Metadata management, data lineage, data governance, especially as related to Big Data
IT skills required
o Data modelling, ETL Experience, BI experience, data profiling, data quality management, Big Data architecture, data architecture, Enterprise data lake architecture
Hands-on experience with Talend or any ETL tools, SQL RDBMS, Hadoop (preferably Hortonworks) and Hive databases (on Cloud esp. Azure, preferred)
Experience in Azure HDInsight, ADLS, Azure SQL DW & Snowflake is a plus
Experience/Knowledge in Spark & Kafka is a plus
Experience designing, implementing, and validating integrations with 3rd party applications
Create and maintain efficient and scalable data pipeline architectures
Need to be very good in relational and dimensional data modeling using ERWIN or any other data design tools, understand data profiling, data quality management, data cataloging
Governing the usage, design and management of data in the enterprise data lake
Act as subject matter expert for Big Data Platforms (preferably on Azure Cloud) and provide technical leadership around the delivery of projects on enterprise data lake
Design, develop, and deliver large-scale projects to support Big Data ingestion, curation and delivery using an Agile approach resulting in software delivered on time and in budget
Establish and enforce guidelines to ensure consistency, quality and completeness of data assets
Work with all stakeholders including executive, product, and engineering teams to assist with data-related issues and need
Define, implement, and evolve the necessary data models to support our big data lifecycle needs
Help establish analytics tools that enable data scientists and business analysts in building and optimizing models that support our business goals
Refine the existing processes, guidelines, and tools for data management solutions covering data movement, data security, data privacy, and metadata management.
Experience working with medium to large engineering teams.
Shows creativity and initiative to improve project coverage and effectiveness.
Ability to effectively communicate with technical and non-technical stakeholders across all levels of the organization.
If interested kindly share the updated resume to kumar.v@ab-inbev.com with below details.
Total Years of experience
Total years of relevant experience
Current CTC
Expected CTC
Notice period
Preferred time to Reach
Comfortable with Shifts: |
Salary: | | INR 12,00,000 - 20,00,000 P.A |
Role: | | Database Architect/Designer |
Role Category: | | Programming & Design |
Industry Type: | | IT-Software / Software Services |
Functional Area: | | IT Software - DBA, Datawarehousing |
Keywords: | | spark,big data,data modeling,sql,design patterns,etl,data architecture |
| |
| Desired Candidate Profile | |
|
Education: | | (UG - Any Graduate - Any Specialization, Graduation Not Required) AND (PG - Any Postgraduate - Any Specialization, Post Graduation Not Required) AND (Doctorate - Any Doctorate - Any Specialization, Doctorate Not Required) |
| |
|
| Company Profile | |
| Anheuser-Busch InBev | |
| Anheuser-Busch InBev World's largest brewer. Our Dream is to be the Best Beer Company Bringing People Together For a Better World! We are a company of owners that believe in achieving excellence in all that we do. We embrace and lead change. We're focused and we work hard. And although we're spurred on by our desire for success, we're equally driven by the people around us - inspired and motivated by the great minds we work with. ABInBev is setting up Global Analytics Centre in Bangalore where we envision state of art analytics work give benefits of over $ 1B a year to the company. We are looking for smart, driven, entrepreneurial people who can work in a start-up like an environment and deliver high impact, cutting edge analytics. Are you ready to join our team? | |
| Job Sent by | |
|
| |
| | |
|
Regards,
Kumar V
Anheuser-Busch InBev
Apply Now |
|
|
No comments:
Post a Comment