Fabric and Azure Specialist

Total-TECH Co.

” The Job Description”

  1. Design and configure Microsoft Fabric workspaces (F8 Dev, F16 Staging, F32 Production).
  2. Implement 4-layer Lakehouse architecture (Raw, Bronze, Silver, Gold layers).
  3. Configure One Lake storage with appropriate partitioning and optimization.
  4.  Set up Azure Data Lake Storage Gen2 with proper security and access controls.
  5.  Implement Fabric capacity management and auto-scaling policies.
  6. Configure Fabric SQL analytics endpoints for reporting.
  7. Design and implement Azure Data Factory pipelines forinternal sources.
  8. Configure API integrations for  external data sources.
  9.  Implement incremental data loading and change data capture (CDC).
  10.  Set up orchestration, scheduling, and monitoring for all pipelines.
  11.  Implement error handling, retry logic, and alerting mechanisms.
  12. Optimize pipeline performance and cost.
  13. Implement data quality frameworks and cleansing routines.
  14. Design data validation and exception handling processes.
  15. Configure data profiling and anomaly detection.
  16. Implement master data management procedures.
  17. Set up data lineage tracking and impact analysis.
  18. Create data transformation logic using Spark/Python in Fabric.
  19. Configure and implement “Talk-to-Your-Data” assistant using Azure OpenAI Service.
  20. Set up Azure Cognitive Services for Arabic and English NLP.
  21. Implement Azure AI Search for semantic search capabilities.
  22.  Configure retrieval-augmented generation (RAG) pipelines.
  23.  Set up conversation flows using Azure Bot Service.
  24. Implement prompt engineering and optimization.
  25. Configure Azure AutoML for predictive analytics use cases.
  26. Implement sentiment analysis models for customer feedback.
  27. Set up clustering algorithms for customer segmentation.
  28. Deploy regression models for forecasting.
  29. Implement classification models for risk assessment.
  30. Manage model training, versioning, and deployment pipelines.
  31. Configure Azure API Management for secure data access.
  32. Implement Azure App Service for web applications.
  33. Set up Azure Static Web Apps for frontend deployment.
  34. Configure Power BI Embedded integration.
  35. Implement authentication using Azure AD B2C.
  36.  Set up monitoring using Azure Monitor and Application Insights.

    Requirements:

  • Bachelor’s degree in Computer Science, Information Technology, or related field.
  • 4+ years of experience with Microsoft Azure services.
  • Hands-on experience with Microsoft Fabric or similar platforms.
  • Azure certifications (DP-600, AI-102, AZ-305 preferred).
  • Strong background in data engineering and ETL/ELT processes.
  • Experience with Python/PySpark programming.

Technical Skills:

  • Expert knowledge of Microsoft Fabric components.
  • Proficiency in Azure Data Factory and integration patterns.
  • Experience with Azure AI Services (OpenAI, Cognitive Services, ML).
  • Strong Python, SQL, and Spark programming skills.
  • Understanding of Delta Lake and lakehouse architectures.
  • Experience with DataOps and MLOps practices.
  • Knowledge of data security and governance.

Tagged as: , , , , , , , ,

Upload your CV/resume or any other relevant file. Max. file size: 3 GB.

Job Overview
Job Location