Greetings from #petadata, #celebrating 30K Followers We appreciate your interest, your continued followership, and your engagement with our organization. We endeavor to help, to be interesting, and ultimately to be useful with the content we share with you. Please know that we appreciate each one of you for your continued support. Keep following #petadata to get more opportunities. #thanksforyoursupport #celebrations #30kfollowers #linkedin #keepsupporting #hiring #recruiting #selecting #USITRecruiters #USITStaffing #2024hiring #newopportunities #domestichiring
PETADATA’s Post
More Relevant Posts
-
Hiring Alert !! Share suitable profiles to [email protected] No H1B's Role: MarkLogic DBA Location: 100% Remote Work (EST TIMINGS) Duration: Long Term Contract MarkLogic (1 Position) 3+ years of experience with cluster administration MarkLogic cluster, replication and failover Experience with Marklogic re-indexing, rebalancing, merge activities #hiring #remoterolesinusa #marklogicdba #dbaroles #completeremoterolesdba #clusteradministration #MarkLogiccluster, #replication and #failover
To view or add a comment, sign in
-
Hello All, Greetings of the Day!!! Please ,Find the Hotlist below and share the Corp to Corp Requirements to [email protected] or +1972-325-1221. #dataengineer hashtag #pythondeveloper hashtag #javadeveloper hashtag #business Analyst hashtag #scrummaster hashtag #sapbwhana hashtag #splunkengineer hashtag #bss hashtag #oss hashtag #citrixadmin hashtag #vmware hashtag #windowsadmin hashtag
To view or add a comment, sign in
-
Hello All, Greetings of the Day!!! Please ,Find the Hotlist below and share the Corp to Corp Requirements to [email protected] or +1972-325-1221. #dataengineer hashtag #pythondeveloper hashtag #javadeveloper hashtag #business Analyst hashtag #scrummaster hashtag #sapbwhana hashtag #splunkengineer hashtag #bss hashtag #oss hashtag #citrixadmin hashtag #vmware hashtag #windowsadmin hashtag
To view or add a comment, sign in
-
Dear Staffing Partner, Pradeep is a Teradata DBA & Snowflake Architect /Admin is having 15+ years of experience and available for projects immediately. He is located in Apex, NC and looking for positions anywhere in USA, if you'd like to receive his resume, please reply to this email with job details or contact me AT 630-480-4203. SUMMARY: Leveraged 14+ years of expertise in Business Intelligence, Global Data Warehousing, and Development across diverse cross-functional teams, spanning Database Architecture, Development, Administration & Maintenance, and Reporting & Database Performance Tuning. Demonstrated strong proficiency in Relational Databases (OLTP), Data Warehousing systems (OLAP), and Cloud Services, specializing in Dimensional Data modeling including Star Schema and Snowflake Modeling. Led Data Mapping, Data Profiling, and Data Mining initiatives to support the creation of robust Data Warehouse environments. Served as a Snowflake Architect and ADMIN for 2+ years at Comcast, focusing on large-scale EDW implementation and executing strategic Data Migration initiatives. Established efficient data pipelines from Teradata to Snowflake and developed Python frameworks for Adhoc data ingestion, alongside expert-level UNIX Shell Applied deep expertise in Performance Tuning, Query Optimization, and the creation and monitoring of Snowflake Virtual Warehouses to enhance system efficiency. Managed high-volume parallel CSV files generation using Informatica for seamless Snowflake data ingestion and led the Lift and Shift of Historic EDW Star Schema to Snowflake EDW. Successfully ported 250+ SAP BO Reports from Teradata to Snowflake Data Warehouse, specializing in Dimensional Modeling, Star and Snowflake Schemas, and Extended Star Schema designs. Implemented robust Data Masking Policies for PII and EBS SOC Compliance. Brought 12+ years of Teradata DBA/Architect experience, proficient in Teradata Administrator for managing Databases, Users, Tables, Indexes, and permissions management. Extensively utilized Teradata 17.20/16.20/15.10 for application performance tuning, SQL optimization, detailed design, and handling database maintenance tasks. Expertise in Teradata Workload management, collaborating with user groups to define TASM Workloads, develop exceptions, and implement filters and throttles as needed. Managed production environments, identifying bottlenecks, and ensuring workload balancing through Shell scripts, Stored Procedures, and Macros automation. Executed internal and external data stages, transformed data during loads, and performed Zero Copy cloning for DEV and QA environments, with hands-on experience in Teradata Query Grid configuration. Contributed to the Analysis, Design, and Development of Application packages/systems in Python, and Unix shell script for the Banking/Telecom industry. Thanks, Deepthi - Thoughtwave Software & Solutions, Inc. Phone: 630-480-4203 Email: [email protected] Web: https://lnkd.in/gxVjZ4aZ
To view or add a comment, sign in
-
Let's Discuss: Rights is one of the Metadata Elements According to Dublin Core, Rights is one of the Metadata Elements that need to capture. Their definition of Rights is, Information about Rights held in and over the resource. Typically, Rights information includes a statement about various property Right associated with the resource, including intellectual property Rights. How to impose that in this organization? Daniel Lundin a topic near to you. Husna Khairuddin (CDMP, PMP) Let's test the power of LinkedIn. #metadata #rights
To view or add a comment, sign in
-
When you root for business growth, It becomes quite difficult to find an in-house administrator with all the skills needed to manage your IT products & solutions is nearly impossible, and hiring a larger team is often cost-prohibitive. Partner with BsCubes Managed Services and get a flexible team with a diverse skill set, that can deliver the exact expertise needed, for the exact amount of time needed in a cost-effective manner. #highered #Database #higheredsupport #technology #ItSolution #It #DataInsight #datatheft #datasecuritybreach
To view or add a comment, sign in
-
Hey LinkedIn Community! As i am at the end of the notice period i have given many interviews as a splunk engineer where most of them are expecting you to explain performance tuning in Splunk. I am happy to share developer perspective on how to achieve best performance while using SPL. In the world of big data and real-time insights, Splunk stands out as a powerful tool for searching, monitoring, and analyzing machine-generated data. However, the efficiency of your Splunk searches can make or break your data-driven decisions. That's where performance tuning using Search Processing Language (SPL) comes into play! ✌️Why Performance Tuning Matters: Optimizing your SPL queries not only speeds up search results but also reduces the load on your Splunk infrastructure, ensuring smoother and faster operations. 📈 Top Tips for Tuning Your SPL Queries: 1. Use the `tstats` Command: - Leverage `tstats` for faster searches, especially when dealing with large datasets. It bypasses raw event data, querying indexed fields directly. - Example: ```spl | tstats count where index=_internal by host ``` 2. Optimize Field Extraction: - Extract only the necessary fields to reduce processing time. Use the `fields` command to specify required fields. - Example: ```spl index=main | fields host, source, sourcetype ``` 3. Leverage `| where` instead of `search`: - Use `| where` for conditional logic after the initial dataset is retrieved, as it can be more efficient than using `search` commands multiple times. - Example: ```spl index=main | where status="200" ``` 4. Summary Indexing: - Use summary indexing for long-term reports and reduce the search load on your main index. - Example: ```spl | collect index=summary ``` 5. Limit Data Early: - Filter and limit data as early as possible in your search pipeline to minimize the amount of data processed. - Example: ```spl index=main sourcetype=access_combined | head 100 ``` 6. Avoid Costly Wildcards: - Use wildcards wisely. Avoid leading wildcards and opt for more specific patterns to speed up searches. - Example: ```spl index=main host=webserver* ``` By implementing these practices, you can significantly improve the performance of your Splunk queries, making your data analytics more efficient and actionable. 🔍 Ready to supercharge your Splunk experience? Start tuning your SPL queries today and watch your data insights soar! #Splunk #SPL #DataAnalytics #PerformanceTuning #BigData #ITOps
To view or add a comment, sign in
-
SR US IT Recruiter | Currently Hiring professionals for Candidates and | LinkedIn Recruiter! Expert in Talent Acquisition for Tech Industry Leaders | Driving Success Through Strategic Recruitment Solutions"
We are still #hiring Know anyone who might be interested? Please share profiles with [email protected] Job Title: Transition Lead with #splunk Location: #Iselin #nj / #somerset NJ: (Day 1 #onsite ) Duration: #contract / #fulltime Job Description: As a Splunk Consultant, Candidate’s role will involve evaluating and enhancing existing #splunk deployments for customer to ensure optimal performance, efficiency, and utilization of the Splunk platform. Candidate will be responsible for conducting thorough assessments of Splunk environments, identifying areas of improvement, and implementing strategies to enhance the overall effectiveness of the system. Requirements: Expertise in #siem (Security Identity and Event Management) tools such as Splunk Transition management of Security Operations Center ( #soc ) from current set up to the proposed new state (and define its #roadmap , transition plan, actionable, responsibilities and project schedule) Proven experience (8-12 Years) in Splunk administration, optimization, and performance tuning in enterprise-level environments. Deep understanding of Splunk architecture, configuration, and best practices for data ingestion, indexing, search, and storage. Strong knowledge of Splunk search processing language (SPL) and experience in optimizing complex search queries. Familiarity with Splunk data models, pivot, and visualization capabilities. Good understanding of IT infrastructure components, including networking, systems, applications, and security. Strong communication and interpersonal skills, with the ability to effectively communicate technical concepts to non-technical stakeholders. Splunk certifications (e.g., Splunk Certified Architect, Splunk Certified Admin) are a plus. Responsibilities: Splunk Environment Assessment: Evaluate existing Splunk deployments to identify areas of improvement, including data ingestion, indexing, search performance, storage utilization, and overall system health. Perform in-depth analysis Performance Optimization: Analyze and optimize the search queries, data models, and indexing strategies to improve search performance and reduce Data Onboarding and Integration: Review data sources and data ingestion processes to ensure efficient and accurate data collection. Advise on best practices for onboarding different data types, including logs, events, metrics, Dashboard and Report Optimization: Evaluate existing dashboards, reports, and visualizations to enhance their usability, relevance, and performance. Collaborate with stakeholders to understand their reporting requirements Capacity Planning and Scalability: Assess current resource utilization and provide recommendations for scaling the Splunk infrastructure to accommodate future data growth. Analyze system capacity and design Documentation and Reporting: Prepare detailed reports and documentation summarizing the findings of the Splunk review and optimization process.
To view or add a comment, sign in
-
Masters in Economics || Big Data || Machine Learning || Statistics || Python || SQL || R || JAVA || AWS cloud
#Kafka #1.0.0 #server.properities Follow me on LinkedIn: https://lnkd.in/gWDUngbz 🔶 magic behind Kafka's seamless performance and scalability. The server.properties file is your gateway to configuring brokers, optimizing performance, ensuring security, and scaling effortlessly. ➡ Broker Configuration: ◽ Broker ID assignment. ◽ Port settings for communication. ◽ Configuration for log directories. ➡ Performance Optimization: ◽ Batch size configurations. ◽ Replication factors. ◽ Message retention policies. ➡ Security Measures: ◽ Authentication mechanisms setup. ◽ Encryption protocols enabling. ◽ Authorization configurations. ➡ Scalability Strategies: ◽ Dynamic broker configurations. ◽ Partition assignments. ◽ Replication strategies. ➡ Monitoring and Management: ◽ Logging level configurations. ◽ JMX metrics enabling. ◽ Custom listeners setup for monitoring. #Job #Jobsearch #Jobopening #Recruitment #Recruiting #Jobposting #HR #LinkedIn #Hiring #Openings #Jobvacancy #Jobalert #Interviewing #Jobhunters #CV #dataengineering #datascience #dataanalytics #kafka #apache
To view or add a comment, sign in
-
#Hiringnow Title: Lead Splunk Administrator Location: Austin, Texas Job Requirements: Build, Deploy and Manage the Enterprise Lucene DB systems (Splunk & Elastic) to ensure that the legacy physical, Virtual systems, and container infrastructure for business-critical services are being rigorously and effectively served for high quality logging services with high availability. Support periodic Observability and infrastructure monitoring tool releases and tool upgrades, Environment creation, Performance tuning of large scale Prometheus systems Serve as Devops, SRE for the internal observability systems in Visa’s various data centers across the globe including in Cloud environment Lead the evaluation, selection, design, deployment, and advancement of the portfolio of tools used to provide infrastructure and service monitoring. Ensure tools utilized can provide the critical visibility on modern architectures leveraging technologies such as cloud, containers etc. Maintain, upgrade, and troubleshoot issues with SPLUNK clusters. Monitor and audit configurations and participate in the Change Management process to ensure that unauthorized changes do not occur. Manage patching and updates of Splunk hosts and/or Splunk application software. Design, develop, recommend, and implement Splunk dashboards and alerts in support of the Incident Response team. Ensure monitoring team increases use of automation and adopts a DevOps/SRE mentality Engagement Deliverable(s) Splunk administration support, including operation and maintenance of the log aggregation and Security Information and Event Management (SIEM) platform. Perform systems analysis, modify, and update systems and related data ingestion parameters based on results of analysis, deploy applications and tools, perform testing of deployed applications and tools, and communicate updates to the customer. Establish and maintain configuration and technical support, assist in the technical design process, and provide guidance/direction to customer on how to best get value from Splunk products. Maintain, upgrade, and troubleshoot SPLUNK servers, clusters, and management systems. Install, upgrade, and maintain required SPLUNK applications and add-ons. Provide performance and license tuning for systems and troubleshoot SPLUNK components across multiple network environments. Provide solution engineering support to ensure systems and components meet current and future standards. Develop, create, deploy, and manage custom SPLUNK monitors, alerts, and dashboards. Monitor SPLUNK for cluster status, health status, and other issues, and resolve as needed. Manage patching and updates of Splunk hosts and/or Splunk application software. Monitor and audit configurations and participate in the Change Management process to ensure that unauthorized changes do not occur. Build and integrate contextual data into notable events. Mandatory Skills Splunk Admin AIOps Email ID: [email protected] #Hiring #Splunk #Admin #AIOPS
To view or add a comment, sign in
40,144 followers