We're #hiring a new Backend Engineer with Relational Database Experience in Brazil. Apply today or share this post with your network.
Wizdaa’s Post
More Relevant Posts
-
💡As a backend engineer, it's really important to think about how to make our API responses as fast and efficient as possible. It's our job to ensure we deliver high-quality products and outputs to our clients. To achieve this, there are several key areas we need to focus on: ✅ Database design ✅ Proper indexing ✅ Normalization/denormalization ✅ Partitioning ✅ Caching ✅ Query optimization If you have any additional points, please feel free to add them below.
To view or add a comment, sign in
-
What Should Backend Engineer DO? The backend engineer role is pivotal in software development, responsible for creating and optimizing the server-side systems that support web and mobile applications. Backend engineers ensure that applications are scalable, secure, and efficient, handling critical components such as databases, APIs, and infrastructure design. It requires expertise in system architecture, database management, API development, and security. This position is highly technical and demands the ability to solve complex problems, optimize performance, and build scalable solutions. Backend engineers are essential to delivering robust, reliable systems that drive the functionality and success of modern applications. Main Aspects of the Role: 1.System Architecture & Scalability: Design backend systems that can handle growing user demand, ensuring high availability, performance, and scalability. 2.API Development: Build and maintain secure, efficient APIs that enable seamless communication between frontend systems and backend services. 3.Database Management: Develop and optimize relational (e.g., MySQL, PostgreSQL) and NoSQL (e.g., MongoDB) databases to ensure efficient data storage and retrieval. 4.Security & Data Integrity: Implement strong security measures, including encryption and authentication protocols, to protect sensitive data and maintain system integrity. 5.Performance Optimization: Improve the speed and efficiency of backend processes by fine-tuning code, caching mechanisms, and optimizing server performance. 6.Collaboration: Work closely with frontend developers, product managers, and other teams to ensure backend systems meet the needs of the overall project.
To view or add a comment, sign in
-
HR Technical Recruiter at HAN Staffing |If you are looking for H1 Transfer and GC Process, please reach out at [email protected]
AWS Developer Malvern, Pa - Hybrid Must 80% AWS 20% Java [email protected] Backend Developer JD WE STRICTLY DON’T ACCEPT FAKE H-1B AND FAKE GC'S. WE NEED PASSPORT NUMBER TO CHECK TRAVEL HISTORY. KINDLY DON’T SEND RESUMES WHO ARE NOT INTERESTED IN SHARING PASSPORT NUMBER. Using the Python/Java programming language to create scalable code. Application testing and bug fixing. Creating the back-end elements. Utilizing server-side logic to incorporate user-facing components. Evaluating and ranking customer feature requests. Integrating storage methods for data. Design and implementation of high-performance, low-latency applications. Working in concert with front-end programmers. Upgrading the functionality of current databases. Creating digital technologies to track online activity.” Previous profiles did not seem to be 80% AWS, so they have been passed on. Resumes must be 80% Java Development and not for third party apps and we need to include a short write up of their AWS experience in the submission.
To view or add a comment, sign in
-
Java / Kotlin (JVM), TypeScript (Node.js) and Rust! - Microservice & Event Based Cloud Architectures
KISS and show some love for PostgreSQL 💋 Do you really need the operational complexity that comes with deploying multiple database solutions? PostgreSQL is every kind of database and it’s transactional. 📄 Document: Built-in JSON and JSONB data types. 🔍 Vector: pgvector extension (semantic/similarity search for ML applications). 🌐 Graph: (Apache AGE extension - supports Cypher queries). 🔎 Search: pg_bm25 extension (tf-idf like elastic search). 📑 Columnar: Citus extension. 🔑 Key-value cache: Unlogged tables (it’s not Redis but it could work - measure it!). 🔗 Relational: doh. Solve first, optimize later: PostgreSQL is a pragmatic solution to get started with. The problem you actually have may not be the one you would like to have; those millions of customers may never show up! Do you know someone who is looking for a pragmatic senior engineer? I am currently looking for a Java, Kotlin, Typescript or Rust role. AWS, Google Cloud, Kubernetes etc. Yes - I also do PostgreSQL. Bristol and/or remote. Please feel free to message me, like or repost.
To view or add a comment, sign in
-
Working at Seekersspark Career as Sr. HR Executive | Talent Acquisition Executive | Contract Hiring | IT Recruitment | HR Domain | Collaboration | Helping Hands | Sourcing
Hi All, 📢 We’re Hiring: Senior Rust Developer (7+ Years of Experience) A forward-thinking tech company is looking for experienced Rust Developers to join their team. The role involves designing and implementing back-end services with a focus on security, dependability, and performance. You will work in secure environments, collaborating with cross-functional teams to create microservices and data pipelines. This is an excellent opportunity for engineers who are passionate about cutting-edge technologies and back-end architecture. Job Details: Work Mode: Full-Time (Remote) (Contractual) Experience: 7+ years Shift: 8 hours/day Notice Period: Immediate Key Responsibilities: Design and implement secure, dependable, and high-performance back-end services using Rust. Develop and maintain data intake pipelines with scalability and performance in mind. Collaborate to create microservices and implement CI/CD pipelines. Work in secure computing environments and adhere to security guidelines. Optimize databases for performance and scalability. Participate in the creation and maintenance of frameworks and tools for data observability. Stay updated on the latest trends and innovations in the industry. Job Requirements: Education: Bachelor's/Master’s degree in Engineering, Computer Science, or equivalent experience. Experience: 7+ years as a software developer, with hands-on experience in Rust. Extensive experience creating secure, scalable back-end systems. Proficiency in microservices architecture and CI/CD processes. Strong background in secure computing environments and database architecture. Experience with AWS, Azure, and Data Management. Knowledge of Kubernetes, Docker, and data observability principles. Growth mindset with a commitment to continuous learning. Fluent in English (written and verbal). Interview Process: Round 1: Internal technical interview (Live coding in Rust, Microservices, System Design) Round 2: Culture fit interview with the client Round 3: Live coding interview / Problem-solving with the client Round 4: System design interview with the client Pre-Screening Questions: Do you have experience with secure computing environments? Please explain. Can you share your experience working with microservices architecture? Have you worked on any project that involved data observability? Please provide details. Have you designed and delivered a high-volume, high-performing product? If yes, explain. What is your experience with Kubernetes and Dockerized environments? Mandatory Skills: Rust: 4+ years Microservices, Docker, Kubernetes: 5+ years Opportunity: This is a fantastic opportunity for experienced Rust developers to work on innovative projects, driving secure, high-performing back-end solutions. Apply Now and be part of a cutting-edge technology company! Send your resume to [email protected] or WhatsApp at 7678425419. #RustDeveloper #Hiring #TechJobs #BackendDeveloper #Microservices #Kubernetes #Docker #RemoteWork 4o
To view or add a comment, sign in
-
Position: MidLevel Golang Developer Location: Plano, TX (3 days onsite/week) Duration: 6 Months+ Face2Fave Interview – Need local to Plano/Dallas PAY RATE: $60/HR ON C2C Must Have: 3+ years of experience Monitoring experience (i.e. Datadog, Prometheus or other) Golang (Channels, Go Routines) Microservices Event driven architecture experience such as Kafka, Kinesis or other Docker / Kubernetes preferred Testing Nice to Have: Protobuf Go event architecture AWS – ECS, Beanstalk Responsibilities. Develop new features. GoLang (Not as concerned on the version. If they can explain they have worked with concurrency in GO. Channels, Go Routines, is a strong signal of right fit. Library wise. If they have worked with protobufs that would be nice. If they have done GO event architecture would be nice.) They run in Kubernetes cluster. If they have worked with AWS ECS or Beanstalk or AWS Services in general would be nice to have but not required. AWS is more devops and not too important for them to have. Docker for containerization is used here. Testing - Just what you would expect from a dev perspective. Distributed Systems (Microservices), debug issues, root cause analysis. Dealing with Microservices that talk to 3rd party Observability – Datadog (Log Files, tracing, monitoring) – Don’t need to have specific Datadog experience. Monitoring experience in a Microservices environment would do with a comparable tool. Understand difference between logs and metrics. (Have you setup Promethus metrics? For example) Multiple different Microservices (understand the traceability, debug). SK SAHID HUSSAIN. [email protected]
To view or add a comment, sign in
-
Backend Engineer | Building High-Impact Projects from the ground up | TypeScript, Node.js, Python | Great Plans Lead to Great Results
Hey everyone, 👋 I'm Doron, a Backend Engineer. Today, I want to talk about Handling Database Migrations in Production. Deploying new features often means updating your database schema, but doing so in a live production environment can be risky. You don’t want to cause downtime or break existing functionality. Here’s how I handle database migrations seamlessly without disrupting the user experience: 1. Non-blocking Schema Changes Using a tool like gh-ost, I can perform non-blocking schema changes in MySQL. It creates a shadow table and migrates data without locking your production tables, keeping your system online while changes are made. 2. Backward-Compatible Code With Flyway, I version my database schema and ensure that the code supports both the old and new schema before fully migrating. This avoids breaking functionality if some parts of your system still rely on the old structure. 3. Feature Flags for Safe Rollouts Feature flagging tools like LaunchDarkly allow me to toggle between old and new features that rely on different schema versions. This lets me safely test changes and revert back quickly if needed, without affecting the database itself. 4. Phased Schema Updates Instead of doing all updates at once, tools like Sqitch let me roll out incremental changes. First, I add a new column, then deploy the app changes that write to it, and finally remove the old column once everything is stable—this minimizes risks. 5. Testing & Rollback Plans Before applying migrations in production, I simulate changes using TestContainers in a staging environment with production-like data. Having rollback plans ready, like snapshots or reverse migrations, is critical in case something goes wrong. By following these steps and using the right tools, you can handle database migrations in production with confidence and avoid downtime. How do you handle migrations in your projects? Any war stories or tips to share? Let’s discuss below!👇 #DatabaseMigrations #BackendDevelopment #NoDowntime #ProductionReady
To view or add a comment, sign in
-
Node.js Powerhouse: Expert Database Integration for Robust Applications Node.js Powerhouse: Expert Database Integration for Robust Applications Need a top Node.js developer to craft high-performance applications that seamlessly connect to various databases? Look no further! We're a leading Node.js development company with a team of top Node.js developers who are experts in database integration. Why Choose Us? Skilled Node.js Developers: Our team of hire dedicated Node.js developers comprises experts who can handle complex database interactions effortlessly. Database Proficiency: We possess in-depth knowledge of popular databases (MongoDB, PostgreSQL, MySQL) and SQL concepts to ensure optimal performance. Custom Solutions: We design database integrations that precisely align with your application's requirements and data model. Let's collaborate to create a Node.js application that exceeds your expectations! https://lnkd.in/geYMkKF6 #nodejsdevelopment #nodejsdevelopmentservicesc #webdevelopment #webdevelopmentservices #javascriptdevelopment #javascriptdevelopmentservices
To view or add a comment, sign in
13,101 followers