MD Rasul, QA, SDET

MD Rasul

QA, SDET

American Express

Location
United States
Education
Bachelor's degree, Computer Information and System
Experience
19 years, 8 Months

Share My Profile

Block User


Work Experience

Total years of experience :19 years, 8 Months

QA, SDET at American Express
  • Egypt
  • My current job since October 2015

Installing and administrating Web server Like IIS 6.0, Apache, JBoss Environment. Customization on OSCommerce, Magneto or Joomla Like Ecommerce environment. Practical Experience on SEO (Search Engine optimization) process.

Test Automation Developer at Sirius XM Radio Inc
  • November 2013 to August 2015

Design and implement re-usable Automation test scripts using industry standard tools (HP/Mercury QTP, VB Scripts and open source tools like Selenium.
•Develop and maintain test scripts that automate testing of enterprise applications through the entire product life cycle.
•Create and execute test scripts, cases, and scenarios that will validate system performance according to specifications.
•Produce reports and documentation for all automated testing efforts, results, activities, data, logging and tracking.
•Coordinate with other team members to ensure maximum coverage is obtained through automation.
•Frequently communicate test progress, test results, and other relevant information to team members and management.
•Develop/enhance and document automated testing methodology
•Start working with a highly Data Sensitive and keyword Driven complex Hybrid framework. Modified modules when needed, added new module without affecting the entire framework and deleted modules or code if needed.
•Maintain the Framework Build to Build. Enhanced is power by updating with new elements comes in It Worlds
•Automate Web service Process by Using QTP.
•Develop script using VB script language that reads data from an external data source like Excel / Data Base table format them as an Input XML send it to an END point and finally retrieve, manipulate and compare the response xml with web services requirement criteria and generates reports.
•Used Selenium Web-driver JAVA project to perform cross browser testing for Web UI
•Automation for 'Pipe-Line Project 'using Selenium Web-driver And Java Code in ECLIPS IDE.
•Automation 'Omega Web Repository' (an online search engine for company’s internal use for scanned document) regression test using Selenium Web-driver And Java Coding in ECLIPS IDE.
•Start Automation test for 'ECA Project’ using Selenium Web-driver And Java Code in ECLIPS IDE.
•Developed and maintained Test scripts for automating the testing procedure using Quick Test Pro., Selenium Web-driver, Eclipse
•Perform White Box Testing and Unit Testing using Selenium JUNIT framework.
•Develop test automation, using BDD approach and using cucumber Framework
•Verified in UNIX environment to make sure different data validation point meet the Requirement.
•Accessed terminal like Putty/Telnet/ Humming through automation and send Unix Command to do various configuration steps related to Application
•Work with engineering services team to run tests in continuous integration environment (Jenkins) and report on results

Environment/ Testing Tools: Windows 2000/2003 Server, Oracle, Solaris, HP Quick Test Professional, Quality Center ALM, TOAD, .Net, UNIX

Application Quality Analyst iii / QA Automation Lead at Fannie Mae
  • United States
  • August 2009 to November 2013

Set up QTP scripts to run in batch mode at the beginning of each QA cycle.
•Design Data-Driven Testing Framework to automate the functional validation task once the execution has completed using Quality Center/ QTP, Unix Shell Script and PL/SQL.
•Reduce the maintenance cost and brought efficiency by developing Data-Driven Testing Framework, it was an ETL application which Extract transforms and loads data into multiple Database and also produces reports in different format like XML, Excel, and CSV. in each build the automation main script don’t needed to touch as it was able to connect different external data source, loads those data inside the script’s native variables or data repository compare them and generated reports stating Passed or failed state and also provides cell by cell break down result of data.
•Design Keyword-Driven Framework to reduce the time of manual testers testing process on GUI/Web UI using Quality Center/ QTP, Unix Shell Script and PL/SQL.
•Developed Driver script for Keyword-Driven Framework utilizing Scripting Language like VB Script which reads the functionality of the application-under-test as well as the step-by-step instructions for each test from External Data source like Excel File or text file then transform that step/action into keys and calls functions from function library associate to those keys.
•Trained the team on how to use the developed framework, quality standards, implement the change management system in the automation suites and prepare the user friendly documentation for the scripts
•Build, customize and configure tools for automating execution and verification of results for various test efforts
•Developed & Ran PL/SQL scripts in TOAD to populate test data on the backend Oracle tables.
•Wrote Complex SQL queries to verify deployed file data in the Oracle tables.
•Creates Unix Script and used them as a wrapper to perform various task like moving file from one UNIX box to another, Renaming file, clearing file.
•Work with development, Business, QA, and other Automation engineers to determine root cause of batch job execution problems while automating test cases
•Provide design documentation; participate in technical review for the test automation projects
•Provide metrics and project planning updates for the testing effort.
•Maintain the process of functional components implemented as batch processes (both inbound and outbound file processing).
•Created a java based framework using Eclipse IDE to load data from different data source XML, CSV file then transform them with required rules and then load them into Database Like Oracle
•Designed and developed script using Eclipse, and Core Java for Unit Test Purpose using JUnit and Log4j
•Leads a team of automation engineers. Provide supervision and maintain test automation code with code version control, Clear Case.

Environment/ Testing Tools: Windows 2000/2003 Server, Oracle, Solaris, ETL AB initio, HP Quick Test Professional, Quality Center, TOAD, Clear Case, Clear Quest, Doors

Test Automation Developer at CareFirs
  • May 2006 to March 2009

Developed user-friendly web-based GUI using JSP, Java in Eclipse, CSS, HTML, and DHTML.
•Designed and developed Test Plans, Test Cases, and Expected Results and Prioritized tests for the applications using Quality Center.
•Created test criteria in Quality Center using existing use cases and requirements documents.
•Performed Functional and Regression testing manually and logged defect using HP Quality Center.
•Involved in writing and implementation of the test plan and developed test cases and test scripts in Mercury Test management tool Quality Center.
•Documented Defects found during test on the Quality Center and communicated to the responsible QA or development personnel for the recorded problems.
•Used Quality Center for Planning, execution and tracking bugs and involved in generation of reports.
•Performed incident tracking and reporting through Quality Center and used Quality Center for storing requirements, test cases, and defects.
•Managed the automation testing effort.
•Installed HP automation tools in designated machines.
•Developed Automation test scripts Using VB script for the GUI, Functional, Data Driven and Regression testing using QTP.
•Used QTP checkpoints to automatically capture and verify properties of objects on web page to verify proper functionality. Used standard checkpoints to verify radio buttons. Text checkpoints were used in QTP scripts to verify text data appeared as expected.
•Enhanced QTP scripts by using VBScript in the Expert view, to verify whether buttons were enabled or not as well as maximizing fields and retrieving the maximum values held in these fields.
•Developed automated scripts using Quick Test Professional that saved 200 hours of testing per QA cycle by the deployment of files to the servers and testing basic functionalities of the application.
•Performed data driven testing using parameterization in Quick Test Professional for the deployment of files and creation of engagement data.
•Preparing Test Data table to run the automation suite.
•Created baseline QTP scripts that effectively performed smoke testing to verify the stability and sanity of the application.
•Set up QTP scripts to run in batch mode at the beginning of each QA cycle.
•Assisted with setting up the QA build including internationally deploying files across territories for use by the testing team.
•Wrote simple SQL queries to verify deployed file data in the Oracle tables.
•Ran PL/SQL scripts in TOAD to populate test data on the backend Oracle tables.
•Set up user test machines based on configurations approved by management for User Acceptance Testing purposes.
•Provided regular status reports to management on tasks completed.
•Contributed significantly to the document created for the establishment of standards and procedures for automation.
•Used regression and integration testing to manually test the application repeatedly across several QA environments and scenarios till the product was released.
•Used LoadRunner for Performance Testing, Load testing and Stress testing.
•Created VUser Scripts in Virtual User Generator (VuGen) as per the Business requirement and created various scenarios using controller in LoadRunner.
•Analyzed the results to find the bottlenecks by using Analysis Component in LoadRunner.
•Updated weekly status to QA Manager for the progress of the testing efforts and open issues to be resolved.
•Actively Participated in status reporting meetings and interacted with development team to discuss the technical issues.

Environment/ Testing Tools: Windows 2000/2003 Server, Oracle, Solaris, Lotus Notes Release, HP Quick Test Professional, Quality Center, TOAD

Test Automation Engineer at Bank of New York
  • United States
  • August 2005 to April 2006

Preparing Manual Test case refractor list to select test cases for scope of automation.
•Estimating automation timelines.
•Preparing Test Data requirement sheet to analyze data required to run automation test suite
•Preparing automation framework using QTP.
•Building Keyword, Application, Business, Generic function libraries.
•Maintaining Object repositories.
•Used Data Driven Framework to Automate Client side regression test Scripts through (QTP).
•Performed Regression and Functional testing using Quick Test Professional (QTP).
•Analyzed the test results generated by QTP and triggered suitable corrective actions.
•Responsible for performing Functional Testing on the application by creating Automated VB Script using QTP.
•Automated regression testing using QTP, VB script for testing GUI functions, acceptance and limit validations.
•Worked on QTP to validate links, objects, images and text on GUI interface to verify its proper functionality.
•Conducted UAT (User Acceptance) Testing.
•Version controlling of scripts by using Rational Clear Case.
•Maintaining / Executing QTP Test scripts from Quality Center.
•Logging defects into Quality Center.
•Send the Status Report (Daily, Weekly etc.) to the Client.
•Daily status-check meetings with the Team.
•Act as single point of contact between Offshore and Onshore.
•Developed the lower level business requirements from higher-level functional business requirements.
•Involved in developing the Design documents.
•Developed Test Plan and Test cases in HP Quality Center from business requirements.
•Participated in various meetings with Project Managers to discuss the status of the testing and/or decide the best suitable test process/procedure to be followed.
•Involved manually in Integration testing, Positive testing, Negative testing, Functional testing, GUI testing and UAT testing on various browsers.
•Used manual and automatic correlation and parameterization techniques in generating the test scripts for LoadRunner.
•Executed LoadRunner scenarios using LoadRunner to perform Stress and scalability test.
•Worked with LoadRunner Controller for configuring and execution of performance test scenarios with multiple virtual users and multiple virtual user scripts, managed and collected metrics for the various system monitors.
•Used LoadRunner to analyze the response times of the business transactions under load; developed reports and graphs to present the stress test results to the management.
•Load tested the application and watched CPU Usage, Memory usage and Bandwidth on the server.
•Responsible for sending Release notes of each build to the Users, Management and testing team.
•Sending electronic UAT Signoff for each build and getting approval from the users.

Environments/ Testing Tools: Windows NT /2000, Quality Center, QTP, UNIX, HP Quality Center

QA Analyst at AT&T
  • United States
  • May 2004 to July 2005

Preparing and implementing Test Plan, planning, tracking and prioritizing project deliverables.
•Arranged the Hardware and software requirement for Test Setup. Escalated the issues about project requirements (Software, Hardware, Resources) to Project Manager / Test Manager.
•Analysis of requirements, keeping tracks of the new / changing requirements of the project and forecast / estimates the Project future requirements.
•Attend the regular client call and discuss the weekly status with the client.
•Defect control through Quality Center.
•Send the Status Report (Daily, Weekly etc.) to the Client.
•Frequent status-check meetings with the Team.
•Act as the single point of contact between Development and Testers for iterations, testing and deployment activities.
•Maintenance of QTP scripts and there execution, debugging.
•Creating / Reviewing Test Scripts by using QTP.
•Implementing automation framework using QTP.
•Executing test scripts / test cases from Quality Center.
•Creating / Reviewing / Tracking the Test Cases in Quality Center.
•Writing SQL queries for testing the mapping of UI fields to backend fields.
•Maintaining Test Assets (Requirements, Test Scripts, Test Cases, Test Execution, and Defects) in Quality Center.

Education

Bachelor's degree, Computer Information and System
  • at DeVry UniversityAmerican InterContinental UniversityNational Institute of Information Technology (NIIT)
  • January 2003

of

High school or equivalent, Computer Information and System
  • at DeVry UniversityAmerican InterContinental UniversityNational Institute of Information Technology (NIIT)
  • January 2003

of

Bachelor's degree, Computer Information and System
  • at DeVry UniversityAmerican InterContinental UniversityNational Institute of Information Technology (NIIT)
  • January 2003

of

Specialties & Skills

Test Automation
Selenium
Software Testing
AUTOMATION
AUTOMATE
MICROSOFT WINDOWS 2000
PROGRESS
QUALITY
REQUIREMENTS
SHELL SCRIPTING