Basic Guidelines for storage performance … [Read more...]
Your software might work for a given workload and might not for others. Large scale systems are susceptible to the workloads and environment they operate under. It is utmost importance to test software against realistic workloads and test data. It is not easy to anticipate or model test model with respect to production environment. What […]
Data Analysis tests for Big Data Key Features of the workloads that we choose for Testing : Representative Workloads : The workloads in those benchmarks are all representative in their own field. Diverse Programming Models : In data centers, there are a large amount of programming models, e.g., MapReduce, Dryad, for developers or users to […]
Table of Contents: TL;DR Getting Started: T-SQLIntroductionCreate & Use databaseCreate & Describe tableInsert DataUpdate DataSelect & UpdateCreating viewsCreate Stored proceduresDrop ViewDrop ProcedureDrop TableRemove DatabaseReferences : Getting Started: T-SQL This is a basic practicals oriented getting started guide to Microsoft SQL server. It is intended to get hands dirty quickly and revise SQL commands and syntax […]
The ‘Apriori’ to Storage Performance Benchmarking Storage SAN/NAS/HyperConverged (Block, File, Object), are trying to keep up with ever increasing workloads to support. A storage sub system performs best against known environments. All hell breaks loose when sub systems try to cater to all workloads. Storage performance as a benchmarking study concerns itself with how fast […]
To start Test planning deep understanding of product is required. Test coverage gives an idea of targeting problematic functions from all transiotionable states. Test coverage gives an idea of what to cover and how to cover depends on your test case design startegy. Test coverage gives an idea of functions Not excercised.
It gives minimum number of test cases to excercise/traverse each function/lines-of-code and that may become part of sanity or functional test cases.