استراتيجية اختبار تحويل البيانات

استراتيجية اختبار تحويل البيانات

الفوركس الاتجاه الملاح تنزيل
724 الخيارات الثنائية
تداول الفوركس اي فون التطبيق


أسعار الفوركس اليوم في الهندية أباكاه الخيار ثنائي إيتو الفوركس العقود الآجلة الحية أفضل إشارات وسطاء الخيارات الثنائية اتفاقية الشراكة مع الفوركس التمساح استراتيجية الخيار إق

استراتيجية اختبار ترحيل البيانات: دليل كامل لنجاح اختبار ترحيل البيانات. يتطلب اختبار ترحيل البيانات إستراتيجية شاملة للحد من المخاطر وتحقيق هجرة ناجحة للمستخدمين النهائيين. في هذه المقالة، يحدد ديفيد كاتزوف، المدير الإداري لشركة فاليانس بارتنرز، وهي تقنية متخصصة في نقل البيانات ومزود الخدمة، مخططا لتصميم استراتيجية فعالة لاختبار ترحيل البيانات. ديفيد جولات قبالة المادة مع قائمة مرجعية مفيدة من التوصيات التي يمكن للقارئ استخدامها لقياس نهجهم. كيفية تنفيذ استراتيجية فعالة لترحيل البيانات. يلعب االمتثال ومخاطر األعمال دورا هاما في تنفيذ أنظمة معلومات الشركات. والمخاطر المرتبطة بهذه النظم معروفة عموما. ومع ذلك، وكجزء من عملية التنفيذ، سيتم ملء العديد من نظم المعلومات هذه مع البيانات القديمة ومخاطر الامتثال والأعمال المرتبطة بالبيانات وهجرة المحتوى ليست بالضرورة مفهومة. وفي هذا السياق، فإن المخاطر المرتبطة بترحيل البيانات هي نتيجة مباشرة لخطأ الترحيل. وعلاوة على ذلك، فإن استراتيجيات اختبار الصناعة للتخفيف من هذه المخاطر، أو بشكل أكثر تحديدا خطأ في نقل البيانات، تفتقر إلى الاتساق وهي بعيدة كل البعد عن الحتمية. تقدم هذه المقالة أفكارا وتوصيات حول كيفية إنشاء استراتيجية اختبار ترحيل البيانات أكثر قوة واتساقا. قبل الغوص في التفاصيل، قليلا من الخلفية - فاليانس الشركاء قد اختبرت مئات البيانات وهجرة المحتوى، في المقام الأول في الصناعات التي تنظمها ادارة الاغذية والعقاقير (الأدوية، والأجهزة الطبية، والتكنولوجيات الحيوية والمنتجات الغذائية) والتصنيع والسيارات. وتشمل المعلومات المقدمة هنا بعض الدروس المستفادة من مراقبة الجودة عملائنا وتاريخ الخطأ الفعلي من اختبار هجرات مئات الآلاف من الحقول و تيرابايت من المحتوى. والنهج الموصى به في تصميم استراتيجيات اختبار الهجرة هو توثيق المخاطر، واحتمال حدوثها، ومن ثم تحديد وسائل التخفيف من المخاطر عن طريق الاختبار حيثما كان ذلك ملائما. إن تحديد المخاطر أمر صعب وسوف يكون جزء كبير من العملية محددا للنظام الذي يجري ترحيله. دعونا نراجع نظامين لتوضيح هذه النقطة: في الحالة الأولى، عادة ما يتم تعريف البيانات المالية المهاجرة في الخدمات المصرفية للأفراد من خلال عمليات الترحيل الكبيرة الحجم (10 أو 100 من ملايين السجلات) حيث يكون المصدر إلى سجلات المقصد متشابه جدا ويتضمن الحد الأدنى من الترجمة، وإثراء البيانات إن وجد. للحصول على مثال ثان، ضع في اعتبارك إدارة شكاوى شركة منتجات استهلاكية. وعادة ما تكون هذه النظم غير ناضجة والتنفيذ الأحدث، وعمليات الأعمال المرتبطة بها، يجب أن تتكيف مع متطلبات الأعمال التجارية والامتثال المختلفة. هذه النظم لديها حجم متواضع مقارنة (10، أو 100 من ألف من السجلات) مع ترجمة معقدة، وإثراء البيانات لاستكمال السجل الأحدث كما يتم ترحيلها. في كلتا الحالتين، يعتبر الحصول على البيانات التي تم ترحيلها بدقة في نظام الوجهة أمرا بالغ الأهمية. ومع ذلك، فإن العملية التي يتم من خلالها تعريف الدقة تختلف اختلافا كبيرا بين هذين النظامين وعمليات الترحيل المرتبطة بها. في الحالة الأولى، تطورت صناعة الخدمات المالية إلى درجة وجود معايير تبادل البيانات، مما يبسط هذه العملية إلى حد كبير. وفي حالة ترحيل بيانات إدارة الشكاوى، سيلزم إجراء تحليل مسبق بدرجة أكبر من أجل "ملاءمة" البيانات القديمة في النظام الجديد. وسيستمد هذا التحليل إثراء البيانات لملء السجلات غير المكتملة، وتحديد متطلبات تطهير البيانات من خلال تحليل ما قبل الهجرة، والتشغيل الجاف لعملية الترحيل والتحقق من النتائج التي تدور قبل فهم المتطلبات النهائية لترحيل البيانات. خصائص النظام جانبا، وهناك العديد من الخيارات لتقليل حدوث خطأ الهجرة من خلال الاختبار. وتستعرض المناقشة التالية هذه الخيارات وتقدم مجموعة من التوصيات للنظر فيها. اختبار ترحيل البيانات: ما هي الخيارات؟ ويعتمد النهج الواقعي لاختبار البيانات وهجرة المحتوى على أخذ العينات، حيث يتم اختيار بعض مجموعة فرعية من البيانات أو المحتوى العشوائي وتفتيشها لضمان اكتمال عملية الترحيل "كما هي مصممة". تلك التي اختبرت الهجرات باستخدام هذا النهج هي على دراية نموذجية اختبار متكرر، طريقة التصحيح وإعادة الاختبار، حيث عمليات الإعدام اللاحقة لعملية الاختبار تكشف عن ظروف خطأ مختلفة كما يتم مراجعة عينات جديدة. أعمال أخذ العينات، ولكنها تعتمد على مستوى مقبول من الخطأ وافتراض يتعلق بالتكرار. ويعني مستوى مقبول من الخطأ أن أقل من 100٪ من البيانات سيتم ترحيلها دون خطأ وأن مستوى الخطأ يتناسب عكسيا مع عدد العينات التي تم اختبارها (راجع معايير المعاينة مثل أنسي / أسك Z1.4). ووفقا للافتراض المتعلق بالتكرار، فإن حقيقة أن العديد من حالات الهجرة تتطلب أربعة أو خمسة أو أكثر من تكرار الاختبارات مع نتائج مختلفة تعني ضمنا أن أحد المبادئ الرئيسية لأخذ العينات لا يتم تأييده، أي أن "عدم المطابقة يحدث بشكل عشوائي وباستقلال إحصائي ... ". حتى مع هذه العيوب، أخذ العينات لها دور في استراتيجية اختبار محددة بشكل جيد، ولكن ما هي خيارات الاختبار الأخرى؟ تسرد الخيارات التالية للاختبار حسب مرحلة عملية الترحيل: اختبار هجرة ما قبل البيانات. تحدث هذه الاختبارات في وقت مبكر من عملية الترحيل، قبل اكتمال أي عملية ترحيل، حتى الهجرة لأغراض الاختبار. تتضمن خيارات اختبار ما قبل الترحيل: تحقق من نطاق أنظمة المصدر والبيانات مع مجتمع المستخدمين وتكنولوجيا المعلومات. يجب أن يتضمن التحقق البيانات التي سيتم تضمينها وكذلك المستبعدة، وإذا كان ذلك مناسبا، مرتبطة بالاستعلامات المحددة المستخدمة للترحيل. تحديد المصدر لاستهداف تعيينات عالية المستوى لكل فئة من فئات البيانات أو المحتوى والتحقق من أن النوع المطلوب تم تعريفه في نظام الوجهة. تحقق من متطلبات بيانات نظام الوجهة مثل أسماء الحقول ونوع الحقل والحقول الإلزامية وقوائم القيمة الصالحة وغيرها من عمليات التحقق من صحة المستوى الميداني. باستخدام المصدر إلى تعيينات الوجهة، واختبار البيانات المصدر ضد متطلبات نظام الوجهة. على سبيل المثال، إذا كان نظام الوجهة لديه حقل إلزامي، تأكد من أن المصدر المناسب لا فارغ أو إذا كان حقل نظام الوجهة يحتوي على قائمة من القيم الصالحة، فاختبر للتأكد من احتواء حقول المصدر المناسبة على هذه القيم الصالحة. اختبار الحقول التي تربط بشكل فريد المصدر والسجلات المستهدفة والتأكد من أن هناك تعيين نهائي بين مجموعات السجلات اتصالات مصدر مصدر ونظام الهدف من منصة الترحيل. تكوين أداة الاختبار ضد مواصفات الترحيل التي يمكن أن تكتمل في كثير من الأحيان عن طريق اختبار الصندوق الأسود على أساس الحقل الميداني. إذا كان الاختبار ذكي هنا، فيمكن استخدامه أيضا للتحقق من أن تعيينات مواصفات الترحيل كاملة ودقيقة. هجرة البيانات الرسمية مراجعة التصميم. إجراء مراجعة تصميمية رسمية لمواصفات الهجرة عند اكتمال اختبار ما قبل الترحيل، أو أثناء المراحل الأولى من تهيئة أداة الترحيل. وينبغي أن تشمل المواصفات ما يلي: تعريف أنظمة المصدر مجموعات بيانات نظام المصدر واستعلاماته التعيينات بين حقول النظام المصدر ونظام الوجهة عدد سجلات المصدر عدد سجلات أنظمة المصدر التي تم إنشاؤها لكل وحدة زمنية (لاستخدامها لتحديد توقيت الترحيل ووقت التعطل تحديد المصادر التكميلية متطلبات تطهير البيانات متطلبات الأداء متطلبات الاختبار. وينبغي أن يشمل الاستعراض الرسمي للتصميم ممثلين عن مجتمعات المستعملين المناسبة وتكنولوجيا المعلومات والإدارة. وينبغي أن تتضمن نتائج استعراض التصميم الرسمي قائمة بالقضايا المفتوحة، والوسائل اللازمة لإغلاق كل قضية والموافقة على مواصفات الهجرة وعملية للحفاظ على المواصفات متزامنة مع تهيئة أداة الترحيل (التي يبدو أنها تتغير باستمرار حتى يتم نقل الإنتاج ). اختبار هجرة ما بعد البيانات. بعد تنفيذ عملية الترحيل، يمكن تنفيذ اختبارات إضافية من النهاية إلى النهاية. نتوقع أن يتم تحديد عدد كبير من الأخطاء أثناء تشغيل الاختبار الأولي على الرغم من أنه سيتم تقليله إذا تم تنفيذ اختبار ما قبل الترحيل بشكل جيد. وعادة ما تتم عملية ما بعد الترحيل في بيئة اختبار وتشمل: اختبار إنتاجية عملية الترحيل (عدد السجلات لكل وحدة زمنية). سيتم استخدام هذا الاختبار للتحقق من أن وقت التوقف المخطط كاف. لأغراض التخطيط، ضع في اعتبارك الوقت للتحقق من اكتمال عملية الترحيل بنجاح. مقارنة السجلات التي تم ترحيلها إلى السجلات التي تم إنشاؤها بواسطة نظام الوجهة - تأكد من أن السجلات التي تم ترحيلها كاملة والسياق المناسب. ملخص التحقق - هناك العديد من التقنيات التي توفر معلومات موجزة بما في ذلك عدد السجلات والمختبر الاختباري. هنا، يتم تجميع عدد السجلات التي تم ترحيلها من نظام الوجهة ومن ثم مقارنتها بعدد السجلات التي تم ترحيلها. يوفر هذا النهج معلومات موجزة فقط وفي حالة وجود أي مشكلة، فإنه لا يوفر في كثير من الأحيان نظرة ثاقبة على السبب الجذري للقضية. مقارنة السجلات التي تم ترحيلها إلى المصادر - يجب أن تتحقق الاختبارات من أنه يتم ترحيل قيم الحقول وفقا لمواصفات الترحيل. باختصار، يتم استخدام قيم المصدر وتعيينات مستوى الحقل لحساب النتائج المتوقعة في الوجهة. ويمكن إكمال هذا الاختبار باستخدام أخذ العينات إذا كان ذلك ملائما أو إذا كانت عملية الترحيل تتضمن بيانات تشكل خطرا كبيرا على الأعمال أو الامتثال، يمكن التحقق من 100٪ من البيانات التي تم ترحيلها باستخدام أداة اختبار آلية. وتشمل مزايا النهج الآلي القدرة على تحديد الأخطاء التي من المرجح أن تحدث (الإبر المثل في كومة قش). بالإضافة إلى ذلك، نظرا لأنه يمكن تهيئة أداة اختبار تلقائية بالتوازي مع تهيئة أداة الترحيل، تتوفر إمكانية اختبار 100٪ من البيانات التي تم ترحيلها مباشرة بعد إجراء اختبار الترحيل الأول. بالمقارنة مع مقاربات أخذ العينات، فمن السهل أن نرى أن الاختبار الآلي يوفر وقتا كبيرا ويقلل من اختبار تكراري نموذجي، والتصحيح وإعادة الاختبار وجدت مع أخذ العينات. يحتوي المحتوى الذي تم ترحيله على اعتبارات خاصة. بالنسبة إلى الحالات التي يتم فيها نقل المحتوى بدون تغيير، يجب أن يتحقق الاختبار من الحفاظ على سلامة المحتوى ويرتبط المحتوى بسجل الوجهة الصحيح. هذا يمكن أن تكتمل باستخدام أخذ العينات أو كما سبق وصفها، ويمكن استخدام أدوات الآلي للتحقق من 100٪ من النتيجة. اختبار قبول المستخدم لترحيل البيانات. قد يكون من الصعب التعرف على التفاصيل الدقيقة الوظيفية المرتبطة باختلاط البيانات التي تم ترحيلها والبيانات التي تم إنشاؤها في نظام المقصد في مرحلة مبكرة من عملية الهجرة. يوفر اختبار قبول المستخدم فرصة لمجتمع المستخدم للتفاعل مع البيانات القديمة في نظام الوجهة قبل إصدار الإنتاج، وغالبا ما تكون هذه أول فرصة من نوعها للمستخدمين. وينبغي إيلاء الاهتمام للإبلاغ، والأعلاف المصب، وعمليات النظام الأخرى التي تعتمد على البيانات التي تم ترحيلها. هجرة الإنتاج. كل اختبار الانتهاء قبل هجرة الإنتاج لا يضمن أن عملية الإنتاج سيتم الانتهاء دون خطأ. وتشمل التحديات التي تظهر عند هذه النقطة أخطاء إجرائية وفي بعض الأحيان، أخطاء تكوين نظام الإنتاج. إذا تم استخدام أداة اختبار آلية لاختبار ترحيل البيانات والمحتوى، فإن تنفيذ عملية اختبار أخرى هو أمر بسيط ويوصى به. وفي حالة عدم استخدام نهج آلي، لا يزال يوصى باستخدام مستوى معين من العينات أو التحقق الموجز. استراتيجية اختبار ترحيل البيانات: توصيات التصميم. في سياق عمليات ترحيل البيانات والمحتوى، تمثل مخاطر الأعمال والامتثال نتيجة مباشرة لخطأ ترحيل البيانات، إلا أن إستراتيجية الاختبار الشاملة تقلل من احتمال حدوث أخطاء في نقل البيانات والمحتوى. وتقدم القائمة أدناه مجموعة من التوصيات لتحديد استراتيجية اختبار من هذا القبيل لنظام معين: إنشاء فريق شامل للهجرة، بما في ذلك ممثلون عن مجتمع المستخدمين، وتكنولوجيا المعلومات والإدارة. التحقق من المستوى المناسب من الخبرة لكل عضو في الفريق والتدريب على النحو المطلوب على مبادئ ترحيل البيانات، المصدر ونظام الوجهة. تحليل مخاطر الأعمال والامتثال مع الأنظمة المحددة التي يتم ترحيلها. وينبغي أن تصبح هذه المخاطر أساسا لاستراتيجية اختبار ترحيل البيانات. إنشاء، ومراجعة رسميا وإدارة مواصفات الهجرة كاملة - في حين أنه من السهل أن الدولة، وعدد قليل جدا من الهجرة تأخذ هذه الخطوة. تحقق من نطاق الترحيل مع منتدى المستخدم وتكنولوجيا المعلومات. فهم أن نطاق الترحيل قد يتم تنقيحه مع مرور الوقت حيث أن اختبار ما قبل وبعد الترحيل قد يكشف عن أوجه قصور في هذا النطاق الأولي. حدد (أو توقع) المصادر المحتملة لخطأ الترحيل وحدد استراتيجيات اختبار محددة لتحديد هذه الأخطاء ومعالجتها. هذا يحصل على أسهل مع الخبرة وفئات الخطأ والشروط المذكورة هنا توفر نقطة انطلاق جيدة. استخدم المصدر على مستوى الحقل لتعيين الوجهة لتحديد متطلبات البيانات لنظام المصدر. استخدم متطلبات البيانات هذه لإكمال اختبار ما قبل الترحيل. إذا لزم الأمر، تطهير أو تكملة البيانات المصدر حسب الضرورة. أكمل المستوى المناسب من اختبار ترحيل المشاركة. بالنسبة إلى عمليات الترحيل التي تحتاج إلى تقليل الأخطاء، ينصح باستخدام التحقق التلقائي بنسبة 100٪ باستخدام أداة تلقائية. تأكد من أن أداة الاختبار التلقائية هذه مستقلة عن أداة الترحيل. ننظر عن كثب في عائد الاستثمار من الاختبار الآلي إذا كان هناك بعض القلق حول التكاليف والالتزام الوقت أو الطبيعة التكرارية للتحقق الهجرة عن طريق أخذ العينات اختبار قبول المستخدم الكامل مع البيانات التي تم ترحيلها. يميل هذا النهج إلى تحديد أخطاء التطبيق مع البيانات التي تم ترحيلها كما تم تصميمها. اختبار تشغيل الإنتاج. إذا تم اختيار أداة اختبار آلية، فمن المرجح أن 100٪ من البيانات التي تم ترحيلها يمكن اختبارها هنا بأقل تكلفة إضافية أو وقت تعطل. إذا تم استخدام نهج الاختبار اليدوي، أكمل التحقق الموجز. نبذة عن الكاتب: ديفيد كاتزوف، فاليانس بارتنرز، Inc. ديفيد كاتزوف هو العضو المنتدب لتطوير المنتجات في فاليانس بارتنرز، إنك. انه يجلب أكثر من خمسة عشر عاما من تطبيقات هندسة البرمجيات، وإدارة المشاريع والخبرة التجارية المتوافقة لهذا الدور. بالإضافة إلى الإشراف على تطوير منتجات هجرة فاليانس، قام ديفيد بأدوار استشارية رئيسية للهجرة على نطاق واسع في أمجين، فايزر، وايث و J & أمب؛ J. الحصول على 50+ قائمة مرجعية ترحيل البيانات لتخطيط المشروع الخاص بك. & # xf007. ديلان جونز (محرر) & # xf017؛ 30 نوفمبر 2009 & # xf115؛ منهجية ترحيل البيانات. الوظائف ذات الصلة. إنشاء ترحيل بيانات الرعاية الصحية الناجحة: مقابلة الخبراء مع علي ماكجوكين. الترحيل إلى ساب سوتشيسفاكتورس: المشورة العملية لتدفق البيانات في الوقت المحدد، خارج الميزانية، ويضم مايلز دافيز. تيش بريفينغ: ميغراتينغ داتا & # 8211؛ كيف تعرف أنك على استعداد للذهاب؟ أفضل ممارسات ترحيل البيانات لمشروعك التالي. التعليقات مغلقة. لمساعدتك في إنشاء مهنة ناجحة لنقل البيانات أو مشروع أو نشاط تجاري. اختبار ترحيل البيانات. التحقق من صحة واختبار عملية ترحيل البيانات بسهولة. اختبار ترحيل البيانات. 80 في المئة من مشاريع ترحيل البيانات تفشل في تلبية التوقعات، مع مرور الوقت والميزانية. سواء أكنت ترحل من الأنظمة القديمة إلى نظام جديد أو تنتقل من برنامج مورد واحد إلى برنامج آخر، فقد أصبح ترحيل البيانات واحدا من أكثر المبادرات تحديا لمديري تقنية المعلومات. على الرغم من أن هذه المشاريع تعطي فوائد تجارية عالية، فإنها تميل إلى تنطوي على مستوى عال من المخاطر نظرا لحجم ونقد البيانات. ترحيل البيانات: مقاييس مخيفة. متوسط ​​تكلفة هجرة البيانات = 875،000 $ 34٪ من عمليات الترحيل فقدت أو فقدت البيانات 38٪ لديها شكل من أشكال الفساد البيانات 64٪ من مشاريع الهجرة لديها انقطاع غير متوقع / التوقف عن العمل تكلفة نموذجية من التوقف: 6.5 مليون $ / ساعة - صناعة الوساطة 2.8 مليون $ / ساعة - صناعة الطاقة تم تأجيل 72٪ من المنظمات بسبب ترحيل البيانات إلى درجة عالية من الخطورة. ترحيل البيانات: المخاطر الرئيسية. التوقف غير المتوقع تجاوزات الميزانية العميل أو العلامة التجارية تأثير تلف البيانات مشاكل أداء التطبيق فقدان البيانات. للحد من المخاطر وضمان أن يتم ترحيل البيانات وتحويلها، تحتاج إلى تنفيذ استراتيجية التحقق من صحة والاختبار شاملة. كيريسورج يساعدك على اختبار البيانات الخاصة بك بسرعة وسهولة. أتمتة عملية اختبار الترحيل: عملية من خطوتين. الخطوة رقم 1: ذي ويزاردز - فاست & أمب؛ سهل. لا الترميز المطلوبة. في عام 2018 عرض كيريزورج لدينا ويزاردز الاستعلام في المنتج. تساعد "معالج الاستعلام" الفريق على التحقق من صحة البيانات دون الحاجة إلى ترميز. فهي تسرع عملية التحقق بشكل كبير وتبسط عبء العمل. ويزاردز هي مفيدة ل: التحقق بسرعة من جدول إلى جدول يقارن، التحقق من صحة مئات الجداول بالدقائق التأكد من أن جميع الصفوف قد انتهت من دون مشكلة والتحقق من عدد الصفوف إجراء أي عمود ضروري إلى عمود يقارن التحقق من أنواع البيانات وعتبات البيانات. الخطوة رقم 2: اختبار التحويلات (إيتل) بالنسبة للبيانات التي تحتوي على تحولات، يمكنك إنشاء كيريبايرس في مكتبة التصميم لدينا - واحدة تهدف إلى النظام الحالي والآخر في النظام الجديد. يمكنك تشغيل الاستعلامات إما مباشرة أو في تاريخ معين & أمب؛ زمن. أو يمكن أن يتم تشغيلها تلقائيا بواسطة حدث، مثل بعد إتل أو عملية الإنشاء التي يتم إكمالها. يتم تنفيذ هذه المهمة ديفوبس من خلال كيريسورج أبي. والتقارير يمكن أن تكون عبر البريد الالكتروني تلقائيا إلى فريقك. دعم تقنيات البيانات. كيريزورج يدعم مستودعات البيانات، وبحيرات هادوب، ومخازن البيانات نوسقل، وقواعد البيانات التقليدية، والملفات المسطحة، شمل، خدمات الويب، جسون، والعديد من مخازن البيانات الأخرى كمصادر أو أهداف. العثور على بيانات سيئة (تعرف أيضا ب "أخطاء البيانات") استخدام كيريزورج يسمح فريقك لتنفيذ التحقق من صحة البيانات المتكررة & أمبير؛ استراتيجية الاختبار التي تتجنب التأثير السلبي لأي من هذه العيوب يمكن أن يكون على البيانات الخاصة بك وعلى جهود ذكاء الأعمال والتحليلات. أنواع نموذجية من المشاكل التي كيريسورج والتجارة؛ سوف تجد في مشاريع ترحيل البيانات الخاصة بك يمكن أن ينظر إليه هنا & غ؛ & غ؛ ويبينار سليد ديك & أمب؛ فيديو. الذهاب من خلال سطح السفينة الشريحة أو مشاهدة الفيديو لدينا من الويبينار ناجحة بعنوان هجرة البيانات: كيفية تجنب صدمة المشروع # 1 (ينظر أكثر من 30،000 مرة) وتعلم: لماذا التحقق من صحة البيانات والاختبار مهم جدا كيفية دمج اختبار البيانات في الجدول الزمني للمشروع كيفية أتمتة التحقق من صحة البيانات والاختبار كيفية إثبات لأصحاب المصلحة في المشروع أن الهجرة الخاصة بك خالية من البيانات السيئة. سوف كيريسورج تساعدك على تحقيق ما يلي: تحسين جودة البيانات الخاصة بك & أمب؛ إدارة البيانات تسريع دورات تسليم البيانات الخاصة بك تقليل التكاليف الخاصة بك & أمب؛ المخاطر توفر عائد استثمار ضخم. ولكن لا نعتقد لنا (أو عملائنا). محاولة لنفسك. اطلع على البرنامج التعليمي المجاني هنا & غ؛ & غ؛ هل تبحث عن بعض المعلومات؟ واسمحوا لنا أن نعرف كيف يمكننا مساعدتك وسوف فريقنا الاتصال بك على الفور. فريق كيريسورج. تعرف على كيريزورج. اتخاذ محاكمة. روابط شائعة. 9 إيست 37th ستريت، 9th فلور، نيو يورك، ني 10016. ترحيل البيانات اختبار البرنامج التعليمي: دليل كامل. نظرة عامة على اختبار ترحيل البيانات: وكثيرا ما سمعت أن يتم نقل تطبيق إلى ملقم مختلف، يتم تغيير التكنولوجيا، يتم تحديثه إلى الإصدار التالي أو نقلها إلى خادم قاعدة بيانات مختلفة وما إلى ذلك، ماذا يعني هذا في الواقع؟ ما هو متوقع من فريق الاختبار في هذه الحالات؟ من وجهة نظر الاختبار، كل هذا يعني أن التطبيق يجب أن يتم اختبارها بشكل شامل نهاية إلى نهاية جنبا إلى جنب مع الهجرة من النظام الحالي إلى النظام الجديد بنجاح. يجب إجراء اختبار النظام في هذه الحالة مع جميع البيانات، والتي يتم استخدامها في التطبيق القديم والبيانات الجديدة كذلك. يلزم التحقق من الوظائف الموجودة مع الوظائف الجديدة / المعدلة. بدلا من اختبار الهجرة فقط، يمكن أن يطلق عليه أيضا اختبار ترحيل البيانات، حيث سيتم ترحيل بيانات المستخدم بالكامل إلى نظام جديد. لذلك، يشمل اختبار الترحيل اختبار البيانات القديمة أو البيانات الجديدة أو الجمع بين كل من الميزات القديمة (الميزات غير المتغيرة) والميزات الجديدة. ويطلق على التطبيق القديم عادة تطبيق "تراث". جنبا إلى جنب مع التطبيق الجديد / ترقية، بل هو أيضا إلزامية للحفاظ على اختبار التطبيق القديم حتى تصبح جديدة / ترقية تصبح مستقرة ومتسقة. سوف اختبار الهجرة واسعة النطاق على تطبيق جديد تكشف عن القضايا الجديدة التي لم يتم العثور عليها في التطبيق القديم. ما سوف تتعلم: ما هو اختبار الترحيل؟ اختبار الترحيل هو عملية التحقق من ترحيل النظام القديم إلى النظام الجديد مع الحد الأدنى من تعطيل / التوقف، مع تكامل البيانات وعدم فقدان البيانات، مع ضمان أن جميع الجوانب الوظيفية وغير الوظيفية المحددة من التطبيق يتم استيفاء ما بعد- الهجرة. التمثيل البسيط لنظام الهجرة: لماذا اختبار الهجرة؟ وكما نعلم، فإن ترحيل التطبيق إلى نظام جديد يمكن أن يكون لأسباب مختلفة، وتوطيد النظام، والتكنولوجيا التي عفا عليها الزمن، والتحسين أو أي أسباب أخرى. وبالتالي، في حين أن النظام قيد الاستخدام يحتاج إلى ترحيل إلى نظام جديد، فمن الضروري ضمان النقاط التالية: أي نوع من الاضطراب / إزعاج تسبب للمستخدم بسبب الهجرة يحتاج إلى تجنب / التقليل. على سبيل المثال: التوقف، وفقدان البيانات تحتاج إلى التأكد إذا كان يمكن للمستخدم الاستمرار في استخدام كافة ميزات البرنامج عن طريق التسبب في الحد الأدنى أو أي ضرر أثناء الهجرة. على سبيل المثال: تغيير في وظيفة، وإزالة وظيفة معينة ومن المهم أيضا لاستباق واستبعاد، كل مواطن الخلل / العوائق المحتملة التي قد تحدث أثناء الهجرة الفعلية للنظام الحية. وبالتالي من أجل ضمان انتقال سلس للنظام الحي من خلال القضاء على تلك العيوب، فمن الضروري إجراء اختبار الهجرة في المختبر. هذا الاختبار له أهميته الخاصة وأنه يلعب دورا حيويا عندما تأتي البيانات في الصورة. من الناحية الفنية، فإنه من المطلوب أيضا أن يتم تنفيذها للأغراض التالية: لضمان توافق التطبيق الجديد / المطور مع جميع الأجهزة والبرمجيات الممكنة التي يدعمها التطبيق القديم. أيضا، ينبغي اختبار التوافق الجديد لأجهزة جديدة، منصة البرمجيات كذلك. للتأكد من أن جميع الوظائف الموجودة تعمل كما هو الحال في التطبيق القديم. يجب ألا يكون هناك أي تغيير في الطريقة التي يعمل بها التطبيق عند مقارنته بالإرث القديم. إمكانية وجود عدد كبير من العيوب بسبب الهجرة مرتفعة جدا. كثير من العيوب عادة ما تكون ذات صلة بالبيانات، وبالتالي تحتاج هذه العيوب إلى تحديد & أمب؛ ثابت أثناء الاختبار. للتأكد من أن وقت استجابة النظام للتطبيق الجديد / المطور هو نفسه أو أقل مما يتطلبه التطبيق القديم. للتأكد من أن الاتصال بين الخوادم والأجهزة والبرمجيات وما إلى ذلك، كلها سليمة ولا كسر أثناء الاختبار. تدفق البيانات بين المكونات المختلفة لا ينبغي كسر تحت أي شرط. متى يكون هذا الاختبار مطلوب؟ يجب إجراء الاختبار قبل وبعد الترحيل. يمكن تصنيف المراحل المختلفة من اختبار الهجرة التي سيتم إجراؤها في مختبر الاختبار على النحو التالي. اختبار ترحيل اختبار ما قبل الترحيل اختبار ترحيل المشاركة. بالإضافة إلى ما سبق، يتم تنفيذ الاختبارات التالية أيضا كجزء من نشاط الترحيل بأكمله. اختبار استعادة التحقق من التوافق السابق. قبل إجراء هذا الاختبار، من الضروري لأي مختبر أن يفهم بوضوح النقاط التالية: التغييرات التي تحدث كجزء من النظام الجديد (الخادم، الواجهة الأمامية، دب، مخطط، تدفق البيانات، وظيفة الخ،) لفهم استراتيجية الهجرة الفعلية التي وضعها الفريق. كيف يحدث الهجرة، خطوة بخطوة التغييرات يحدث في الجزء الخلفي من النظام والنصوص المسؤولة عن هذه التغييرات. وبالتالي فمن الضروري إجراء دراسة شاملة للنظام القديم والجديد ومن ثم تخطيط وتصميم حالات الاختبار وسيناريوهات الاختبار التي سيتم تغطيتها كجزء من مراحل الاختبار أعلاه وإعداد استراتيجية الاختبار. استراتيجية اختبار ترحيل البيانات. ويشمل تصميم استراتيجية الاختبار للهجرة مجموعة من الأنشطة التي يتعين القيام بها وعدد قليل من الجوانب التي ينبغي النظر فيها. هذا هو تقليل الأخطاء والمخاطر التي تحدث نتيجة للهجرة وإجراء اختبار الهجرة على نحو فعال. الأنشطة في هذا الاختبار: تشكيل فريق الاختبار مع أعضاء لديهم المعرفة المطلوبة & أمب؛ وتوفير التدريب المتعلق بالنظام الذي يجري ترحيله. وينبغي عدم عرقلة الأعمال التجارية الحالية بعد الهجرة، ومن ثم القيام باجتماعات "تحليل المخاطر التجارية" التي تشمل أصحاب المصلحة المعنيين (مدير الاختبار، محلل الأعمال، والمهندسين المعماريين، وأصحاب المنتج، وأصحاب الأعمال التجارية وما إلى ذلك)، وتحديد المخاطر والتدابير القابلة للتنفيذ. وينبغي أن يشمل الاختبار سيناريوهات للكشف عن تلك المخاطر والتحقق مما إذا كان قد تم تنفيذ تدابير التخفيف المناسبة. إجراء "تحليل الأخطاء المحتملة" باستخدام "مخططات خطأ التخمين" المناسبة ومن ثم تصميم الاختبارات حول هذه الأخطاء للكشف عنها أثناء الاختبار. # 3) تحليل نطاق الهجرة وتحديد الهوية: تحليل نطاق واضح من اختبار الهجرة كما متى وما يحتاج إلى اختبار. # 4) تحديد الأداة المناسبة للترحيل: في حين تحديد استراتيجية هذا الاختبار، الآلي أو اليدوي، وتحديد الأدوات التي سيتم استخدامها. على سبيل المثال: أداة تلقائية لمقارنة بيانات المصدر والوجهة. # 5) تحديد بيئة الاختبار المناسبة للترحيل: حدد بيئات منفصلة لبيئات ما قبل النشر والترحيل لإجراء أي عملية تحقق مطلوبة كجزء من الاختبار. فهم وتوثيق الجوانب التقنية لنظام الإرث والنظام الجديد للهجرة، لضمان تهيئة بيئة الاختبار وفقا لذلك. # 6) اختبار اختبار الهجرة وثيقة ومراجعة: إعداد وثيقة مواصفات اختبار الهجرة التي تصف بوضوح نهج الاختبار ومجالات الاختبار وطرق الاختبار (الآلي واليدوي)، ومنهجية الاختبار (الصندوق الأسود، تقنية اختبار الصندوق الأبيض)، وعدد دورات الاختبار، وجدول الاختبار، ونهج إنشاء البيانات واستخدام البيانات الحية (معلومات حساسة يحتاج إلى أن تكون ملثمين)، ومواصفات بيئة الاختبار، واختبار المؤهلين وما إلى ذلك، وتشغيل جلسة استعراض مع أصحاب المصلحة. تحليل وتوثيق قائمة المهام لترحيل الإنتاج ونشره في وقت مبكر. مراحل الهجرة المختلفة. وفيما يلي مختلف مراحل الهجرة. المرحلة رقم 1: اختبار ما قبل الترحيل. قبل ترحيل البيانات، يتم تنفيذ مجموعة من أنشطة الاختبار كجزء من مرحلة اختبار ما قبل الترحيل. يتم تجاهل هذا أو لا ينظر في تطبيقات أبسط. ولكن عندما يتم ترحيل التطبيقات المعقدة، فإن أنشطة ما قبل الهجرة ضرورية. وفيما يلي قائمة بالإجراءات التي يتم تناولها خلال هذه المرحلة: تعيين نطاق واضح للبيانات - ما هي البيانات التي يجب تضمينها، وما هي البيانات التي يجب استبعادها، والبيانات التي تحتاج إلى تحويلات / تحويلات إلخ. إجراء رسم بيانات للبيانات بين الإرث والتطبيق الجديد - لكل نوع من أنواع البيانات في التطبيق القديم نوعه ذات الصلة في التطبيق الجديد ثم رسمها - رسم الخرائط مستوى أعلى. إذا كان التطبيق الجديد يحتوي على الحقل إلزامي فيه، ولكن ليس هو الحال في إرث، ومن ثم التأكد من أن إرث لا يكون هذا الحقل كما نول. - انخفاض مستوى رسم الخرائط. دراسة مخطط بيانات التطبيق الجديد أسماء الحقول والأنواع والقيم الدنيا والقصوى والطول والحقول الإلزامية، والتحقق من مستوى الميدان وما إلى ذلك، بشكل واضح وهناك عدد من الجداول في النظام القديم ويلاحظ أسفل وإذا تم إسقاط أي الجداول وأضاف بعد التحقق من الهجرة. يجب ملاحظة عدد من السجلات في كل جدول، يجب أن يتم ملاحظة المشاهدات في التطبيق القديم. دراسة الواجهات في التطبيق الجديد واتصالاتهم. يجب أن تكون البيانات المتدفقة في واجهة تأمين عالية وليس مكسورة. إعداد حالات الاختبار، سيناريوهات الاختبار، واستخدام الحالات لظروف جديدة في التطبيقات الجديدة. تنفيذ مجموعة من حالات الاختبار، سيناريوهات مع مجموعة من المستخدمين والحفاظ على النتائج، سجلات المخزنة. ويتعين التحقق من ذلك بعد الهجرة للتأكد من أن البيانات والوظائف القديمة سليمة. يجب تدوين عدد البيانات والسجلات بشكل واضح، يجب التحقق بعد الهجرة من عدم فقدان البيانات. المرحلة رقم 2: اختبار الترحيل. "دليل الهجرة" الذي يعده فريق الهجرة يجب أن يتبع بدقة لتنفيذ نشاط الهجرة. من الناحية المثالية، يبدأ نشاط الترحيل مع البيانات احتياطيا على الشريط، بحيث، في أي وقت يمكن استعادة النظام القديم. إن التحقق من جزء الوثائق من "دليل الترحيل" هو أيضا جزء من اختبار ترحيل البيانات. تحقق مما إذا كانت الوثيقة واضحة وسهلة المتابعة. يجب توثيق جميع النصوص والخطوات بشكل صحيح دون أي غموض. أي نوع من أخطاء الوثائق، مباريات ملكة جمال في ترتيب تنفيذ الخطوات تحتاج أيضا إلى أن تعتبر هامة بحيث يمكن الإبلاغ عنها وثابتة. يجب التقاط النصوص البرمجية للترحيل والدليل والمعلومات الأخرى المتعلقة بالترحيل الفعلي من مستودع التحكم في الإصدار للتنفيذ. وللتأکید علی الوقت الفعلي للھجرة من نقطة بدء الھجرة حتی استعادة النظام بنجاح، فإنھ من حالات الاختبار التي سیتم تنفیذھا، ومن ثم یتعین تسجیل "الوقت المستغرق لترحیل النظام" في النھایة تقرير الاختبار الذي سيتم تسليمه كجزء من نتائج اختبار الهجرة وهذه المعلومات ستكون مفيدة أثناء إطلاق الإنتاج. يتم استقراء وقت التوقف المسجلة في بيئة الاختبار لحساب وقت التوقف التقريبي في النظام المباشر. وهي تقع على النظام القديم حيث سيتم تنفيذ نشاط الهجرة. خلال هذا الاختبار، عادة ما يتم إسقاط جميع مكونات البيئة وإزالتها من الشبكة لتنفيذ أنشطة الهجرة. وبالتالي فمن الضروري أن نلاحظ "التوقف" المطلوبة لاختبار الهجرة. من الناحية المثالية، سيكون هو نفسه كما في الوقت الهجرة. بشكل عام، يتضمن نشاط الترحيل المحدد في وثيقة "دليل الترحيل" ما يلي: الترحيل الفعلي للتطبيق الجدران النارية والموانئ والمضيفات والأجهزة وتكوينات البرامج يتم تعديلها وفقا للنظام الجديد الذي يتم ترحيل إرث تسرب البيانات، يتم إجراء عمليات فحص أمنية تم التحقق من الاتصال بين كافة مكونات التطبيق. من المستحسن أن اختبار للتحقق من أعلاه في الجزء الخلفي من النظام أو عن طريق إجراء اختبار مربع أبيض. بمجرد اكتمال نشاط الترحيل المحدد في الدليل، يتم عرض جميع الخوادم وسيتم إجراء الاختبارات الأساسية المتعلقة بالتحقق من الترحيل الناجح، مما يضمن ربط جميع الأنظمة من النهاية إلى النهاية بشكل مناسب وتتحدث جميع المكونات مع كل أخرى، دب هو وتشغيلها، الواجهة الأمامية هو التواصل مع نهاية الظهر بنجاح. يجب تحديد هذه الاختبارات في وقت سابق وتسجيلها في وثيقة مواصفات اختبار الترحيل. هناك احتمالات أن البرنامج يدعم منصات مختلفة متعددة. في مثل هذه الحالة، يجب التحقق من الهجرة على كل من هذه المنصات بشكل منفصل. سيصبح التحقق من النصوص البرمجية للترحيل جزءا من اختبار الترحيل. في بعض الأحيان يتم التحقق من البرنامج النصي للترحيل الفردي أيضا باستخدام "اختبار المربع الأبيض" في بيئة اختبار مستقلة. وبالتالي فإن اختبار الهجرة سيكون مزيجا من "الصندوق الأبيض واختبار الصندوق الأسود". بعد الانتهاء من عملية التحقق ذات الصلة بالهجرة وتمرير الاختبارات المطابقة، يمكن للفريق المتابعة بشكل أكبر مع نشاط اختبار ما بعد الترحيل. المرحلة الثالثة: اختبار ما بعد الترحيل. مرة واحدة يتم ترحيل التطبيق بنجاح، ويأتي اختبار ما بعد الهجرة في الصورة. Here end-to-end system testing is performed in the testing environment. Testers execute identified test cases, test scenarios, use cases with legacy data as well as a new set of data. In addition to these, there are specific items to be verified in the migrated environments which are listed below: All of these are documented as a test case and included in the ‘Test Specification’ document. Check whether all the data in the legacy is migrated to the new application within the downtime that was planned. To ensure this, compare the number of records between legacy and the new application for each table and views in the database. Also, report the time taken to move say 10000 records. Check whether all the schema changes (fields and tables added or removed) as per the new system are updated. Data migrated from the legacy to new application should retain its value and format unless it is not specified to do so. To ensure this, compare data values between legacy and new application’s database. Test the migrated data against the new application. Here cover a maximum number of possible cases. To ensure 100% coverage with respect to data migration verification, use the automated testing tool. Check for database security. Check for data integrity for all possible sample records. Check and ensure that the earlier supported functionality in the legacy system works as expected in the new system. Check the data flow within the application which covers most of the components. The interface between the components should be extensively tested, as the data should not be modified, lost, and corrupted when it is going through components. Integration test cases can be used to verify this. Check for legacy data’s redundancy. No legacy data should be duplicated itself during migration Check for data mismatch cases like data type changed, storing format is changed etc., All the field level checks in the legacy application should be covered in the new application as well Any data addition in the new application should not reflect back on the legacy Updating legacy application’s data through the new application should be supported. Once updated in the new application, it should not reflect back on the legacy. Deleting the legacy application’s data in the new application should be supported. Once deleted in the new application, it should not delete data in legacy as well. Verify that the changes made to the legacy system support the new functionality delivered as a part of the new system. Verify the users from the legacy system can continue to use both the old functionality and new functionality, especially the ones where the changes are involved. Execute the test cases and the test results stored during the Pre-migration testing. Create new users on the system and carry out tests to ensure that functionality from the legacy as well as the new application, supports the newly created users and it works fine. Carry out functionality related tests with a variety of data samples (different age group, users from different region etc.,) It is also required to verify if ‘Feature Flags’ are enabled for the new features and switching it on/off enables the features to turn on and off. Performance testing is important to ensure that migration to new system/software has not degraded the performance of the system. It is also required to carry out Load and stress tests to ensure the system stability. Verify that the software upgrade has not opened up any security vulnerabilities and hence carry out security testing, especially in the area where changes have been made to the system during migration. Usability is another aspect which is to be verified, wherein if GUI layout/front-end system has changed or any functionality has changed, what is the Ease of Use that the end user is feeling as compared to the legacy system. Since the scope of Post-Migration testing becomes very huge, it is ideal to segregate the important tests that need to be done first to qualify that Migration is successful and then to carry out the remaining later. It is also advisable to automate the end to end functional test cases and other possible test cases so that the testing time can be reduced and the results would be available quickly. Few tips for testers for writing the test cases for post-migration execution: When the application is migrated, it does not mean that the test cases have to be written for the whole new application. Test cases already designed for the legacy should still hold good for the new application. So, as far as possible use the old tests cases and convert the legacy test cases to a new application’s cases wherever required. If there is any feature change in the new application, then test cases related to the feature should be modified. If there is any new feature added in the new application, then new test cases should be designed for that particular feature. When there is any feature drop in the new application, related legacy application’s test cases should not be considered for post migration execution, and they should be marked as not valid and kept apart. Test cases designed should always be reliable and consistent in terms of usage. Verification of Critical data should be covered in test cases so that it is not missed while executing. When the design of the new application is different from that of the legacy (UI), then the UI related test cases should be modified to adapt the new design. The decision to either update or write new ones, in this case, can be taken by the tester based on the volume of change happened. Backward Compatibility Testing. Migration of the system also calls for the testers to verify the ‘Backward Compatibility’, wherein the new system introduced is compatible with the old system (at least 2 previous versions) and ensures that it functions perfectly with those versions. Backward compatibility is to ensure: Whether the new system supports the functionality supported in earlier 2 versions along with the new one. The system can be migrated successfully from the earlier 2 versions without any hassles. Hence it is essential to ensure the backward compatibility of the system by specifically carrying out the tests related to support backward compatibility. The tests related to backward compatibility needs to be designed and included in the Test Specification document for execution. Rollback Testing. In case of any issues while carrying out the migration or if there is a migration failure at any point of time during migration, then it should be possible for the system to roll back to the legacy system and resume its function quickly without impacting the users and the functionality supported earlier. So, in order to verify this, Migration failure test scenarios need to be designed as part of negative testing and rollback mechanism needs to be tested. Total time required to resume back to the legacy system also needs to be recorded and reported in the test results. After rollback, the main functionality and the regression testing (automated) should be run to ensure that migration has not impacted anything and rollback is successful in bringing back the legacy system in place. Migration Test Summary Report. The test summary report should be produced after completing the testing and should cover the report on the summary of the various tests/scenarios carried out as part of various phases of migration with the result status (pass/fail) and the test logs. Time recorded for the following activities should be clearly reported: Total time for Migration Downtime of the applications Time spent to migrate 10000 records. Time spent for rollback. In addition to the above information, any observations /recommendations can also be reported. Challenges in Data Migration Testing. Challenges faced in this testing are mainly with data. Below are few in the list: We may find that the data used in the legacy application is of poor quality in the new/upgraded application. In such cases, data quality has to be improved to meet business standards. Factors like assumptions, data conversions after migrations, data entered in the legacy application itself are invalid, poor data analysis etc. leads to poor data quality. This results in high operational costs, increased data integration risks, and deviation from the purpose of business. Data migrated from the legacy to the new/upgraded application may be found mismatching in the new one. This may be due to the change in data type, format of data storage, the purpose for which the data is being used may be redefined. This result in huge effort to modify the necessary changes to either correct the mismatched data or accept it and tweak to that purpose. Data might be lost while migrating from the legacy to the new/upgraded application. This may be with mandatory fields or non-mandatory fields. If the data lost is for non-mandatory fields, then the record for it will still be valid and can be updated again. But if the mandatory field’s data is lost, then the record itself becomes void and it cannot be retracted. This will result in huge data loss and should have to be retrieved either from the backup database or audit logs if captured correctly. Huge Data that requires a lot of time to migrate within the downtime window of the migration activity. E.g: Scratch cards in Telecom industry, users on an Intelligent network platform etc., here the challenge is by the time, the legacy data is cleared, a huge new data will be created, which needs to be migrated again. Automation is the solution for huge data migration. #5) Simulation of a real-time environment (with the actual data): Simulation of a real-time environment in the testing lab is another real challenge, where testers get into different kind of issues with the real data and the real system, which is not faced during testing. So, data sampling, replication of real environment, identification of volume of data involved in migration is quite important while carrying out data Migration Testing. #6) Simulation of the volume of data: Teams need to study the data in the live system very carefully and should come up with the typical analysis and sampling of the data. E.g: users with age group below 10 years, 10-30 years etc., As far as possible, data from the live needs to be obtained, if not data creation needs to be done in the testing environment. Automated tools need to be used to create a large volume of data. Extrapolation, wherever applicable can be used, if the volume cannot be simulated. Tips to Smoothen the Data Migration Risks. Below given are few tips to be carried out in order to smoothen the data migration risks: Standardize data used in legacy system, so that when migrated, standard data will be available in new system Enhance quality of the data, so that when migrated, there is a qualitative data to test giving the feel of testing as an end-user Clean the data before migrating, so that when migrated, duplicate data will not be present in the new system and also this keeps the entire system clean Recheck the constraints, stored procedures, complex queries which yield accurate results, so that when migrated, correct data is returned in the new system as well Identify correct automation tool to perform data checks /record checks in the new system in comparison with the legacy. استنتاج. Hence considering the complexity involved in carrying out data Migration Testing, keeping in mind that a small miss in any aspect of verification during testing will lead to the risk of failure of migration at the production, it is very important to carry out careful and thorough study & analysis of the system before and after migration. Plan and design the effective migration strategy with the robust tools along with skilled and trained testers. As we know that Migration has a huge impact on quality of the application, a good amount of effort must be put up by the entire team to verify the entire system in all aspects like functionality, performance, security, usability, availability, reliability, compatibility etc., which in turn will ensure successful ‘Migration Testing’. ‘Different types of Migrations’ that typically happen quite often in reality and the ways to handle their testing will be explained briefly in our next tutorial in this series. About the Authors: This guide is written by STH Author Nandini. She is having 7+ years of experience into software testing. Also, thanks to STH Author Gayathri S. for reviewing and providing her valubale suggestions for improving this series. Gayathri is having 18+ years of experience in Software Development and Testing Services. Let us know your comments/suggestions about this tutorial. اقتراحات للقراءة. 10 comments ↓ These are perfect steps with accurate verification scenarios. kudos to the team. How to perform data mapping? Any idea or good tool suggestion? Really appreciate your efforts…great article. Keep posting. Nice write up . Security testing will be a major pillar stone in data migration going forward with cloud and fog computing architectures. The data volume and data loss part can be related to the project that we are doing right now. It’s a legacy to SQL server migration project and we have seen the implications of data loss. Nicely written by the Team… عمل عظيم. What a beautiful article, really great job. Data mapping mainly resolves potential issues. To perform data mapping, what you can do is, create a document with detailed listing of all the attributes of source data repository and its corresponding attributes in destination data repository. Also, include data transformation definitions for the attributes, if any. Build test cases around these to assess values in source and destination are accurate and maps according to their definitions. Good One!! Outlines all the critical points and helps both the development team and testing team bridge the gaps during migration activities. Very helpful and knowledgeable article about Migration. Keep posting such kind of informative concepts. Before red this blog i really don’t know about Migration, now i got some idea. Really i appreciate your efforts and your sincerity. شكر! In my legacy I have more than 4 lakhs data and how can I validate the same in my New Application ? 2 SQL Developer: Migrating Third-Party Databases. Migration is the process of copying the schema objects and data from a source MySQL or third-party (non-Oracle) database, such as Microsoft SQL Server, Sybase Adaptive Server, Microsoft Access, or IBM DB2 (UDB), to Oracle Database. You can perform the migration in an efficient, largely automated way. Thus, you have two options for working with databases other than Oracle Database in SQL Developer: Creating database connections so that you can view schema objects and data in these databases. Migrating these databases to Oracle, to take advantage of the full range of Oracle Database features and capabilities. This topic contains the following topics: 2.1 Migration: Basic Options and Steps. To migrate all or part of a third-party database to Oracle, you have the following basic options: However, before you perform any migration actions, you may want to prepare by setting any appropriate Migration user preferences (such as date and timestamp masks and Is Quoted Identifier On? ) and by reading relevant topics in Section 2.2, "Migration: Background Information and Guidelines". After you migrate by using the wizard or by copying tables to Oracle, verify that the results are what you expected. For a description of the user interface for database migrations, see Section 2.3, "SQL Developer User Interface for Migration". For a walk-through of a typical migration, go to the sqldeveloper\sqldeveloper\bin folder and enter the following command: 2.1.1 Migrating Using the Migration Wizard. The Migration wizard provides convenient, comprehensive guidance through the actions that can be involved in database migration (capturing the source database, converting it to Oracle format, generating DDL to perform the conversion, and so on). This is the recommended approach when performing a migration: you can resolve issues during these phases, and you can then inspect or modify objects to suit your needs. The migration wizard is invoked in a variety of contexts, such as when you right-click a third-party database connection and select Migrate to Oracle or when you click Tools , then Migration , then Migrate . Sometimes the wizard is invoked at a page other than the first step. On all pages except the last, enabling Proceed to Summary Page causes Next to go to the Summary page. The Repository page of the wizard requires that you specify the database connection for the migration repository to be used. The migration repository is a collection of schema objects that SQL Developer uses to manage metadata for migrations. If you do not already have a migration repository and a database connection to the repository, create them as follows: Create an Oracle user named MIGRATIONS with default tablespace USERS and temporary tablespace TEMP; and grant it at least the RESOURCE role and the CREATE SESSION, CREATE VIEW, and CREATE MATERIALIZED VIEW privileges. (For multischema migrations, you must grant the RESOURCE role with the ADMIN option; and you must also grant this user the CREATE ROLE, CREATE USER, and ALTER ANY TRIGGER privileges, all with the ADMIN option.) Create a database connection named Migration_Repository that connects to the MIGRATIONS user. Right-click the Migration_Repository connection, and select Migration Repository , then Associate Migration Repository to create the repository. If you do not already have a database connection to the third-party database to be migrated, create one. (For migrations other than from Microsoft Access, you should set the third party JDBC driver preference before creating the connection.) For example, create a database connection named Sales_Sybase to a Sybase database named sales . Connection : The database connection to the migration repository to be used. Truncate : If this option is enabled, the repository is cleared (all data from previous migrations is removed) before any data for the current migration is created. The Project page of the wizard specifies the migration project for this migration. A migration project is a container for migration objects. New lets you create a new project, or Existing lets you select from a list of existing projects. Name : Name to be associated with this migration project. Description : Optional descriptive comments about the project. Output Directory : The directory or folder in which all scripts generated by the migration wizard will be placed. Enter a path or click Choose to select the location. The Source Database page of the wizard specifies the third-party database to be migrated. Mode : Online causes the migration to be performed by SQL Developer when you have completed the necessary information in the wizard; Offline causes SQL Developer to perform the migration using a file (the Offline Capture Source File) that you specify. Connection (Online mode): The database connection to the third-party database to be migrated. To add a connection to the list, click the Add (+) icon; to edit the selected connection, click the Edit (pencil) icon. Available Source Platforms (Online mode) List of third-party databases that you can migrate. If the desired platform is not listed, you probably need the appropriate JDBC driver, which you can get by clicking Help , then Check for Updates , or by clicking the Add Platform link and adding the necessary entry on the Database: Third Party JDBC Drivers preferences page. Offline Capture Source File (Offline mode): The .ocp file (or .xml file for a Microsoft Access migration). This is a file that you previously created by clicking Tools , then Migration , then Create Database Capture Scripts . Cannot Connect error: If you receive the Cannot Connect error, this means that the .ocp file that is normally in the generated Offline Capture data is not present to identify the type of database, and therefore SQL Developer cannot select the appropriate plugin to perform the conversion. Ensure that the correct, valid .ocp file is present. The Capture page of the wizard lets you specify the database or databases (of the platform that you specified) to be migrated. Select the desired items under Available Databases , and use the arrow icons to move them individually or collectively to Selected Databases . The Convert page of the wizard lets you examine and modify, for each data type in the source database, the Oracle Database data type to which columns of that source type will be converted in the migrated database. For each source data type entry, the possible Oracle Data Type values reflect the valid possible mappings (which might be only one). Add New Rule : Lets you specify mappings for other source data types. Edit Rule : Lets you modify the mapping for the selected source data type. Advanced Options : Displays the Migration: Identifier Options preferences page. The Translate page of the wizard lets you specify the SQL objects to be translated. Select the desired items under Available SQL Objects , and use the arrow icons to move them individually or collectively to Selected SQL Objects . The Target Database page of the wizard specifies the Oracle database to which the third-party database or databases will be migrated. Mode : Online causes the migration to be performed by SQL Developer when you have completed the necessary information in the wizard; Offline causes SQL Developer to generate scripts after you have completed the necessary information in the wizard, and you must later run those scripts to perform the migration. Connection : The database connection to the Oracle Database user into whose schema the third-party database or databases are to be migrated. To add a connection to the list, click the Add (+) icon; to edit the selected connection, click the Edit (pencil) icon. Generated Script Directory : The directory or folder in which migration script files will be generated (derived based on your previous entry for the project Output Directory). Drop Target Objects : If this option is enabled, any existing database objects in the target schema are deleted before the migration is performed (thus ensuring that the migration will be into an empty schema). Advanced Options : Displays the Migration: Generation Options preferences page. The Move Data page of the wizard lets you specify options for moving table data as part of the migration. Moving the table data is independent of migrating the table definitions (metadata) Note that if you do not want to move the table data, you can specify the mode as Offline and then simply not run the scripts for moving the data. Mode : Online causes the table data to be moved by SQL Developer when you have completed the necessary information in the wizard; Offline causes SQL Developer to generate scripts after you have completed the necessary information in the wizard, and you must later run those scripts if you want to move the data. (Online moves are convenient for moving small data sets; offline moves are useful for moving large volumes of data.) Connections for online data move : The Source and Target connections for the third-party and Oracle connections, respectively. To add a connection to either list, click the Add (+) icon; to edit the selected connection, click the Edit (pencil) icon. Truncate Data : If this option is enabled, any existing data in a target (Oracle) table that has the same name as the source table is deleted before the data is moved. If this option is not enabled, any data from a source table with the same name as the corresponding target (Oracle) table is appended to any existing data in the target table. The Summary page of the wizard provides a summary of your specifications for the project, repository, and actions, in an expandable tree format. If you want to make any changes, go back to the relevant wizard page. To perform the migrat6ion actions that you have specified, click Finish . 2.1.2 Copying Selected Tables to Oracle. To copy one or more tables from a third-party database to an Oracle database, you can select the third-party tables and use the Copy to Oracle feature. With this approach, you do not need to create or use a migration repository, or to capture and convert objects. Note that this approach does not perform a complete migration. It only lets you copy the table, and optionally the table data, from the third-party database to an Oracle database. It does not migrate or re-create primary and foreign key definitions and most constraints. (Any UNIQUE constraints or default values are not preserved in the copy. NOT NULL constraints are preserved in most cases, but not for Microsoft Access tables.) The approach also does not consider any non-table objects, such as procedures. In addition, this approach supports autoincrement columns only if the INCREMENT BY value is 1, and if the sequence starts at 1 or is adjusted to MAX VAL + 1 at the first call to the trigger. If these restrictions are acceptable, this approach is fast and convenient. For example, many Microsoft Access database owners only need the basic table definitions and the data copied to an Oracle database, after which they can add keys and constraints in the Oracle database using SQL Developer. To copy selected tables, follow these steps: Create and open a database connection for the third-party database. (For migrations other than from Microsoft Access, you should set the third party JDBC driver preference before creating the connection.) For example, create a database connection named Sales_Access to a Microsoft Access database named sales.mdb, and connect to it. In the Connections navigator, expand the display of Tables for the third-party database connection, and select the table or tables to be migrated. To select multiple tables, use the standard method for individual and range selections (using the Ctrl and Shift keys) as appropriate. Right-click and select Copy to Oracle . In the Choose Database for Copy to Oracle dialog box, select the appropriate entries: Destination Database Name : Database connection to use for copying the selected tables into the Oracle database. (Only Oracle Database connections are shown for selection.) Include Data : If this option is enabled, any data in the table in the third-party database is copied to the new table after it is created in the Oracle database. If this option is not enabled, the table is created in the Oracle database but no data is copied. If Table Exists : Specifies what happens if a table with the same name as the one to be copied already exists in the destination Oracle database: Indicate Error generates an error and does not perform the copy; Append adds the rows from the copied table to the destination Oracle table; Replace replaces the data in the destination Oracle table with the rows from the copied table. Note that if the two tables with the same name do not have the same column definitions and if Include Data is specified, the data may or may not be copied, depending on whether the source and destination column data types are compatible. To perform the copy operation, click Apply . If a table with the same name as the one to be copied already exists in the destination Oracle database, then: If the two tables do not have the same column definitions, the copy is not performed. If the two tables have the same column definitions and if Include Data was specified, the data is appended (that is, the rows from the table to be copied are inserted into the existing Oracle table). 2.2 Migration: Background Information and Guidelines. The following topics provide background information and guidelines that are helpful in planning for a database migration: 2.2.1 Overview of Migration. An Oracle database provides you with better scalability, reliability, increased performance, and better security than third-party databases. For this reason, organizations migrate from their current database, such as Microsoft SQL Server, Sybase Adaptive Server, Microsoft Access, or IBM DB2, to an Oracle database. Although database migration can be complicated, SQL Developer enables you to simplify the process of migrating a third-party database to an Oracle database. SQL Developer captures information from the source database and displays it in the captured model , which is a representation of the structure of the source database. This representation is stored in a migration repository , which is a collection of schema objects that SQL Developer uses to store migration information. The information in the repository is used to generate the converted model , which is a representation of the structure of the destination database as it will be implemented in the Oracle database. You can then use the information in the captured model and the converted model to compare database objects, identify conflicts with Oracle reserved words, and manage the migration progress. When you are ready to migrate, you generate the Oracle schema objects, and then migrate the data. SQL Developer contains logic to extract data from the data dictionary of the source database, create the captured model, and convert the captured model to the converted model. Using SQL Developer to migrate a third-party database to an Oracle database provides the following benefits: Reduces the effort and risks involved in a migration project. Enables you to migrate an entire third-party database, including triggers and stored procedures. Enables you to see and compare the captured model and converted model and to customize each if you wish, so that you can control how much automation there is in the migration process. 2.2.1.1 Migration Implemented as SQL Developer Extensions. Migration support is implemented in SQL Developer as a set of extensions. If you want, you can disable migration support or support for migrating individual third-party databases. To view the installed extensions, and to enable or disable individual extensions, click Tools , then Preferences , then Extensions . Note that SQL Developer ships which all extensions and third-party database "plugins" available at the time of release, so to begin migrations other than for Microsoft Access, only the third-party drivers need be installed. 2.2.2 Preparing a Migration Plan. This topic describes the process of how to create a migration plan. It identifies the sections to include in the migration plan, describes how to determine what to include for each section, and explains how to avoid the risks involved in a migration project. This information includes: 2.2.2.1 Task 1: Determining the Requirements of the Migration Project. In this task, you identify which databases you want to migrate and applications that access that database. You also evaluate the business requirements and define testing criteria. To determine the requirements of the migration project: Define the scope of the project. There are several choices you must make about the third-party database and applications that access that database in order to define the scope of the migration project. To obtain a list of migration issues and dependencies, you should consider the following. What third-party databases are you migrating? What is the version of the third-party database? What is the character set of the third-party database? What source applications are affected by migrating the third-party database to an Oracle database? What is the third-party application language? What version of the application language are you using? In the scope of the project, you should have identified the applications you must migrate. Ensure that you have included all the necessary applications that are affected by migrating the database. What types of connectivity issues are involved in migrating to an Oracle database? Do you use connectivity software to connect the applications to the third-party database? Do you need to modify the connectivity software to connect the applications to the Oracle database? What version of the connectivity software do you use? Can you use this same version to connect to the Oracle database? Are you planning to rewrite the applications or modify the applications to work with an Oracle database? Use Table 2-1 to determine whether you have a complex or simple source database environment. Identify the requirements based on the specific scenario. If the migration project is a simple scenario, you may not have to complete all possible migration tasks. You make decisions based on your specific environment. For example, if you have a complex scenario, you may require extra testing based on the complexity of the application accessing the database. Table 2-1 Complex and Simple Scenarios. Involves more than one of the following: Large database (greater than 25 GB) Large applications (more than 100 forms, reports, and batch jobs) Database used by multiple lines of business. Large user base (more than 100) High availability requirement (such as a 24 X 7 X 365 environment) Involves the following: Small database (less than 25 GB) Simple online transaction processing (OLTP) Small application (less than 100 forms, reports, and batch jobs) Database used by one department. Small user base (less than 100) Average availability (business hours) Determine whether the destination database requires additional hardware and rewriting of backup schedules. Define testing and acceptance criteria. Define tests to measure the accuracy of the migration. You then use the acceptance criteria to determine whether the migration was successful. The tests that you develop from the requirements should also measure stability, evaluate performance, and test the applications. You must decide how much testing is necessary before you can deploy the Oracle database and applications into a production environment. Create a requirements document with a list of requirements for the migration project. The requirements document should have clearly defined tasks and number each specific requirement, breaking these into sub-requirements where necessary. 2.2.2.2 Task 2: Estimating Workload. In this task, you use SQL Developer to make calculated decisions on the amount of work that can be automated and how much is manual. To estimate the workload: Capture the captured model, create the converted model, and migrate to the destination database. You can analyze the source database through the captured model and a preview of the destination database through the converted model. After you have captured the source database, analyze the captured data contained in the captured model and the converted model. Ensure the content and structure of the migration repository is correct and determine how much time the entire process takes. Use the Migration Log pane to evaluate the capture and migration process, categorize the total number of database objects, and identify the number of objects that can be converted and migrated automatically. The migration log provides information about the actions that have occurred and record any warnings and errors. They identify the changes that have been made to the converted model so that you can evaluate if you should make changes to the applications that access the destination database. Evaluate and categorize the issues that occurred. The migration log can help by providing information about: Tables that did not load when you captured the source database. Stored procedures, views, and triggers that did not parse when you created the converted model. Syntax that requires manual intervention. Database objects that were not created successfully when you migrated the destination database. Data that did not migrate successfully when you migrated the destination database. For each error or warning in the migration log, evaluate the following: Number of times an issue occurred. Time required to fix the issues, in person-hours. Number of resources required to fix the issue. After you have solved a complex problem, it should be easier and quicker to resolve the next time you have the same problem. 2.2.2.3 Task 3: Analyzing Operational Requirements. In this task, you analyze the operational requirements, as follows: Evaluate the operational considerations in migrating the source database to a destination database. Consider the following questions: If the scope of the migration project is a complex scenario as defined in Table 2-1, Oracle recommends that you answer all of these questions. If you have a simple scenario, determine the answers to the most appropriate questions. What backup and recovery changes do you require? What downtime is required during the migration? Have you met the performance requirements? Are you changing the operational time window? What effect does the downtime have on the business? What training requirements or additional staff considerations are required? Is it necessary to have the third-party and the Oracle database running simultaneously? For each task, determine the resources and time required to complete. Create an initial project plan. Use the information that you have gathered during the requirements and planning stage to develop an initial project plan. 2.2.2.4 Task 4: Analyzing the Application. In this task, you identify the users of the applications that run on the source database, what hardware it requires, what the application does, and how it interfaces with the source database. You also analyze the method the application uses to connect to the database and identify necessary modifications. If the migration project is a complex scenario as defined in Table 2-1, Oracle recommends that you consider all of the following items. If you have a simple scenario, consider the most relevant items. To analyze the application: Determine whether changes to the application are required to make them run effectively on the destination database. If changes are required to the application, determine whether it is more efficient to rewrite or modify the applications. If you are rewriting the application to use the Oracle database, consider the following: Create the necessary project documentation to rewrite the application. For example, you need a design specification and requirements documentation. Rewrite the application according to the specification. Test the application works against the Oracle database. If you are modifying the application to use the Oracle database, consider the following: Identify the number of connections to the database that are in the application and modify these connections to use the Oracle database. You may need to change the connection information to use an ODBC or JDBC connection. Identify the embedded SQL statements that you need to change in the application before you can test it against the Oracle database. Test the application using the Oracle database. Allocate time and resource to address each issue associated with rewriting or modifying the application. Update the general requirements document for the project that you created in Task 1. 2.2.2.5 Task 5: Planning the Migration Project. In this task, you evaluate the unknown variables that the migration project may contain, such as the difference in the technologies of the source database and the destination database. During the planning stage, you: Estimate the budget constraints of the project. Gather information to produce a migration plan. Estimate how much time the migration project should take. Calculate how many resources are required to complete and test the migration. To plan a migration project: Define a list of tasks required to successfully complete the migration project requirements of Task 1. Categorize the list of tasks required to complete the migration project. You should group these tasks according to your business. This allows you to schedule and assign resources more accurately. Update and finalize the migration project plan based on the information that you have obtained from Task 3 and Task 4. Make sure the migration project plan meets the requirements of the migration project. The migration plan should include a project description, resources allocated, training requirements, migration deliverable, general requirements, environment analysis, risk analysis, application evaluation, and project schedule. 2.2.3 Before You Start Migrating: General Information. You may need to perform certain tasks before you start migrating a third-party database to an Oracle database. See the following for more information: See also any information specific to the source database that you will be migrating, as explained in Section 2.2.4. SQL Developer does not migrate grant information from the source database. The Oracle DBA must adjust (as appropriate) user, login, and grant specifications after the migration. Oracle recommends that you make a complete backup of the source database before starting the migration. For more information about backing up the source database, see the documentation for that type of database. If possible, begin the migration using a development or test environment, not a production database. 2.2.3.1 Creating a Database User for the Migration Repository. SQL Developer requires a migration repository to migrate a third-party database to an Oracle database. To use an Oracle database for the migration repository, you must have access to that database using a database user account. Oracle recommends that you use a specific user account for migrations, For example, you may want to create a user named MIGRATIONS, create a database connection to that user, and use that connection for the migration repository; and if you wish, you can later delete the MIGRATIONS user to remove all traces of the migration from the database. When you create a user for migrations, specify the tablespace information as in the following example, instead of using the defaults for tablespaces: Do not use a standard account (for example, SYSTEM) for migration. When SQL Developer creates a migration repository, it creates many schema objects that are intended only for its own use. For example, it creates tables, views, indexes, packages, and triggers, many with names starting with MD_ and MIGR . You should not directly modify these objects or any data stored in them. 2.2.3.2 Requirements for Creating the Destination Oracle Objects. The user associated with the Oracle database connection used to perform the migration (that is, to run the script containing the generated DDL statements) must have the following roles and privileges: You must grant these privileges directly to a user account. Granting the privileges to a role, which is subsequently granted to a user account, does not suffice. You cannot migrate a database as the user SYS . For example, you can create a user called migrations with the minimum required privileges required to migrate a database by using the following commands: After you have created the converted model and done first DDL generation done for the new database, it will be clear from the scripts which privileges will be required for your situation. 2.2.4 Before You Start Migrating: Source-Specific Information. Depending on the third-party database that you are migrating to an Oracle database, you may have to configure connection information and install drivers. For more information about specific third-party database requirements, see the following: 2.2.4.1 Before Migrating From IBM DB2. To configure an IBM DB2 database for migration: Ensure that the source database is accessible by the IBM DB2 database user that is used by SQL Developer for the source connection. This user must be able to see any objects to be captured in the IBM DB2 database; objects that the user cannot see are not captured. For example, if the user can execute a stored procedure but does not have sufficient privileges to see the source code, the stored procedure cannot be captured. Ensure that you can connect to the IBM DB2 database from the system where you have installed SQL Developer. Ensure that you have downloaded the db2jcc.jar and db2jcc_license_cu.jar files from IBM. In SQL Developer, do the following: Click Tools , then Preferences , then Database , then Third Party JDBC Drivers . Click Add Entry . Select the db2jcc.jar file. Repeat steps b through d for the db2jcc_license_cu.jar file. 2.2.4.2 Before Migrating From Microsoft SQL Server or Sybase Adaptive Server. To configure a Microsoft SQL Server or Sybase Adaptive Server database for migration: Ensure that the source database is accessible by the Microsoft SQL Server or Sybase Adaptive Server user that is used by SQL Developer for the source connection. This user must be able to see any objects to be captured in the Microsoft SQL Server or Sybase Adaptive Server database; objects that the user cannot see are not captured. For example, if the user can execute a stored procedure but does not have sufficient privileges to see the source code, the stored procedure cannot be captured. Ensure that you can connect to the Microsoft SQL Server or Sybase Adaptive Server database from the system where you have installed SQL Developer. Ensure that you have downloaded the JTDS JDBC driver from jtds.sourceforge/ . In SQL Developer, if you have not already installed the JTDS driver using Check for Updates (on the Help menu), do the following: Click Tools , then Preferences , then Database , then Third Party JDBC Drivers . Click Add Entry . Select the jar file for the JTDS driver you downloaded from jtds.sourceforge/ . In SQL Developer, click Tools , then Preferences , then Migration: Identifier Options , and ensure that the setting is correct for the Is Quoted Identifier On option (that is, that the setting reflects the database to be migrated). If this option is enabled, quotation marks (double-quotes) can be used to refer to identifiers; if this option is not enabled, quotation marks identify string literals. As an example of the difference in behavior, consider the following T-SQL code: If the Is Quoted Identifier On option is enabled (checked), the following PL/SQL code is generated: If the Is Quoted Identifier On option is disabled (not checked), the following PL/SQL code is generated: 2.2.4.3 Before Migrating From Microsoft Access. To configure a Microsoft Access database for migration: Make backup copies of the database file or files. Ensure that the necessary software (Microsoft Access, perhaps other components) is installed on the same system as SQL Developer. Ensure that the Admin user has at least Read Design and Read Data permissions on the MSysObjects, MSysQueries, and MSysRelationships system tables, as explained in the information about the Access tab in the Create/Edit/Select Database Connection dialog box. If security is enabled, you should turn it off by copying the contents of the secured database into a new database, as follows: SQL Developer does not support the migration of Microsoft Access databases that have security enabled. By default, SQL Developer uses the name of the Microsoft Access MDB file as the user name for the destination Oracle user. If you create an Oracle user in this way, the password is ORACLE. From the File menu in Microsoft Access, select New Database . Select the Blank Database icon, then click OK . In the File New Database option, type a name for the database, then click Create . From the File menu within the new database, select Get External Data , then select Import . Select the secured Microsoft Access database that you want to import, then click Import . From the Import Objects dialog, click Options . Select the Relationships and Definition and Data options. From the Tables tab, choose Select All . All Microsoft Access objects are copied over to the new Microsoft Access database, except for the security settings. If the application contains linked tables to other Microsoft Access databases, refresh these links by opening the application in Microsoft Access and performing the following: From the Tools menu in Microsoft Access 97, select Add Ins , then select Linked Table Manager . From the Tools menu in Microsoft Access 2000, select Database Utilities , then select Linked Table Manager . Ensure that the Microsoft Access database is not a replica database, but a master database. When you use the Exporter for Microsoft Access to export, an error message is displayed if the database is a replica. SQL Developer does not support the migration of a replica database. From the Tools menu within Microsoft Access, select Database , then select Compact Database to compact the Microsoft Access database files. Ensure that the Microsoft Access database file is accessible from the system where you have installed SQL Developer. Use the Oracle Universal Installer to verify that you have the Oracle ODBC driver installed. If you need to install the driver, it is available on the Oracle Database Server or Database Client CD. You can also download the Oracle ODBC driver from the Oracle Technology Network (OTN) website: Install the Oracle ODBC driver into an Oracle home directory that contains the Oracle Net Services. You can obtain the Oracle Net Services from the Oracle Client or Oracle Database CD. You install Oracle Net Services to obtain the Net Configuration Assistant and Net Manager. These allow you to create a net configuration in the tnsnames.ora file. For more information about installing the networking products needed to connect to an Oracle database, see the installation guide for your Oracle Database release. 2.2.4.3.1 Creating Microsoft Access XML Files. To prepare for capturing a Microsoft Access database, the Exporter for Microsoft Access tool must be run, either automatically or manually, as explained in Section 2.2.5, "Capturing the Source Database". This tool is packaged as a Microsoft Access MDE file and it allows you to export the Microsoft Access MDB file to an XML file. Do not modify any of the files created by the Exporter tool. Each Microsoft Access database that you selected is exported to an XML file. The exporter tool currently does not support creating XML files from secured or replica databases. 2.2.4.4 Before Migrating From MySQL. To configure a MySQL database for migration, install MySQLConnector/J release 3.1.12 or 5.0.4 on the system where you have installed SQL Developer and set the appropriate SQL Developer preference. Follow these steps: Ensure that you can connect to the MySQL database from the system where you have installed SQL Developer. Ensure that you have downloaded the MySQLConnector/J API from the MySQL website at mysql/ . In SQL Developer, if you have not already installed the MySQL JDBC driver using Check for Updates (on the Help menu), do the following: Click Tools , then Preferences , then Database , then Third Party JDBC Drivers . Click Add Entry . Select the jar file for the MySQL driver you downloaded from mysql/ . Ensure that the source database is accessible by the MySQL user that is used by SQL Developer for the source connection. This user must be able to see any objects to be captured in the MySQL database; objects that the user cannot see are not captured. For example, if the user can execute a stored procedure but does not have sufficient privileges to see the source code, the stored procedure cannot be captured. 2.2.4.5 Before Migrating From Teradata. Note that for the current release of SQL Developer, the following Teradata objects will not be migrated to Oracle: procedures, functions, triggers, views, macros, and BTEQ scripts. To configure a Teradata database for migration: Ensure that the source database is accessible by the Teradata database user that is used by SQL Developer for the source connection. This user must be able to see any objects to be captured in the Teradata database; objects that the user cannot see are not captured. Ensure that you can connect to the Teradata database from the system where you have installed SQL Developer. Ensure that you have downloaded the tdgssconfig.jar and terajdbc4.jar files from Teradata. In SQL Developer, do the following: Click Tools , then Preferences , then Database , then Third Party JDBC Drivers . Click Add Entry . Select the tdgssconfig.jar file. Repeat steps b through d for the terajdbc4.jar file. 2.2.5 Capturing the Source Database. Before migrating a third-party database, you must extract information from the database. This information is a representation of the structure of the source database, and it is called the captured model. The process of extracting the information from the database is called capturing the source database. The capture can be done online or offline: Online capture is done in a convenient guided sequence, during the Migrating Using the Migration Wizard process. Offline capture involves creating a script that you run later, as explained in Section 2.2.5.1, "Offline Capture". You can use offline capture with IBM DB2, MySQL, Microsoft SQL Server databases, and Sybase Adaptive Server. After capturing the source database, you can view the source database information in the captured model in SQL Developer. If necessary, you can modify the captured model and change data type mappings. Oracle recommends that you do not change the default data type mappings unless you are an experienced Oracle database administrator. 2.2.5.1 Offline Capture. To perform an offline capture of an IBM DB2, MySQL, Microsoft SQL Server, or Sybase Adaptive Server database, you create a set of offline capture scripts, run these scripts outside SQL Developer to create the script output (a dump of the third party metadata tables), and load the script output (the .ocp file containing the converted model) using SQL Developer. To create the script file (a Windows .bat file or a Linux or UNIX .sh file) and related files, click Tools , then Migration , then Create Database Capture Scripts . When this operation completes, you are notified that several files (.bat, .sql, .ocp) have been created, one of which is the controlling script. You must run the controlling script (outside SQL Developer) to populate the object capture properties (.ocp) file with information about the converted model. To load the converted model from the object capture properties (.ocp) file generated by the offline capture controlling script, click Tools , then Migration , then Third Party Database Offline Capture , then Load Database Capture Script Output . 2.2.5.1.1 IBM DB2 Offline Capture Notes. Script files and the db2_x.ocp file are generated in the target folder. The main script is startDump. xxx , which you must execute to produce the schema dump. The script files prompt you for the database name, user name, and password, and they use this information to connect to the local DB2 database. The scripts generate the schema dump for database objects within object-specific folders. To capture the schema information in offline file format, use a command in the following format (with the db2 executable in the run path): To export the schema data in offline file format, use a command in the following format (with the db2 executable in the run path): For DB2 version 9 data export: For DB2 version 8 data export: DB2 version 9 supports LOB data in separate files, which is better for migrating large data sizes. With version 8, to support large LOB data, you must modify the oracle ctl file command and db2 command in unload_script.bat or unload_script.sh . The table data is exported to files with names in the format <catalog>.<schema>.<table> .dat . The format of file is as follows: data1#<COL_DEL> #data2#<COL_DEL>…<ROW_DEL> where COL_DEL and ROW_DEL come from migration offline preference settings. Before you execute the DB2 data dump script, you must log in by entering a command in the following format: You can then execute the script using the logged connection session. 2.2.6 Creating and Customizing the Converted Model. After you capture a third-party database, the next step is to convert it, creating the converted model. The converted model is a representation of the structure of the destination database. SQL Developer creates the converted model using the information from the captured model. By default, all procedures, functions, triggers, and views are copied to the converted model during translation and translated to Oracle PL/SQL. However, if translation fails for any of the objects, those objects appear in the converted model but their original SQL code remains unchanged. Objects that remain in their original SQL code will not be used when the generation scripts are created. Therefore, to have any such objects migrated, you must either fix the problem in the original SQL code before generating the script or edit the generated script to replace the original SQL code with valid PL/SQL code. The conversion of the captured model to a converted model is done as part of Migrating Using the Migration Wizard. You can specify or accept the defaults for data mappings. The following topic describes how to modify the converted model, if this becomes necessary: 2.2.6.1 Correcting Errors in the Converted Model. If error messages with the prefix Parse Exception are listed in the migration log, manual intervention is required to resolve the issues. To complete the converted model: Note the converted model schema object that failed. Select that schema object in the converted model. Copy the schema objects DDL and paste it into the translation scratch editor (displayed by clicking Migration, then Translation Scratch Editor). Inspect the properties on the schema object in the translation scratch editor for possible causes of the error. Modify a property of the schema object in the translation scratch editor. For example, you might comment out one line of a stored procedure. Translate using the appropriate translator. If the error appears again, repeat steps 2 to 6. If the error cannot be resolved in this way, it is best to modify the object manually in the converted model. 2.2.7 Generating the DDL for the Oracle Schema Objects. To generate the DDL statements to create the Oracle schema objects, you must already have captured the captured model and created the converted model. After you generate the DDL, you can run the DDL statements to cause the objects to be created in the Oracle database. At this point, the database schema is migrated to Oracle. After you generate and run the DDL statements to migrate the schema objects, you can migrate the data from the original source database, as explained in Section 2.2.8. 2.2.8 Migrating the Data. The Migration Wizard lets you choose whether to migrate (move) any existing data from the source database to the Oracle database. If you choose to migrate the data: If you are performing migration in online mode, you can perform the data migration in online or offline mode. (However, for PostgreSQL migrations, the data migration must be performed in online mode.) If you are performing the migration in offline mode, the data migration is included in the generated files. Online data moves are suitable for small data sets, whereas offline data moves are useful for moving large volumes of data. 2.2.8.1 Transferring the Data Offline. To transfer the data offline, you generate and use scripts to copy data from the source database to the destination database. During this process you must: Use SQL Developer to generate the data unload scripts for the source database and corresponding data load scripts for the destination database. Run the data unload scripts to create data files from the source database using the appropriate procedure for your source database: For Teradata, perform the offline data move using BTEQ and SQL*Loader. Run the data load scripts using SQL*Loader to populate the destination database with the data from these data files as described in Section 2.2.8.1.4. 2.2.8.1.1 Creating Data Files From Microsoft SQL Server or Sybase Adaptive Server. To create data files from a Microsoft SQL Server or Sybase Adaptive Server database: Copy the contents of the directory where SQL Developer generated the data unload scripts onto the computer where the source database is installed. Edit the BCP extract script to include the name of the source database server. On Windows, edit the unload_script.bat script to alter the bcp lines to include the appropriate variables. The following shows a line from a sample unload_script.bat script: Run the BCP extract script. On Windows, enter: This script creates the data files in the current directory. Copy the data files and scripts, if necessary, to the target Oracle database system, or to a system that has access to the target Oracle database and has SQL*Loader (Oracle Client) installed. 2.2.8.1.2 Creating Data Files From Microsoft Access. To create data files from a Microsoft Access database, use the Exporter for Microsoft Access tool. For information about how to create data files from a Microsoft Access database, see online help for the exporter tool. 2.2.8.1.3 Creating Data Files From MySQL. To create data files from a MySQL database: Copy the contents of the directory where SQL Developer generated the data unload scripts, if necessary, onto the system where the source database is installed or a system that has access to the source database and has the mysqldump tool installed. Edit the unload_script script to include the correct host, user name, password, and destination directory for the data files. On Windows, edit the unload_script.bat script. On Linux or UNIX, edit the unload_script.sh script. The following shows a line from a sample unload_script.bat script: Edit this line to include the correct values for USERNAME, PASSWORD , and DESTINATION PATH . Do not include the angle brackets in the edited version of this file. In this command line, localhost indicates a loopback connection, which is required by the -T option. (See the mysqldump documentation for more information.) On Windows, enter: On Linux or UNIX, enter: This script creates the data files in the current directory. Copy the data files and scripts, if necessary, to the target Oracle database system, or to a system that has access to the target Oracle database and has SQL*Loader (Oracle Client) installed. 2.2.8.1.4 Populating the Destination Database Using the Data Files. To populate the destination database using the data files, you run the data load scripts using SQL*Loader: Navigate to the directory where you created the data unload scripts. Edit the oracle_ctl.bat (Windows systems) or oractl_ctl.sh (Linux or UNIX systems) file, to provide the appropriate user name and password strings. Run the SQL Load script. On Windows, enter: On Linux or UNIX, enter: For Microsoft SQL Server and Sybase migrations, if you are inserting into BLOB fields with SQL*Loader, you will receive the following error: To handle situations indicated by this error, you can use either one of the following options: Enable the Generate Stored Procedure for Migrate Blobs Offline SQL Developer preference (see Migration: Generation Options). The workaround is to load the data (which is in hex format) into an additional CLOB field and then convert the CLOB to a BLOB through a PL/SQL procedure. The only way to export binary data properly through the Microsoft SQL Server or Sybase Adaptive Server BCP is to export it in a hexadecimal (hex) format; however, to get the hex values into Oracle, save them in a CLOB (holds text) column, and then convert the hex values to binary values and insert them into the BLOB column. The problem here is that the HEXTORAW function in Oracle only converts a maximum of 2000 hex pairs. Consequently, write your own procedure that will convert (piece by piece) your hex data to binary. (In the following steps and examples, modify the START.SQL and FINISH.SQL to reflect your environment. The following shows the code for two scripts, start.sql and finish.sql , that implement this workaround. Read the comments in the code, and modify any SQL statements as needed to reflect your environment and your needs. After you run start.sql and before you run finish.sql , run BCP; and before you run BCP, change the relevant line in the .ctl file from: 2.2.9 Making Queries Case Insensitive. With several third-party databases, it is common for queries to be case insensitive. For example, in such cases the following queries return the same results: If you want queries to be case insensitive for a user in the Oracle database, you can create an AFTER LOGON ON DATABASE trigger, in which you set, for that database user, the NLS_SORT session parameter to an Oracle sort name with _CI (for "case insensitive") appended. The following example causes queries for user SMITH to use the German sort order and to be case insensitive: 2.2.10 Testing the Oracle Database. During the testing phase, you test the application and Oracle database to make sure that the: Migrated data is complete and accurate. Applications function in the same way as the source database. Oracle database produces the same results as the source database. Applications and Oracle database meet the operational and performance requirements. You may already have a collection of unit tests and system tests from the original application that you can use to test the Oracle database. You should run these tests in the same way that you ran tests against the source database. However, regardless of added features, you should ensure that the application connects to the Oracle database and that the SQL statements it issues produces the correct results. The tests that you run against the application vary depending on the scope of the application. Oracle recommends that you thoroughly test each SQL statement that is changed in the application. You should also test the system to make sure that the application functions the same way as in the third-party database. 2.2.10.1 Testing Methodology. Many constraints shape the style and amount of testing that you perform on a database. Testing can contain one or all of the following: Simple data validation. Full life cycle of testing addressing individual unit tests. System and acceptance testing. You should follow a strategy for testing that suits your organization and circumstances. Your strategy should define the process by which you test the migrated application and Oracle database. A typical test method is the V-model, which is a staged approach where each feature of the database creation is mirrored with a testing phase. Figure 2-1, "V-model with a Database Migration" shows an example of the V-model with a database migration scenario: Figure 2-1 V-model with a Database Migration. There are several types of tests that you use during the migration process. During the testing stage, you go through several cycles of testing to enhance the quality of the database. The test cases you use should make sure that any issues encountered in a previous version of the Oracle database are not introduced again. For example, if you have to make changes to the migrated schema based on test results, you may need to create a new version of the Oracle database schema. In practice, you use SQL Developer to create a base-line Oracle schema at the start of testing, and then edit this schema as you progress with testing. Oracle recommends that you track issues that you find during a testing cycle in an issue tracking system. Track these issues against the version of the database or application that you are testing. 2.2.10.2 Testing the Oracle Database. Use the test cases to verify that the Oracle database provides the same business logic results as the source database. Oracle recommends that you define completion criteria so that you can determine the success of the migration. This procedure explains one way of testing the migrated database. Other methods are available and may be more appropriate to your business requirements. To test the Oracle database: Create a controlled version of the migrated database. Oracle recommends that you keep the database migration scripts in a source control system. Design a set of test cases that you can use to test the Oracle database from unit to system level. The test cases should: Ensure the following: All the users in the source database have migrated successfully. Privileges and grants for users are correct. Tables have the correct structure, defaults are functioning correctly, and errors did not occur during mapping or generation. Validate that the data migrated successfully by doing the following: Comparing the number of rows in the Oracle database with those in the source database. Calculating the sum of numerical columns in the Oracle database and compare with those in the source database. Ensure that the following applies to constraints: You cannot enter duplicate primary keys. Foreign keys prevent you from entering inconsistent data. Check constraints prevent you from entering invalid data. Check that indexes and sequences are created successfully. Ensure that views migrated successfully by doing the following: Comparing the number of rows in the Oracle database with those in the source database. Calculating the sum of numerical columns in the Oracle database and compare with those in the source database. Ensure that triggers, procedures, and functions are migrated successfully. Check that the correct values are returned for triggers and functions. Run the test cases against the migrated database. Create a report that evaluates the test case results. These reports allow you to evaluate the data to qualify the errors, file problem reports, and provide a customer with a controlled version of the database. If the tests pass, go to step 7. If all tests in the test cases pass or contain acceptable errors, the test passes. If acceptable errors occur, document them in an error report that you can use for audit purposes. If the test cases fail: Identify the cause of the error. Identify the test cases needed to check the errors. Log an issue on the controlled version of the migrated database code in the problem report. Add the test case and a description of the problem to the incident tracking system of your organization, which could be a spreadsheet or bug reporting system. Aside from the test case, the incident log should include the following: Provide a clear, concise description of the incident encountered. Provide a complete description of the environment, such as platform and source control version. Attach the output of the test, if useful. Indicate the frequency and predictability of the incident. Provide a sequence of events leading to the incident. Describe the effect on the current test, diagnostic steps taken, and results noted. Describe the persistent after effect, if any. Attempt to fix the errors. Identify acceptance tests that you can use to make sure the Oracle database is an acceptable quality level. 2.2.10.2.1 Guidelines for Creating Tests. You may already have a collection of unit tests and system tests from the original application that you can use to test the Oracle database. However, if you do not have any unit or system tests, you need to create them. When creating test cases, use the following guidelines: Plan, specify, and execute the test cases, recording the results of the tests. The amount of testing you perform is proportional to the time and resources that are available for the migration project. Typically, the testing phase in a migration project can take anywhere from 40% to 60% of the effort for the entire project. Identify the components that you are testing, the approach to the test design and the test completion criteria. Define each test case so that it is reproducible. A test that is not reproducible is not acceptable for issue tracking or for an audit process. Divide the source database into functions and procedures and create a test case for each function or procedure. In the test case, state what you are going to test, define the testing criteria, and describe the expected results. Record the expected result of each test case. Verify that the actual results meet the expected results for each test. Define test cases that produce negative results as well as those that you expect a positive result. 2.2.10.2.2 Example of a Unit Test Case. The following displays a sample unit test plan for Windows: Name Jane Harrison. Module Table Test Emp. Date test completed 23 May 2007. Coverage log file location mwb\database\TableTestEmp. Description This unit test tests that the emp table was migrated successfully. Reviewed by John Smith. Run the following on the destination database for each table: On the destination database, the count(*) number corresponds to the number of rows in the new Oracle table. The number of rows in each table is the same in the source and destination databases. Run the following on the destination database for each table: On the destination database, sum(salary) corresponds to the sum of the salary in the emp table. The sum for each table is the same in the source and destination databases. 2.2.11 Deploying the Oracle Database. Deploying the migrated and tested Oracle database within a business environment can be difficult. Therefore, you may need to consider different rollout strategies depending on your environment. Several rollout strategies are identified for you, but you may use another approach if that is recommended by your organization. During the deployment phase, you move the destination database from a development to a production environment. A group separate from the migration and testing team, may perform the deployment phase, such as the in-house IT department. Deployment involves the following: 2.2.11.1 Choosing a Rollout Strategy. The strategy that you use for migrating a third-party database to an Oracle database must take into consideration the users and the type of business that may be affected during the transition period. For example, you may use the Big Bang approach because you do not have enough systems to run the source database and Oracle database simultaneously. Otherwise, you may want to use the Phased approach to make sure that the system is operating in the user environment correctly before it is released to the general user population. You can use one of the following approaches. 2.2.11.1.1 Phased Approach. Using the Phased approach, you migrate groups of users at different times. You may decide to migrate a department or a subset of the complete user-base. The users that you select should represent a cross-section of the complete user-base. This approach allows you to profile users as you introduce them to the Oracle database. You can reconfigure the system so that only selected users are affected by the migration and unscheduled outages only affect a small percentage of the user population. This approach may affect the work of the users you migrated. However, because the number of users is limited, support services are not overloaded with issues. The Phased approach allows you to debug scalability issues as the number of migrated users increases. However, using this approach may mean that you must migrate data to and from legacy systems during the migration process. The application architecture must support a phased approach. 2.2.11.1.2 Big Bang Approach. Using the Big Bang approach, you migrate all of the users at the same time. This approach may cause schedule outages during the time you are removing the old system, migrating the data, deploying the Oracle system, and testing that the system is operating correctly. This approach relies on you testing the database on the same scale as the original database. It has the advantage of minimal data conversion and synchronization with the original database because that database is switched off. The disadvantage is that this approach can be labor intensive and disruptive to business activities due to the switch over period needed to install the Oracle database and perform the other migration project tasks. 2.2.11.1.3 Parallel Approach. Using the Parallel approach, you maintain both the source database and destination Oracle database simultaneously. To ensure that the application behaves the same way in the production environment for the source database and destination database, you enter data in both databases and analyze the data results. The advantage of this approach is if problems occur in the destination database, users can continue using the source database. The disadvantage of the Parallel approach is that running and maintaining both the source and the destination database may require more resources and hardware than other approaches. 2.2.11.2 Deploying the Destination Database. There are several ways to deploy the destination database. The following task is an example that you should use as a guideline for deploying the destination database. If you have a complex scenario as defined in Table 2-1, Oracle recommends that you complete all of the deployment tasks. However, if you have a simple scenario, you should choose the deployment tasks appropriate to your organization. Configure the hardware, if necessary. In a large scale or complex environment, you must design the disk layout to correspond with the database design. If you use redundant disks, align them in stripes that you can increase as the destination database evolves. You must install and configure the necessary disks, check the memory, and configure the system. Make sure the operating system meets the parameters of the Oracle configuration. Before installing any Oracle software, make sure that you have modified all system parameters. For more information about modifying system parameters, see the relevant installation guide for your platform, such as Solaris Operating System. Install the Oracle software. Aside from the Oracle software that allows you to create an Oracle database, you may need to install ancillary software to support the application, such as Extract Transformation and Load (ETL) Software for data warehousing. Create the destination database from the source database and migrate the data to the Oracle database. There are several ways of putting the destination database into production after testing it, such as: Place the successfully tested database into production. The test system is now the production system. Use Oracle Export to extract the destination database from the successfully tested database and use Oracle Import to create that database within the production environment. Use the tested migration scripts to create the Oracle database and populate it with data using SQL*Loader. Perform the final checks on the destination database and applications. Place the destination database into production using one of the rollout strategies. Perform a final audit by doing the following: Audit the integrity of the data. Audit the validity of the processes, such as back-up and recovery. Obtain sign-off for the project, if necessary. 2.3 SQL Developer User Interface for Migration. If you are performing database migration, you need to use some migration-specific features in addition to those described in Section 1.3, "SQL Developer User Interface". The user interface includes an additional navigator (Migration Projects), a Migration submenu under Tools, and many smaller changes throughout the interface. Figure 2-2, "Main Window for a Database Migration" shows the SQL Developer main window with objects reflecting the migration of a Sybase database. It also shows the Migration Submenu. Figure 2-2 Main Window for a Database Migration. The Connections navigator shows a connection named sybase_15 , which is to the Sybase database to be migrated to Oracle. This connection name also appears in a drop-down control in the upper right area. In the migration projects navigator, <repository-connection> after "Projects -" will be the actual connection name for the migration repository. The migration project name is sybase_15_migr . Under the project name are trees (hierarchies) for Captured Database Objects and Converted Database Objects. As an alternative to using the SQL Developer graphical interface for migration tasks, you can use the command line, which is explained in Section 1.24, "Command-Line Interface for SQL Developer". 2.3.1 Migration Submenu. The Migration submenu contains options related to migrating third-party databases to Oracle. To display the Migration submenu, click Tools , then Migration . Migrate : Displays a wizard for performing an efficient migration. The wizard displays steps and options relevant to your specified migration. Scan Application : Displays the Application Migration wizard. Scratch Editor : Displays the translation scratch editor, which is explained in Section 2.3.5. Create Database Capture Scripts specifies options for creating script files, including an offline capture properties (.ocp) file, which you can later load and run. Repository Management : Enables you to create (associate) or delete a migration repository, disconnect from the current repository (deactivates the current repository but does not disconnect from the database), or truncate (remove all data from) the repository. 2.3.2 Other Menus: Migration Items. The View menu has the following item related to database migration: Migration Projects : Displays the Migration Projects navigator, which includes any captured models and converted models in the currently selected migration repository. 2.3.3 Migration Preferences. The SQL Developer user preferences window (displayed by clicking Tools , then Preferences ) contains a Migration pane with several related subpanes, and a Translation pane with a Translation Preferences subpane. For information about these preferences, click Help in the pane, or see Section 1.21.11, "Migration". 2.3.4 Migration Log Panes. Migration Log : Contains errors, warnings, and informational messages relating to migration operations. Logging Page : Contains an entry for each migrated-related operation. Data Editor Log : Contains entries when data is being manipulated by SQL Developer. For example, the output of a Microsoft Excel import operation will be reported here as a series of INSERT statements. 2.3.5 Using the Translation Scratch Editor. You can use the translation scratch editor to enter third-party database SQL statements and have them translated to Oracle PL/SQL statements. You can specify translation from Microsoft SQL Server T-SQL to PL/SQL, from Sybase T-SQL to PL/SQL, or from Microsoft Access SQL to PL/SQL. You can display the scratch editor by clicking Tools , then Migration , then Translation Scratch Editor . The scratch editor consists of two SQL Worksheet windows side by side, as shown in the following figure: To translate a statement to its Oracle equivalent, select the type of translation, enter the third-party SQL statement or statements; select the specific translation from the Translator drop-down (for example, Access SQL to PL/SQL ) and optionally the applicable schema from the Captured Schema drop-down; then click the Translate ( >> ) icon to display the generated PL/SQL statement or statements. SQL keywords are automatically highlighted. For a Microsoft SQL Server or Sybase Adaptive Server connection, the worksheet does not support running T-SQL statements. It only supports SELECT, CREATE, INSERT, UPDATE, DELETE, and DROP statements. The first time you save the contents of either worksheet window in the translation scratch editor, you are prompted for the file location and name. If you perform any subsequent Save operations (regardless of whether you have erased or changed the content of the window), the contents are saved to the same file. To save the contents to a different file, click File , then Save As . For detailed information about the worksheet windows, see Section 1.8, "Using the SQL Worksheet". 2.4 Command-Line Interface for Migration. As an alternative to using the SQL Developer graphical interface for migration operations, you can use the command-line interface, explained in Section 1.24, "Command-Line Interface for SQL Developer". For a walk-through of a typical migration, go to the sqldeveloper\sqldeveloper\bin folder and enter the following command:

Data conversion testing strategy

الحصول على فيا أب ستور قراءة هذه المشاركة في التطبيق لدينا! How to test data migration procedure? We are migrating our production data (DB and filesystem) from a fairly complex data model to another fairly complex data model. Migration process is given by the specification as a set of mapping rules and transformation functions. I'm planning testing strategy for this migration process. To identify the most risky areas here, I've read some lessons learned from migration testing and ETL tools testing in general and compared them with our migration process specification. Where the things may go wrong? For backward compatibility reasons, the new data will be provided by the new application to the old modules through adapters. Old modules will do their job using migrated data as they used. The new data model will support some new application features. The new data model will no longer support some old (usually used infrequently) application features. Nullable fields in the old model are no longer nullable. Files in filesystem are defined in different format. Data that used to be decoupled in the old system will now be joined (e.g. customers with their projects) based on: automatic rules administrator rules New data formats will be used. Values for data used by new features may be incorrectly defaulted. Adapter may translate new data to legacy format incorrectly. Not all entities may be migrated, e.g., some old client configurations cannot be found in the new system. Not all data (properties) may be migrated. New data model may use data formats may not be able to store some old data structure (e.g., VARCHAR vs. BLOB) Migrated files may not be readable for old modules etc. My very first idea is to combine two testing strategies here: Low-level testing strategy that will be relatively thorough but will have little coverage. The idea is to isolate a most representative sample of original data (that makes sense for the business, e.g., single customer and its documents). Then, for each mapping rule define real expected output for the corresponding part of the sample data. Automate checks. High-level testing strategy that will be less thorough but will cover majority of application functionalities and try to identify integration problems with old modules. The idea is to migrate all data first. Then perform user test scenarios that involve all functionalities of the new application and of the old modules. What other strategies would you suggest to discover bugs for the identified risks? Having been involved in a few data migrations myself, I'd say you have a pretty good start down the right track. Creating baselines of expected behavior prior to the migration and comparing them after the migration will be useful, but as you mentioned there are a number of things that are expected to change so the results will not be exactly the same and there will be some manual effort involved to compare the results and ensure they meet expectations. As you already outlined, you plan on running regression tests against the application to ensure that the functionality has not been broken. Hopefully you already have automated tests or a well defined manual test suite for most of this. Some additional things that I would watch out for from my own personal experience: Make sure you take a look at the current edge cases and that they are covered in the migration. It is easy for a developer to handle all of the data that is normally expected, however there are always edge cases where data for particular customers or particular configurations is stored differently and you need to ensure that those special cases are handled properly as well. (I found an issue that would have caused 200,000 people in a "special" state -out of millions - to be locked out of their accounts during one migration). If the system will be online during the migration, (I'm thinking of a migration that may take hours or even days or weeks) make sure that after the initial pass, the cleanup process to get all of the data since the start of the migration is just as robust as the original process. It's easy for developers to spend less time on this part, but it is just as critical. It is also harder to test if you are only testing with a small subset of the data. If at all possible, go through a full migration with the entire dataset as a dry-run prior to the migration. Even if you can't test every piece of data, this will give developers a lot of information and they may well find that some of their migration scripts errored out on certain data that they will need to analyze a bit more closely to ensure they handle correctly. Then go to town on your regression testing. There is a set of tools from Red-Gate that allow you to do data comparisons between databases. Even if the data is stored differently, you can compare the results of specific queries, so if you know that a query in the old system should be equivalent to a query in the new system you can compare them that way. Migration can be a messy process to test, and some testers might approach it half-heartedly. It is good to see that you are taking it seriously. I might think about how migration could be impacted by configuration settings. If your product has per-customer configuration settings that could impact migration, you might consider how to test an appropriate sample of settings. Your definition of "appropriate" may depend on factors such as who your customers are, which settings are most important, risky, or complicated. Other questions in this forum discuss approaches to combinatorial testing. There's some really good advice on this thread. The only thing I can add is possibly a little off-topic but still pertinent: make sure you are fresh when the production environment gets migrated. I worked on a project a while ago that switched over at four in the morning, and the testers had all worked a 9-5 day then turned up onsite for the migration at midnight. When we came to do our sanity tests, our concentration was completely shot and we ended up missing some nasty issues! I gained lot of information from the question as well from all the answers. In the question, under the heading "Where the things may go wrong?" or "What may go wrong", I feel apart from functional perspective, impact on performance of the system should also be considered as you mentioned that the migration involves changes in complex data model as well file system. Some great answers above and I think every one is right. The more directions you come at this the better. Have you investigated some of the semantic profiling tools like Rever? (rever.eu). They have a case study of a similar problem of transformation but because their tools analyse the code as well as the schema they can show the metadata structures of the transformation logic for a logical view of the code for review before the expense of the physical testing recommended above. This in addition to a metadata view of source and target based on code and schema - as we all know many of the edge cases will only be visible in the code not the schema. Is there any business statistic functionality in the old and in the new system? If true you can make the same statistic in both versions and compare them. Example: the number of customers should be the same, as well as the revenue of the best and the worst customer. These can act as a kind of cheksum that make shure that no critical value is lost. Testing migration requires good analysis of the current and the target data model and the mapping. Make sure all the data fields are being mapped. You also need to check if a data field has been broken down into multiple fields or two or more fields have been combined into one. Check for what has been eliminated as it may not be required anymore. There are ETL tools available now which can help you in testing, but when I did similar project long time back I had written my own tool, which connected to both databases extracted structure and asked user to map the data from old to new. It analyzed both the structure based on the user input and displayed the results(error cases).
أفضل العودة اختبار استراتيجية التداول
إكسل نظام تداول الأسهم