Rubrics, and Validity, and Reliability: Oh My!

فهرست عناوین اصلی در این پاورپوینت

فهرست عناوین اصلی در این پاورپوینت

● Rubrics, and Validity, and Reliability: Oh My!
● CPPA Members
● Agenda
● Table Discussion
● Rubric Design
● Resources for Rubric Design
● Introduction to Validity
● Table Discussion
● Approaches to Establishing Validity
● Tagging Example
● Table Discussion
● Introduction to Reliability
● Approaches to Establishing Reliability
● Time to Practice
● Criterion Validity
● Content Validity
● Inter-Rater Reliability
(Extent to which different assessors have consistent results)
● Inter-Rater Reliability
● Table Discussion

What would you do to increase
inter-rater reliability?
● Summary


نوع زبان: انگلیسی حجم: 1.9 مگا بایت
نوع فایل: اسلاید پاورپوینت تعداد اسلایدها: 28 صفحه
سطح مطلب: نامشخص پسوند فایل: pptx
گروه موضوعی: زمان استخراج مطلب: 2019/06/05 09:20:35

لینک دانلود رایگان لینک دانلود کمکی

اسلایدهای پاورپوینت مرتبط در پایین صفحه

عبارات مهم استفاده شده در این مطلب

عبارات مهم استفاده شده در این مطلب

rubric, ., validity, criterion, http, –, type, table, university, standard, develop, measure,

توجه: این مطلب در تاریخ 2019/06/05 09:20:35 به صورت خودکار از فضای وب آشکار توسط موتور جستجوی پاورپوینت جمع آوری شده است و در صورت اعلام عدم رضایت تهیه کننده ی آن، طبق قوانین سایت از روی وب گاه حذف خواهد شد. این مطلب از وب سایت زیر استخراج شده است و مسئولیت انتشار آن با منبع اصلی است.

https://secure.aacte.org/apps/planner/uploads/10/1408/131948/cppa-pre-con-aacte-2016-am—2316.pptx

در صورتی که محتوای فایل ارائه شده با عنوان مطلب سازگار نبود یا مطلب مذکور خلاف قوانین کشور بود لطفا در بخش دیدگاه (در پایین صفحه) به ما اطلاع دهید تا بعد از بررسی در کوتاه ترین زمان نسبت به حدف با اصلاح آن اقدام نماییم. جهت جستجوی پاورپوینت های بیشتر بر روی اینجا کلیک کنید.

عبارات پرتکرار و مهم در این اسلاید عبارتند از: rubric, ., validity, criterion, http, –, type, table, university, standard, develop, measure,

مشاهده محتوای متنیِ این اسلاید ppt

مشاهده محتوای متنیِ این اسلاید ppt

rubrics and validity and reliability oh my pre conference session the committee on preparation and professional accountability aacte annual meeting ۲ ۱۶ cppa members george drake dean college of education and human services millersville university mark meyers educational administration program director xavier university trish parrish committee chair associate vice president academic affairs saint leo university debbie rickey associate dean college of education grand canyon university carol ryan associate dean college of education and human services northern kentucky university jill shedd assistant dean for teacher education school of education indiana university carol vukelich board liaison interim dean college of education and human development university of delaware agenda welcome and introductions rubric design break table work improving rubrics application rubrics for field experiences debrief and wrap up trish our goal for you today is that you will be actively engaged in trying some of the processes we describe. we recognize that based on time limitations we cannot provide practice with all approaches. so when you return to your institutions be sure to explore other approaches to find the one that best suits your institution and your needs. now let’s get started with some discussion at your tables advance slide table discussion what type of assessment system does your epp use do you use locally developed instruments as part of your key assessment of teacher candidates have you followed a formal process to establish reliability and validity of these instruments trish to intro mark steps in to conclude ۵ minutes have table groups discuss these questions. we can circulate to get a feeling for where people are with the topic. this will allow us to make some on the fly adjustments as needed. ۵ rubric design role of rubrics in assessment formative and summative feedback transparency of expectations illustrative value mark rubrics that have been defined and agreed upon by the expected evaluators increase the likelihood of comparable ratings thus increasing the interrater reliability of the instrument. the development process helps clarify the expectations for performance. the resulting well designed rubrics ensure that the information gathered can be used to make changes in the instruction or the curriculum. ۶ rubric design criteria for sound rubric development appropriate definable observable diagnostic complete retrievable mark appropriate – align with some aspect of standards or outcomes that have been articulated for the learner. definable – utilize clear agreed upon meanings of language and symbols. observable – relate each criterion to a quality of performance that can be perceived. diagnostic – define distinct levels of candidate performance that permit detection of both strengths and weaknesses. complete – describe the whole of the learning outcome or make clear any limits in scope. retrievable – establish a realtime setting or archival repository so that a sample nonscored version can be accessed prior to use during use and after ۷ rubric design steps in writing rubrics select criteria set the scale label the ratings identify basic meaning describe performance mark select criteria select criteria for the rubric from learning outcomes from your academic program or standards from a professional organization. typically a single outcome is reflected by one criterion on the rubric. set the scale – odd numbered or even numbered should be chosen deliberately and should be revisited after piloting. avoiding resemblance to a grading scale through avoidance of ۴ point scale. label the ratings – distinctive labels for steps which clearly distinguish through clear plain language. identify basic meaning – are the levels measureable and clear is the connection to evaluative criteria clear describe performance descriptors should be so clearly written that any learner could work backward and plan a project or paper based on how it will be evaluated with the rubric. for both learners and evaluators definitions and examples may be appropriate as annotations to the rubric. ۸ resources for rubric design national postsecondary education cooperative http nces.ed.gov pubs۲ ۵ ۲ ۵۸۳۲.pdf rubric bank at univ of hawaii http www.manoa.hawaii.edu assessment resources rubricbank.htm university of minnesota http www.carla.umn.edu assessment vac improvement p ۴.html penn state rubric basics http www.schreyerinstitute.psu.edu pdf rubricbasics.pdf aacu value project http www.aacu.org value value rubrics http www.aacu.org value rubrics irubric http www.rcampus.com indexrubric.cfm final examination of the rubric∫ the rubric has clear and distinct performance levels. the rubric s criteria are observable and independent yet work together. the rubric s descriptors are clear and distinguish between levels. introduction to validity construct validity how well a rubric measures what it claims to measure content validity estimate of how the rubric aligns with all elements of a construct criterion validity correlation with standards face validity a measure of how representative a rubric is at face value trish total time ۱۵ minutes for validity table discussion why does establishing validity of a rubric matter define each type of validity construct defines how well a rubric measures what it claims to measure https explorable.com types of validity content the estimate of how the rubric aligns with all elements of a construct https explorable.com types of validity this is the one that caep is most interested in criterion correlations with standards http changingminds.org explanations research design types validity.htm face a measure of how representative a rubric is at face value https explorable.com types of validity table discussion which of the types of validity would be most helpful for locally developed rubrics ۱ table discussion which of the types of validity would be most helpful for locally developed rubrics construct content criterion face approaches to establishing validity locally established methodology such as tagging developed by the epp with the rationale provided by the epp research based methodology such as lawshe removes need for epp to develop a rationale trish reiterate that the is not one right approach for this. we selected tagging because of its relevance in documenting standards and lawshe because it is one of the approaches caep discusses in its workshops give a rubric with one standard where does it align. discuss in tables tagging example caep ۱.۲ providers ensure that candidates use research and evidence to develop an understanding of the teaching profession and use both to measure their p ۱۲ students’ progress and their own professional practice. component descriptor ineffective emerging target developing objectives lists learning objectives that do not reflect key concepts of the discipline. lists learning objectives that reflect key concepts of the discipline but are not aligned with relevant state or national standards. …

کلمات کلیدی پرکاربرد در این اسلاید پاورپوینت: rubric, ., validity, criterion, http, –, type, table, university, standard, develop, measure,

این فایل پاورپوینت شامل 28 اسلاید و به زبان انگلیسی و حجم آن 1.9 مگا بایت است. نوع قالب فایل pptx بوده که با این لینک قابل دانلود است. این مطلب برگرفته از سایت زیر است و مسئولیت انتشار آن با منبع اصلی می باشد که در تاریخ 2019/06/05 09:20:35 استخراج شده است.

https://secure.aacte.org/apps/planner/uploads/10/1408/131948/cppa-pre-con-aacte-2016-am—2316.pptx

  • جهت آموزش های پاورپوینت بر روی اینجا کلیک کنید.
  • جهت دانلود رایگان قالب های حرفه ای پاورپوینت بر روی اینجا کلیک کنید.

رفتن به مشاهده اسلاید در بالای صفحه


پاسخی بگذارید

نشانی ایمیل شما منتشر نخواهد شد. بخش‌های موردنیاز علامت‌گذاری شده‌اند *