A common format for an introduction (but by no means a required format) is to explain the background, explain the thesis, and then outline the structure of the paper.
You should know the difference between these words:
Conducting the Legal Analysis3
The key to writing a good legal research paper lies in conducting sound legal analysis of the issue that you are writing about. The format laid out below supports a methodical approach to analyzing legal issues. The initial “thinking time” that it takes to move through the steps is crucial to good legal research; it will help you to identify the primary legal issues and relevant facts. In doing so, you will be able to focus your attention on the legal question.
The following procedure for legal analysis is recommended:
1. Determine the Legal Issue
Save your time - order a paper!
Get your paper written from scratch within the tight deadline. Our service is a reliable solution to all your troubles. Place an order on any task and we will take care of it. You won’t have to worry about the quality and deadlinesOrder Paper Now
2. Identify the Applicable Law
Identify convention, treaty, domestic legislation or item of customary law applicable to the issue
3. Identify the Relevant Facts
Who is involved?
Where did the incident occur?
How did it happen?
Why did it happen?
4. Apply the Law to the Facts
Does the conduct complained of breach a law?
o What does the law say?
o What do the courts say?
o Is there a defence or a reasonable explanation?
5. Conduct your Legal Analysis
Discuss the fundamental nature of the legal conflict and the context in which the matter at issue occurred
Comment on the law, how the event at issue relates to that law, any loopholes or lack of clarity
6. State your Legal Conclusions
3 This portion of the guide was created by Phillip Drew
INTRODUCTION AND LEARNING OUTCOMES
On completion of this weeks activities, students should be able to:
describe similarities and difference between evaluation and research
assess barriers to evaluation
examine methods for conducting an evaluation
MONITORING AND EVALUATION
An important component of being a reflective educator is program evaluation. There are many similarities between program evaluation and research methods, however evaluation is not necessarily research nor vice versa.
Boland, D. L. (2015). Program Evaluation. In M. H. Oermann (Ed.), Teaching in nursing and role of the educator (pp. 275-302). New York: Springer Publishing Company.
There are three additional readings on curriculum evaluation approaches in the reference list available through eReserve (Hall 2014; Kesting 2015; Lindemann & Lipsett 2016) . Please review these if you are interested in building on your knowledge of the field of evaluation.
Boland (2015 p278-9) cites Wandersman et al’s (2012) nine principles associated with empowerment that educators should incorporate into their program evaluations. This raises some great principles to consider when planning and conducting program evaluations. Boland (2015 p285) then presents a list of things that need to be considered when developing and doing a program evaluation:
identify and engage stakeholders
clarify goals of the evaluation
assess resources needed for evaluation
design the evaluation
determine appropriate methods of measurement and procedures
develop a work-plan, budget, and timeline for evaluation
collect the data using agreed upon methods and procedures
process and analyse data
interpret and disseminate the results
Given the purpose of undertaking a program evaluation is to judge the worth and value of a program, and its ability to meet the intended aims, it is vital that results be actioned, meaning it must lead to a modified curriculum as required.
Drawing on similar points of view, the following reading is another look at curriculum evaluation, but with a focus on technology enhanced learning.
All curricula, including those for the health professions, benefit from timely periodic revision. The revision process usually includes curriculum monitoring evidence, but also embraces emerging societal trends, health care innovations and educational practices is called curriculum renewal. The next reading articulates twelve tips on how to assure dynamic, ongoing curriculum renewal. The overall goal of the renewal should be to assure timely, evidence-based curriculum responsiveness to changes in practice, health care, student needs and educational approaches based on quality research (McLeod & Steinert 2015).
As mentioned in the presentation, evaluation methods draw on research methods for data collection, interpretation and analysis processes. The following is links to the BetterEvaluation site which will provide more detail on specific processes as well as philosophical approaches and theoretical underpinnings.
BetterEvaluation: An international collaboration to improve evaluation practice and theory by sharing and generating information about options (methods or processes) and approaches
The nine procedural steps promoted by Better Evaluation include:
· 1. Decide how decisions about the evaluation will be made
· 2. Scope the evaluation
· 3. Develop the Terms of Reference (ToR)
· 4. Engage the evaluation team
· 5. Manage development of the evaluation methodology
· 6. Manage development of the evaluation work plan including logistics
· 7. Manage implementation of the evaluation(s)
· 9. Disseminate reports and support use of evaluation
· 8. Guide production of quality report
An alternate online resource is the
This is also available as a downloaded PDF booklet here
Evaluation and research often overlap. In the following reading McKenna & Williams (2017) set out to study the near-peer learner and teacher experiences of participating in near-peer learning and to explore students’ engagement beyond the skill being learnt. What they found was related to the hidden curriculum of the educational activities. Such information, emanating from educational research should still be used to inform curriculum renewal.
PUTTING IT ALL TOGETHER!
Much of the programs at Flinders University have layers of curriculum evaluation:
Tutors can undertake their own class learning/topic understanding through the ‘muddiest point’ etc
Tutors can engage student feedback on teaching, peer evaluation or 360 degree review
Topic coordinators can do topic evaluations – to check for consistency across classes, ensure ILOs are being met, teachers are delivering syllabus as intended.
Topic coordinators can do learning analytics of student engagement
Topic coordinators are required to review assessment outcomes for students and report to aberrations to a normal distribution curve
University require SETs (Student evaluation of topic/teaching) to be done
TC/School/College review of SETs
Annual Course advisory committees
Five year course accreditation reviews
Annual university student satisfaction survey (not specific to individual topics thought)
Annual national graduate surveys
Any of the aforementioned approaches to the evaluation may be employed.