Pathfinder network Several psychometric scaling methods start from proximity data and yield structures revealing the underlying organization of the data. Data clustering and multidimensional scaling are two such methods. Network scaling represents another method based on graph theory. Pathfinder networks are derived from proximities for pairs of entities. Proximities can be obtained from similarities, correlations, distances, conditional probabilities, or any other measure of the relationships among entities. The entities are often concepts of some sort, but they can be anything with a pattern of relationships. Here is an example of an undirected Pathfinder network derived from average similarity ratings of a group of biology graduate students. The Pathfinder algorithm uses two parameters. (1) The q parameter constrains the number of indirect proximities examined in generating the network. References[edit] Schvaneveldt, R. A shorter article summarizing Pathfinder networks: Schvaneveldt, R.
Factor analysis Factor analysis is related to principal component analysis (PCA), but the two are not identical. Latent variable models, including factor analysis, use regression modelling techniques to test hypotheses producing error terms, while PCA is a descriptive statistical technique.[1] There has been significant controversy in the field over the equivalence or otherwise of the two techniques (see exploratory factor analysis versus principal components analysis).[citation needed] Statistical model[edit] Definition[edit] Suppose we have a set of observable random variables, with means Suppose for some unknown constants and unobserved random variables , where , we have Here, the are independently distributed error terms with zero mean and finite variance, which may not be the same for all . , so that we have In matrix terms, we have If we have observations, then we will have the dimensions , and . denote values for one particular observation, and matrix does not vary across observations. and are independent. . or .
Co-citation Figure visualizing co-citation on the left and a refinement of co-citation, Co-citation Proximity Analysis (CPA) on the right. Co-citation, like Bibliographic Coupling, is a semantic similarity measure for documents that makes use of citation relationships. Co-citation is defined as the frequency with which two documents are cited together by other documents.[1] If at least one other document cites two documents in common these documents are said to be co-cited. The more co-citations two documents receive, the higher their co-citation strength, and the more likely they are semantically related.[1] The figure to the right illustrates the concept of co-citation and a more recent variation of co-citation which accounts for the placement of citations in the full text of documents. The figure's right image shows a citing document which cites the Documents 1, 2 and 3. Over the decades, researchers proposed variants or enhancements to the original co-citation concept. Considerations[edit]
Domain analysis In software engineering, domain analysis, or product line analysis, is the process of analyzing related software systems in a domain to find their common and variable parts. It is a model of wider business context for the system. The term was coined in the early 1980s by James Neighbors.[1][2] Domain analysis is the first phase of domain engineering. It is a key method for realizing systematic software reuse.[3] Domain analysis produces domain models using methodologies such as domain specific languages, feature tables, facet tables, facet templates, and generic architectures, which describe all of the systems in a domain. Several methodologies for domain analysis have been proposed.[4] The products, or "artifacts", of a domain analysis are sometimes object-oriented models (e.g. represented with the Unified Modeling Language (UML)) or data models represented with entity-relationship diagrams (ERD). Domain analysis techniques[edit] References[edit] Jump up ^ Neighbors, J.M. See also[edit]
Scientometrics Scientometrics is the study of measuring and analysing science research. In practice, scientometrics is often done using bibliometrics which is a measurement of the impact of (scientific) publications. Modern scientometrics is mostly based on the work of Derek J. de Solla Price and Eugene Garfield. The latter founded the Institute for Scientific Information which is heavily used for scientometric analysis. Methods of research include qualitative, quantitative and computational approaches. One significant finding in the field is a principle of cost escalation to the effect that achieving further findings at a given level of importance grow exponentially more costly in the expenditure of effort and resources. See also[edit] Further reading[edit] Derek J. de Solla Price, Little Science, Big Science (New York, 1963)G. External links[edit]
Meta-analysis In statistics, meta-analysis comprises statistical methods for contrasting and combining results from different studies, in the hope of identifying patterns among study results, sources of disagreement among those results, or other interesting relationships that may come to light in the context of multiple studies.[1] Meta-analysis can be thought of as "conducting research about previous research." In its simplest form, meta-analysis is done by identifying a common statistical measure that is shared between studies, such as effect size or p-value, and calculating a weighted average of that common measure. This weighting is usually related to the sample sizes of the individual studies, although it can also include other factors, such as study quality. The motivation of a meta-analysis is to aggregate information in order to achieve a higher statistical power for the measure of interest, as opposed to a less precise measure derived from a single study. History[edit] Advantages[edit] [edit]
Social influence Morton Deutsch and Harold Gerard described two psychological needs that lead humans to conform to the expectations of others. These include our need to be right (informational social influence), and our need to be liked (normative social influence).[3] Informational influence (or social proof) is an influence to accept information from another as evidence about reality. Informational influence comes into play when people are uncertain, either because stimuli are intrinsically ambiguous or because there is social disagreement. Types[edit] Social Influence is a broad term that relates to many different phenomena. Kelman's varieties[edit] There are three processes of attitude change as defined by Harvard psychologist Herbert Kelman in his 1958 paper in the Journal of Conflict Resolution.[2] The purpose of defining these processes was to help determine the effects of social influence: for example, to separate public conformity (behavior) from private acceptance (personal belief). Status[edit]
Types of research methods and disciplines A dissertation is an extended piece of writing based on comprehensive reading and research, written by an academic scholar at an undergraduate, masters or post graduate level. In some cases, a dissertation is referred to an academic research document written at PhD level, while a Thesis may be one which is written by an academic at Masters or Undergraduate level. However the opposite is also true in other cases. Etymology[edit] The word ‘dissertation’ was derived from the Latin word dissertātiō which means ‘discourse’ or ‘path. Types of Research[edit] There are two types of research which can be done to develop a thesis or dissertation: Practical Research: The practical approach consists of the empirical study of the topic under research and chiefly consists of hands on approach. Types of Research Method[edit] Descriptive/Qualitative[edit] This type of research methods involve describing in details specific situation using research tools like interviews, surveys, and observations.[3] [edit]
Interoperability Interoperability is the ability of making systems and organizations to work together (inter-operate). While the term was initially defined for information technology or systems engineering services to allow for information exchange,[1] a more broad definition takes into account social, political, and organizational factors that impact system to system performance.[2] The "interoperability" issue in U.S. antitrust scholar papers is considerably raising in the past two years.[3] Syntactic interoperability[edit] If two or more systems are capable of communicating and exchanging data, they are exhibiting syntactic interoperability. Specified data formats, communication protocols and the like are fundamental. XML or SQL standards are among the tools of syntactic interoperability. Syntactical interoperability is a necessary condition for further interoperability. Semantic interoperability[edit] Cross-domain interoperability[edit] Interoperability and open standards[edit] Open standards[edit]
Postpositivism In philosophy and models of scientific inquiry, postpositivism (also called postempiricism) is a metatheoretical stance that critiques and amends positivism. While positivists believe that the researcher and the researched person are independent of each other, postpositivists accept that theories, background, knowledge and values of the researcher can influence what is observed.[1] However, like positivists, postpositivists pursue objectivity by recognizing the possible effects of biases.[1] Postpositivists believe that human knowledge is based not on unchallengeable, rock-solid foundations, but rather upon human conjectures. Postpositivists believe that a reality exists, like positivists do, though they hold that it can be known only imperfectly and probabilistically.[1] One of the first thinkers to criticize logical positivism was Sir Karl Popper. Main publications[edit] See also[edit] Notes[edit] ^ Jump up to: a b c Robson, Colin (2002). References[edit] External links[edit]
Social Etymology[edit] Definition[edit] In the absence of agreement about its meaning, the term "social" is used in many different senses and regarded as a fuzzy concept, referring among other things to: The adjective "social" is also used often in political discourse, although its meaning in a context depends heavily on who is using it. Social theorists[edit] In the view of Karl Marx[1] human beings are intrinsically, necessarily and by definition social beings who, beyond being "gregarious creatures", cannot survive and meet their needs other than through social co-operation and association. By contrast, the sociologist Max Weber[1] for example defines human action as "social" if, by virtue of the subjective meanings attached to the action by individuals, it "takes account of the behavior of others, and is thereby oriented in its course". Social in "Socialism"[edit] The modern concept of socialism evolved in response to the development of industrial capitalism. Modern uses[edit] See also[edit]