[go: up one dir, main page]

Aspect (computer programming)

(Redirected from Aspect (computer science))

In computer programming, an aspect of a program is a feature linked to many other parts of the program, but is not related to the program's primary function. An aspect crosscuts the program's core concerns, therefore violating its separation of concerns that tries to encapsulate unrelated functions. For example, logging code can crosscut many modules, yet the aspect of logging should be separate from the functional concerns of the module it cross-cuts. Isolating such aspects as logging and persistence from business logic is at the core of the aspect-oriented programming (AOP) paradigm.[1]

Aspect-orientation is not limited to programming since it is useful to identify, analyse, trace and modularise concerns through requirements elicitation, specification, and design. Aspects can be multi-dimensional by allowing both functional and non-functional behaviour to crosscut any other concerns, instead of just mapping non-functional concerns to functional requirements.[citation needed]

One view of aspect-oriented software development is that every major feature of the program, core concern (business logic), or cross-cutting concern (additional features), is an aspect, and by weaving them together (a process also called composition), one finally produces a whole out of the separate aspects. This approach is known as pure aspect programming, but hybrid approaches are more common. It is possible for functional concerns to crosscut non-functional or functional concerns (e.g., the need for more features harms mobility). A uniform approach to representation and composition, similar to the pure approach in AOP, is termed multidimensional representation.[citation needed]

References

edit
  1. ^ Awais Rashid (2004). Aspect-Oriented Database Systems. Springer. ISBN 3-540-00948-5.