Document 1
Document 1
ABSTRACT — The rapid growth of the Android market in various developing countries has driven the demand for higher-
quality applications. Developing Android-based applications presents specific challenges, such as the need for responsive
designs and optimization for devices with diverse specifications. Design patterns like model-view-controller (MVC), model-
view-presenter (MVP), and model-view-viewmodel (MVVM) have become popular approaches to address these issues.
However, studies on the performance of design patterns in Android applications, especially in modern programming
languages like Kotlin, remain limited. This research aims to compare the performance of the MVP and MVVM design
patterns in an Android-based boarding house management application, KosGX. This application utilized Kotlin and featured
an interactive dashboard requiring significant device resources. Testing was conducted by measuring performance across
three key aspects: central processing unit (CPU) usage, memory usage, and system response time. The results of the study
showed that MVVM outperformed in CPU efficiency, with an average usage of 8.92% compared to 11.15% for MVP. In
terms of memory usage, MVVM was also slightly more efficient, with an average usage of 121.48 MB compared to 121.55
MB for MVP. However, MVP excelled in response time, averaging 236.88 ms, whereas MVVM reached 252.68 ms. This
study underscores that the choice of design pattern affects application performance. MVVM is more efficient in CPU and
memory usage, while MVP offers better response times. These findings provide valuable insights for developers in selecting
the optimal design pattern based on the specific needs of their applications.
KEYWORDS — Kotlin, CPU Efficiency, Response Time, MVP, MVVM, Memory Usage, Android Profiling.
regarding design pattern usage in Android applications have II. RELATED WORKS
reported that MVVM offers better CPU utilization and faster
A. DESIGN PATTERN
response times compared to MVP [17]. However, MVP
A design pattern is a solution to common problems
performs better in memory management. These tests were
encountered during system development, particularly those
conducted on a point of sale (PoS) application developed in
related to design, code organization, and system efficiency [14].
Java, leaving a research gap for performance evaluations in
In mobile applications, system development also requires
more modern programming languages.
design patterns to ensure architectural structures are more
This study introduced a new approach by evaluating the
organized and that the function and purpose of every line of
performance of an application named KosGX, built using
code written by the developer can be easily identified.
Kotlin, through a comparative analysis of the MVP and
Officially, Android recommends that applications consist of
MVVM design patterns. Unlike previous studies that focused
two layers: the presentation layer and the data layer, with an
solely on Java programming language with limited use cases,
additional layer acting as an intermediary to facilitate
this study leveraged Kotlin—a modern programming language
interaction between these two layers [15]. Design patterns are
officially supported by Android—to address the lack of
essential in Android system development to create efficient,
research evaluating performance in newer programming
maintainable applications with high scalability [21]. Various
environments. KosGX, a boarding house management
types of design patterns can be applied in Android development,
application with complex interactive elements, provides a
each with its unique characteristics, advantages, and
comprehensive testing platform to assess design pattern
disadvantages. For instance, applying the flyweight design
performance in real-world scenarios. The application serves as
pattern in Android has proven to enhance awareness of memory
a general example of a dashboard-based system, presenting
consumption during mobile application development [22]. One
data akin to typical dashboard systems. Memory usage reflects
study compared the flyweight pattern with traditional object-
the amount of RAM allocated to the application, making it
oriented programming, showing that the flyweight pattern does
critical for low-end devices where excessive RAM allocation
not negatively impact memory usage, enabling professional
can degrade performance [18]. System response time (SRT) is
software design without sacrificing efficiency. Another study
a key factor influencing user satisfaction [19].
discussed the importance of improving software quality and
This study offers three main contributions: empirical
reusability in Android systems [23]. As a result, a paper
evidence, practical implications, and actionable insights.
proposed PatRoid, a framework for automatically detecting the
Empirical evidence provides empirical evidence on the
presence of design patterns in source code. Preliminary
performance trade-offs between MVP and MVVM in Kotlin-
evaluations demonstrated that PatRoid successfully detected 23
based Android applications, addressing gaps in previous
gang of four (GoF) design patterns in Android applications.
research on contemporary software development. Meanwhile,
practical implications highlight the significance of choosing B. MODEL VIEW VIEWMODEL (MVVM) dan MODEL VIEW
design patterns for resource-constrained software, ensuring PRESENTER (MVP)
more optimal memory usage and shorter response times. The MVVM and MVP architectural patterns are widely
Actionable insights offer developers to improve application used in software development, particularly in Android
quality and user experience through informed architectural applications. Both patterns aim to decouple concerns,
decisions. improving maintainability and testability, but they differ in
From a societal perspective, these findings benefit both their implementation approaches and data handling. MVVM is
developers and end-users. For developers, the study equips a variation of the MVC architecture designed to achieve a
them with deeper insights into design pattern performance, complete separation between the model and the view
enabling them to create more efficient and maintainable components [24]. The model is a class containing data, the view
applications. For end-users, especially those in developing represents the application’s user interface (UI) and is
regions who rely on low-end Android devices, optimized responsible for displaying information, and the viewmodel
applications contribute to smoother user experiences and handles the application’s business logic, providing data streams
longer device lifespans. Furthermore, as digital solutions to the view component without being directly tied to it. In other
continue to address critical societal needs such as education, words, the viewmodel has no knowledge of the existence of a
View.
healthcare, and financial inclusion, developing high-
In contrast, MVP is an architecture similar to MVC but with
performance applications becomes a cornerstone for
some differences. The workflow begins with the view
technology-driven progress.
capturing user input, which is then passed to the presenter [25].
This research aims to identify the design pattern that The model contains the data to be displayed, the view
offered better data presentation and optimal resource utilization represents the application’s interface, and the presenter
for the KosGX application. Significant performance manages all interactions between the model and the view [26].
differences indicate that the choice of design pattern is an The presenter retrieves data from the model and delivers it to
essential factor in performance testing. Design pattern selection the view. MVP leverages interface classes, which are empty
impacts device performance when running applications, function definitions that can be extended by other classes.
making it essential for developers to choose the most suitable These interfaces are used in both the presenter and view
pattern to ensure a positive user experience. Application components to implement functions as required, whether to
performance plays a vital role in user satisfaction, and poor retrieve or send data [27]. Figure 1 illustrates the workflow of
performance can adversely affect users’ perceptions of the the MVP and MVVM architectures.
application [20]. Consequently, developers must consider MVVM excels in terms of modifiability, featuring the
system performance as an integral aspect of user experience lowest modification index, while MVP performs better in terms
design. of raw performance. Both architectures are part of the clean
TABLE I
TEST CASE MVVM
Response
Data Range CPU Memory
Time
TC-VM- TC-VM- TC-VM-
10,000– 19,000
CPU-01 MEM-01 RT-01
TC-VM- TC-VM- TC-VM-
20,000– 29,000
CPU-02 MEM-02 RT-02
TC-VM- TC-VM- TC-VM-
30,000– 39,000
CPU-03 MEM-03 RT-03
TC-VM- TC-VM- TC-VM-
40,000– 49,000
CPU-04 MEM-04 RT-04
TABLE II
Figure 2. Example of KosGX application UI. TEST CASE MVP
.
Response
scalability, stability, and resource usage [32]. Dummy data, or Data Range CPU Memory
Time
synthetic data, are displayed on the application’s dashboard for TC-P-MEM- TC-P-RT-
10,000– 19,000 TC-P-CPU-01
testing. Each test case is identified using a specific code format: 01 01
“TC-X-Y-Z,” where X represents the design pattern, Y 20,000– 29,000 TC-P-CPU-02
TC-P-MEM- TC-P-RT-
represents the test type, and Z indicates the test case number. 02 02
Data for each test case consists of 10 data points. Tables I and TC-P-MEM- TC-P-RT-
30,00 – 39,000 TC-P-CPU-03
II show the test cases for each design pattern. 03 03
TC-P-MEM- TC-P-RT-
40,000– 49,000 TC-P-CPU-04
C. EXPERIMENT 04 04
The testing involves running the application populated with
dummy data according to the designated test cases. During
testing, the device is connected to Android Profiler to record irregularities, they may lack the statistical power of parametric
CPU and memory usage while opening the dashboard. For tests, meaning that detecting significant effects might require
response time, logs from Android Studio are used to measure larger sample sizes or more pronounced differences.
the time the application takes to display data. Before testing In this study, hypotheses were formulated to examine
begins, it is ensured that no other applications are running on whether there are significant differences in application
the device. The testing was conducted on an Android device, performance metrics—CPU usage, memory usage, and
specifically a Samsung Galaxy A52 with a Snapdragon 778G response time—between the MVVM and MVP design patterns.
processor, 8GB of RAM, and Android 14 operating system. These hypotheses were tested using appropriate statistical
techniques based on the processed research data. For example,
D. DATA COLLECTION an independent sample t-test was applied if the data met
CPU and memory performance were recorded using parametric criteria, allowing for accurate comparisons of mean
Android Profiler, while dashboard response times were differences between the two design patterns. Conversely, if
collected using Android activity’s built-in methods. Data assumptions were violated, the Mann-Whitney U test was used
collection occurs exclusively during the dashboard activity. to compare the distributions of performance metrics without
When the dashboard was opened, CPU and memory usage relying on normality.
appeared in the Android Profiler. CPU usage data were The completion of the data analysis phase provided critical
summed and averaged, while memory usage was measured at insights into the performance characteristics of each design
the start, peak, and end of the activity, then averaged. pattern. The results determined whether observed differences
E. DATA ANALYSIS
in CPU usage, memory consumption, or response time were
Quantitative data were analyzed using parametric or statistically significant or merely due to random variation.
nonparametric statistical techniques, depending on the These findings not only validated the hypotheses but also
characteristics of the data. Parametric statistics are typically offered actionable conclusions regarding the suitability of
used when the data meet certain assumptions, including normal MVVM and MVP for various application scenarios,
contributing to a deeper understanding of their performance
distribution, linearity, and homogeneity of variance. These
trade-offs in Android development.
assumptions are crucial as they ensure the reliability and
By employing a rigorous and adaptive statistical approach,
validity of tests such as t-tests or ANOVA, which rely on
this study ensured that the analysis was scientifically robust and
precise mathematical models to evaluate differences or
capable of adapting to the nuances of the data, providing
relationships between variables. For instance, the normality of
reliable evidence to support the conclusions drawn. This
data distribution is often assessed using tools such as the
meticulous methodology reinforced the validity of the research
Shapiro-Wilk test, while homogeneity of variance can be
findings and emphasized the importance of selecting
evaluated using Levene’s test.
appropriate statistical techniques tailored to the characteristics
If the data fail to meet these assumptions, nonparametric of the data.
statistical techniques are used as an alternative. Methods such
as the Mann-Whitney U test or Kruskal-Wallis test do not rely F. RESULTS AND DISCUSSION
on strict assumptions about the underlying data distribution, After statistical analysis is completed, the results will yield
making them more robust for analyzing skewed or nonlinear conclusions about whether there is a significant difference
data. While nonparametric tests are less sensitive to outliers and between the MVVM and MVP design patterns in displaying
130
MVP. In TC-2, with a range of 20,000–29,000 data, the average
memory usage was 119.84 MB for MVVM and 120.39 MB for 125
MVP. For TC-3, with 30,000–39,000 data, the average memory
usage was 124.10 MB for MVVM and 123.37 MB for MVP. 120
Finally, in TC-4, with 40,000–49,000 data, the average
115
memory usage was 128.96 MB for MVVM and 129.10 MB for
MVP. The visualization of the increasing average memory 110
usage is shown in Figure 5, with MVVM represented by a blue MVVM
line and MVP by a red line. In the memory performance 105
MVP
category, MVVM and MVP were closely matched, with MVP 100
performing better in TC-1, TC-2, and TC-4, while MVVM TC1 TC2 TC3 TC4
outperformed in TC-3.
The distribution of memory usage data can be visualized Test Case
using a boxplot, as shown in Figure 6, with MVVM represented Figure 5. Memory performance comparison.
by a blue color and MVP by a red color. The bottommost point
represents the lowest memory usage at 107.37 MB for MVVM
and 108.63 MB for MVP. The highest memory usage is exhibits a wider data spread and the lowest memory usage,
represented by the topmost point at 137.83 MB for MVVM and while the MVP design pattern has a narrower data spread.
133.97 MB for MVP. The median memory usage is indicated In TC-1, with a data range of 10,000–19,000, the average
by the line within the box, at 122.70 MB for MVVM and response time was 144.79 ms for MVVM and 134.19 ms for
122.23 MB for MVP. Visually, the MVVM design pattern MVP. In TC-2, with a data range of 20,000–29,000, the average
140 400
130
250
125 200
150
120 MVVM
100
MVP MVVM
115
50 MVP
110 0
TC1 TC2 TC3 TC4
response time was 217.16 ms for MVVM and 202.28 ms for 350
MVP. In TC-3, with a data range of 30,000–39,000, the average
response time was 285.17 ms for MVVM and 270.49 ms for 300
MVP. Finally, in TC-4, with a data range of 40,000–49,000, the
average response time was 363.61 ms for MVVM and 340.57 250 MVVM
ms for MVP. The visualization of the increasing average
response time for each design pattern is shown in Figure 7, with MVP
200
MVVM represented by a blue line and MVP by a red line.
MVVM outperformed in TC-1, while MVP performed better in 150
TC-2, TC-3, and TC-4, with a relatively close gap between the
two design patterns. Graphically, MVP is faster at displaying
100
data compared to MVVM, though the difference in speed
between the two design patterns is quite narrow.
50
The distribution of response time data can be visualized
using a boxplot, as shown in Figure 8, with MVVM represented Design Pattern
in blue and MVP in red. The lowest response time is at the
bottommost point, 116.30 ms for MVVM and 107.70 ms for Figure 8. Boxplot diagram response time.
MVP. The highest response time is at the topmost point, 407.50 TABLE III
ms for MVVM and 390.80 ms for MVP. The median response NORMALITY TESTING
time is indicated by the line within the box, at 256.50 ms for
Design Response
MVVM and 231.65 ms for MVP. Visually, both design patterns CPU Memory
Pattern Time
have boxes of the same size, but MVP is slightly more optimal MVVM 0.986 0.674 0.1768
than MVVM. MVP 0.8933 0.1883 0.1809
Once the experimental data were collected, the next step
was to analyze it. The first analysis involved testing the
normality of the data, which determines the type of difference summarizes the Shapiro-Wilk normality test results for each
test to use. If the data are normally distributed, a parametric test performance category.
is applied. Otherwise, a nonparametric test is used. Homogeneity testing was conducted to determine whether
For CPU usage with MVVM, the p-value was 0.986, which the variance between groups is equal or homogeneous, which
is greater than 0.05, indicating a normal distribution. Similarly, is required for an independent t-test. For CPU, the p-value was
CPU usage for MVP had a p-value of 0.8933, which is also 0.5931, greater than 0.05, indicating homogeneous variance.
greater than 0.05, confirming normal distribution. For the For memory, the p-value was 0.9689, also greater than 0.05,
memory usage, MVVM had a p-value of 0.674, while and MVP confirming homogeneity. For response time, the p-value was
had a p-value of 0.1883. These results indicate normal 0.756, greater than 0.05, indicating homogeneous variance.
distribution. For response time, MVVM and MVP had p-values Since all data categories had homogeneous variances, an
of 0.1768 and 0.1809, respectively, both greater than 0.05, independent t-test was conducted.
confirming normality. With the prerequisites fulfilled, the final testing was
Thus, all data categories were found to be normally conducted to determine whether there was a significant
distributed, allowing for parametric testing. Table III difference between the MVVM and MVP design patterns in
each performance category. The hypotheses for the where the Presenter actively controls UI updates and might
independent t-test were as follows: keep unnecessary UI-bound computations within the main
1. null hypothesis (H0): If p > 0.05, no significant thread.
difference exists; Conversely, MVP demonstrated superior response times,
2. alternative hypothesis (H1): If p ≤ 0.05, a significant likely due to its simpler data flow architecture, where the
difference exists. presenter acts as a direct channel between the model and the
For CPU usage, the p-value was 1.866e-10 (less than 0.05), view. This aligns with [28], which observed that minimizing
indicating a significant difference between the design patterns. the number of intermediary layers in the data flow can result in
For memory usage, the p-value was 0.9608, greater than 0.05, faster response times. However, this advantage comes at the
indicating no significant difference. For response time, the p- cost of higher CPU usage, as the presenter requires more
value was 0.756, also greater than 0.05, indicating no frequent interactions with the View and Model components,
significant difference. Thus, a significant difference was especially in complex applications.
observed in CPU usage between the design patterns, while no Interestingly, MVVM’s slight advantage in memory usage
significant difference was found for memory usage and compared to MVP contradicts previous findings [17], where
response time. Additionally, the analysis results indicate that in MVP was reported to excel in this category. A possible
terms of CPU usage, MVVM is more efficient than MVP, with explanation is MVVM’s more efficient handling of lifecycle-
Cohen’s d = -1.64, representing a very large and significant aware components, which reduces the likelihood of memory
effect. Conversely, in response time, MVP demonstrates better leaks—a common issue in MVP when managing long-lived
performance than MVVM, with Cohen’s d = 1.07, which is also Presenters. This aligns with [20], which emphasizes the
a large effect. Meanwhile, in memory usage, the difference importance of lifecycle-aware components in effectively
between the two design patterns is not significant (Cohen’s d = managing memory consumption.
-0.03) with a high p-value, indicating that memory usage Overall, these findings underscore the nuanced trade-offs
between MVVM and MVP is relatively the same. Regarding between MVVM and MVP, particularly when applied to
potential biases, the results of the normality test (Shapiro-Wilk Android development using Kotlin. While MVVM
test) and homogeneity of variance test (Levene’s test) confirm demonstrates superior CPU efficiency and marginally better
that the statistical assumptions are met, ensuring that the t-test memory usage, MVP offers a more responsive user experience.
results are reliable and free from significant bias. The choice between the two should consider the application’s
performance priorities and complexity, as well as the
B. DISCUSSION
development team’s familiarity with each design pattern.
Among the three performance categories—CPU, memory,
Future research could further explore these trade-offs by
and response time—only CPU usage showed a significant
including more complex scenarios or integrating additional
difference between the MVVM and MVP design patterns.
performance metrics, such as energy consumption or
MVVM was superior in CPU usage, with an average usage of
maintainability.
8.02%, compared to 11.15% for MVP. For memory usage,
This study provides valuable insights into the comparative
MVVM slightly outperformed MVP with an average of 121.48
performance of the MVVM and MVP design patterns in
MB versus 121.55 MB. However, MVP excelled in response
Android applications developed with Kotlin. One of the main
time, with an average of 236.88 ms compared to 252.68 ms for
advantages of this research is its empirical evaluation of real-
MVVM. This finding differs from previous research [17]
world application scenarios, ensuring that the findings are
which reported that MVVM was superior in both CPU usage
relevant and applicable to modern Android development.
and response time, while MVP excelled in memory usage. This
Additionally, the use of statistical analysis strengthens the
discrepancy could be attributed to differences in testing
validity of the conclusions, offering developers concrete data
environments, application complexity, or the specific use of
to support design pattern selection based on specific
Kotlin in this study, which may influence performance metrics.
performance needs. However, the study also has limitations.
Additionally, variations in how the UI rendering pipeline is
The testing was conducted on a single device model, which
managed between different versions of Android could also
may not fully capture variations in hardware performance
contribute to these differences, as certain optimizations in UI
across different Android devices. Furthermore, the study
thread execution may favor one architecture over the other.
primarily focuses on CPU usage, memory consumption, and
Based on the results of the independent sample t-test, only
response time, while other important factors such as energy
the CPU category showed a significant difference in
efficiency, maintainability, and scalability were not explored in
application performance between the design patterns for
depth. Future research could address these limitations by
Android-based applications developed using Kotlin. This
expanding the scope to include a broader range of devices and
difference is assumed to arise from the distinct ways each
additional performance metrics to provide a more
design pattern manages data flow and presents data to the user.
comprehensive evaluation of these design patterns.
The key distinction lies in the intermediary component used to
handle requests to the model and deliver data to the view: the V. CONCLUSION
viewmodel in MVVM and the presenter in MVP. The This study compared the performance of MVVM and MVP
viewmodel facilitates a reactive data-binding mechanism that design patterns in an Android application built with Kotlin,
reduces the overhead of frequent updates, thereby optimizing focusing on CPU usage, memory consumption, and response
CPU usage. This finding is consistent with [16], which time. The results indicate that MVVM outperforms MVP in
highlighted the efficiency of reactive programming paradigms CPU efficiency, with an average usage of 8.92% compared to
in reducing CPU load. Furthermore, since MVVM leverages 11.15%. MVVM also demonstrated slightly better memory
LiveData and flows in Kotlin, it offloads computation-heavy efficiency (121.48 MB vs. 121.55 MB), though the difference
operations to background threads more efficiently than MVP, was not statistically significant. In contrast, MVP exhibited
faster response times, averaging 236.88 ms compared to 252.68 design,” Sustainability, vol. 14, no. 15, pp. 1–26, Aug. 2022, doi:
10.3390/su14159131.
ms for MVVM.
[10] D. Amalfitano, M. Júnior, A.R. Fasolino, and M. Delamaro, “A GUI-
These findings suggest that the choice of design pattern based metamorphic testing technique for detecting authentication
should be based on the application’s performance priorities. vulnerabilities in Android mobile apps,” J. Syst. Softw., vol. 224, pp. 1–
MVVM is more suitable for applications requiring optimized 17, Jun. 2025, doi: 10.1016/j.jss.2025.112364.
CPU and memory management, particularly those with [11] N. Hoshieah, S. Zein, N. Salleh, and J. Grundy, “A static analysis of
complex data-binding scenarios or resource constraints. MVP, Android source code for lifecycle development usage patterns,” J.
Comput. Sci., vol. 15, no. 1, pp. 92–107, Jan. 2019, doi:
on the other hand, is preferable for applications demanding 10.3844/jcssp.2019.92.107.
real-time responsiveness with minimal latency. [12] B.S. Panca, S. Mardiyanto, and B. Hendradjaya, “Evaluation of software
Future research could expand on these findings by design pattern on mobile application based service development related
incorporating additional metrics such as energy consumption, to the value of maintainability and modularity,” in 2016 Int. Conf. Data
maintainability, and scalability. Furthermore, testing on a wider Softw. Eng. (ICoDSE), 2016, pp. 1–5, doi:
10.1109/ICODSE.2016.7936132.
range of devices and real-world scenarios could provide deeper
[13] B.B. Mayvan, A. Rasoolzadegan, and Z.G. Yazdi, “The state of the art on
insights into the performance of these design patterns under design patterns: A systematic mapping of the literature,” J. Syst. Softw.,
various conditions, ensuring broader applicability of the vol. 125, pp. 93–118, Mar. 2017, doi: 10.1016/j.jss.2016.11.030.
conclusions drawn from this study. [14] A. Naghdipour, S.M.H. Hasheminejad, and M.R. Keyvanpour, “DPSA:
A brief review for design pattern selection approaches,” in 2021 26th Int.
CONFLICTS OF INTEREST Comput. Conf. Comput. Soc. Iran (CSICC), 2021, pp. 1–6, doi:
The authors declare that there is no conflict of interest in the 10.1109/CSICC52343.2021.9420629.
research and preparation of this paper. [15] D. Panchal, “Comparative study on Android design patterns,” Int. Res. J.
Eng. Technol., vol. 7, no. 9, pp. 833–840, Sep. 2020.
AUTHORS’ CONTRIBUTIONS [16] R.L.B. Baptista, “Framedrop-Mobile Client,” M.S. thesis, University of
Conceptualization, Fajar Pradana and Raziqa Izza Coimbra, Coimbra, Portugal, 2023.
Langundi; methodology, Fajar Pradana; software, Raziqa Izza [17] B. Wisnuadhi, G. Munawar, and U. Wahyu, “Performance comparison of
native Android application on MVP and MVVM,” in Proc. Int. Semin.
Langundi; validation, Djoko Pramono; formal analysis, Nur Ida Sci. Appl. Technol. (ISSAT 2020), 2020, pp. 276–282, doi:
Iriani; investigation, Fajar Pradana; resources, Raziqa Izza 10.2991/aer.k.201221.047.
Langundi; data curation, Raziqa Izza Langundi; writing— [18] M. Willocx, J. Vossaert, and V. Naessens, “Comparing performance
original draft preparation, Fajar Pradana; writing—reviewing parameters of mobile app development strategies,” in MOBILESoft '16,
Proc. Int. Conf. Mob. Softw. Eng. Syst., 2016, pp. 38–47, doi:
and editing, Fajar Pradana; visualization, Raziqa Izza Langundi;
10.1145/2897073.2897092.
supervision, Fajar Pradana; project administration, Nur Ida [19] R.A. Doherty and P. Sorenson, “Keeping users in the flow: Mapping
Iriani; funding acquisition, Nur Ida Iriani. system responsiveness with user experience,” Procedia Manuf., vol. 3,
pp. 4384–4391, 2015, doi: 10.1016/j.promfg.2015.07.436.
ACKNOWLEDGMENT [20] F. Rösler, A. Nitze, and A. Schmietendorf, “Towards a mobile application
This research was funded by the Faculty of Computer performance benchmark,” in ICIW 2014, 9th Int. Conf. Internet Web Appl.
Science, Universitas Brawijaya, Malang. Serv., 2014, pp. 55–59.
[21] G. Lim, C. Min, and Y.I. Eom, “Enhancing application performance by
REFERENCES memory partitioning in Android platforms,” in 2013 IEEE Int. Conf.
[1] International Data Corporation (IDC) “Worldwide smartphone market Consum. Electron. (ICCE), 2021, pp. 649–650, doi:
forecast to grow 6.2% in 2024, fueled by Robust growth for Android in 10.1109/ICCE.2013.6487055.
emerging markets and China, according to IDC.” Access date: 23-Jan- [22] W. Ngaogate, “Applying the Flyweight design pattern to Android
2025. [Online]. Available: application development,” ASEAN J. Sci. Technol. Rep. (AJSTR), vol. 26,
https://www.idc.com/getdoc.jsp?containerId=prUS52757624 no. 2, pp. 49–57, Apr.-Jun. 2023, doi: 10.55164/ajstr.v26i2.247607.
[2] A. Karapantelakis et al., “Generative AI in mobile networks: A survey,” [23] D. Rimawi and S. Zein, “A model based approach for Android design
Ann. Telecommun., vol. 79, no. 1–2, pp. 15–33, Feb. 2024, doi: patterns detection,” in 2019 3rd Int. Symp. Multidiscip. Stud. Innov.
10.1007/s12243-023-00980-9. Technol. (ISMSIT), 2019, pp. 1–10, doi: 10.1109/ISMSIT.2019.8932921.
[3] D. Rimawi and S. Zein, “A static analysis of Android source code for [24] R.F. García, “MVVM: Model–view–viewmodel,” in iOS Architecture
design patterns usage,” Int. J. Adv. Trends Comput. Sci. Eng., vol. 9, no. Patterns. Berkeley, CA, USA: Apress, 2023, pp. 145–224.
2, pp. 2178–2186, Mar./Apr. 2020, doi: [25] X. Li, S. Wang, Z. Liu, and G. Wu, “Design and implementation of
10.30534/ijatcse/2020/194922020. enterprise web application common framework based on model-view-
[4] S. Papadakis, M. Kalogiannakis, and N. Zaranis, “Educational apps from viewmodel architecture,” in 5th Int. Conf. Mechatron. Comput. Technol.
the Android Google Play for Greek preschoolers: A systematic review,” Eng. (MCTE 2022), 2022, pp. 1-4, doi: 10.1117/12.2661040.
Comput. Educ., vol. 116, pp. 139–160, Jan. 2018, doi: [26] M.I. Alfathar et al., “Penerapan MVVM (model view viewmodel) pada
10.1016/j.compedu.2017.09.007. pengembangan aplikasi bank sampah digital,” J. Ris. Apl. Mhs. Inform.
[5] J.B. Jorgensen et al., “Variability handling for mobile banking apps on (JRAMI), vol. 5, no. 2, pp. 406–414, Apr. 2024, doi:
iOS and Android,” in 2016 13th Work. IEEE/IFIP Conf. Softw. Archit. 10.30998/jrami.v5i2.11071.
(WICSA), 2016, pp. 283–286, doi: 10.1109/WICSA.2016.29. [27] C.J. Sampayo-Rodríguez, R. González-Ambriz, B.A. Gonzalez-Martinez,
[6] F.M. Kundi, A. Habib, A. Habib, and M.Z. Asghar, “Android-based and J. Aldana-Herrera, “Processor and memory performance with design
health care management system,” Int. J. Comput. Sci. Inf. Secur. (IJCSIS), patterns in a native Android application,” J. Appl. Comput., vol. 6, no. 18,
vol. 14, no. 7, pp. 77–87, Jul. 2016. pp. 53–61, Jun. 2022, doi: 10.35429/JCA.2022.18.6.53.61.
[7] M. Prakash, U. Gowshika, and T. Ravichandran, “A smart device [28] F.F. Anhar, M.H.P. Swari, and F.P. Aditiawan, “Analisis perbandingan
integrated with an Android for alerting a person’s health condition: implementasi clean architecture menggunakan MVP, MVI, dan MVVM
Internet of things,” Indian J. Sci. Technol., vol. 9, no. 6, pp. 1–6, Feb. pada pengembangan aplikasi Android native,” Jupiter, Publ. Ilmu
2016, doi: 10.17485/ijst/2016/v9i6/69545. Keteknikan Ind. Tek. Elekt. Inform., vol. 2, no. 2, pp. 181–191, Mar. 2024,
[8] G.H. Prakash et al., “Development and validation of Android mobile doi: 10.61132/jupiter.v2i2.155.
application in the management of mental health,” Clin. Epidemiol. Glob. [29] A. Moreno-Azze, D. López-Plaza, F. Alacid, and D. Falcón-Miguel,
Health, vol. 31, pp. 1–7, Jan./Feb. 2025, doi: “Validity and reliability of an iOS mobile application for measuring
10.1016/j.cegh.2024.101894. change of direction across health, performance, and school sports
[9] W. Li, Y. Zhou, S. Luo, and Y. Dong, “Design factors to improve the contexts,” Appl. Sci., vol. 15, no. 4, pp. 1–11, Feb. 2025, doi:
consistency and sustainable user experience of responsive interface 10.3390/app15041891.
[30] L. Corral, A. Sillitti, and G. Succi, “Mobile multiplatform development: Technol. (iJIM), vol. 17, no. 3, pp. 21–38, Feb. 2023, doi:
An experiment for performance analysis,” Procedia Comput. Sci., vol. 10, 10.3991/ijim.v17i03.36957.
pp. 736–743, 2012, doi: 10.1016/j.procs.2012.06.094. [32] S. Pargaonkar, “A comprehensive review of performance testing
[31] F. Pradana, P. Setyosari, S. Ulfa, and T. Hirashima, “Development of methodologies and best practices: Software quality engineering,” Int. J.
gamification-based e-learning on web design topic,” Int. J. Interact. Mob. Sci. Res. (IJSR), vol. 12, no. 8, pp. 2008–2014, Aug. 2023, doi:
10.21275/SR23822111402.