Posts

Sort by:
Post not yet marked as solved
0 Replies
14 Views
Hi, i have been noticing some strange issues with using CoreML models in my app. I am using the Whisper.cpp implementation which has a coreML option. This speeds up the transcribing vs Metal. However every time i use it, the app size inside iphone settings -> General -> Storage increases - specifically the "documents and data" part, the bundle size stays consistent. The Size of the app seems to increase by the same size of the coreml model, and after a few reloads it can increase to over 3-4gb! I thought that maybe the coreml model (which is in the bundle) is being saved to file - but i can't see where, i have tried to use instruments and xcode plus lots of printing out of cache and temp directory etc, deleting the caches etc.. but no effect. I have downloaded the container of the iphone from xcode and inspected it, there are some files stored inthe cache but only a few kbs, and even though the value in the settings-> storage shows a few gb, the container is only a few mb. Please can someone help or give me some guidance on what to do to figure out why the documents and data is increasing? where could this folder be pointing to that is not in the xcode downloaded container?? This is the repo i am using https://github.com/ggerganov/whisper.cpp the swiftui app and objective-C app both do the same thing i am witnessing when using coreml. Thanks in advance for any help, i am totally baffled by this behaviour
Posted
by
Post not yet marked as solved
0 Replies
10 Views
Hi, I am testing an consumable in-app purchase for my app on an iPad, whenever I select to purchase, it always shows "For testing purpose only. You will not be charged for confirming this purchase.". Then I touch the blue Purchase button. It instantly shows "Done", then alerts "You're all set. Your purchase was successful. [Environment: Xcode]. It never asks me to enter a Sandbox account. I followed the instructions on "Testing in-app purchases with sandbox" page, but I can not find the sandbox account in Settings > App Store. I expected to see [Environment: Sandbox] so I could get the transaction id for App Store Server API. My iPadOS version is 17.4.1. My Xcode version is 15.1 and I use StoreKit with SwiftUI view. Can someone please shed some light on why I always get [Environment: Xcode]? I googled a lot, the process to test with Sandbox seems to be straightforward, But I just could not get it right. Thank you very much! KInd Regards, Shih-Chin Yang
Posted
by
Post not yet marked as solved
0 Replies
19 Views
Hi guys, I'm investigating failure to play low latency Live HLS stream and I'm getting following error: (String) “<AVPlayerItemErrorLog: 0x30367da10>\n#Version: 1.0\n#Software: AppleCoreMedia/1.0.0.21L227 (Apple TV; U; CPU OS 17_4 like Mac OS X; en_us)\n#Date: 2024/05/17 13:11:46.046\n#Fields: date time uri cs-guid s-ip status domain comment cs-iftype\n2024/05/17 13:11:16.016 https://s2-h21-nlivell01.cdn.xxxxxx.***/..../xxxx.m3u8 -15410 \“CoreMediaErrorDomain\” \“Low Latency: Server must support http2 ECN and SACK\” -\n2024/05/17 13:11:17.017 -15410 \“CoreMediaErrorDomain\” \“Invalid server blocking reload behavior for low latency\” -\n2024/05/17 13:11:17.017 The stream works when loading from dev server with TLS 1.3, but fails on CDN servers with TLS 1.2. Regular Live streams and VOD streams work normally on those CDN servers. I tried to configure TLSv1.2 in Info.plist, but that didn't help. When running nscurl --ats-diagnostics --verbose it is passing for the server with TLS 1.3, but failing for CDN servers with TLS 1.2 due to error Code=-1005 "The network connection was lost." Is TLS 1.3 required or just recommended? Refering to https://developer.apple.com/documentation/http-live-streaming/enabling-low-latency-http-live-streaming-hls and https://datatracker.ietf.org/doc/html/draft-pantos-hls-rfc8216bis Is it possible to configure AVPlayer to skip ECN and SACK validation? Thanks.
Posted
by
Post not yet marked as solved
0 Replies
15 Views
Our app is a time reporting service with various functions around that. The user checks in at work, checks out when they go home. We thought it'd be useful to provide a live activity to show how long they have worked for. There is also a couple of other cool things we could do that users would love, but i couldn't find definitive answers to the questions below. 1. We have a geofence-based function that checks the user in when they for example arrive at work, and check them out when they go home, so that they don't have to open the app. However, this means that we will need to start and end the live activity from within a geofence trigger. Is this possible? 2. It seems that the maximum time for a live activity is 8 hours? Sometimes people work for longer... How would we solve this? i would be fine with 12 since it would solve most cases. Is it possible somehow to go beyond 8 hours up to 12? If not, is there a callback that "8 hours are up!" so that i could do a final update on the live activity from a counter to "you started working at 09:04" 3. I have seen that some live activities have buttons. It would be neat if the user can check out via a button on the live activity. However, since we take location and call our servers when checking out, we need to be able to use both the locationmanager and make a network call from the live activity. Is this possible? Thanks in advance, Cheers
Posted
by
Post not yet marked as solved
0 Replies
9 Views
good morning everyone, I have a problem of understanding, my app unlocks content after registration, we offer a trial period of xx days to allow new users to use the full version at the end of these days if they want to continue using the full app they have to register without any payment. The app release documentation mentions a xx-day trial period in the Payments section: Apps without a subscription may offer a free time-based trial period before presenting a full unlock option by configuring a non-consumable IAP item at price level 0 that follows the naming convention: “XX-day trial.” Prior to the start of the trial period, the app must clearly identify the duration, content or services that will no longer be accessible at the end of the trial period, and any downstream costs that the user will have to pay for full functionality. should we also use the IAP method even though there is never any mention of purchase but only registration ? thank you in advance for your help
Posted
by
Post not yet marked as solved
0 Replies
17 Views
We started to see some crashes in our iOS app a few seconds after launch with the following reason: Exception Type: EXC_CRASH (SIGILL) Exception Codes: 0x0000000000000000, 0x0000000000000000 Termination Reason: SIGNAL 4 Illegal instruction: 4 when calling [NSPersistentContainer loadPersistentStoresWithCompletionHandler:]. I've looked through Understanding the exception types in a crash report page, but couldn't find this type there. Any ideas what could it be?
Posted
by
Post not yet marked as solved
0 Replies
7 Views
Hi everyone, I created an IAP When I filled a the information in the require database, my internet was broken. When the internet in working normally, I create it again (I didn't save it before I lost my internet) The system reported an error "The Product ID you entered is already being used by another in-app purchase associated with this app." Question: Why can't I reuse a Product ID that I haven't saved before? Please reply to me soon, Thank you so much
Posted
by
Post not yet marked as solved
0 Replies
8 Views
We have trying to programmatically send data to Final Cut Pro by using Apple Event as decribed in Sending Data Programmatically to Final Cut Pro : tell application "Final Cut Pro" activate open POSIX file "/Users/JohnDoe/Documents/UberMAM/MyEvents.fcpxml" end tell This works fine in Script Editor but we run into problems when trying to do the same in our macOS app. We found interesting information in Workflow Extensions SDK 1.0.2 Release Notes.pdf. A) Hardened runtime has "Apple Events Enabled" checked. B) Info.plist contains NSAppleEventsUsageDescription: <key>NSAppleEventsUsageDescription</key> <string>Test string</string> C) We added following entitlements: <key>com.apple.security.scripting-targets</key> <dict> <key>com.apple.FinalCut</key> <array> <string>com.apple.FinalCut.library.inspection</string> </array> <key>com.apple.FinalCutTrial</key> <array> <string>com.apple.FinalCut.library.inspection</string> </array> </dict> <key>com.apple.security.automation.apple-events</key> <true/> With this configuration in place, our app is able to call AppleScript to activate Final Cut Pro application but it is unable to open the file. Following error is returned: Error executing AppleScript: { NSAppleScriptErrorAppName = "Final Cut Pro Trial"; NSAppleScriptErrorBriefMessage = "A privilege violation occurred."; NSAppleScriptErrorMessage = "Final Cut Pro Trial got an error: A privilege violation occurred."; NSAppleScriptErrorNumber = "-10004"; NSAppleScriptErrorRange = "NSRange: {56, 64}"; } Also there is no prompt asking user to allow Automation from our app to Final Cut. I am not sure whether the prompt is to be expected when developing an application in Xcode. Our current workaround is to add (or even replace com.apple.security.scripting-targets with): com.apple.security.temporary-exception.apple-events entitlement like this <key>com.apple.security.temporary-exception.apple-events</key> <array> <key>com.apple.FinalCutTrial</key> </array> However while this approach might work in development we know this would probably prevent us from publishing the app to Mac App Store. I think we are missing something obvious. Could you help? :-)
Posted
by
Post not yet marked as solved
0 Replies
9 Views
I recently converted a Core Data app to use CloudKit. Before the conversion, the sqlite file was around 25MB. After the conversion, the file grew to over one gigabyte. I ran sqlite3_analyzer on the new file and found that a single table, ANSCKRECORDMETADATA, used 95% of the storage, 998MB. Is this to be expected? I confess that the conversion was a bit bumpy. Maybe I did something to create a monstrosity. If not expected, should I revert to the old Core Data file, create a new CloudKit container, and repeat the conversion.
Posted
by
Post not yet marked as solved
0 Replies
14 Views
I have macOS application and I want to provide an action for it in the 'Shortcuts' app QuickAction list. I was using InApp handling to present my intent created in the intent.intentdefinition file as action in the shortcuts app and it was working. However, this action perform a very lightweight task so I intent to have the action implemented as an extension in my xcode project. According to my minimum deployment(i.e macOS 11.0) I found that 'Intents Extension' could be used. I have added the 'Intents extension' target to my main application and created an intent using the intent.intentdefinition file. However, my intent does not appear in the shortcuts app. I have verified it multiple time to ensure I am not missing anything, but still the intent is not present in the shortcuts app action. I wanted to be know, Is this even possible? cause this apple documentation only mentions about iOS and watchOS app. It also does not mention If our custom intent(created using Intents extension) in the intents extension can be exposed to the shortcuts app. For macOS 13.0+, I have used the 'AppIntents extension' and I m able to achieve the same. So, I suppose the same should be possible using the 'Intents extension'
Posted
by
Post not yet marked as solved
0 Replies
22 Views
HI! I am developing an application that should utilize ScriptingBridge.framework to interact with another process. Firstly, I created a separate test application for which I have added Apple Events entitlements via "Signing & Capabilities" section in Xcode and updated its Info.plist to have "Privacy - AppleEvents Sending Usage Description". While the test app works fine (I see an automation request popup and the process executes as expected) the main application where I want to integrate this functionality gets closed immediately after reaching the code interacting with Scripting Bridge. On its launch, I see the following error message from tccd in Console: Prompting policy for hardened runtime; service: kTCCServiceAppleEvents requires entitlement com.apple.security.automation.apple-events but it is missing for accessing={TCCDProcess: identifier=<app bundleID>, ..., binary_path=<path to the app's binary>} I had no such issues with the test app. Moreover, I should mention that the bundle I want to have with such functionality is stored in another bundle, both main and inner bundles aren't sandboxed, and the target app has Application is agent (UIElement) key set in Info.plist. Can you suggest any ideas as to why processes behave so differently despite having pretty much the same build configurations?
Posted
by
Post not yet marked as solved
0 Replies
21 Views
Long story short, I had my App and Watch app already uploaded to the app store. However, I needed to add a WatchConnectivity to have App to Watch communication. At the beginning My app bundle id was: com.x My watch bundle id was: com.x.watchkitapp However, while developing Watch Connectivity, I noticed that my Apps are not connected unless I changed it to com.x.watchkitapp -> com.x.watch However after changing it I cannot submit my bundle anymore. I'm getting Asset validation failed error Invalid Bundle Identifier. Attempting to change bundle identifier from com.x.watchkitapp to com.x.watch is disallowed for bundle x.app/Watch/WatchX Watch App.app. (ID: 75a4621a-7e28-411d-a2a7-84674e460656) Any ideas how this could be solved?
Posted
by
Post not yet marked as solved
0 Replies
11 Views
My app uses Core Data to store an synchronise the data. It runs on iOS, iPadOS and now WatchOS. The records are stored in the private and shared CloudKit database. Running the app in developer mode, syncing is fine for all devices. But as soon as I switch to production mode using TestFlight, the Watch only syncs the data of the shared records, while iPad and iPhone still are syncing all records (private and shared) correctly. I run out of ideas where to search for the reason of this problem. I use one module to setup the persistent store for iOS, iPadOs and WatchOS. Any ideas are well come.
Posted
by
Post not yet marked as solved
0 Replies
10 Views
i am able to see my iphone in device and simulation but unable to select it the manage as run application i have macos of version:sonoma 14.5 ,xcode version:15.4 and iphone os 17.5 here are the following methods i tried: 1)updated all device and restarted 2)changed the cable and used different iphone 3)installed xcode in another mac but it still remain unsolved 4)deleted the derive data and restarted 5)deleted and reinstalled xcode [Edited by Moderator]
Posted
by
Post not yet marked as solved
0 Replies
15 Views
Hello, We're developing a framework that needs to talk to a camera on iOS. We've written a Driverkit Extension to enable this but we're having trouble working out how to distribute this to third parties. We have to specify the application's bundle id in the driverkit's bundle ID. But as far as I can tell that means if there are multiple consumers, they each need their own specially built driverkit extension. For MacOS, we can see an entitlement that allow any 'user client' to connect to our driverkit extension. But from what I can tell, that doesn't seem to be the same for iOS. Am I missing something? Or is it expected that we should have to build a new driverkit extension with a different bundle ID for every app that every third party wants to develop? Let me know if I'm missing too much context, thanks in advance!
Posted
by
Post not yet marked as solved
0 Replies
13 Views
We want to make a multi-person networked application of vision pro. The first step is to need multiple vision pro with the same spatial information, so as to make the world coordinates consistent. Can we completely copy the spatial information created by one of the vision pro scans to other devices?
Posted
by
Post not yet marked as solved
0 Replies
15 Views
Hi, I am using compositional layout on my UICollectionView. For my sections I have added a decorationItem which is a UICollectionReusableView. Everything seems to work fine on all iOS versions, except iOS 16, where I'm getting an unexpected crash. The Firebase Crashlytics report says as follows: Fatal Exception: NSInternalInconsistencyException Invalid parameter not satisfying: sectionIndex < self.solutionBookmarks.count This happens right after I'm reloading my collectionView after fetching data from API. Any help would be appreciated on this, since I can't seem to debug this issue whatsoever. Thanks
Posted
by
Post not yet marked as solved
0 Replies
12 Views
Is it possible to get the process (name, executable location) that triggers CryptoTokenKit extension security operation, such as signData or decryptData? We are developing smart card middleware, for both Windows (minidriver) and macOS (CryptoTokenKit extension). We would like the possibility to configure various parts of our implementation based on the calling process. For example, we would like to cache PIN code in memory for particular amount of time, that is different for web browser and email client. On Windows it can be done, since minidriver dll is loaded into the calling application process. By calling GetCurrentProcess() inside our minidriver, we can figure out what application is using it. On macOS, however, there is a single process that handles all requests from the apps, using smart cards. So getting current process info does not help. Is there a way to get calling application somehow?
Posted
by
Post not yet marked as solved
0 Replies
33 Views
Hello, I am developing a private internal Flutter app for our customer, which will not be published on the Apple Store. One of the key features of this app is to collect RF strength metrics to share user experience with the network. For Android, we successfully implemented the required functionality and are able to collect the following metrics: Signal strength level (0-4) Signal strength in dBm RSSI RSRQ Cell ID Location Area Code Carrier name Mobile country code Mobile network code Radio access technology Connection status Duplex mode However, for iOS, we are facing challenges with CoreTelephony, which is not returning the necessary data. We are aware that CoreTelephony is deprecated and are looking for alternatives. We noticed that a lot of the information we need is available via FTMInternal-4. Is there a way to access this data for a private app? Are there any other recommended approaches or frameworks that can be used to gather cellular network information on iOS for an app that won't be distributed via the Apple Store? my swift code import Foundation import CoreTelephony class RfSignalStrengthImpl: RfSignalStrengthApi { func getCellularSignalStrength(completion: @escaping (Result<CellularSignalStrength, Error>) -> Void) { let networkInfo = CTTelephonyNetworkInfo() guard let carrier = networkInfo.serviceSubscriberCellularProviders?.values.first else { completion(.failure(NSError(domain: "com.xxxx.yyyy", code: 0, userInfo: [NSLocalizedDescriptionKey: "Carrier not found"]))) return } let carrierName = carrier.carrierName ?? "Unknown" let mobileCountryCode = carrier.mobileCountryCode ?? "Unknown" let mobileNetworkCode = carrier.mobileNetworkCode ?? "Unknown" let radioAccessTechnology = networkInfo.serviceCurrentRadioAccessTechnology?.values.first ?? "Unknown" var connectionStatus = "Unknown" ... ... } Thank you for your assistance.
Posted
by
Post not yet marked as solved
1 Replies
34 Views
In Declarative Device Management there is the Get Server Supported Declarations endpoint that is sent via an MDM Check-In request. Is this supposed to return all of the declarations supported by the server, or only the ones that are intended for the device making the request? This seems like a bad choice of naming for that endpoint and, if my assumption is correct it should be named more along the lines of "Get Device Declarations" Or am I fundamentally misunderstanding DDM and our server should be sending all declarations we have to the device and the device controls them via activations? This seems counter to the pitch around scalability and performance improvements that DDM offers if we have to send literally everything to the device even if it's known to not be needed, and similarly if the device doesn't support it but the server does then obviously(?) the server shouldn't send it to the device.
Posted
by

TestFlight Public Links

Get Started

Pinned Posts

Categories

See all